Disable OSX display auto-detection

Discussion in 'macOS' started by tinycg, Jan 1, 2012.

  1. tinycg macrumors member

    May 8, 2009
    I know this topic has come up before, and it seems like there is still no solid answer beyond hardware such as a Gefen DVI Detective

    I find it hard to believe there's no way in config somewhere to have OSX ignore EDID. I'm plagued with the common problem of having a mac hooked up to an HDTV, only when its on does it broadcast EDID so when the TV gets turned off OSX resets the config. (On a sidenote it still strands windows on the display, why if its resetting things doesnt OSX move them to an active display?.. I'd love that feature too.)

    Researching all over the interwebs it seems nobody has figured it out. If SwitchResX had a way to permanently enable a display I could work with that.

    The constant blue flashing fade as the HDTV switches inputs or the power is toggled is infinitely frustrating to one's workflow and one of the only real things that frustrates me with OSX.

    Any help would be greatly appreciated for my sanity.
  2. nontroppo macrumors 6502


    Mar 11, 2009
    The forced auto-detetction also breaks functionality of many modern KVMs. We've tried newer displayport KVMs and there is no way for them to not send the DDC2b signals and thus always force display detection to kick in every time you switch the KVM (there were workarounds and solutions for VGA, but not for more modern digital interfaces like displayport). Wish there was a defaults command or something...

Share This Page