Is it possible to disable a secondary display?

Discussion in 'macOS' started by jroad, Jan 10, 2007.

  1. jroad macrumors member

    Joined:
    Mar 24, 2006
    #1
    Hi,

    I have a multiple display setup and am experiencing a software conflict because of this. If I unplug the display, the conflict goes away.

    Is there a way to temporarily disable the display through software? In Windows, this can be done easily through Display settings, but I am unsure how to do this in OS X.

    Thanks for any suggestions.
     
  2. zakatov macrumors 6502a

    zakatov

    Joined:
    Mar 8, 2005
    Location:
    South Florida
    #2
    Off the top of my head (used multiple displays before, but not at the moment) it can't be just disabled if the second monitor is plugged in. A 3rd party app may do something similar to what you want.
     
  3. disconap macrumors 68000

    disconap

    Joined:
    Oct 29, 2005
    Location:
    Portland, OR
    #3
    Before going through a bunch of new installs and the like, try setting the displays to "mirror" and then just turn off the second monitor, that might work...
     
  4. jroad thread starter macrumors member

    Joined:
    Mar 24, 2006
    #4
    I would be happy with an app or even a way to do it through the terminal, but so far I haven't found anything.

    Interesting idea. I tried it but the conflict still arose.

    I'm really suprised this is not a readily available feature of the OS. Are Mac notebooks not able to shut the screen off? Perhaps there is a button to do so...

    In any case, any other ideas or suggestions appreciated.
     
  5. disconap macrumors 68000

    disconap

    Joined:
    Oct 29, 2005
    Location:
    Portland, OR
    #5
    Some more specifics might help (like OS version, hardware set-up, application, and details of the conflict)...
     
  6. jroad thread starter macrumors member

    Joined:
    Mar 24, 2006
    #6
    Well, I initially left these details out to focus this thread on the action of enabling/disabling a display through the OS or an application. However, it seems that such a feature is not immediately obvious to myself or others I have asked.

    My hardware is a Mac Pro running 3 displays with an ATI 1900xt & a Nvidia 7300gt card, the app is Motion 2. The ATI in pcie 1, Nvidia in pcie 4. 2 Displays connected to the ATI, 1 to the Nvidia. I have previously posted some of the issues I have encountered with such a setup in OS X and XP both here and at the apple discussions forum.

    Motion 2 will not work properly on the displays attached to the ATI card, only the one connected to the Nvidia card. I can only get Motion 2 to work on the ATI connected displays if I unplug the cable to the Nvidia connected display. I have reported this problem to Applecare and they said it was a known issue, with no indication if this would ever be fixed.

    I find it disappointing that such a conflict exists since Apple controls both the hardware and software. The requirements for Motion 2 do not mention this issue.

    So, as an alternative to unplugging my display cable constantly or being forced to use Motion on the lesser card in my system, I seek a way to hide or disable the display to eliminate the conflict.
     
  7. simie macrumors 6502a

    simie

    Joined:
    Aug 26, 2004
    Location:
    Sitting
    #7
    As Motion is a Pro application and your problem is known, I would have thought that they would address this issue.

    My other point being that both the Nvidia and Ati PCIE cards are supplied by Apple and not a third party, then they should address this problem.
     
  8. disconap macrumors 68000

    disconap

    Joined:
    Oct 29, 2005
    Location:
    Portland, OR
    #8
    Well, if it is a known issue, then either they are working on fixing it or it won't be an issue in the next release (but they should still patch the software).

    I might suggest picking up a cheap monitor splitter box, as switching the monitors to a different channel will sever the link and the computer won't identify the monitor. However, with some ATI cards, the presence of a chord in the output will generally identify that channel as used (this is a cheap workaround when flashing ATI and Nvidia cards). I suggest posting at strangedogs and see if anyone there has a solution or can suggest anything...
     

Share This Page