Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jroad

macrumors member
Original poster
Mar 24, 2006
49
0
Hi,

I have a multiple display setup and am experiencing a software conflict because of this. If I unplug the display, the conflict goes away.

Is there a way to temporarily disable the display through software? In Windows, this can be done easily through Display settings, but I am unsure how to do this in OS X.

Thanks for any suggestions.
 
Off the top of my head (used multiple displays before, but not at the moment) it can't be just disabled if the second monitor is plugged in. A 3rd party app may do something similar to what you want.
 
Before going through a bunch of new installs and the like, try setting the displays to "mirror" and then just turn off the second monitor, that might work...
 
Off the top of my head (used multiple displays before, but not at the moment) it can't be just disabled if the second monitor is plugged in. A 3rd party app may do something similar to what you want.

I would be happy with an app or even a way to do it through the terminal, but so far I haven't found anything.

Before going through a bunch of new installs and the like, try setting the displays to "mirror" and then just turn off the second monitor, that might work...

Interesting idea. I tried it but the conflict still arose.

I'm really suprised this is not a readily available feature of the OS. Are Mac notebooks not able to shut the screen off? Perhaps there is a button to do so...

In any case, any other ideas or suggestions appreciated.
 
Some more specifics might help (like OS version, hardware set-up, application, and details of the conflict)...

Well, I initially left these details out to focus this thread on the action of enabling/disabling a display through the OS or an application. However, it seems that such a feature is not immediately obvious to myself or others I have asked.

My hardware is a Mac Pro running 3 displays with an ATI 1900xt & a Nvidia 7300gt card, the app is Motion 2. The ATI in pcie 1, Nvidia in pcie 4. 2 Displays connected to the ATI, 1 to the Nvidia. I have previously posted some of the issues I have encountered with such a setup in OS X and XP both here and at the apple discussions forum.

Motion 2 will not work properly on the displays attached to the ATI card, only the one connected to the Nvidia card. I can only get Motion 2 to work on the ATI connected displays if I unplug the cable to the Nvidia connected display. I have reported this problem to Applecare and they said it was a known issue, with no indication if this would ever be fixed.

I find it disappointing that such a conflict exists since Apple controls both the hardware and software. The requirements for Motion 2 do not mention this issue.

So, as an alternative to unplugging my display cable constantly or being forced to use Motion on the lesser card in my system, I seek a way to hide or disable the display to eliminate the conflict.
 
As Motion is a Pro application and your problem is known, I would have thought that they would address this issue.

My other point being that both the Nvidia and Ati PCIE cards are supplied by Apple and not a third party, then they should address this problem.
 
Well, if it is a known issue, then either they are working on fixing it or it won't be an issue in the next release (but they should still patch the software).

I might suggest picking up a cheap monitor splitter box, as switching the monitors to a different channel will sever the link and the computer won't identify the monitor. However, with some ATI cards, the presence of a chord in the output will generally identify that channel as used (this is a cheap workaround when flashing ATI and Nvidia cards). I suggest posting at strangedogs and see if anyone there has a solution or can suggest anything...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.