Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

strausd

macrumors 68030
Original poster
Jul 11, 2008
2,998
1
Texas
I have a 5870 and before when I tried to hook up 3 displays, only two would work, which was talked about here. It seems like I would need to buy a $100 adapter for this to work, and apparently those adapters suck.

So right now I only have 2 displays running, 24" Apple (MDP) and 23" Dell (DVI). I am considering getting a cintiq 12wx, but don't know if I will encounter the same problem as I did before. Basically, I want to be able to have the two monitors I am running now continue to work perfectly, and have the cintiq be plugged in to the open MDP using a DVI to MDP adapter. Would this work or would the dell u2311h go blank like it did before?
 
Sorry to ask these questions but has 3 displays on new Mac Pro worked for anyone? Basically, can only two outputs be used simultaneously? If that's the case, then I doubt that your Cintiq will work. One option is to buy e.g. GT 120, that doesn't cost much. Then you can hook have four monitors connected and not worry about the issue with new GPUs. Apple will likely fix it sooner than later but I doubt you want to wait for a fix as Apple ain't the fastest in these things :p

I'm of course assuming that you have an empty PCIe slot. Other options are USB video adapter or Matrox DualHead or something similar
 
Sorry to ask these questions but has 3 displays on new Mac Pro worked for anyone? Basically, can only two outputs be used simultaneously? If that's the case, then I doubt that your Cintiq will work. One option is to buy e.g. GT 120, that doesn't cost much. Then you can hook have four monitors connected and not worry about the issue with new GPUs. Apple will likely fix it sooner than later but I doubt you want to wait for a fix as Apple ain't the fastest in these things :p

I'm of course assuming that you have an empty PCIe slot. Other options are USB video adapter or Matrox DualHead or something similar

Thats a good option, then I can finally get my TV to work too.


How?
 
Your signature says that you have (at least one) U2311H. Get an MDP to VGA adapter and power your third screen with that. Mini display port to vga adapters are effectively active (although this seems to be very badly documented). The PC 5870s and 5970s can do this in Windows with such an adapter so unless Apple has purposefully disabled this ability so as to force people to buy their expensive dual link adapters (which don't even work very well) then it should work.

In Windows I run my third monitor on my 5970 in this manner and it works perfectly. I used to use an Apple Dual Link MDP adapter but found it to be poor - often not being detected properly or giving artefacts etc. The (also Apple) MDP to VGA adapter works perfectly though.
 
Your signature says that you have (at least one) U2311H. Get an MDP to VGA adapter and power your third screen with that. Mini display port to vga adapters are effectively active (although this seems to be very badly documented). The PC 5870s and 5970s can do this in Windows with such an adapter so unless Apple has purposefully disabled this ability so as to force people to buy their expensive dual link adapters (which don't even work very well) then it should work.

In Windows I run my third monitor on my 5970 in this manner and it works perfectly. I used to use an Apple Dual Link MDP adapter but found it to be poor - often not being detected properly or giving artefacts etc. The (also Apple) MDP to VGA adapter works perfectly though.

Would there be any display difference on the Cintiq between using single link DVI or VGA? Heres a link to the display specs on it incase they matter.
 
Would there be any display difference on the Cintiq between using single link DVI or VGA? Heres a link to the display specs on it incase they matter.

VGA seems to have strong quality degradation vs. DVI. I've had monitors with both VGA and DVI. In some cases I used them on VGA for a while, thought they were really crappy displays, and then plugged them in DVI and suddenly everything was sharp and clear.

If you are doing serious work I would avoid VGA.
 
Would there be any display difference on the Cintiq between using single link DVI or VGA? Heres a link to the display specs on it incase they matter.

There shouldn't be. As long as the display passes its native resolution back to the computer (as pretty much all modern displays other than some TVs do) then you should be fine. It's a comparatively low resolution you'd be running at anyway so you shouldn't see any discernible drops in signal quality over pretty much any length of cable under about 10m or so.

VGA seems to have strong quality degradation vs. DVI. I've had monitors with both VGA and DVI. In some cases I used them on VGA for a while, thought they were really crappy displays, and then plugged them in DVI and suddenly everything was sharp and clear.

If you are doing serious work I would avoid VGA.

VGA does lose signal quality over length but then so can DVI - try running a dual link dvi display with an extension cable and you'll often end up with weird pixel artefacting. If you use a good quality short VGA lead then there's usually no discernible difference in quality between VGA and DVI. The lower the resolution of the display, the longer the cables can be. You can even pipe through things like 720p via VGA->ethernet passive adapters for good stretches.

There is no discernible difference in quality between VGA and DVI on my 2007fp IPS panelled 1600x1200 display and yes, I can compare the two signals at the same time since I have two of these displays on my desk, one connected via VGA, one via DVI.
 
There shouldn't be. As long as the display passes its native resolution back to the computer (as pretty much all modern displays other than some TVs do) then you should be fine. It's a comparatively low resolution you'd be running at anyway so you shouldn't see any discernible drops in signal quality over pretty much any length of cable under about 10m or so.



VGA does lose signal quality over length but then so can DVI - try running a dual link dvi display with an extension cable and you'll often end up with weird pixel artefacting. If you use a good quality short VGA lead then there's usually no discernible difference in quality between VGA and DVI. The lower the resolution of the display, the longer the cables can be. You can even pipe through things like 720p via VGA->ethernet passive adapters for good stretches.

There is no discernible difference in quality between VGA and DVI on my 2007fp IPS panelled 1600x1200 display and yes, I can compare the two signals at the same time since I have two of these displays on my desk, one connected via VGA, one via DVI.

Thanks for the insight. The cable I use probably won't be any longer than 6 feet. I still might end up getting a GT 120 because I want to use my TV too. Will these work in the 2010 MPs?
 
VGA does lose signal quality over length but then so can DVI - try running a dual link dvi display with an extension cable and you'll often end up with weird pixel artefacting. If you use a good quality short VGA lead then there's usually no discernible difference in quality between VGA and DVI. The lower the resolution of the display, the longer the cables can be. You can even pipe through things like 720p via VGA->ethernet passive adapters for good stretches.

The max cable length for DVI with no quality loss is 16 feet, on the other hand, all the displays that I have just have always looked horrible over VGA. One display even had color bleeding issues, and it was with the nicest, most shielded VGA cable I have. It just isn't pretty...
 
The max cable length for DVI with no quality loss is 16 feet, on the other hand, all the displays that I have just have always looked horrible over VGA. One display even had color bleeding issues, and it was with the nicest, most shielded VGA cable I have. It just isn't pretty...

Obviously your mileage may vary. The max cable length for DVI with no quality loss may be 16 feet but in reality it can be far more or far less than that. Add a 1m (high quality) extensions cable to a 30" ACD connected to a 4870 and you'll get signal degradation. I've only once had a poor signal with VGA and that was with a very long cable and trying to run a high resolution.
 
I have a 25' HDMI cable that I use for my TV. I don't see a big quality loss, just low frame rates.
 
Obviously your mileage may vary. The max cable length for DVI with no quality loss may be 16 feet but in reality it can be far more or far less than that.

No... By the laws of physics it's 16 feet (likely more, but 16 feet is a nice round advertisable number).... Digital can't be taken down by interference. Interference alters the level of the wavelength running through the wire. This messes with analog connections like VGA. However, in digital, the very presence of any signal, regardless of the level, is interpreted as a 1, and no signal at all is a 0. If there is a signal going through, it doesn't matter how much interference or noise it goes through, it still comes out as a 1 on the other end. The only way you can kill a digital connection is to run it so far that the current cannot carry, which is where 16 feet comes from.

Add a 1m (high quality) extensions cable to a 30" ACD connected to a 4870 and you'll get signal degradation. I've only once had a poor signal with VGA and that was with a very long cable and trying to run a high resolution.

Extension cables will never degrade digital/DVI. Again, even if the extension cables play with the wavelength levels, it doesn't matter. The presence of a current is 1, no current is 0.

Also, there is no such thing as a "high quality" digital cable. It either carries power or it doesn't.
 
Avoid VGA if at all possible. It always looks worse on LCDs. Not to mention, if you have large electrical items in the room such as air conditioners when they are turned on your signal will really degrade.

This happened all the time where i worked and people were always putting in tickets for fuzzy picture. We would show them that by turning off the AC or fridge in their office the signal went back to being acceptable for general use and that they would have to move their desk if they wanted the AC on. I worked in Iraq so most people would rather leave the AC on and move the desk :p
 
No... By the laws of physics it's 16 feet (likely more, but 16 feet is a nice round advertisable number).... Digital can't be taken down by interference. Interference alters the level of the wavelength running through the wire. This messes with analog connections like VGA. However, in digital, the very presence of any signal, regardless of the level, is interpreted as a 1, and no signal at all is a 0. If there is a signal going through, it doesn't matter how much interference or noise it goes through, it still comes out as a 1 on the other end. The only way you can kill a digital connection is to run it so far that the current cannot carry, which is where 16 feet comes from.



Extension cables will never degrade digital/DVI. Again, even if the extension cables play with the wavelength levels, it doesn't matter. The presence of a current is 1, no current is 0.

Also, there is no such thing as a "high quality" digital cable. It either carries power or it doesn't.

Digital *can* be taken down by interference. When interference is at a certain level, a distorted 1 and a distorted 0 start to become impossible to tell between. The higher the data frequency (i.e. for high resolution displays), the less interference is needed to break it down. By the laws of physics, digital is not infallible!
 
Digital *can* be taken down by interference. When interference is at a certain level, a distorted 1 and a distorted 0 start to become impossible to tell between. The higher the data frequency (i.e. for high resolution displays), the less interference is needed to break it down. By the laws of physics, digital is not infallible!

No, this is basic electrical engineering.

1 is on, 0 is off. You can distort it all you want. 1 will still be on, 0 will still be off.

The only way a 1 can become a 0 is if power is lost. It's a binary signal. On, off. That's it. Distortion does not matter. You can't distort a 1 to make it look like a 0. Even if the signal gets changed, the very presence of a signal still means 1. No signal still means 0, and you can't distort a signal that doesn't exist to turn it back into a 1.

If digital worked the way you claim it does, your computer could be taken down by wireless signals extremely easily.
 
No, this is basic electrical engineering.

1 is on, 0 is off. You can distort it all you want. 1 will still be on, 0 will still be off.

The only way a 1 can become a 0 is if power is lost. It's a binary signal. On, off. That's it. Distortion does not matter. You can't distort a 1 to make it look like a 0. Even if the signal gets changed, the very presence of a signal still means 1. No signal still means 0, and you can't distort a signal that doesn't exist to turn it back into a 1.

If digital worked the way you claim it does, your computer could be taken down by wireless signals extremely easily.

A very high frequency digital signal such as dual link DVI *is* susceptible to interference and signal degradation over lengths shorter than what a single link DVI cable can function with.

You can think about this theoretically all the way you want. Either DVI signals are susceptible to interference and degradation other than current loss or 30" displays with DVI extension cables bend space time. Your call.
 
A very high frequency digital signal such as dual link DVI *is* susceptible to interference and signal degradation over lengths shorter than what a single link DVI cable can function with.

How? You've yet to offer any proof. Interference can only change the value of the signal, and the value of the signal doesn't matter under digital, only analog.

Signal degradation based on length is entirely possible, but in a way different way than VGA. VGA has a nice linear degradation, while digital is going to remain perfect until the length that it runs out of power. It's entirely possible that dual link is trying to send twice the data with the same amount of power, which would reduce the transmission range. But that's entirely different than VGA degradation.

You can think about this theoretically all the way you want. Either DVI signals are susceptible to interference and degradation other than current loss or 30" displays with DVI extension cables bend space time. Your call.

No, they aren't. Digital video signals are the same sort of transmission as ethernet, the buses on your computer, and light bulbs, none of which are affected by interference.

If you were right, we would have never actually switched to digital, as analog is capable of transmitting far more information at once than digital.

This is also why it's absolutely silly for anyone to buy premium digital cables.
 
Honumaui,
do you have 3 dvi monitors?
I want to connect 3 dvi monitors 24" DELL 1920x1200 to my new hexacore which has the 5770.
Are these adapters gonna work so I can avoid the 200$ for the apple ones?

Thanks
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.