Can my GPU handle this without problems?

Discussion in 'Mac Pro' started by strausd, Sep 19, 2010.

  1. strausd macrumors 68030

    Joined:
    Jul 11, 2008
    Location:
    Texas
    #1
    I have a 5870 and before when I tried to hook up 3 displays, only two would work, which was talked about here. It seems like I would need to buy a $100 adapter for this to work, and apparently those adapters suck.

    So right now I only have 2 displays running, 24" Apple (MDP) and 23" Dell (DVI). I am considering getting a cintiq 12wx, but don't know if I will encounter the same problem as I did before. Basically, I want to be able to have the two monitors I am running now continue to work perfectly, and have the cintiq be plugged in to the open MDP using a DVI to MDP adapter. Would this work or would the dell u2311h go blank like it did before?
     
  2. strausd thread starter macrumors 68030

    Joined:
    Jul 11, 2008
    Location:
    Texas
  3. Hellhammer Moderator

    Hellhammer

    Staff Member

    Joined:
    Dec 10, 2008
    Location:
    Finland
    #3
    Sorry to ask these questions but has 3 displays on new Mac Pro worked for anyone? Basically, can only two outputs be used simultaneously? If that's the case, then I doubt that your Cintiq will work. One option is to buy e.g. GT 120, that doesn't cost much. Then you can hook have four monitors connected and not worry about the issue with new GPUs. Apple will likely fix it sooner than later but I doubt you want to wait for a fix as Apple ain't the fastest in these things :p

    I'm of course assuming that you have an empty PCIe slot. Other options are USB video adapter or Matrox DualHead or something similar
     
  4. DualShock macrumors 6502

    Joined:
    Jun 29, 2008
    #4
  5. Vylen macrumors 65816

    Joined:
    Jun 3, 2010
    Location:
    Sydney, Australia
    #5
    Yes :p
     
  6. strausd thread starter macrumors 68030

    Joined:
    Jul 11, 2008
    Location:
    Texas
    #6
    Thats a good option, then I can finally get my TV to work too.

    How?
     
  7. Spanky Deluxe macrumors 601

    Spanky Deluxe

    Joined:
    Mar 17, 2005
    Location:
    London, UK
    #7
    Your signature says that you have (at least one) U2311H. Get an MDP to VGA adapter and power your third screen with that. Mini display port to vga adapters are effectively active (although this seems to be very badly documented). The PC 5870s and 5970s can do this in Windows with such an adapter so unless Apple has purposefully disabled this ability so as to force people to buy their expensive dual link adapters (which don't even work very well) then it should work.

    In Windows I run my third monitor on my 5970 in this manner and it works perfectly. I used to use an Apple Dual Link MDP adapter but found it to be poor - often not being detected properly or giving artefacts etc. The (also Apple) MDP to VGA adapter works perfectly though.
     
  8. strausd thread starter macrumors 68030

    Joined:
    Jul 11, 2008
    Location:
    Texas
    #8
    Would there be any display difference on the Cintiq between using single link DVI or VGA? Heres a link to the display specs on it incase they matter.
     
  9. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #9
    VGA seems to have strong quality degradation vs. DVI. I've had monitors with both VGA and DVI. In some cases I used them on VGA for a while, thought they were really crappy displays, and then plugged them in DVI and suddenly everything was sharp and clear.

    If you are doing serious work I would avoid VGA.
     
  10. Spanky Deluxe macrumors 601

    Spanky Deluxe

    Joined:
    Mar 17, 2005
    Location:
    London, UK
    #10
    There shouldn't be. As long as the display passes its native resolution back to the computer (as pretty much all modern displays other than some TVs do) then you should be fine. It's a comparatively low resolution you'd be running at anyway so you shouldn't see any discernible drops in signal quality over pretty much any length of cable under about 10m or so.

    VGA does lose signal quality over length but then so can DVI - try running a dual link dvi display with an extension cable and you'll often end up with weird pixel artefacting. If you use a good quality short VGA lead then there's usually no discernible difference in quality between VGA and DVI. The lower the resolution of the display, the longer the cables can be. You can even pipe through things like 720p via VGA->ethernet passive adapters for good stretches.

    There is no discernible difference in quality between VGA and DVI on my 2007fp IPS panelled 1600x1200 display and yes, I can compare the two signals at the same time since I have two of these displays on my desk, one connected via VGA, one via DVI.
     
  11. strausd thread starter macrumors 68030

    Joined:
    Jul 11, 2008
    Location:
    Texas
    #11
    Thanks for the insight. The cable I use probably won't be any longer than 6 feet. I still might end up getting a GT 120 because I want to use my TV too. Will these work in the 2010 MPs?
     
  12. Spanky Deluxe macrumors 601

    Spanky Deluxe

    Joined:
    Mar 17, 2005
    Location:
    London, UK
    #12
    Yup, GT 120s work fine in combo with 5870s. :)
     
  13. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #13
    The max cable length for DVI with no quality loss is 16 feet, on the other hand, all the displays that I have just have always looked horrible over VGA. One display even had color bleeding issues, and it was with the nicest, most shielded VGA cable I have. It just isn't pretty...
     
  14. Spanky Deluxe macrumors 601

    Spanky Deluxe

    Joined:
    Mar 17, 2005
    Location:
    London, UK
    #14
    Obviously your mileage may vary. The max cable length for DVI with no quality loss may be 16 feet but in reality it can be far more or far less than that. Add a 1m (high quality) extensions cable to a 30" ACD connected to a 4870 and you'll get signal degradation. I've only once had a poor signal with VGA and that was with a very long cable and trying to run a high resolution.
     
  15. strausd thread starter macrumors 68030

    Joined:
    Jul 11, 2008
    Location:
    Texas
    #15
    I have a 25' HDMI cable that I use for my TV. I don't see a big quality loss, just low frame rates.
     
  16. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #16
    No... By the laws of physics it's 16 feet (likely more, but 16 feet is a nice round advertisable number).... Digital can't be taken down by interference. Interference alters the level of the wavelength running through the wire. This messes with analog connections like VGA. However, in digital, the very presence of any signal, regardless of the level, is interpreted as a 1, and no signal at all is a 0. If there is a signal going through, it doesn't matter how much interference or noise it goes through, it still comes out as a 1 on the other end. The only way you can kill a digital connection is to run it so far that the current cannot carry, which is where 16 feet comes from.

    Extension cables will never degrade digital/DVI. Again, even if the extension cables play with the wavelength levels, it doesn't matter. The presence of a current is 1, no current is 0.

    Also, there is no such thing as a "high quality" digital cable. It either carries power or it doesn't.
     
  17. chrono1081 macrumors 604

    chrono1081

    Joined:
    Jan 26, 2008
    Location:
    Isla Nublar
    #17
    Avoid VGA if at all possible. It always looks worse on LCDs. Not to mention, if you have large electrical items in the room such as air conditioners when they are turned on your signal will really degrade.

    This happened all the time where i worked and people were always putting in tickets for fuzzy picture. We would show them that by turning off the AC or fridge in their office the signal went back to being acceptable for general use and that they would have to move their desk if they wanted the AC on. I worked in Iraq so most people would rather leave the AC on and move the desk :p
     
  18. alphaod macrumors Core

    alphaod

    Joined:
    Feb 9, 2008
    Location:
    NYC
  19. Spanky Deluxe macrumors 601

    Spanky Deluxe

    Joined:
    Mar 17, 2005
    Location:
    London, UK
    #19
    Digital *can* be taken down by interference. When interference is at a certain level, a distorted 1 and a distorted 0 start to become impossible to tell between. The higher the data frequency (i.e. for high resolution displays), the less interference is needed to break it down. By the laws of physics, digital is not infallible!
     
  20. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #20
    No, this is basic electrical engineering.

    1 is on, 0 is off. You can distort it all you want. 1 will still be on, 0 will still be off.

    The only way a 1 can become a 0 is if power is lost. It's a binary signal. On, off. That's it. Distortion does not matter. You can't distort a 1 to make it look like a 0. Even if the signal gets changed, the very presence of a signal still means 1. No signal still means 0, and you can't distort a signal that doesn't exist to turn it back into a 1.

    If digital worked the way you claim it does, your computer could be taken down by wireless signals extremely easily.
     
  21. Spanky Deluxe macrumors 601

    Spanky Deluxe

    Joined:
    Mar 17, 2005
    Location:
    London, UK
    #21
    A very high frequency digital signal such as dual link DVI *is* susceptible to interference and signal degradation over lengths shorter than what a single link DVI cable can function with.

    You can think about this theoretically all the way you want. Either DVI signals are susceptible to interference and degradation other than current loss or 30" displays with DVI extension cables bend space time. Your call.
     
  22. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #22
    How? You've yet to offer any proof. Interference can only change the value of the signal, and the value of the signal doesn't matter under digital, only analog.

    Signal degradation based on length is entirely possible, but in a way different way than VGA. VGA has a nice linear degradation, while digital is going to remain perfect until the length that it runs out of power. It's entirely possible that dual link is trying to send twice the data with the same amount of power, which would reduce the transmission range. But that's entirely different than VGA degradation.

    No, they aren't. Digital video signals are the same sort of transmission as ethernet, the buses on your computer, and light bulbs, none of which are affected by interference.

    If you were right, we would have never actually switched to digital, as analog is capable of transmitting far more information at once than digital.

    This is also why it's absolutely silly for anyone to buy premium digital cables.
     
  23. Honumaui macrumors 6502a

    Joined:
    Apr 18, 2008
    #23
  24. yiagoulas macrumors newbie

    yiagoulas

    Joined:
    Sep 21, 2007
    Location:
    LOS ANGELES
    #24
    Honumaui,
    do you have 3 dvi monitors?
    I want to connect 3 dvi monitors 24" DELL 1920x1200 to my new hexacore which has the 5770.
    Are these adapters gonna work so I can avoid the 200$ for the apple ones?

    Thanks
     

Share This Page