NVIDIA GeForce GT 120 & GTX 690

Discussion in 'Mac Pro' started by 5050, Apr 23, 2013.

  1. 5050 macrumors regular

    May 28, 2009
    I have a MacPro4,1 (firmware updated to MacPro5,1) 2 x 2.26 GHz Quad Core (8-core).

    I'm looking to upgrade my graphics power and one setup I've been considering is running a GeForce GT 120 as my GUI in Slot-2 (current video card) and adding a GTX 690 into Slot-1.

    Rob at Bare Feats ran a few After Effects benchmarks with the GTX 690 that smoked both the GTX 580c and GTX 680 Mac. See times below:

    After Effects CS6 - Ray-Traced 3D (minutes)

    • GTX 690: 10.7
    • GTX 580c: 12.4
    • GTX 680c Mac: 14.3
    • GTX 680 Mac: 15.3
    • GTX 570: 15.9

    Rob also mentioned that with a 6-pin to 8-pin adaptor he was able to run the GTX 690 (EVGA) and "banged on it hard and never had any issues . . . [and] was also very quiet." The only drawback he mentioned was that the GTX 690 only ran at PCIe 1.0 instead of PCIe 2.0. Nonetheless, it still managed to outperform the other cards in his After Effects benchmarking.

    With the GeForce GT 120 as my GUI I'll still have my boot screens while simultaneously leveraging the GPU power of the GTX 690 for my After Effects use.

    Thoughts on this setup? Viable?

  2. derbothaus macrumors 601


    Jul 17, 2010
    Not sure about the results from Rob but in OS X you wont see much improvement over a single GTX 670. A single GTX 680 should beat it. Again, not sure why it doesn't. But for the same reason the single 680c beats the 690 in all the other tests.
    The GTX 690 has two GK104 Kepler GPUs arranged in an internal SLI configuration.
    OS X does not allow for SLI use at all. So... It will fly in Windows and in OS X you'll have a single GTX 670/680. It's got slower clocks like 670 but more stream procs like the GTX 680. I would not do it. Dual cards are always more trouble then they are worth. Unless you've never owned a dual card. Then go ahead and learn your lesson. :p
  3. Asgorath macrumors 68000

    Mar 30, 2012
    After Effects is probably using CUDA to talk to each GPU separately, as if you had two GTX 670 cards in the system.
  4. 5050 thread starter macrumors regular

    May 28, 2009
    All this makes a lot of sense, however, the GTX 690 managed to pull away from the rest of the pack with some really impressive render times in Rob's AE benchmarking. Not sure how to account for this?
  5. 5050 thread starter macrumors regular

    May 28, 2009
    Actually just read this:

    "After Effects is actually multi-GPU capable and scales well on a GTX 690. It's not quite twice as fast as a GTX 680 on AE (will complete an AE render in, say 56-59% of the time rather than 50%) but being a single card many users find it preferable to 2 x GTX 680's."


  6. IceMacMac macrumors 6502

    Jun 6, 2010
    I'm not seeing any testing of the 690 on BareFeats. Can you post a link?
  7. 5050 thread starter macrumors regular

    May 28, 2009
    Actually, I've been in touch with Rob over email and he's been awesome and incredibly patient fielding all my video card related questions. He emailed me screenshots of his 690 tests in After Effects. Try emailing him from barefeats.com and I'm sure he'd be glad to pass it along to you.
  8. derbothaus, Apr 23, 2013
    Last edited: Apr 23, 2013

    derbothaus macrumors 601


    Jul 17, 2010
    Again, that is for Windows only. SLI bridge is not understood by OS X. Please keep discussions about AE separated by OS as the feature sets are different.
    It does not matter that AE is itself multi GPU aware it will never get passed the info that there are 2 GPU's in the system by OS X's driver.

    If anything Asgorath may be onto the only possible explanation. But again not sure how it see's 2 GPU's at all. Historically this was impossible.

    Go ahead and ask Rob if SLI works in OS X (It doesn't). And why the 690 is slower than the 680 in every other test performed because of the lack of dual GPU awareness.
    I am just the messenger. You can buy what you want but the AE test anomaly should be understood before thinking a 690 in a Mac is a good investment.

    The answer was given to you already in this thread:
    The same tech that allows 2 separate cards to talk is the same that is needed to communicate the 2 GPU's on one PCB ala GTX 690. That is Nvidia. External or internal SLI bridge. No other way at the moment.
  9. 5050 thread starter macrumors regular

    May 28, 2009
    So you're suggesting that AE CS 6 via CUDA is somehow able to circumvent OS X's lack of SLI support and leverage the GTX 690's internal SLI bridge?

    Again, we must consider the results that Rob was able to achieve in the "real world," despite what we may know about OS X's lack of SLI support.
  10. derbothaus macrumors 601


    Jul 17, 2010
    Must we? It's a single test. It does not change the history of graphics drivers on OS X.
  11. 5050 thread starter macrumors regular

    May 28, 2009
    Multi-GPU Processing in OS X

    One additional angle to consider in your argument is how Blackmagic Design is able to leverage multi-GPU processing in it's OS X version of DaVinci Resolve. According to Blackmagic Design, the multi-GPU processing is enabled through CUDA.

    "DaVinci Resolve uses CUDA technology for real time processing, and so typically an NVIDIA GPU with more CUDA cores and memory, will result in faster performance."

    This support for multi-GPU processing on the OS X version of DaVinci Resolve has been supported since its 7.1 release (i.e. 4th quarter 2010).


    So it's entirely plausible that After Effects is accessing multi-GPU processing in the same way as DaVinci Resolve -- through CUDA.

    Who cares about arguing semantics (is it SLI, is it CUDA?) and what we "think we know," I think considering the weight of evidence quoted above, it's very valid to postulate CUDA accelerated applications in OS X are able to access highly coveted "SLI-like" performance through CUDA.
  12. 5050 thread starter macrumors regular

    May 28, 2009
  13. crjackson2134 macrumors 68040


    Mar 6, 2013
    Charlotte, NC
    Seems like you've made up your mind. I think if it were me and I just had to have it, I'd try to buy the card from a place that allowed returns. Then you could run your own tests and see if the results are what you were hoping for. If not, then return or exchange the card.
  14. 666sheep macrumors 68040


    Dec 7, 2009
    It's all matter of software. CUDA and OCL accelerated (if written to take advantage of multiple GPUs) will use both GPUs. Luxmark uses both (under OS X too). Read more here (GTX690 in MP 3,1 by etc): http://forum.netkas.org/index.php/topic,3850.0.html
  15. 5050 thread starter macrumors regular

    May 28, 2009
    No, I haven't made up my mind. I'm simply weighing the evidence and proposing the possibility that After Effects is leveraging multi-GPU processing in the same way that DaVinci Resolve does through CUDA.

    Any thoughts on how DaVinci Resolve is able to accomplish this in OS X?
  16. Boomhowler macrumors 6502

    Feb 23, 2008
    Isn't this due to Grand Central Station seeing all processors (GPUs and CPUs) as just different computational devices and puts as many devices to use as it can?
  17. lewdvig, Apr 24, 2013
    Last edited: Apr 24, 2013

    lewdvig macrumors 65816


    Jan 1, 2002
    South Pole

    If all you care about is CUDA performance (after Effects and other renderers) get the 690. Any app that uses CUDA to leverage the GPU as a coprocessor will use as many CUDA cores as you can shove into your Mac.

    Games will only see one of the 670s (the 690 = two GTX 670 GPUs in SLI on one board). But since you did not mention games, I assume you don't care about them. And even if you did, one GTX 670 is still great.

    There are some ignorant (meant in its purest sense, not as an insult) people here that do not understand the question posting answers, as usual.
  18. lewdvig macrumors 65816


    Jan 1, 2002
    South Pole
    Nvidia's new cards are coming next month. Among them will be a new Titan LE variant at $600 with 2496 CUDA Cores.

    2496 is less than the 690, however a single Titan LE might be better because you can potentially add another one some day (along with an external PSU). Almost 5000 CUDA Cores would be very fast.

    And because they are just doing math, not moving textures, the PCIe bus is irrelevant - probably - unless you start doing lots of 4k stuff.

    Crazy talk.

    And yes, Rob ART is a credit to the Mac scene. Great guy.
  19. Topper macrumors 65816


    Jun 17, 2007
    There is something wrong with these Titan LE (GTX 780) rumors.
    If the GTX 780 is priced at $600, it will be a slap in the face to GTX Titan and GTX 680 owners.
    I've got to believe the GTX 780 will be priced at $700, maybe more.
    If it is $600, get out of my way because I am coming through. Even at $700, I'll be at the front of the line.
  20. derbothaus macrumors 601


    Jul 17, 2010
    ...and the answer to the discrepancy. Thank you.
  21. derbothaus macrumors 601


    Jul 17, 2010
    I and the OP were ignorant to the CUDA interaction. :(
    Dual cards historically are boat anchors in OS X.
    Apologies. But we all learn, "as usual".
    Where were you all several posts ago? I still thought it was 2009 or something.
  22. 5050 thread starter macrumors regular

    May 28, 2009
    Hey, it's all good. These posts are to become as informed as possible and the more people contributing, the better off we all are. I actually started to dig deeper with my research due to your posts. Thanks for contributing to the thread!
  23. 5050 thread starter macrumors regular

    May 28, 2009
    Just to update this thread, installed the GTX 690 in the MacPro4,1 powered externally by a Seasonic 660W PSU. Running OS X 10.8.3, CUDA 5.0.45 driver for MAC, and Adobe After Effects CS6 11.0.2. The next step is to make the external PSU "power aware" so that it will power on/off with the main MacPro4,1 PSU.

    See images below:

    Attached Files:

  24. IceMacMac macrumors 6502

    Jun 6, 2010
    I was interested in the 690...and since the two apps I'm most interested in include AE and C4D (particularly with the VRAY renderer)... I was curious to see what the VRAY spokesperson would have to say...

    Stefan (VRAYC4D) posted this:
    That said, he also posted this in the same thread:
    I don't want to buy into the teeth of a technology shift. MUST. BE. PATIENT. *he says, grinding his teeth.*
  25. 5050 thread starter macrumors regular

    May 28, 2009
    New Bare Feats shootout including the GTX 690


Share This Page