Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

5050

macrumors regular
Original poster
May 28, 2009
180
2
I have a MacPro4,1 (firmware updated to MacPro5,1) 2 x 2.26 GHz Quad Core (8-core).

I'm looking to upgrade my graphics power and one setup I've been considering is running a GeForce GT 120 as my GUI in Slot-2 (current video card) and adding a GTX 690 into Slot-1.

Rob at Bare Feats ran a few After Effects benchmarks with the GTX 690 that smoked both the GTX 580c and GTX 680 Mac. See times below:

After Effects CS6 - Ray-Traced 3D (minutes)

  • GTX 690: 10.7
  • GTX 580c: 12.4
  • GTX 680c Mac: 14.3
  • GTX 680 Mac: 15.3
  • GTX 570: 15.9

Rob also mentioned that with a 6-pin to 8-pin adaptor he was able to run the GTX 690 (EVGA) and "banged on it hard and never had any issues . . . [and] was also very quiet." The only drawback he mentioned was that the GTX 690 only ran at PCIe 1.0 instead of PCIe 2.0. Nonetheless, it still managed to outperform the other cards in his After Effects benchmarking.

With the GeForce GT 120 as my GUI I'll still have my boot screens while simultaneously leveraging the GPU power of the GTX 690 for my After Effects use.

Thoughts on this setup? Viable?

Thanks!!
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
Not sure about the results from Rob but in OS X you wont see much improvement over a single GTX 670. A single GTX 680 should beat it. Again, not sure why it doesn't. But for the same reason the single 680c beats the 690 in all the other tests.
The GTX 690 has two GK104 Kepler GPUs arranged in an internal SLI configuration.
OS X does not allow for SLI use at all. So... It will fly in Windows and in OS X you'll have a single GTX 670/680. It's got slower clocks like 670 but more stream procs like the GTX 680. I would not do it. Dual cards are always more trouble then they are worth. Unless you've never owned a dual card. Then go ahead and learn your lesson. :p
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
Not sure about the results from Rob but in OS X you wont see much improvement over a single GTX 670. A single GTX 680 should beat it. Again, not sure why it doesn't. But for the same reason the single 680c beats the 690 in all the other tests.
The GTX 690 has two GK104 Kepler GPUs arranged in an internal SLI configuration.
OS X does not allow for SLI use at all. So... It will fly in Windows and in OS X you'll have a single GTX 670/680. It's got slower clocks like 670 but more stream procs like the GTX 680. I would not do it. Dual cards are always more trouble then they are worth. Unless you've never owned a dual card. Then go ahead and learn your lesson. :p

After Effects is probably using CUDA to talk to each GPU separately, as if you had two GTX 670 cards in the system.
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
Not sure about the results from Rob but in OS X you wont see much improvement over a single GTX 670. A single GTX 680 should beat it. Again, not sure why it doesn't. But for the same reason the single 680c beats the 690 in all the other tests.
The GTX 690 has two GK104 Kepler GPUs arranged in an internal SLI configuration.
OS X does not allow for SLI use at all. So... It will fly in Windows and in OS X you'll have a single GTX 670/680. It's got slower clocks like 670 but more stream procs like the GTX 680. I would not do it. Dual cards are always more trouble then they are worth. Unless you've never owned a dual card. Then go ahead and learn your lesson. :p

All this makes a lot of sense, however, the GTX 690 managed to pull away from the rest of the pack with some really impressive render times in Rob's AE benchmarking. Not sure how to account for this?
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
Not sure about the results from Rob but in OS X you wont see much improvement over a single GTX 670. A single GTX 680 should beat it. Again, not sure why it doesn't. But for the same reason the single 680c beats the 690 in all the other tests.
The GTX 690 has two GK104 Kepler GPUs arranged in an internal SLI configuration.
OS X does not allow for SLI use at all. So... It will fly in Windows and in OS X you'll have a single GTX 670/680. It's got slower clocks like 670 but more stream procs like the GTX 680. I would not do it. Dual cards are always more trouble then they are worth. Unless you've never owned a dual card. Then go ahead and learn your lesson. :p

Actually just read this:

"After Effects is actually multi-GPU capable and scales well on a GTX 690. It's not quite twice as fast as a GTX 680 on AE (will complete an AE render in, say 56-59% of the time rather than 50%) but being a single card many users find it preferable to 2 x GTX 680's."

Here:

http://forums.adobe.com/message/4985643
 

IceMacMac

macrumors 6502
Jun 6, 2010
394
18
All this makes a lot of sense, however, the GTX 690 managed to pull away from the rest of the pack with some really impressive render times in Rob's AE benchmarking. Not sure how to account for this?

I'm not seeing any testing of the 690 on BareFeats. Can you post a link?
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
I'm not seeing any testing of the 690 on BareFeats. Can you post a link?

Actually, I've been in touch with Rob over email and he's been awesome and incredibly patient fielding all my video card related questions. He emailed me screenshots of his 690 tests in After Effects. Try emailing him from barefeats.com and I'm sure he'd be glad to pass it along to you.
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
Actually just read this:

"After Effects is actually multi-GPU capable and scales well on a GTX 690. It's not quite twice as fast as a GTX 680 on AE (will complete an AE render in, say 56-59% of the time rather than 50%) but being a single card many users find it preferable to 2 x GTX 680's."

Here:

http://forums.adobe.com/message/4985643

Again, that is for Windows only. SLI bridge is not understood by OS X. Please keep discussions about AE separated by OS as the feature sets are different.
It does not matter that AE is itself multi GPU aware it will never get passed the info that there are 2 GPU's in the system by OS X's driver.

If anything Asgorath may be onto the only possible explanation. But again not sure how it see's 2 GPU's at all. Historically this was impossible.

Go ahead and ask Rob if SLI works in OS X (It doesn't). And why the 690 is slower than the 680 in every other test performed because of the lack of dual GPU awareness.
I am just the messenger. You can buy what you want but the AE test anomaly should be understood before thinking a 690 in a Mac is a good investment.

The answer was given to you already in this thread:
https://forums.macrumors.com/threads/1573466/
The same tech that allows 2 separate cards to talk is the same that is needed to communicate the 2 GPU's on one PCB ala GTX 690. That is Nvidia. External or internal SLI bridge. No other way at the moment.
 
Last edited:

5050

macrumors regular
Original poster
May 28, 2009
180
2
Again, that is for Windows only. SLI bridge is not understood by OS X. Please keep discussions about AE separated by OS as the feature sets are different.
It does not matter that AE is itself multi GPU aware it will never get passed the info that there are 2 GPU's in the system by OS X's driver.

If anything Asgorath may be onto the only possible explanation. But again not sure how it see's 2 GPU's at all. Historically this was impossible.

Go ahead and ask Rob if SLI works in OS X (It doesn't). And why the 690 is slower than the 680 in every other test performed because of the lack of dual GPU awareness.
I am just the messenger. You can buy what you want but the AE test anomaly should be understood before thinking a 690 in a Mac is a good investment.

The answer was given to you already in this thread:
https://forums.macrumors.com/threads/1573466/
The same tech that allows 2 separate cards to talk is the same that is needed to communicate the 2 GPU's on one PCB ala GTX 690. That is Nvidia. External or internal SLI bridge. No other way at the moment.

So you're suggesting that AE CS 6 via CUDA is somehow able to circumvent OS X's lack of SLI support and leverage the GTX 690's internal SLI bridge?

Again, we must consider the results that Rob was able to achieve in the "real world," despite what we may know about OS X's lack of SLI support.
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
Again, we must consider the results that Rob was able to achieve in the "real world," despite what we may know about OS X's lack of SLI support.

Must we? It's a single test. It does not change the history of graphics drivers on OS X.
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
Multi-GPU Processing in OS X

It does not matter that AE is itself multi GPU aware it will never get passed the info that there are 2 GPU's in the system by OS X's driver.

If anything Asgorath may be onto the only possible explanation. But again not sure how it see's 2 GPU's at all. Historically this was impossible.

One additional angle to consider in your argument is how Blackmagic Design is able to leverage multi-GPU processing in it's OS X version of DaVinci Resolve. According to Blackmagic Design, the multi-GPU processing is enabled through CUDA.

"DaVinci Resolve uses CUDA technology for real time processing, and so typically an NVIDIA GPU with more CUDA cores and memory, will result in faster performance."

This support for multi-GPU processing on the OS X version of DaVinci Resolve has been supported since its 7.1 release (i.e. 4th quarter 2010).

http://www.creativeplanetnetwork.co...nounces-major-davinci-resolve-71-update/18262

So it's entirely plausible that After Effects is accessing multi-GPU processing in the same way as DaVinci Resolve -- through CUDA.

Who cares about arguing semantics (is it SLI, is it CUDA?) and what we "think we know," I think considering the weight of evidence quoted above, it's very valid to postulate CUDA accelerated applications in OS X are able to access highly coveted "SLI-like" performance through CUDA.
 

crjackson2134

macrumors 601
Mar 6, 2013
4,822
1,948
Charlotte, NC
One additional angle to consider in your argument is how Blackmagic Design is able to leverage multi-GPU processing in it's OS X version of DaVinci Resolve. According to Blackmagic Design, the multi-GPU processing is enabled through CUDA.

"DaVinci Resolve uses CUDA technology for real time processing, and so typically an NVIDIA GPU with more CUDA cores and memory, will result in faster performance."

This support for multi-GPU processing on the OS X version of DaVinci Resolve has been supported since its 7.1 release (i.e. 4th quarter 2010).

http://www.creativeplanetnetwork.co...nounces-major-davinci-resolve-71-update/18262

So it's entirely plausible that After Effects is accessing multi-GPU processing in the same way as DaVinci Resolve -- through CUDA.

Who cares about arguing semantics (is it SLI, is it CUDA?) and what we "think we know," I think considering the weight of evidence quoted above, it's very valid to postulate CUDA accelerated applications in OS X are able to access highly coveted "SLI-like" performance through CUDA.

Seems like you've made up your mind. I think if it were me and I just had to have it, I'd try to buy the card from a place that allowed returns. Then you could run your own tests and see if the results are what you were hoping for. If not, then return or exchange the card.
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
Seems like you've made up your mind. I think if it were me and I just had to have it, I'd try to buy the card from a place that allowed returns. Then you could run your own tests and see if the results are what you were hoping for. If not, then return or exchange the card.

No, I haven't made up my mind. I'm simply weighing the evidence and proposing the possibility that After Effects is leveraging multi-GPU processing in the same way that DaVinci Resolve does through CUDA.

Any thoughts on how DaVinci Resolve is able to accomplish this in OS X?
 

Boomhowler

macrumors 6502
Feb 23, 2008
324
19
No, I haven't made up my mind. I'm simply weighing the evidence and proposing the possibility that After Effects is leveraging multi-GPU processing in the same way that DaVinci Resolve does through CUDA.

Any thoughts on how DaVinci Resolve is able to accomplish this in OS X?

Isn't this due to Grand Central Station seeing all processors (GPUs and CPUs) as just different computational devices and puts as many devices to use as it can?
 

lewdvig

macrumors 65816
Jan 1, 2002
1,416
75
South Pole
I have a MacPro4,1 (firmware updated to MacPro5,1) 2 x 2.26 GHz Quad Core (8-core).

I'm looking to upgrade my graphics power and one setup I've been considering is running a GeForce GT 120 as my GUI in Slot-2 (current video card) and adding a GTX 690 into Slot-1.

Rob at Bare Feats ran a few After Effects benchmarks with the GTX 690 that smoked both the GTX 580c and GTX 680 Mac. See times below:

After Effects CS6 - Ray-Traced 3D (minutes)

  • GTX 690: 10.7
  • GTX 580c: 12.4
  • GTX 680c Mac: 14.3
  • GTX 680 Mac: 15.3
  • GTX 570: 15.9

Rob also mentioned that with a 6-pin to 8-pin adaptor he was able to run the GTX 690 (EVGA) and "banged on it hard and never had any issues . . . [and] was also very quiet." The only drawback he mentioned was that the GTX 690 only ran at PCIe 1.0 instead of PCIe 2.0. Nonetheless, it still managed to outperform the other cards in his After Effects benchmarking.

With the GeForce GT 120 as my GUI I'll still have my boot screens while simultaneously leveraging the GPU power of the GTX 690 for my After Effects use.

Thoughts on this setup? Viable?

Thanks!!

Not sure about the results from Rob but in OS X you wont see much improvement over a single GTX 670. A single GTX 680 should beat it. Again, not sure why it doesn't. But for the same reason the single 680c beats the 690 in all the other tests.
The GTX 690 has two GK104 Kepler GPUs arranged in an internal SLI configuration.
OS X does not allow for SLI use at all. So... It will fly in Windows and in OS X you'll have a single GTX 670/680. It's got slower clocks like 670 but more stream procs like the GTX 680. I would not do it. Dual cards are always more trouble then they are worth. Unless you've never owned a dual card. Then go ahead and learn your lesson. :p

Again, that is for Windows only. SLI bridge is not understood by OS X. Please keep discussions about AE separated by OS as the feature sets are different.
It does not matter that AE is itself multi GPU aware it will never get passed the info that there are 2 GPU's in the system by OS X's driver.

If anything Asgorath may be onto the only possible explanation. But again not sure how it see's 2 GPU's at all. Historically this was impossible.

Go ahead and ask Rob if SLI works in OS X (It doesn't). And why the 690 is slower than the 680 in every other test performed because of the lack of dual GPU awareness.
I am just the messenger. You can buy what you want but the AE test anomaly should be understood before thinking a 690 in a Mac is a good investment.

The answer was given to you already in this thread:
https://forums.macrumors.com/threads/1573466/
The same tech that allows 2 separate cards to talk is the same that is needed to communicate the 2 GPU's on one PCB ala GTX 690. That is Nvidia. External or internal SLI bridge. No other way at the moment.

Must we? It's a single test. It does not change the history of graphics drivers on OS X.


If all you care about is CUDA performance (after Effects and other renderers) get the 690. Any app that uses CUDA to leverage the GPU as a coprocessor will use as many CUDA cores as you can shove into your Mac.

Games will only see one of the 670s (the 690 = two GTX 670 GPUs in SLI on one board). But since you did not mention games, I assume you don't care about them. And even if you did, one GTX 670 is still great.

There are some ignorant (meant in its purest sense, not as an insult) people here that do not understand the question posting answers, as usual.
 
Last edited:

lewdvig

macrumors 65816
Jan 1, 2002
1,416
75
South Pole
Nvidia's new cards are coming next month. Among them will be a new Titan LE variant at $600 with 2496 CUDA Cores.

2496 is less than the 690, however a single Titan LE might be better because you can potentially add another one some day (along with an external PSU). Almost 5000 CUDA Cores would be very fast.

And because they are just doing math, not moving textures, the PCIe bus is irrelevant - probably - unless you start doing lots of 4k stuff.

Crazy talk.

And yes, Rob ART is a credit to the Mac scene. Great guy.
 

Topper

macrumors 65816
Jun 17, 2007
1,186
0
Nvidia's new cards are coming next month. Among them will be a new Titan LE variant at $600 with 2496 CUDA Cores.

There is something wrong with these Titan LE (GTX 780) rumors.
If the GTX 780 is priced at $600, it will be a slap in the face to GTX Titan and GTX 680 owners.
I've got to believe the GTX 780 will be priced at $700, maybe more.
If it is $600, get out of my way because I am coming through. Even at $700, I'll be at the front of the line.
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
There are some ignorant (meant in its purest sense, not as an insult) people here that do not understand the question posting answers, as usual.

I and the OP were ignorant to the CUDA interaction. :(
Dual cards historically are boat anchors in OS X.
Apologies. But we all learn, "as usual".
Where were you all several posts ago? I still thought it was 2009 or something.
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
I and the OP were ignorant to the CUDA interaction. :(
Dual cards historically are boat anchors in OS X.
Apologies. But we all learn, "as usual".
Where were you all several posts ago? I still thought it was 2009 or something.

Hey, it's all good. These posts are to become as informed as possible and the more people contributing, the better off we all are. I actually started to dig deeper with my research due to your posts. Thanks for contributing to the thread!
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
If all you care about is CUDA performance (after Effects and other renderers) get the 690. Any app that uses CUDA to leverage the GPU as a coprocessor will use as many CUDA cores as you can shove into your Mac.

Games will only see one of the 670s (the 690 = two GTX 670 GPUs in SLI on one board). But since you did not mention games, I assume you don't care about them. And even if you did, one GTX 670 is still great.

There are some ignorant (meant in its purest sense, not as an insult) people here that do not understand the question posting answers, as usual.

Just to update this thread, installed the GTX 690 in the MacPro4,1 powered externally by a Seasonic 660W PSU. Running OS X 10.8.3, CUDA 5.0.45 driver for MAC, and Adobe After Effects CS6 11.0.2. The next step is to make the external PSU "power aware" so that it will power on/off with the main MacPro4,1 PSU.

See images below:
 

Attachments

  • SCREENSHOT_GTX_690_ABOUT_MAC.png
    SCREENSHOT_GTX_690_ABOUT_MAC.png
    81.3 KB · Views: 415
  • SCREENSHOT_GTX_690_AE.png
    SCREENSHOT_GTX_690_AE.png
    81.8 KB · Views: 260

IceMacMac

macrumors 6502
Jun 6, 2010
394
18
I was interested in the 690...and since the two apps I'm most interested in include AE and C4D (particularly with the VRAY renderer)... I was curious to see what the VRAY spokesperson would have to say...

Stefan (VRAYC4D) posted this:
SLI as in fact bad for gpu render, this is a technic to combine gfx power for screen, but not for render use.

Vrayrt can take the power of any number cards in a better way than sli, 2catfs give 2x speed, 4x 4fold spped and so on. - inside the computer, and even from other computers in the network. This doesnt involve sli, in fact combining cards via sli would dramatically slow down things.

That said, he also posted this in the same thread:
i would not go below a 680 4gb, but maybe wait a bit nvidia will bring 7xx cards people say, which should be finally an improvement in the gpu.

I don't want to buy into the teeth of a technology shift. MUST. BE. PATIENT. *he says, grinding his teeth.*
 

5050

macrumors regular
Original poster
May 28, 2009
180
2
I was interested in the 690...and since the two apps I'm most interested in include AE and C4D (particularly with the VRAY renderer)... I was curious to see what the VRAY spokesperson would have to say...

Stefan (VRAYC4D) posted this:


That said, he also posted this in the same thread:


I don't want to buy into the teeth of a technology shift. MUST. BE. PATIENT. *he says, grinding his teeth.*

New Bare Feats shootout including the GTX 690

http://barefeats.com/gpu680v3.html
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.