Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

twietee

macrumors 603
Jan 24, 2012
5,300
1,675
No 680 with 4 GB has two six pin connectors, only the stock and one slightly OC'd model (at least from EVGA).

I see. Just found the article about the Point of View 680 UC 4GB, searched for an English link and saw 2 6pins mentioned, but that was obviously the 2GB edition. My bad.

Triple Fan cooling with low power consumption may be too good to be true. But just for the record, is Point of View a reliable manufacturer? I read a lot about Evga in this thread.
 

silvercircle

macrumors member
Nov 18, 2010
61
7
I have the GTX 670 SC 4GB on a 5.1 with the 3690. XP ver 10 is still beta IMHO and all that tweaking to get whatever new effects like the "plausible world," was a joke, at least for professional users. You have plenty of CPU so you are probably are still GPU bound, but some settings eat up CPU too (AI aircraft for instance). The code just isn't optimized for modern hardware. Or, as the XP fanboys may say, you are just checking too many boxes and setting rendering options too high. You can't have your cake and eat it too with XP 10...yet...or on a Mac Pro...maybe never.

PS - I lose maybe 5% FPS with HDR and gain anywhere from 15 to 25% FPS in Windows i/o ML.

----------

Beta or not, optimized or not,... it is the same application and I wonder why all is faster on the 680 except xp.
In XP the 5870 is a lot faster. Why, when in every other benchmark (OpenGL also) the 680 is faster.
That is what I do not understand.
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
The answer is really simple: X Plane is CPU limited with the NVIDIA cards. There are probably driver optimizations that AMD has that makes it run better, or perhaps the app is using a different path on NV.

As far as I know, XP is known to be very CPU intensive, which is why you're not seeing any change in FPS when you crank up the settings.
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
The answer is really simple: X Plane is CPU limited with the NVIDIA cards. There are probably driver optimizations that AMD has that makes it run better, or perhaps the app is using a different path on NV.

As far as I know, XP is known to be very CPU intensive, which is why you're not seeing any change in FPS when you crank up the settings.

No, not always. On modern computers, XP is seldom CPU limited, although that is to a degree user dependent upon selected settings. GHz matters most, not cores, unless, for example, you run a bunch of additional aircraft at the same time, which may be fun for gamers, but is pointless for professional use. The mobo is also seldom a limiting factor. It is most often the GPU. Frame rates I am getting on the 670 are significantly faster than those reported by 5870 users who use the same/similar rendering settings, hardware, and other XP options. There are so many to choose from that hardware comparisons are almost impossible unless the built-in command line framerate time-demo is used. Unfortunately, that too is a bit buggy in the current release.

Also, at least in XP, the GTX admittedly runs better under Windows than ML. In OS X, Barefeats reported significant XP10 FPS drops in ML compared to Lion too. With my new W3690 CPU, I've yet to see a single core hit 100% In fact, I haven't seen a second core above 30, and the others just sit there. Again YMMV depending on your settings.

According to Barefeats, at low screen rez, the 5870 absolutely spanks the GTX 5XX, but at high rez, the MVC GTX 570/580 are slightly ahead of it. Obviously, with improved drivers, the 670/80 will be even better. For me it definitely already is. XP is a conundrum though. If you can solve it, maybe LR will hire you.

See: http://barefeats.com/gam12.html

Bottom line: No computer on Earth will run XP with all of the boxes checked and settings at max. Where the fault lies is another matter.
 
Last edited:

revilate

macrumors member
Nov 4, 2011
34
0
would two 680s in SLI be supported?

and by "supported" i mean simply be able to boot up in osx at a screen's native resolution?

i mostly work in windows via bootcamp and two 680s would be quite beneficial, but starting into osx for logic is necessary

great investigation work regardless macvidcards :)
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
According to Barefeats, at low screen rez, the 5870 absolutely spanks the GTX 5XX, but at high rez, the MVC GTX 570/580 are slightly ahead of it. Obviously, with improved drivers, the 670/80 will be even better. For me it definitely already is. XP is a conundrum though. If you can solve it, maybe LR will hire you.

Notice how the GTX 285, GTX 570 and GTX 580 basically have the same score in the low-res test? This is what I meant -- it's completely CPU limited, and yes, the driver is slower than AMD's (or the app is taking a different path).
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
Notice how the GTX 285, GTX 570 and GTX 580 basically have the same score in the low-res test? This is what I meant -- it's completely CPU limited, and yes, the driver is slower than AMD's (or the app is taking a different path).

Check the XP forums for an explanation...and more than a few excuses. Running a single aircraft (the norm) XP cannot come close to using half the power of a speedy hex, or even a quad. A possible limit is the way XP feeds the CPU, not the CPU itself. The other issue is that AMD runs better in OS X and Nvidia in Windows. Also, XP is OpenGL. According to the guys that write the code, the most common limit is the GPU in combination with (excessive) rendering settings chosen by the user. The latest version is a train wreck compared to the previous one. Combine that with the performance loss in 10.8 ML as reported on Barefeats and there are a lot of disappointed Mac users right now who cannot understand how a 4,000 USD maxed out computer can't run the app without hiccups and/or faster than mid 20's FPS in actual use. If it sheds any light on the issue, it's still a 32 bit app too.
 
Last edited:

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
Check the XP forums for an explanation...and more than a few excuses. Running a single aircraft (the norm) XP cannot come close to using half the power of a speedy hex, or even a quad. A possible limit is the way XP feeds the CPU, not the CPU itself.

X-Plane is running on the CPU, and if it can't give the GPU enough work to saturate it, then that's what I would call a "CPU limited" case. That is, you could make the GPU infinitely fast and the FPS wouldn't change. You can confirm this by running the OpenGL Driver Monitor from Xcode and enable "GPU Core Utilization" on the NVIDIA GPU to verify that it isn't pegged at 100%.

Any combination of slow app or slow driver can result in the GPU not being fully utilized, and that's what I mean by CPU limited. If you'd like, you can think of it as "not limited by the GPU" instead, but the end result is the same: no improvement in FPS by putting a more powerful GPU in the system.
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
X-Plane is running on the CPU, and if it can't give the GPU enough work to saturate it, then that's what I would call a "CPU limited" case. That is, you could make the GPU infinitely fast and the FPS wouldn't change. You can confirm this by running the OpenGL Driver Monitor from Xcode and enable "GPU Core Utilization" on the NVIDIA GPU to verify that it isn't pegged at 100%.

Any combination of slow app or slow driver can result in the GPU not being fully utilized, and that's what I mean by CPU limited. If you'd like, you can think of it as "not limited by the GPU" instead, but the end result is the same: no improvement in FPS by putting a more powerful GPU in the system.

Well, I think you may have clearly said in a few sentences what the first post in the link below is saying, at least to a great extent. So the bottom line is XP has to run graphics through the CPU, and can't make use of all the cores...just a couple really. Again, it's just the one plane you are flying we're talking here, not additional "AI aircraft" (other planes in the sky). Therefore, for max FPS, all other things being equal (rendering settings like texture rez, AA, etc.), what's needed is the fastest reasonably modern CPU clock speed possible, coupled with a very good, but not necessarily top of the line GPU? Correct?

http://forums.x-plane.org/index.php?showtopic=55346
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
You could have an infinite number of CPU cores, and XP wouldn't get any faster. There is one thread calling OpenGL on one core, and that core can't feed work to the GPU fast enough. That's my definition of CPU limited.
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
You could have an infinite number of CPU cores, and XP wouldn't get any faster. There is one thread calling OpenGL on one core, and that core can't feed work to the GPU fast enough. That's my definition of CPU limited.

Thanks man for finally providing a straight answer to a question many XP users have. It is what I suspected all along. The GTX 670 allowed me to crank up certain rendering quality settings to the max, which is great, but frame rate increases were minimal otherwise. XP is a GHz hog, which is why I wanted that X5687 to work in my 5.1. At least I now have the W3690 running at turbo max now to feed the GTX.
 

silvercircle

macrumors member
Nov 18, 2010
61
7
what controls the fan on theGTX 680?

My GTX 680 is up and running fine in ML Mac Pro 5,1.
Sometimes there are some small artifacts, little squares, on the left bottom side of the screen.
Ok, in xp the fps have dropped a bit but I can turn on HDR, 4X SSA quad render HDR antialiasing without loss of fps. Shadow detail to 'global'.
My HD 5870 says no with these options set :eek:

The card is so quiet, even if it is used hard. In fact, the noise is mostly produced by the power supply fan. Once the card is put to work this fan spins up really fast, but it also slows back down relative fast.

What controls the fan on the GPU? Is it Mac OS or is it the card itself?
Are there any risks for overheating this thing in my Mac Pro?

BTW:
Is there a good guide on how to make the card work in Lion? I tried a few things but it simply won't work. Or I'm to stupid to make it work :(
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
My GTX 680 is up and running fine in ML Mac Pro 5,1.
Sometimes there are some small artifacts, little squares, on the left bottom side of the screen.
Ok, in xp the fps have dropped a bit but I can turn on HDR, 4X SSA quad render HDR antialiasing without loss of fps. Shadow detail to 'global'.
My HD 5870 says no with these options set :eek:

The card is so quiet, even if it is used hard. In fact, the noise is mostly produced by the power supply fan. Once the card is put to work this fan spins up really fast, but it also slows back down relative fast.

What controls the fan on the GPU? Is it Mac OS or is it the card itself?
Are there any risks for overheating this thing in my Mac Pro?

BTW:
Is there a good guide on how to make the card work in Lion? I tried a few things but it simply won't work. Or I'm to stupid to make it work :(

Don't know about Lion or the fan control per se, but from what I can tell, my 670 OC'd runs cooler than the 4870 and is quiet as a mouse. The only things that get hot in my Pro are the power supply, and the Northbridge. I too have the lower left artifacts from time to time, and not just in XP, but in any OpenGL app, so it may be a driver thing. Can also confirm no huge FPS increase as much as the ability to crank up the rendering settings to max in XP. I think it has been established now that XP is easily CPU limited. I did get a decent increase going from the 3540 to 3690 though. Standard benchmarks, as opposed to XP, are impressive now for sure.
 

pprior

macrumors 65816
Aug 1, 2007
1,448
9
My GTX680 has been in a few weeks now (mountain lion). It's totally quiet (less noise than the 4870 it replaced) and the only artifacts I've seen are when it awakes from sleep there is a lot if digital noise on the screen for maybe 1/2 to 1 second and then it snaps back to normal.

Running 2 screens, 1 rotated 90 degrees. Very happy thus far, but haven't done any premiere and AE editing yet, which is what I bought it for.
 

Topper

macrumors 65816
Jun 17, 2007
1,186
0
My GTX680 has been in a few weeks now (mountain lion). It's totally quiet (less noise than the 4870 it replaced) and the only artifacts I've seen are when it awakes from sleep there is a lot if digital noise on the screen for maybe 1/2 to 1 second and then it snaps back to normal.

Running 2 screens, 1 rotated 90 degrees. Very happy thus far, but haven't done any premiere and AE editing yet, which is what I bought it for.

How did you hook it up to your power supply?
 

Topper

macrumors 65816
Jun 17, 2007
1,186
0
FYI, from EVGA at least, the stock 680 and the basic superclocked 680 take two 6 pin connectors. Anything more powerful requires one or two 8 pins.

Sorry, my bad. So, it's the 4GB model that has the 8 pin connector.
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
Sorry, my bad. So, it's the 4GB model that has the 8 pin connector.

Correct...sort of. There are many models from Evga alone. I too was confused at first. Only the bottom two in their 680 line are dual six pin. However, you can use the two six pin 4gb 670. I'm happy with it for XP and in Windows, but still don't have OpenCL running, and don't need it, at least for now in OS X.
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
Some 2GB "super ultra gamma beta clocked" versions hav 8-pins as well. I think generally speaking the "square" cooler has 1x6 and 1x8 and the stock reference cooler keeps the 2x6 pins. Of which there is also a "superclocked". It is stupid confusing without looking at exact specs before buying.
 

twietee

macrumors 603
Jan 24, 2012
5,300
1,675
Sorry, my bad. So, it's the 4GB model that has the 8 pin connector.

I don't think so. The EVGA Geforce GTX 680 Classified 4GB (Link in German: http://www.alternate.de/html/product/EVGA/Geforce_GTX_680_Classified/1024872/? - you can click on mehr info for specs) has two 6 pins connectors. At least that's what the specs say.
I too saw a Point of View 680 4GB with 2x6pins.
So xav8tor, maybe you're lucky XP is CPU limited..;)

But the more infos I gather, the more confused I get. Is the 570 with better bandwith maybe much more useful for me compared to the 256 Bit of the 680?
And I even read that the 4GB models are somewhat slower than the 2GB ones.
I do CAD, PS and rendering with it so would naturally go with 4GB...
 

xav8tor

macrumors 6502a
Mar 30, 2011
533
36
I don't think so. The EVGA Geforce GTX 680 Classified 4GB (Link in German: http://www.alternate.de/html/product/EVGA/Geforce_GTX_680_Classified/1024872/? - you can click on mehr info for specs) has two 6 pins connectors. At least that's what the specs say.
I too saw a Point of View 680 4GB with 2x6pins.
So xav8tor, maybe you're lucky XP is CPU limited..;)

But the more infos I gather, the more confused I get. Is the 570 with better bandwith maybe much more useful for me compared to the 256 Bit of the 680?
And I even read that the 4GB models are somewhat slower than the 2GB ones.
I do CAD, PS and rendering with it so would naturally go with 4GB...

You are right about the EVGA specs not being clear. The pdf spec sheet on their USA site is correct, and I don't think it was the same when I ordered. In any case, on many sites, the specs are just wrong. I raised the roof with them about it because I paid over 100 USD in shipping and restocking fees to return the 4 GB 680 Classified. It takes two EIGHT pin connectors or FOUR six pin connectors. I'm reasonably satisfied with overclocked 4 GB 670.
 

pkshdk

macrumors newbie
Jun 11, 2012
17
0
I don't think so. The EVGA Geforce GTX 680 Classified 4GB (Link in German: http://www.alternate.de/html/product/EVGA/Geforce_GTX_680_Classified/1024872/? - you can click on mehr info for specs) has two 6 pins connectors. At least that's what the specs say.
I too saw a Point of View 680 4GB with 2x6pins.
So xav8tor, maybe you're lucky XP is CPU limited..;)

But the more infos I gather, the more confused I get. Is the 570 with better bandwith maybe much more useful for me compared to the 256 Bit of the 680?
And I even read that the 4GB models are somewhat slower than the 2GB ones.
I do CAD, PS and rendering with it so would naturally go with 4GB...

If you look at the product pictures on EVGA's website, you can clearly see, it needs 2x8 pins.

http://eu.evga.com/products/moreinf...features&family=GeForce 600 Series Family&sw=
 

twietee

macrumors 603
Jan 24, 2012
5,300
1,675
I'm reasonably satisfied with overclocked 4 GB 670.

Glad to hear it, I will take this one probably, too. Is the noise level the same as the 680 under load?

If you look at the product pictures on EVGA's website, you can clearly see, it needs 2x8 pins.

http://eu.evga.com/products/moreinf...features&family=GeForce 600 Series Family&sw=



Sorry, I ment this one!

http://eu.evga.com/products/moreInfo.asp?pn=04G-P4-3687-KR&family=GeForce 600 Series Family&uc=EUR

Not sure what FTW+ means, though.
Hope this is correct now, still a noob here!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.