Register FAQ / Rules Forum Spy Search Today's Posts Mark Forums Read
Go Back   MacRumors Forums > Apple Hardware > Desktops > Mac Pro

Reply
 
Thread Tools Search this Thread Display Modes
Old Aug 11, 2012, 12:21 PM   #226
twietee
macrumors 68030
 
twietee's Avatar
 
Join Date: Jan 2012
Quote:
Originally Posted by xav8tor View Post
No 680 with 4 GB has two six pin connectors, only the stock and one slightly OC'd model (at least from EVGA).
I see. Just found the article about the Point of View 680 UC 4GB, searched for an English link and saw 2 6pins mentioned, but that was obviously the 2GB edition. My bad.

Triple Fan cooling with low power consumption may be too good to be true. But just for the record, is Point of View a reliable manufacturer? I read a lot about Evga in this thread.
twietee is offline   0 Reply With Quote
Old Aug 11, 2012, 01:06 PM   #227
silvercircle
macrumors member
 
Join Date: Nov 2010
Quote:
Originally Posted by xav8tor View Post
I have the GTX 670 SC 4GB on a 5.1 with the 3690. XP ver 10 is still beta IMHO and all that tweaking to get whatever new effects like the "plausible world," was a joke, at least for professional users. You have plenty of CPU so you are probably are still GPU bound, but some settings eat up CPU too (AI aircraft for instance). The code just isn't optimized for modern hardware. Or, as the XP fanboys may say, you are just checking too many boxes and setting rendering options too high. You can't have your cake and eat it too with XP 10...yet...or on a Mac Pro...maybe never.

PS - I lose maybe 5% FPS with HDR and gain anywhere from 15 to 25% FPS in Windows i/o ML.

----------


Beta or not, optimized or not,... it is the same application and I wonder why all is faster on the 680 except xp.
In XP the 5870 is a lot faster. Why, when in every other benchmark (OpenGL also) the 680 is faster.
That is what I do not understand.
silvercircle is offline   0 Reply With Quote
Old Aug 11, 2012, 01:13 PM   #228
Asgorath
macrumors 6502a
 
Join Date: Mar 2012
The answer is really simple: X Plane is CPU limited with the NVIDIA cards. There are probably driver optimizations that AMD has that makes it run better, or perhaps the app is using a different path on NV.

As far as I know, XP is known to be very CPU intensive, which is why you're not seeing any change in FPS when you crank up the settings.
Asgorath is offline   0 Reply With Quote
Old Aug 11, 2012, 01:40 PM   #229
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by Asgorath View Post
The answer is really simple: X Plane is CPU limited with the NVIDIA cards. There are probably driver optimizations that AMD has that makes it run better, or perhaps the app is using a different path on NV.

As far as I know, XP is known to be very CPU intensive, which is why you're not seeing any change in FPS when you crank up the settings.
No, not always. On modern computers, XP is seldom CPU limited, although that is to a degree user dependent upon selected settings. GHz matters most, not cores, unless, for example, you run a bunch of additional aircraft at the same time, which may be fun for gamers, but is pointless for professional use. The mobo is also seldom a limiting factor. It is most often the GPU. Frame rates I am getting on the 670 are significantly faster than those reported by 5870 users who use the same/similar rendering settings, hardware, and other XP options. There are so many to choose from that hardware comparisons are almost impossible unless the built-in command line framerate time-demo is used. Unfortunately, that too is a bit buggy in the current release.

Also, at least in XP, the GTX admittedly runs better under Windows than ML. In OS X, Barefeats reported significant XP10 FPS drops in ML compared to Lion too. With my new W3690 CPU, I've yet to see a single core hit 100% In fact, I haven't seen a second core above 30, and the others just sit there. Again YMMV depending on your settings.

According to Barefeats, at low screen rez, the 5870 absolutely spanks the GTX 5XX, but at high rez, the MVC GTX 570/580 are slightly ahead of it. Obviously, with improved drivers, the 670/80 will be even better. For me it definitely already is. XP is a conundrum though. If you can solve it, maybe LR will hire you.

See: http://barefeats.com/gam12.html

Bottom line: No computer on Earth will run XP with all of the boxes checked and settings at max. Where the fault lies is another matter.

Last edited by xav8tor; Aug 11, 2012 at 02:10 PM.
xav8tor is offline   0 Reply With Quote
Old Aug 11, 2012, 08:11 PM   #230
revilate
macrumors member
 
Join Date: Nov 2011
would two 680s in SLI be supported?

and by "supported" i mean simply be able to boot up in osx at a screen's native resolution?

i mostly work in windows via bootcamp and two 680s would be quite beneficial, but starting into osx for logic is necessary

great investigation work regardless macvidcards
__________________
Mid 2010 Mac Pro, 3.33GHz x2, 64GB, 8TB, 5870 x2
revilate is offline   0 Reply With Quote
Old Aug 11, 2012, 08:23 PM   #231
Asgorath
macrumors 6502a
 
Join Date: Mar 2012
Quote:
Originally Posted by xav8tor View Post
According to Barefeats, at low screen rez, the 5870 absolutely spanks the GTX 5XX, but at high rez, the MVC GTX 570/580 are slightly ahead of it. Obviously, with improved drivers, the 670/80 will be even better. For me it definitely already is. XP is a conundrum though. If you can solve it, maybe LR will hire you.
Notice how the GTX 285, GTX 570 and GTX 580 basically have the same score in the low-res test? This is what I meant -- it's completely CPU limited, and yes, the driver is slower than AMD's (or the app is taking a different path).
Asgorath is offline   0 Reply With Quote
Old Aug 11, 2012, 09:15 PM   #232
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by Asgorath View Post
Notice how the GTX 285, GTX 570 and GTX 580 basically have the same score in the low-res test? This is what I meant -- it's completely CPU limited, and yes, the driver is slower than AMD's (or the app is taking a different path).
Check the XP forums for an explanation...and more than a few excuses. Running a single aircraft (the norm) XP cannot come close to using half the power of a speedy hex, or even a quad. A possible limit is the way XP feeds the CPU, not the CPU itself. The other issue is that AMD runs better in OS X and Nvidia in Windows. Also, XP is OpenGL. According to the guys that write the code, the most common limit is the GPU in combination with (excessive) rendering settings chosen by the user. The latest version is a train wreck compared to the previous one. Combine that with the performance loss in 10.8 ML as reported on Barefeats and there are a lot of disappointed Mac users right now who cannot understand how a 4,000 USD maxed out computer can't run the app without hiccups and/or faster than mid 20's FPS in actual use. If it sheds any light on the issue, it's still a 32 bit app too.

Last edited by xav8tor; Aug 11, 2012 at 09:23 PM.
xav8tor is offline   0 Reply With Quote
Old Aug 11, 2012, 10:13 PM   #233
Asgorath
macrumors 6502a
 
Join Date: Mar 2012
Quote:
Originally Posted by xav8tor View Post
Check the XP forums for an explanation...and more than a few excuses. Running a single aircraft (the norm) XP cannot come close to using half the power of a speedy hex, or even a quad. A possible limit is the way XP feeds the CPU, not the CPU itself.
X-Plane is running on the CPU, and if it can't give the GPU enough work to saturate it, then that's what I would call a "CPU limited" case. That is, you could make the GPU infinitely fast and the FPS wouldn't change. You can confirm this by running the OpenGL Driver Monitor from Xcode and enable "GPU Core Utilization" on the NVIDIA GPU to verify that it isn't pegged at 100%.

Any combination of slow app or slow driver can result in the GPU not being fully utilized, and that's what I mean by CPU limited. If you'd like, you can think of it as "not limited by the GPU" instead, but the end result is the same: no improvement in FPS by putting a more powerful GPU in the system.
Asgorath is offline   0 Reply With Quote
Old Aug 11, 2012, 10:37 PM   #234
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by Asgorath View Post
X-Plane is running on the CPU, and if it can't give the GPU enough work to saturate it, then that's what I would call a "CPU limited" case. That is, you could make the GPU infinitely fast and the FPS wouldn't change. You can confirm this by running the OpenGL Driver Monitor from Xcode and enable "GPU Core Utilization" on the NVIDIA GPU to verify that it isn't pegged at 100%.

Any combination of slow app or slow driver can result in the GPU not being fully utilized, and that's what I mean by CPU limited. If you'd like, you can think of it as "not limited by the GPU" instead, but the end result is the same: no improvement in FPS by putting a more powerful GPU in the system.
Well, I think you may have clearly said in a few sentences what the first post in the link below is saying, at least to a great extent. So the bottom line is XP has to run graphics through the CPU, and can't make use of all the cores...just a couple really. Again, it's just the one plane you are flying we're talking here, not additional "AI aircraft" (other planes in the sky). Therefore, for max FPS, all other things being equal (rendering settings like texture rez, AA, etc.), what's needed is the fastest reasonably modern CPU clock speed possible, coupled with a very good, but not necessarily top of the line GPU? Correct?

http://forums.x-plane.org/index.php?showtopic=55346
xav8tor is offline   0 Reply With Quote
Old Aug 12, 2012, 01:32 AM   #235
Asgorath
macrumors 6502a
 
Join Date: Mar 2012
You could have an infinite number of CPU cores, and XP wouldn't get any faster. There is one thread calling OpenGL on one core, and that core can't feed work to the GPU fast enough. That's my definition of CPU limited.
Asgorath is offline   0 Reply With Quote
Old Aug 12, 2012, 08:58 AM   #236
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by Asgorath View Post
You could have an infinite number of CPU cores, and XP wouldn't get any faster. There is one thread calling OpenGL on one core, and that core can't feed work to the GPU fast enough. That's my definition of CPU limited.
Thanks man for finally providing a straight answer to a question many XP users have. It is what I suspected all along. The GTX 670 allowed me to crank up certain rendering quality settings to the max, which is great, but frame rate increases were minimal otherwise. XP is a GHz hog, which is why I wanted that X5687 to work in my 5.1. At least I now have the W3690 running at turbo max now to feed the GTX.
xav8tor is offline   0 Reply With Quote
Old Aug 13, 2012, 12:00 PM   #237
silvercircle
macrumors member
 
Join Date: Nov 2010
what controls the fan on theGTX 680?

My GTX 680 is up and running fine in ML Mac Pro 5,1.
Sometimes there are some small artifacts, little squares, on the left bottom side of the screen.
Ok, in xp the fps have dropped a bit but I can turn on HDR, 4X SSA quad render HDR antialiasing without loss of fps. Shadow detail to 'global'.
My HD 5870 says no with these options set

The card is so quiet, even if it is used hard. In fact, the noise is mostly produced by the power supply fan. Once the card is put to work this fan spins up really fast, but it also slows back down relative fast.

What controls the fan on the GPU? Is it Mac OS or is it the card itself?
Are there any risks for overheating this thing in my Mac Pro?

BTW:
Is there a good guide on how to make the card work in Lion? I tried a few things but it simply won't work. Or I'm to stupid to make it work
silvercircle is offline   0 Reply With Quote
Old Aug 13, 2012, 12:36 PM   #238
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by silvercircle View Post
My GTX 680 is up and running fine in ML Mac Pro 5,1.
Sometimes there are some small artifacts, little squares, on the left bottom side of the screen.
Ok, in xp the fps have dropped a bit but I can turn on HDR, 4X SSA quad render HDR antialiasing without loss of fps. Shadow detail to 'global'.
My HD 5870 says no with these options set

The card is so quiet, even if it is used hard. In fact, the noise is mostly produced by the power supply fan. Once the card is put to work this fan spins up really fast, but it also slows back down relative fast.

What controls the fan on the GPU? Is it Mac OS or is it the card itself?
Are there any risks for overheating this thing in my Mac Pro?

BTW:
Is there a good guide on how to make the card work in Lion? I tried a few things but it simply won't work. Or I'm to stupid to make it work
Don't know about Lion or the fan control per se, but from what I can tell, my 670 OC'd runs cooler than the 4870 and is quiet as a mouse. The only things that get hot in my Pro are the power supply, and the Northbridge. I too have the lower left artifacts from time to time, and not just in XP, but in any OpenGL app, so it may be a driver thing. Can also confirm no huge FPS increase as much as the ability to crank up the rendering settings to max in XP. I think it has been established now that XP is easily CPU limited. I did get a decent increase going from the 3540 to 3690 though. Standard benchmarks, as opposed to XP, are impressive now for sure.
xav8tor is offline   0 Reply With Quote
Old Aug 13, 2012, 04:46 PM   #239
pprior
macrumors 65816
 
Join Date: Aug 2007
My GTX680 has been in a few weeks now (mountain lion). It's totally quiet (less noise than the 4870 it replaced) and the only artifacts I've seen are when it awakes from sleep there is a lot if digital noise on the screen for maybe 1/2 to 1 second and then it snaps back to normal.

Running 2 screens, 1 rotated 90 degrees. Very happy thus far, but haven't done any premiere and AE editing yet, which is what I bought it for.
__________________
2013 nMP 6core 16GB RAM, Promise Pegaus2 R6 12TB RAID; uMBP 17/Matte
pprior is offline   0 Reply With Quote
Old Aug 13, 2012, 05:45 PM   #240
Topper
macrumors 65816
 
Topper's Avatar
 
Join Date: Jun 2007
Quote:
Originally Posted by pprior View Post
My GTX680 has been in a few weeks now (mountain lion). It's totally quiet (less noise than the 4870 it replaced) and the only artifacts I've seen are when it awakes from sleep there is a lot if digital noise on the screen for maybe 1/2 to 1 second and then it snaps back to normal.

Running 2 screens, 1 rotated 90 degrees. Very happy thus far, but haven't done any premiere and AE editing yet, which is what I bought it for.
How did you hook it up to your power supply?
__________________
ASUS Z87-Pro, Core i7-4770k, 16GB DDR 1866, EVGA Nvidia GTX Titan 6GB, HP LP3065
Topper is offline   0 Reply With Quote
Old Aug 13, 2012, 06:21 PM   #241
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by Topper View Post
How did you hook it up to your power supply?
FYI, from EVGA at least, the stock 680 and the basic superclocked 680 take two 6 pin connectors. Anything more powerful requires one or two 8 pins.
xav8tor is offline   0 Reply With Quote
Old Aug 13, 2012, 07:25 PM   #242
Topper
macrumors 65816
 
Topper's Avatar
 
Join Date: Jun 2007
Quote:
Originally Posted by xav8tor View Post
FYI, from EVGA at least, the stock 680 and the basic superclocked 680 take two 6 pin connectors. Anything more powerful requires one or two 8 pins.
Sorry, my bad. So, it's the 4GB model that has the 8 pin connector.
__________________
ASUS Z87-Pro, Core i7-4770k, 16GB DDR 1866, EVGA Nvidia GTX Titan 6GB, HP LP3065
Topper is offline   0 Reply With Quote
Old Aug 13, 2012, 08:09 PM   #243
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by Topper View Post
Sorry, my bad. So, it's the 4GB model that has the 8 pin connector.
Correct...sort of. There are many models from Evga alone. I too was confused at first. Only the bottom two in their 680 line are dual six pin. However, you can use the two six pin 4gb 670. I'm happy with it for XP and in Windows, but still don't have OpenCL running, and don't need it, at least for now in OS X.
xav8tor is offline   0 Reply With Quote
Old Aug 13, 2012, 09:42 PM   #244
derbothaus
macrumors 601
 
derbothaus's Avatar
 
Join Date: Jul 2010
Some 2GB "super ultra gamma beta clocked" versions hav 8-pins as well. I think generally speaking the "square" cooler has 1x6 and 1x8 and the stock reference cooler keeps the 2x6 pins. Of which there is also a "superclocked". It is stupid confusing without looking at exact specs before buying.
__________________
Mac Pro W3680, GTX 680, 12GB DDR3, SSD; MBP, 2.6GHz Core i7, 16GB DDR3, SSD; Eizo fs2333
derbothaus is offline   0 Reply With Quote
Old Aug 13, 2012, 10:00 PM   #245
pprior
macrumors 65816
 
Join Date: Aug 2007
The one I bought (EVGA GTX680SC) has two 6 pin connectors so it was just plug and play
__________________
2013 nMP 6core 16GB RAM, Promise Pegaus2 R6 12TB RAID; uMBP 17/Matte
pprior is offline   0 Reply With Quote
Old Aug 13, 2012, 10:03 PM   #246
derbothaus
macrumors 601
 
derbothaus's Avatar
 
Join Date: Jul 2010
Quote:
Originally Posted by pprior View Post
The one I bought (EVGA GTX680SC) has two 6 pin connectors so it was just plug and play
Thats the reference cooler. For reference
__________________
Mac Pro W3680, GTX 680, 12GB DDR3, SSD; MBP, 2.6GHz Core i7, 16GB DDR3, SSD; Eizo fs2333
derbothaus is offline   0 Reply With Quote
Old Aug 14, 2012, 01:48 AM   #247
twietee
macrumors 68030
 
twietee's Avatar
 
Join Date: Jan 2012
Quote:
Originally Posted by Topper View Post
Sorry, my bad. So, it's the 4GB model that has the 8 pin connector.
I don't think so. The EVGA Geforce GTX 680 Classified 4GB (Link in German: http://www.alternate.de/html/product...fied/1024872/? - you can click on mehr info for specs) has two 6 pins connectors. At least that's what the specs say.
I too saw a Point of View 680 4GB with 2x6pins.
So xav8tor, maybe you're lucky XP is CPU limited..

But the more infos I gather, the more confused I get. Is the 570 with better bandwith maybe much more useful for me compared to the 256 Bit of the 680?
And I even read that the 4GB models are somewhat slower than the 2GB ones.
I do CAD, PS and rendering with it so would naturally go with 4GB...
twietee is offline   0 Reply With Quote
Old Aug 14, 2012, 06:02 AM   #248
xav8tor
macrumors 6502
 
Join Date: Mar 2011
Quote:
Originally Posted by twietee View Post
I don't think so. The EVGA Geforce GTX 680 Classified 4GB (Link in German: http://www.alternate.de/html/product...fied/1024872/? - you can click on mehr info for specs) has two 6 pins connectors. At least that's what the specs say.
I too saw a Point of View 680 4GB with 2x6pins.
So xav8tor, maybe you're lucky XP is CPU limited..

But the more infos I gather, the more confused I get. Is the 570 with better bandwith maybe much more useful for me compared to the 256 Bit of the 680?
And I even read that the 4GB models are somewhat slower than the 2GB ones.
I do CAD, PS and rendering with it so would naturally go with 4GB...
You are right about the EVGA specs not being clear. The pdf spec sheet on their USA site is correct, and I don't think it was the same when I ordered. In any case, on many sites, the specs are just wrong. I raised the roof with them about it because I paid over 100 USD in shipping and restocking fees to return the 4 GB 680 Classified. It takes two EIGHT pin connectors or FOUR six pin connectors. I'm reasonably satisfied with overclocked 4 GB 670.
xav8tor is offline   0 Reply With Quote
Old Aug 14, 2012, 06:31 AM   #249
pkshdk
macrumors newbie
 
Join Date: Jun 2012
Quote:
Originally Posted by twietee View Post
I don't think so. The EVGA Geforce GTX 680 Classified 4GB (Link in German: http://www.alternate.de/html/product...fied/1024872/? - you can click on mehr info for specs) has two 6 pins connectors. At least that's what the specs say.
I too saw a Point of View 680 4GB with 2x6pins.
So xav8tor, maybe you're lucky XP is CPU limited..

But the more infos I gather, the more confused I get. Is the 570 with better bandwith maybe much more useful for me compared to the 256 Bit of the 680?
And I even read that the 4GB models are somewhat slower than the 2GB ones.
I do CAD, PS and rendering with it so would naturally go with 4GB...
If you look at the product pictures on EVGA's website, you can clearly see, it needs 2x8 pins.

http://eu.evga.com/products/moreinfo...s%20Family&sw=
pkshdk is offline   0 Reply With Quote
Old Aug 14, 2012, 07:00 AM   #250
twietee
macrumors 68030
 
twietee's Avatar
 
Join Date: Jan 2012
Quote:
Originally Posted by xav8tor View Post
I'm reasonably satisfied with overclocked 4 GB 670.
Glad to hear it, I will take this one probably, too. Is the noise level the same as the 680 under load?

Quote:
Originally Posted by pkshdk View Post
If you look at the product pictures on EVGA's website, you can clearly see, it needs 2x8 pins.

http://eu.evga.com/products/moreinfo...s%20Family&sw=


Sorry, I ment this one!

http://eu.evga.com/products/moreInfo...0Family&uc=EUR

Not sure what FTW+ means, though.
Hope this is correct now, still a noob here!
twietee is offline   0 Reply With Quote

Reply
MacRumors Forums > Apple Hardware > Desktops > Mac Pro

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Similar Threads
thread Thread Starter Forum Replies Last Post
Nvidia Titan/Kepler EFI ROM for Mac Pro zebity Mac Pro 23 Jun 6, 2014 04:45 PM
Leave gt120 inside with gtx680? Mac Pro 2009 and 27in Cinema Display chirpie Mac Pro 14 Apr 1, 2014 05:35 PM
GTX680 - Mac Pro system fan noise when cold Spacedust Mac Pro 1 Jan 3, 2014 05:07 PM
2 x GTX580 or 2 x GTX680 in Mac Pro 2010? hackerwayne Mac Pro 35 Dec 23, 2012 02:55 PM

Forum Jump

All times are GMT -5. The time now is 01:25 AM.

Mac Rumors | Mac | iPhone | iPhone Game Reviews | iPhone Apps

Mobile Version | Fixed | Fluid | Fluid HD
Copyright 2002-2013, MacRumors.com, LLC