Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

flavr

macrumors 6502
Original poster
Nov 9, 2011
363
40
Calling all GTX 680mx / 675mx owners...please post VALLEY benchmarks! You can download it here:

http://unigine.com/products/valley/

it has some really detailed gorgeous graphics that but the cards through its paces! I have a 675mx and need something to compare my benchmark to.

Do it with settings:

1920x1080 RESOLUTION AT 16:9
antialias x2
ultra detail
v-sync OFF all other boxes checked ON

post your final benchmark results with a screenshot, thanks!
 
Last edited:
2012 iMac, i7 3.4Ghz, 32GB RAM, Geforce MX 680GTX, Windows 7 64bit, 320.49 Nvidia drivers, settings as you listed them:

Stock clocks, vsync ON+triple buffering: 43.7fps, score 1829, min fps 22.3, max fps: 65.3

225/350 clocks, vsync ON+triple buffering: 48.5fps, score 2028, min fps: 25.6, max fps: 60.5

Stock clocks, vsync OFF: 44.0fps, score 1840, min fps 22.1, max fps 88.4

225/350 clocks, vsync OFF, 55fps, score 2332, min fps 26.4, max fps 112.3

As an side, having vsync set to off does nothing for our iMac displays except heat up our GPUs more than necessary. The iMac cannot display more than 60fps.

As an addendum, the minimum frames per second is irrelevant for the most part. It drops hugely when it switches from one scene to the next, much more than actually during the bench.
 
Last edited:
As an side, having vsync set to off does nothing for our iMac displays except heat up our GPUs more than necessary. The iMac cannot display more than 60fps.

Is this specifically regarding the 2012 model or iMacs in general? I haven't heard this before.
 
Is this specifically regarding the 2012 model or iMacs in general? I haven't heard this before.

No, this is for ANY display at 60hz, or 120hz, or whatever. 60hz = 60fps, 120hz = 120fps. So vsync on a 120hz display would lock the display to 120fps. Our iMacs are 60hz, as are most monitors. So anything beyond 60fps is absolutely, 100% pointless. This isn't a debate about whether you or I can see the difference. There is NO difference. All you get with vsync off is screen tearing and an unnecessarily hot GPU, since the GPU has to render those frames internally despite the fact that the display device (iMac, in this case) can't show all those frames.
 
No, this is for ANY display at 60hz, or 120hz, or whatever. 60hz = 60fps, 120hz = 120fps. So vsync on a 120hz display would lock the display to 120fps. Our iMacs are 60hz, as are most monitors. So anything beyond 60fps is absolutely, 100% pointless. This isn't a debate about whether you or I can see the difference. There is NO difference. All you get with vsync off is screen tearing and an unnecessarily hot GPU, since the GPU has to render those frames internally despite the fact that the display device (iMac, in this case) can't show all those frames.

Thanks, this is something I didn't really understand until now. I found this interesting article. Page 9 is where it covers the current topic. It seems to agree with your basic premise but then goes on to show that the full answer can be more complicated. Thanks again for your reply.
 
Thanks, this is something I didn't really understand until now. I found this interesting article. Page 9 is where it covers the current topic. It seems to agree with your basic premise but then goes on to show that the full answer can be more complicated. Thanks again for your reply.

Right, I was giving the basic premise, but you had to go and make things more complicated :) If you're going to use vsync set to on, you need a way to triple-buffer to avoid exactly what is explained on page 9 of that article.

Enter, D3DOverrider. Download it, enable triple buffering. Problem solved. :)

I believe Far Cry 3 has this option of GPU buffering built right into the game, but most games, for no reason I can think of - don't have this option, so you need a global app like D3DOverrider.

And, on that note, you can see my benchmarks above. I need to edit, since I didn't specify I was using triple buffering. :D :D
 
Screen Shot 2013-08-13 at 19.41.44.png
This is from my 680mx, how does it compare to the 675mx?
 
it has some really detailed gorgeous graphics that but the cards through its paces! I have a 675mx and need something to compare my benchmark to.

Could you post your results too? I'm curious how it compares.
 
Yep...here are my results...and iMac specs

VERY CLOSE to the 680mx as I suspected
 

Attachments

  • Screen Shot 2013-08-13 at 9.46.18 PM.png
    Screen Shot 2013-08-13 at 9.46.18 PM.png
    84.2 KB · Views: 117
  • Screen Shot 2013-08-13 at 9.52.24 PM.jpg
    Screen Shot 2013-08-13 at 9.52.24 PM.jpg
    227.2 KB · Views: 303
Last edited:
Just for comparison sake, here are some results from my Mac Pro 5,1 with two different CPUs and two different GPUs. All tests were done in the "extreme" presets.
 

Attachments

  • 2.40 to 2.93.jpg
    2.40 to 2.93.jpg
    958.6 KB · Views: 212
It's a benchmark, and not a very good one, either.

You'll find some differences at higher resolutions, too.

http://barefeats.com/imac12g4.html

Run the benchmark at 2560x1440 and see how you do. I game at 1440p, and it seems to me the 680GTX is the way to go.

I just read that article and the test results which are flawed and they admit to that. Then I read the article linked within it which also contains less flawed results, close enough for our purposes here though.

My take after looking at that stuff is the 675MX is a decent GPU but it is lacking most especially in the area of video ram. I don't think anyone planning to play games for the next few years on a computer should be buying any GPU with less than 2 gigs anymore because as video ram usage by games continues to increase, there will be a performance hit for having 1 gig or less. To me personally, that represents the biggest reason to fork over another $150. for the upgrade to the 680MX which is not a huge upgrade over the 675MX for gaming purposes otherwise although certainly it is faster. Please note I said, "huge" there. I mean, it is not twice as fast and the benchmarks bear that out. It was around 20 or so percent faster using features you'd expect to see in new games. However, and this goes back to the video ram issue, the benchmarks used about 900 MB of video ram. There would be a more substantial difference if they used more memory if it was available as some games already do. How much? I don't know to be honest. I have not seen benchmarks highlighting the difference but they would be interesting to see.

Anyway, I wouldn't rain on this guy's parade. He was looking to see if his card was pretty good compared to the top end and for the most part, it is. There's always going to be something better if not right now, pretty soon - like the next refresh upcoming for example and the one after that, etc.
 
It seems some people just wont admit the truth in this... Gaming at a non-native resolution at the iMac looks like crap compared to 1440p, and >1gb onboard vram is necessary for a solid performance on a high resolution like 1440p. Just try to play a recent AAA game and see how much ram it uses.
Now, if you have very little interest in games, and only play like independent type games or older stuff for nostalgia, then 675MX will do nicely. Even if those are your goals, when you do get that great looking 27'' iMac screen, you'll want to play more recent games believe me. Another thing is framerate. You need 40+ framerate in games to have a decent response while playing. (Hardcore FPS gamers prefer 60-80fps+ and 120hz monitors.)
Playing in 1440p is very demanding on hardware, and a 680MX is the minimum requirement for a decent gaming experience with modern games IMO.
 
It seems some people just wont admit the truth in this... Gaming at a non-native resolution at the iMac looks like crap compared to 1440p, and >1gb onboard vram is necessary for a solid performance on a high resolution like 1440p. Just try to play a recent AAA game and see how much ram it uses.
Now, if you have very little interest in games, and only play like independent type games or older stuff for nostalgia, then 675MX will do nicely. Even if those are your goals, when you do get that great looking 27'' iMac screen, you'll want to play more recent games believe me. Another thing is framerate. You need 40+ framerate in games to have a decent response while playing. (Hardcore FPS gamers prefer 60-80fps+ and 120hz monitors.)
Playing in 1440p is very demanding on hardware, and a 680MX is the minimum requirement for a decent gaming experience with modern games IMO.

Agreed. 1080p on the 27" 1440p iMac is uglier than ugly. I'd rather turn down some details and run 1440p than have it cranked at 1080p.
 
Here is the gtx680mx running at native.
 

Attachments

  • Screen Shot 2013-08-14 at 17.13.31.png
    Screen Shot 2013-08-14 at 17.13.31.png
    1.7 MB · Views: 197
Here is the gtx680mx running at native.

I'll get some results on that when I reboot to Windows.

Also, it's important to note that the GTX 680MX has a huge overclock potential. I'm not sure about the 675, but it's easy to get a 25% fps boost on the 680 with a small overclock. Temperatures only go up by about 2 degrees, too. 225 clock, 350 memory is a 100% stable 24/7 overclock.
 
I'll get some results on that when I reboot to Windows.

Also, it's important to note that the GTX 680MX has a huge overclock potential. I'm not sure about the 675, but it's easy to get a 25% fps boost on the 680 with a small overclock. Temperatures only go up by about 2 degrees, too. 225 clock, 350 memory is a 100% stable 24/7 overclock.

I'll have to give it a go in windows too. Both mine were carried out in OSX. I did notice my GPU getting pretty toasty will running these, touching 90 degrees which I thought was the upper limit for the 680mx.
 
I'll have to give it a go in windows too. Both mine were carried out in OSX. I did notice my GPU getting pretty toasty will running these, touching 90 degrees which I thought was the upper limit for the 680mx.

Interesting. Not sure I hit 90C even with my overclock. It was close, though. 88C, if I recall correctly. I'm working right now, so will get it it in the next day or so and report back findings.
 
Interesting. Not sure I hit 90C even with my overclock. It was close, though. 88C, if I recall correctly. I'm working right now, so will get it it in the next day or so and report back findings.

What a difference windows makes! I barely hit 80 degrees running both these at native and 1080 whilst in bootcamp. 1080 does 100% suck balls compared to native though. Almost like stepping back into 90's TV o_O
 

Attachments

  • 1.PNG
    1.PNG
    37.4 KB · Views: 118
  • 2.PNG
    2.PNG
    36.3 KB · Views: 88
What a difference windows makes! I barely hit 80 degrees running both these at native and 1080 whilst in bootcamp. 1080 does 100% suck balls compared to native though. Almost like stepping back into 90's TV o_O

Yep, much more in line with my results (see second post). I seriously don't know why anyone with these high-end iMacs uses OS X for gaming. People spend all that money on a gaming videocard just to use OS X to play? Madness!
 
Yep, much more in line with my results (see second post). I seriously don't know why anyone with these high-end iMacs uses OS X for gaming. People spend all that money on a gaming videocard just to use OS X to play? Madness!

I know what you mean, the only time I game on OSX is when im not bothered about graphics and just wanna kill some time alla Portal 2 ;). Everything else (now that I have it working) is through bootcamp. Its a shame it has to be this way as I was hopeing never to use windows at home again... but hey ho, games must be played :)
 
I know what you mean, the only time I game on OSX is when im not bothered about graphics and just wanna kill some time alla Portal 2 ;). Everything else (now that I have it working) is through bootcamp. Its a shame it has to be this way as I was hopeing never to use windows at home again... but hey ho, games must be played :)

Oh don't get me wrong. In a perfect world all these games would run in OS X, and BETTER than they do in Windows. :D
 
Thanks for posting guys!

----------

Here is the gtx680mx running at native.

When I get home I will run a test at native 1440 with the 675mx...it will be in OSX so we can compare your OSX results to get and apples to apples comparison...Im guessing Ill get awful close to your 1183 score ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.