Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And again. R9 M295X vs 970m vs 980m, all running Windows 8.1. Obviously the Nvidia GPUs are not on iMacs. It will be interesting to hear your thoughts.

File links:-
https://drive.google.com/file/d/0B9vl5TopANpYcEZORWhXdE5JWWc/view?usp=sharing
https://drive.google.com/file/d/0B9vl5TopANpYLWtzTjJteG93dVE/view?usp=sharing

16nii0.png

wtgkm9.png
 
Thanks for the results guys - my main concern was 1440p performance exceeding the 780 in the 2013 iMac (my current machine).

A 30%+ improvement is nothing to be sniffed at.
 
Thanks for posting these results. Not in the same ballpark as a 980M. Sad... Still, better than I thought it would be.
 

Ahh ok must have missed it. So it does indeed look like the 680MX is easily the equal to the R295X (actually, faster) with an overlock. Boo.

I wish I knew what was up with Apple and this R295X. They're clearly holding back. No question the GPU should have progressed more in 2 years.

Is there a resolution those run at? Or were they all taken at 2560x1440?

----------

Thanks for the results guys - my main concern was 1440p performance exceeding the 780 in the 2013 iMac (my current machine).

A 30%+ improvement is nothing to be sniffed at.

Wha? 30%? Where are you seeing that!? :eek:
 
Ahh ok must have missed it. So it does indeed look like the 680MX is easily the equal to the R295X (actually, faster) with an overlock. Boo.

The 680MX in those numbers might already be overlocked, I don't know as these benchmark results don't report the GPU clock speed. I guess the thing to do might be to run it yourself and report back here (if you have a 680MX)?

Hopefully there should be some improvement in the R9 M295X results over time as the drivers improve, whereas I suspect in the other cards less so because they are already mature.
 
Last edited:
The 680MX in those numbers might already be overlocked, I don't know as these benchmark results don't report the GPU clock speed. I guess the thing to do might be to run it yourself and report back here (if you have a 680MX)?

Hopefully there should be some improvement in the R9 M295X results over time as the drivers improve, whereas I suspect in the other cards less so because they are already mature.

Good news. That 680MX is indeed overclocked in your benchmark comparison! Just booted to Windows 7 and ran the same tests at stock clocks on my 2012 iMac i7, 680MX, and overlocked 225/350.

Here are the results at stock clock GPU, much lower than 680MX in the benchmark:

qJ7D29A.jpg


And here at overclocked GPU, almost identical to the overclocked 680MX in that benchmark (the one you posted might be clocked a tad higher):

This makes me feel much better, as 3500 (680MX stock) -> ~5000 (M295X stock) is quite a difference. And if the M295X can be overlocked as well as the 680MX..., that wouldn't be too bad! Sure it's not 980M good, but...!

sOo6Jkt.jpg
 
Last edited:
Good news. That 680MX is indeed overclocked in your benchmark comparison! Just booted to Windows 7 and ran the same tests at stock clocks on my 2012 iMac i7, 680MX, and overlocked 225/350.

Here are the results at stock clock GPU, much lower than 680MX in the benchmark:

Image

And here at overclocked GPU, almost identical to the overclocked 680MX in that benchmark (the one you posted might be clocked a tad higher):

This makes me feel much better, as 3500 (680MX stock) -> ~5000 (R295X stock) is quite a difference. And if the R295X can be overlocked as well as the 680MX..., that wouldn't be too bad! Sure it's not 980M good, but...!

Image

Well, hopefully this puts that issue to bed then (680MX vs M295X, in the Passmark benchmark at least). I don't know much about overclocking on a Mac, but I gather the 680MX was particularly good for it. The M295X might not have anywhere near the same headroom.
 
Well, hopefully that puts that issue to bed then (for that benchmark anyway). I don't know much about overclocking on a Mac, but I gather the 680MX was particularly good for it. They M295X might not have anywhere near the same headroom.

This is true. I'd be interested to see what the M290X might do in this same benchmark, and if it's possible to overlock the M295X yet?
 
The thing is overclocking the 680MX at 250/350 is 100% stable, and the temperature is not an issue at all. (Only during a few especially hot summer days I turned the overclock down to 150/250, but I'm more careful with temps than a lot of other people here.)
Maybe AMD is better suited to the 5K display than current Maxwell GPUs? Doesn't AMD GPUs run hotter though? Then the overclocking overhead would be smaller than the 680MX/Nvidia. I dunno... I play games regularly, and I'll hold on to my late 2012 iMac for now. (Still don't know how the retina display works in normal 1440p resolution, if it's more blurry than native 1440p non-retina display...)
 
Last edited:
Have we had a Cinebench benchmark for the R9 M295X vs the 780m already? I really can't remember.

Anyway... here is a link:-
http://www.youtube.com/watch?v=PIMWatn5L3M&t=8m40s

Yes, we have. It's not a good indication of GPU performance since the OpenGL score is highly influenced by single core CPU performance and the 4790K has the best single core performance of any Intel CPU (at stock frequencies of course). Cinebench is a great indicator of how well Cinema4D will work on your system and that's about it.
 
Yep now im happy to see these numbers :)

The only thing is why the numbers of the m290x dont look good at all? Why theres such a difference between the 290 and 295?

And the desktop R 290x doesn't impresses either - http://www.3dmark.com/3dm11/8489778 - 8578

Could it be that the m295x is a complete game changer instead of all the rebrands that we are used to see? Lets just hope so

My understanding is that the 290 is based on a 2 years old architecture while the 295 is based on the new tonga platform.
 
have anyone installed bootcamp and run gpu-z yet on m295x imac ?

Just wanna know shader's core amount for it.
 
(Still don't know how the retina display works in normal 1440p resolution, if it's blurry or not...)

It isn't. I mean I'll try it out today when mine arrives but I've had an rMBP for 2.5 years and 1440p is definitely not blurry.
 
Well, blurry compared to native 1440p for the ordinary iMac... :)

Why would you think it'd be blurry? It looks sharp and clear to me at that res and I can compare to my HP work laptop which foes look blurry to me.
 
Good news. That 680MX is indeed overclocked in your benchmark comparison! Just booted to Windows 7 and ran the same tests at stock clocks on my 2012 iMac i7, 680MX, and overlocked 225/350.

Here are the results at stock clock GPU, much lower than 680MX in the benchmark:

And here at overclocked GPU, almost identical to the overclocked 680MX in that benchmark (the one you posted might be clocked a tad higher):

This makes me feel much better, as 3500 (680MX stock) -> ~5000 (M295X stock) is quite a difference. And if the M295X can be overlocked as well as the 680MX..., that wouldn't be too bad! Sure it's not 980M good, but...!


Two things are true, the M295x is a great card and no one should be disappointed in the performance. Second, the 980m is a freak of nature and an even better card that is nearly equivalent of running 780m SLI. I would surmise that Nvidia couldn't produce enough 980's to meet Apple's requirements, and ATI was willing to dedicate the 295x to Apple for a period of time and that is what got the card in the rimac. Just a theory.

Remember that in all of the cards that Apple could have used, there is literally only one card that is superior. With a Geforce 750 still in the MBP, I would call this a win. That Apple hasn't put a 850, 860 or 900 series in the MBP makes me nuts.
 
After looking this up, I guess it doesn't really matter to me which video card I get, when comparing it to my 2012 rMBP. It's going to smoke it.


screen.png
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.