Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Fenn

macrumors member
Original poster
Dec 10, 2012
40
0
I have been working on benchmarks for around 50 games and apps over the past week; a continuation of some of the work I did when the 2012 680mx iMacs were released two years ago. I am comparing performance on a myriad of metrics across the 2012 and 2014 fully loaded iMac models including release driver performance versus mature driver performance versus overclock performance at 2560x1440 and 4k/5k resolutions.

I seem to have hit a stumbling block that I cannot resolve and figured I would ask here for help before I post my results. While the 4k/5k performance in bootcamp and OS X isn't as bad as I thought it would be, the M295X iMacs are performing nowhere near where they should be, especially when it comes to 2560x1440 resolutions. There have also been several instances in which I experienced micro stutters that randomly dropped the framerate and performance to almost half of what it currently was for a split second. In fact, in a dozen or more tests, the M295X actually performed worse than the 680mx at standard clocked speeds. Literally an actually decrease in performance for something that should be blowing a two year old card out of the water.

However, I seem to have come across an issue with the Core clock throttling itself long before it actually should be. Everywhere I look it appears that the stock M295X is intended to start throttling at 105°C and until then, with all of the power options on Max, should run at an 850mhz Core clock. However, with a clean Win7 install, the M295X starts throttling itself literally seconds after use and almost as soon as it hits 70°. I can manually watch this using any of the various Core/Memory/Usage/Temperate apps in windows and you can see a perfect Cosine curve in the Core clock as it starts fluctuation in my case between 720Mhz and 762Mhz. (I have included a quick photo below). The clock always starts out at 850 Mhz, so I know these aren't underclocked, but I cannot find a way for the life of me to lock the clock at 850 or disable PowerPlay (which seems to be AMD's temperature based throttle.) The higher the temperature gets, the lower the clock runs, which makes this an absolute PITA for repeat benchmarks and my numbers are all over the place.

I would like to confirm this is also an issue under Yosemite, is there a GPU tool which can show a visual graph of clock speed against temperature that anyone knows about? I looked and the best I could find are just generic lists of core/memory speeds and GPU usage/temperate, but nothing that can show Core throttling.

I don't mean to be alarmist, but I have a feeling that these M295X iMacs may have a lowered temperate curve for enabling throttling, and if this is the case, we are getting nowhere near the performance that we should be getting. Losing over 100Mhz core clock is a huge hit to performance, not to mention as the clock throttles, the micro stutters that can manifest are extremely annoying.

Unfortunately, there is no way around it, heat is a tremendous problem with these new iMacs and I fear we have a gimped M295X because of it. Out of 6 generations of iMacs, 3 mac pros and 2 G5's, I have never had a computer that has hit 104°C after 7 minutes of playing a game or rendering 3D and this new iMac shocks me. I seriously question the longevity of these machines and from an engineering standpoint, nothing on the market right now is designed to run over 100°C continuously without failing. Once you add in dust, ambient heat during a non-winter season, and months of use, I would be surprised if these machines lasted longer than a couple years without essentially burning themselves out. We can debate the 'dream' 980MX vs. M295X all day, but Apple chose to get to market with a Retina iMac and the only option was a card substantially hotter than what it should be. The entire AMD 290 line has had heat as a controversy since their release last year. The problem is that the thermal envelope on these cards is too hot for the cooling that this iMac form factor has to offer. If the GPU sits at 104°C while the card is oscillating between 720Mhz and 762Mhz we know that there is no way in heck that we are going to get a natural 850Mhz core clock out of these cards. Even if we do find a way to stop the card from throttling and lock it at 850Mhz, it will likely far exceed the 105°C temperature and bring about instability. This is why I fear this issue has nothing to do with drivers and is probably hardcoded into the card BIOS.

I do not want to spoil anyone's fun; the screens on these iMacs are the best I have ever looked at bar none and an absolute dream to work on. Just the brightness and color contrast alone from not having a visible LCD gate separating the pixels is gorgeous. People with needs that won't tax the GPU won't have any issue to deal with and will LOVE this machine and its breathtaking screen. But heat and the throttling I have experienced is a big problem for anyone wanting complete performance. As it stands, is the screen worth a GPU that is only 3-5fps better than a stock 680MX, and worse than an overclocked 680MX? My heart is breaking, I would have paid anything for a 980mx :( Thoughts?
 

Attachments

  • M295XThrottle.png
    M295XThrottle.png
    637.6 KB · Views: 2,505

WilliamG

macrumors G3
Mar 29, 2008
9,922
3,800
Seattle
Interesting results. I have Windows 8 installed, and I'm curious as to what software you used to check the GPU core to see if it's throttling? I have the high-end RiMac and so can check on this end with Win 8.
 

Fenn

macrumors member
Original poster
Dec 10, 2012
40
0
The easiest way for anyone to check this is to use GPU-Z.

http://www.techpowerup.com/gpuz/

The sensors tab has several bits of information related to the function of your system. Open the 3D game of your choice and go into windowed mode. From there you can watch the sensors in GPU-Z. Pay attention to GPU Core clock and GPU temperature. You can watch the oscillation of the throttling versus the rising temperature there.

You can also use MSI Afterburner which is nice because you can mouse over the graph and see the various slices of performance as it relates to time. My screenshot above is from Afterburner, but I originally noticed the issue in GPU-Z.

http://event.msi.com/vga/afterburner/download.htm
 

vir3l

macrumors member
Oct 30, 2014
41
1
I didn't experience gpu performance loss yet (but did not measure it either) but heat and fan noise alone made me gave my imac 5k m295x back. I'm convinced, every gamer feels the same.
 

pcconvert

macrumors member
Oct 24, 2008
69
0
Thanks for your analysis Fenn, it has logic and makes sense. It should make anyone thinking that there is a free lunch and some sort of apple magic think twice. Retina iMac time will come when the laptops will handle 5k but sure it's not now.
 

WilliamG

macrumors G3
Mar 29, 2008
9,922
3,800
Seattle
The easiest way for anyone to check this is to use GPU-Z.

http://www.techpowerup.com/gpuz/

The sensors tab has several bits of information related to the function of your system. Open the 3D game of your choice and go into windowed mode. From there you can watch the sensors in GPU-Z. Pay attention to GPU Core clock and GPU temperature. You can watch the oscillation of the throttling versus the rising temperature there.

You can also use MSI Afterburner which is nice because you can mouse over the graph and see the various slices of performance as it relates to time. My screenshot above is from Afterburner, but I originally noticed the issue in GPU-Z.

http://event.msi.com/vga/afterburner/download.htm

Interesting. Ok. So I installed MSI Afterburner. I've been running Darksiders II for about 20 mins at 1440p in a window. GPU temp is pegged at 106C now. Core clock is holding at between 829-843. It fluctuates between those numbers, but never lower.

Windows 8.1.
 

Fenn

macrumors member
Original poster
Dec 10, 2012
40
0
Interesting. Ok. So I installed MSI Afterburner. I've been running Darksiders II for about 20 mins at 1440p in a window. GPU temp is pegged at 106C now. Core clock is holding at between 829-843. It fluctuates between those numbers, but never lower.

Windows 8.1.

That's great, though I am surprised yours hits 106 degrees yowza . If you can, try something also that is very visually rigorous. I'll try Darksiders II when I get home. It wasn't one of the games in my initial benchmark suite.
 

WilliamG

macrumors G3
Mar 29, 2008
9,922
3,800
Seattle
That's great, though I am surprised yours hits 106 degrees yowza . If you can, try something also that is very visually rigorous. I'll try Darksiders II when I get home. It wasn't one of the games in my initial benchmark suite.

Bad news. I switched to Diablo 3... Ran it full screen (2560x1440, non Retina of course, Windows 8).

Yep, the M295X is an inferior card to the GTX 680MX. The frame-rate fluctuations are insane after only a couple of minutes of gaming.

And I just sold my 2012 iMac. 2 years later, and now gaming is really not fun on the RiMac. Just... sad...

I hadn't really had any time to play anything since I got the RiMac, and so this comes as a bit of a sad shock. I'm hoping Apple can make some magic happen, or I absolutely fail to see the point of a 4GB M295X over my old 2GB Geforce GTX 680MX which never, ever throttled, even when heavily overclocked. That card was a beast! Heck, the M290X might be the better buy if it never throttles.

Once again, I am a sad panda..
 

Attachments

  • IMG_4136.JPG
    IMG_4136.JPG
    2.6 MB · Views: 2,061
Last edited:

spyguy10709

macrumors 65816
Apr 5, 2010
1,007
659
One Infinite Loop, Cupertino CA
Bad news. I switched to Diablo 3... Ran it full screen...

Yep, the M295X is an inferior card to the GTX 680MX. The frame-rate fluctuations are insane after only a couple of minutes of gaming.

And I just sold my 2012 iMac. 2 years later, and now gaming is impossible on the RiMac. Just... sad...

the M295X is an inferior card at 5k than the GTX 680MX was at 1440P. Huge huge huge difference.
 

WilliamG

macrumors G3
Mar 29, 2008
9,922
3,800
Seattle
the M295X is an inferior card at 5k than the GTX 680MX was at 1440P. Huge huge huge difference.

NO! The M295X is an inferior card at 1440p than the GTX 680MX was at 1440p.

Anyone reading this: The M295X is just not good enough. If I hadn't already sold my 2012 iMac with GTX 680MX, I'd probably return this system. Yes, the Retina display is amazing, but next year will probably be a 980M or newer variant, and I'll just sell this iMac.

Apple screwed this one up. I had a feeling they had all along.. *sigh*
 

Fenn

macrumors member
Original poster
Dec 10, 2012
40
0
Yup, that is the issue I have noticed. It is very apparent when you can see the graph of Core clock fluctuations. I will continue to investigate. I would love if someone with a 290x could run some tests as well. My benchmark findings are gruesome even at 1440p and I want to make sure I have covered all my bases before I report.
 

steve62388

macrumors 68040
Apr 23, 2013
3,090
1,944
Throttling is certainly a possibility, although I find it peculiar neither Barefeats or Arstechnica picked it up and mentioned it in their benchmarking. I wonder if there is a wide difference from machine to machine?

I wish Anandtech would pull finger and get a review out. Does anyone remember how long it took them to publish before Anand left? I saw one of their tweets (can't find it right now) that the iPad Air 2 review is out beginning of next week, and the riMac the week after but I don't know which GPU they got. If they didn't get the M295X I guess we will still be none the wiser.
 

WilliamG

macrumors G3
Mar 29, 2008
9,922
3,800
Seattle
Throttling is certainly a possibility, although I find it peculiar neither Barefeats or Arstechnica picked it up and mentioned it in their benchmarking. I wonder if there is a wide difference from machine to machine?

I wish Anandtech would pull finger and get a review out. Does anyone remember how long it took them to publish before Anand left? I saw one of their tweets (can't find it right now) that the iPad Air 2 review is out beginning of next week, and the riMac the week after but I don't know which GPU they got. If they didn't get the M295X I guess we will still be none the wiser.

Throttling isn't just a "possibility." It's doing it. Period.

Yup, that is the issue I have noticed. It is very apparent when you can see the graph of Core clock fluctuations. I will continue to investigate. I would love if someone with a 290x could run some tests as well. My benchmark findings are gruesome even at 1440p and I want to make sure I have covered all my bases before I report.

I'd like to see what your findings are in all cases. I don't think you're doing anything wrong at all. BTW, the reason my Darksiders II results in a window were fine is because it wasn't maxing out the GPU.

This is just pretty depressing. Quite honestly, I feel pretty darn unhappy about paying the extra for the M295X.
 

wubsylol

macrumors 6502
Nov 6, 2014
381
391
Can you guys elaborate on your testing a little further?
I'm confused as to why nobody else is picking this up.
 

rainydays

macrumors 6502a
Nov 6, 2006
886
0
This is very interesting! Finally a clue to why gaming benchmarks doesn't show a huge difference between the 290x and the 295x.

However, the 295x should still perform a lot better in the pro app area when it's being pushed in smaller bursts rather than all the time like in games.
 

WilliamG

macrumors G3
Mar 29, 2008
9,922
3,800
Seattle
This is very interesting! Finally a clue to why gaming benchmarks doesn't show a huge difference between the 290x and the 295x.

However, the 295x should still perform a lot better in the pro app area when it's being pushed in smaller bursts rather than all the time like in games.

Yes, agreed.

As an update, running Diablo 3 at 1440p in OS X yields much better results than in Windows 8. I have not encountered any noticeable throttling in OS X, and the frame-rate runs a solid 60fps (confirmed with CMD+R in game to bring up fps) without any frame stuttering. This is interesting, since on my 2012 iMac with the GTX 680MX, Windows performance of Diablo 3 was far superior to OS X Yosemite's. Now, with the M295X and Windows throttling, it's the opposite with the M295X!

It would be interesting to see some GPU clock speed info in OS X to see if the throttling is exclusive to Windows or not. I'm hoping the throttling can be fixed in Windows with newer drivers or something.. Hoping... hoping... hoping...

More testing to be done in OS X, but this is promising in OS X land, that's for sure.
 

forg0t

macrumors member
Aug 13, 2014
89
0
Yes, agreed.

As an update, running Diablo 3 at 1440p in OS X yields much better results than in Windows 8. I have not encountered any noticeable throttling in OS X, and the frame-rate runs a solid 60fps (confirmed with CMD+R in game to bring up fps) without any frame stuttering. This is interesting, since on my 2012 iMac with the GTX 680MX, Windows performance of Diablo 3 was far superior to OS X Yosemite's. Now, with the M295X and Windows throttling, it's the opposite with the M295X!

It would be interesting to see some GPU clock speed info in OS X to see if the throttling is exclusive to Windows or not. I'm hoping the throttling can be fixed in Windows with newer drivers or something.. Hoping... hoping... hoping...

More testing to be done in OS X, but this is promising in OS X land, that's for sure.


This may be because the 295x has no proper support for Windows yet, it's an OS X only gpu so far. Which sucks since games are optimized for Windows drivers.
 

habeebhashim

macrumors member
Jun 16, 2009
60
0
Singapore
Sorry but 680MX is inferior to the M295X by a 20%-25%

Not if the 295X keeps throttling due to heat. I have to check this.

In the games I play, I reported before that I get temps of 105c... but no apparent dropped frames due to throttling. Even if I didn't notice it playing Shadows of Mordor, Tomb Raider or Dota2... I most definitely should notice dropped frames when playing Counter Strike:GO** Most of my gaming sessions lasts for at least an hour at stretch.


**CS:GO or any Steam game for that matter... don't really tax the GPU that much I think. Maybe I need to get Crysis or something like that.
 

Serban

Suspended
Jan 8, 2013
5,159
928
295x has no drivers for WINDOWS !
every game under OSX is better or the same as 780M

295X under OSX i never had more than 75 celsius on GPU
 

iczster

macrumors member
Oct 23, 2014
95
4
This is concerning and I've just pulled the trigger on a fully loaded RiMac. Should I be looking at cancelling, Im a casual gamer?
 

tillsbury

macrumors 68000
Dec 24, 2007
1,513
454
This is concerning and I've just pulled the trigger on a fully loaded RiMac. Should I be looking at cancelling, Im a casual gamer?

I really don't understand these figures. I'm a casual gamer and find it amazing how smoothly the riMac runs games I want to play at 5k without breaking a sweat. I suppose it depends what you mean by "casual gamer". What is that for you?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.