Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Seems like you can Monitor the GPU temperature on OSX with iStat Menu, but not the Mhz speed it's running at.

agDYt.png


While Playing Portal2 on 5120x2880 and everything maxed out the GPU is on 75-79°C on OSX, that's the only game i have. No fan running.

Funny thing is, 105°C seems to be the maximum, and that's the number everyone here is talking about.

But finally i have absolutely no idea what i'm talking about here, so i just wanted to throw that in.

Edit: I've tried the VehicleGame example of Unreal Engine 4, that brings up the GPU temperature. To 104°C max.
 
Last edited:
Applecare will take care of any failures, and if there's any design issue you know Apple will take care of it as they have done with previous design flaws. Only time will tell if this is the case. These AMD cards run hot even on the desktop PC side.

ORLY? Talk to the poor 2011 MBP 15 guys about their fried AMD GPU's then who are on their 2nd/3rd logic board replacement and are now suing Apple. (So glad I sold mine for a 2012 rMBP) And you're right about hot AMD desktop cards as better cooled ones (which is impossible to do well in the cramped riMac) run much faster than the poorer cooled & throttled ones.

I'm sure many of the 2014 purchasers here will be lining up for the 2015 rimac with a much faster/cooler 20nm Nvidia/AMD GPU
 
This is not something surprising coming from Apple.. Apples line of MBP's with dedicated GPUs from the 6xx and the 7xx series lines all came gimped and throttled quite heavily under load (due to thermal and power limits). This also stands true for CPUs in the Macbook Line. Apple doesn't expect both CPU and GPU to experience full load. This is evident based on how they design their heat sinks and fan curves.

Testing a 750m vs a 660m and a 660m always comes out on top and everyone wonders why. A gaming load is too demanding but a session in Final Cut may not be.... People who run games in OSX may or may not be aware that almost all of them come with vysnc enabled and no way of turning it off. This limits power usage and therefore heat.

I'm really happy on how OP went about explaining this with proper benchmarks and proper facts to back it up. Most people claim they don't notice it and say no one else should care. If you don't notice it, then you're not a demanding enough user and there is NOTHING wrong with that. You guys have nothing to worry about. It's the folks who want to use their 4k+ machine to play games as well expecting top of the line hardware to be able to play a simple game at 1440p. Also.. nothing wrong with that. Did I buy my Macbook to play Advanced Warefare on ultra... no.. I use my Asus 14 inch laptop for that so I don't have to deal with the noise and my laptop catching on fire. Ok not really.
 
Last edited:
Played last night several hours CIV 5 in 5K on OSX. I set the Fan manually to max but temp still was 98-103.
But don't know how to monitor the m295x to see if its throttling.
 
Those GPU temperatures are way too high for long term ownership in my opinion. I've built more gaming systems over the past decade than I can count and when things start going over 85c it's worrying. 100c and you're in the danger zone where the life of the graphics will diminish quickly.

Most desktop systems will actually shut themselves down to stop damage between 105-110c at the GPU die.

Personally I would not run any current generation GPU that includes the 280, 290, 295 and all the latest NVIDIA and AMD cards for that matter above 85c.
 
That's false, in every way shape and form. You have to realize, that even at 1440p it has to do scaling to fit the 5K monitor. Look up the benchmarks... the m295X is literally a desktop r9 285 - which is in every way shape and form faster than a mobile 680MX - which was a underclocked desktop 670.

Not correct, 680MX is an underclocked dekstop 680. I can easily overclock the 680MX to match the speed of the desktop 680@default, and it never throttles. That's pretty decent. On the other hand m295x seems to throttle frequently because of overheating, so don't even think about overclocking there. I'm definitely waiting for the next model, as I game quite a bit. As Quu also said, those high GPU temperatures do take a toll on the electronics in the long run.
 
Last edited:
Not correct, 680MX is an underclocked dekstop 680. I can easily overclock the 680MX to match the speed of the desktop 680@default, and it never throttles. That's pretty decent. On the other hand m295x seems to throttle frequently because of overheating, so don't even think about overclocking there. I'm definitely waiting for the next model, as I game quite a bit. As Quu also said, those high GPU temperatures do take a toll on the electronics in the long run.

Lol someone claiming they can overclock an iMac. This is just... brilliant.

----------

If a game is running better on a 2GB 680MX with a 3.4Ghz i7 than it is on a 2-years newer 4GB M295X with a 4Ghz i7 at the SAME resolutions due to outrageous temperatures and instantaneous GPU throttling, I fail to see your logic. I don't really care about 3D Mark. I care about actual in-game performance. You could put SLI 980GTX in an iMac, but if it throttles the whole time what's the point?


They're not running at the same resolutions, one is pushing nearly double the pixels, even if it's not RENDERING at that resolution, it still has to OUTPUT that resolution.

You could also push the fans on before the throttling kicks in, but who am I to argue with someone who's obviously so well versed in this that s/he thinks a 680MX is more powerful than a M295X.
 
These forums are the worst thing in the world for someone with one of these machines on order.
True. I ended up canceling mine over the heat and noise issues. Can't really bring myself to pay 3k € for something I might not be 100% happy with.

Damn shame because it's a beautiful machine and the display is the best thing ever.
 
I'm hoping you reordered with an i5/M290X, where there are zero heat/noise issues?

I'm thrilled with mine. Maybe I'll need more CPU/GPU someday for some specialized task, but this should return my investment over 2 years on the screen alone.

Reading PDFs fullscreen :) showing more code on-screen :) actually proofing type/documents on-screen :) watching hi-res video without taking up half the screen :) usable small thumbnails :) amazing detail on large maps and data visualizations :)

Makes me feel like my 2012 rMBP (which I still like) was just a taste of things to come.
 
Lol someone claiming they can overclock an iMac. This is just... brilliant.

----------



They're not running at the same resolutions, one is pushing nearly double the pixels, even if it's not RENDERING at that resolution, it still has to OUTPUT that resolution.

You could also push the fans on before the throttling kicks in, but who am I to argue with someone who's obviously so well versed in this that s/he thinks a 680MX is more powerful than a M295X.

You can overclock any nvidia/amd card in any mobile computer or desktop or iMac very easily.

If they are running the same test at 1080P, doesn't matter the native screen res. 1080 on a 5k screen is the same as 1080 on a 1080 screen. Scalers that are built into the Monitors chipset are used to scale an image which doesn't put any extra load on a GPU so the tests are comparable.
 
Lol someone claiming they can overclock an iMac. This is just... brilliant.

----------



They're not running at the same resolutions, one is pushing nearly double the pixels, even if it's not RENDERING at that resolution, it still has to OUTPUT that resolution.

You could also push the fans on before the throttling kicks in, but who am I to argue with someone who's obviously so well versed in this that s/he thinks a 680MX is more powerful than a M295X.

You are trolling really hard, and don't really understand how resolution works, clearly. 1080p is 1080p, no matter what display it's outputting to. That's why running 480p on a 480p display will result in the same frame-rate as running 480p on a 4k display.

And what do you mean someone "claiming" they can overlock an iMac? Have you not been reading the threads of yesteryear where the 680MX is an absolute BEAST of an overclocker in the 2012 iMacs? And guess what? It didn't throttle while being overclocked, all the while running cooler than an M295X at stock clocks.

Please, if you're not going to be polite, don't respond any further.
 
You are trolling really hard, and don't really understand how resolution works, clearly. 1080p is 1080p, no matter what display it's outputting to. That's why running 480p on a 480p display will result in the same frame-rate as running 480p on a 4k display.

And what do you mean someone "claiming" they can overlock an iMac? Have you not been reading the threads of yesteryear where the 680MX is an absolute BEAST of an overclocker in the 2012 iMacs? And guess what? It didn't throttle while being overclocked, all the while running cooler than an M295X at stock clocks.

Please, if you're not going to be polite, don't respond any further.

"BEAST of an overclocker"

*glances at my 7970 running at 1.7X stock clocks*

*glances back at thread*

*glances at open XCode project for a game I'm working on, specifically for retina displays*

*glances back at thread*

Yup I really don't understand how this all works, please do tell me.

EDIT - AMD Knows that the Hawaii and later series cards run hot. They're supposed to run hot and not throttle. Apple's drivers/BIOS implementation must be bonkers - their thermals are definitely good enough to keep it cool.

http://www.bit-tech.net/hardware/graphics/2013/10/28/is-the-amd-radeon-r9-290x-too-hot/1
 
"BEAST of an overclocker"

*glances at my 7970 running at 1.7X stock clocks*

*glances back at thread*

*glances at open XCode project for a game I'm working on, specifically for retina displays*

*glances back at thread*

Yup I really don't understand how this all works, please do tell me.

EDIT - AMD Knows that the Hawaii and later series cards run hot. They're supposed to run hot and not throttle. Apple's drivers/BIOS implementation must be bonkers - their thermals are definitely good enough to keep it cool.

http://www.bit-tech.net/hardware/graphics/2013/10/28/is-the-amd-radeon-r9-290x-too-hot/1

Resolution is just that simple. Apples only issue here is either thermal paste quality control or they don't expect users to full load the GPU for long periods of time. Apple must know what they are doing though.....right.....

Texture Render resolution is different than display resolution.
 
Well, it's like iPad 3 all over again. Resolutionary indeed, but the internals are not holding very well to feed the new resolution. I'd wait the next 2 or even 3 years to see how it's holding up. I believe 5k display will be cheaper too, maybe the base retina would be $1999 by then.

Yes I have iMac with 680MX and I am thankful I keep it. It runs very stable and no problems whatsoever. I've had Macs with Radeon and I can tell they always get hot, like really hot, and after some months they just failed. I don't like it, I don't like Radeon in Macs. I want nVidia in this. Why, Apple?

While Macs from 2012 - 2013 with nVidia graphics rarely has graphic problems. Apple should've sticked with 980M. Bad move, Tim.
 
Last edited:
As a prospective near future purchaser, I guess the REAL question is: are there any documented cases of one of these new RiMacs FAILING due to heat, or are they just throttling?

but the internals are not holding very well to feed the new resolution.

Just because 105c is HOT, does *NOT* mean it is outside the operational parameters of the hardware. As for the throttling, what do you expect from this thin a form factor? There's only so much room for heat dissipation under these circumstances.

Not saying you're wrong to be concerned, but even my desktop 780ti has "dynamic clock speed" under stock fan profiles.
 
As a prospective near future purchaser, I guess the REAL question is: are there any documented cases of one of these new RiMacs FAILING due to heat, or are they just throttling?



Just because 105c is HOT, does *NOT* mean it is outside the operational parameters of the hardware. As for the throttling, what do you expect from this thin a form factor? There's only so much room for heat dissipation under these circumstances.

Not saying you're wrong to be concerned, but even my desktop 780ti has "dynamic clock speed" under stock fan profiles.

Well the answer is we don't know YET. It's a completely new concept to pair a 5K display within a ridiculously thin chassis and limited space to place a computer inside. And AFAIK, there is no consumer grade computers which is designed to work well above 100C.

Fact is, my 2012 iMac is NOT HOT at all, even when it's thinner than the old one. Under heavy load, the fan revs up but I could tell the old, 1" thick iMac is hotter. So by that fact, the culprit in this case would be the nature of M29x chip itself, or because the chip being forced beyond its limit to feed the 5K.
 
It´s funny how ppl are OK with 100+°C gpus, man that´s hot :eek: . I do hope that Apples refresh of riMac will include nVidia gpus because amds grpus generali are running hotter than nvidia gpus and in thight enclosure like iMac we need cpus and gpus that runs cool. As for you all buyers of riMac with m295x, we´ll hear from you in near future i´ll belive ;)
 
Jesus, all I see in this thread are (with the exception of one or two) a bunch of whining fools. Do you honestly believe apple didn't at least run these against something like crysis 3 under boot camp?

If you were in Apple/AMD R&D and researching a suitable card for the iMac would you not at least have some high end games on your list of things to tax the GPU with? I realise gaming isn't their target market but obviously they can't release a pc which will immediately fail if someone runs a game on it.

Therefore I think we can deduce one of the following:

1) driver updates in both OS X & windows will alleviate a lot of this
2) Maybe the unique aluminium design of the iMac allows the GPU to safely operate at these temps
3) The temperatures under OS X & windows aren't being properly read.

Whatever the answer is I'm willing to wait until we have a more authorative testing process before binning my iMac.

Also for WilliamG and all other order cancellers, please grow a pair.
 
Last edited:
If you were in Apple/AMD R&D and researching a suitable card for the iMac would you not at least have some high end games on your list of things to tax the GPU with? I realise gaming isn't their target market but obviously they can't release a pc which will immediately fail if someone runs a game on it.

Therefore I think we can deduce one of the following:

1) driver updates in both OS X & windows will alleviate a lot of this
2) Maybe the unique aluminium design of the iMac allows the GPU to safely operate at these temps
3) The temperatures under OS X & windows aren't being properly read.

If you look back at Apples history you'd see that this would not be the first time that GPUs have failed because of heat. Don't you think they tested them with games back then?
It is not unlikely that they saw the potential risk but decided to go for it anyway because the potential profits would outweigh potential losses.

That said, it might not become a problem! But there is a reason to be concerned.
And it does not perform that much better than the 290x when it comes to games, which is probably because of the throttling. It's probably not an upgrade you'd want to spend that kind of money on.

Yes, there might be a driver update that will address it. But it is unlikely in my opinion. Perhaps they'll throttle it even more to reduce the heat? ;)
 
If you look back at Apples history you'd see that this would not be the first time that GPUs have failed because of heat. Don't you think they tested them with games back then?
It is not unlikely that they saw the potential risk but decided to go for it anyway because the potential profits would outweigh potential losses.

That said, it might not become a problem! But there is a reason to be concerned.
And it does not perform that much better than the 290x when it comes to games, which is probably because of the throttling. It's probably not an upgrade you'd want to spend that kind of money on.

Yes, there might be a driver update that will address it. But it is unlikely in my opinion. Perhaps they'll throttle it even more to reduce the heat? ;)


and do you think they would want to repeat previous mishaps? given the time and money needed to resolve them. also your statement about it being a bit better than the 290x is BS at this stage.....i can almost guarantee driver/firmware updates will improve the situation in the future....

----------

Who said I cancelled? I'm typing on my RiMac I've had for a week now. :p

then stop bitching about hardware that has been hardly tested or used in anger and making wild statements you cannot substantiate.
 
and do you think they would want to repeat previous mishaps? given the time and money needed to resolve them. also your statement about it being a bit better than the 290x is BS at this stage.....i can almost guarantee driver/firmware updates will improve the situation in the future....

Check out the gaming benchmarks here http://www.barefeats.com/imac5k6.html

Yes, there is a difference. But it that difference worth $250? Keep in mind that the i7 likely makes up for some of it as well.

Now as I said earlier, in pro apps we might see more of a performance increase with the 295x when it's used for shorter periods of time. But as far as gaming goes, it does not seem to be worth the upgrade.

I have not said that it's guaranteed to fail, just that there is a reason for concern. And I have hoped for driver updates from Apple to address issues enough times that I do not feel confident in that happening. It might! But I wouldn't make a purchase decision based on that.
 
and do you think they would want to repeat previous mishaps? given the time and money needed to resolve them. also your statement about it being a bit better than the 290x is BS at this stage.....i can almost guarantee driver/firmware updates will improve the situation in the future....

----------



then stop bitching about hardware that has been hardly tested or used in anger and making wild statements you cannot substantiate.

WilliamG isn't making wild statements. In fact, he is a lot more informed than many of the recent posters in this thread. Anyways, the riMac is a great machine, but maybe this model isn't ideal for having long gaming sessions every week. 100c+ GPU temps is pushing it pretty far, but what kind of effect it will have on the internal electronics is the long run is hard to say. It's reasonable to expect it will affect the lifespan of the GPU and possibly nearby surrounding electronics. A PC builder wouldn't accept those kind of temps.
 
Last edited:
They're not running at the same resolutions, one is pushing nearly double the pixels, even if it's not RENDERING at that resolution, it still has to OUTPUT that resolution.

While i'm no expert on this subject, i believe this to be false. I believe once you change the output resolution, the monitor will receive the signal from a videocard on that resolution, and then does the scaling itself.
So the videocard only outputs that given resolution.

Obviously, that doesn't apply to retina mode, but that's not what's being discussed here.

If you have information to the contrary, please provide your source.
 
Last edited:
and do you think they would want to repeat previous mishaps? given the time and money needed to resolve them. also your statement about it being a bit better than the 290x is BS at this stage.....i can almost guarantee driver/firmware updates will improve the situation in the future....

----------



then stop bitching about hardware that has been hardly tested or used in anger and making wild statements you cannot substantiate.

Im trying to remember the last time drivers fixed an overheating problem other than by throttling the speed of the gpu. Your faith in apple is admirable, but Apple only cares about the following: it lasts through the warranty period - after which it's no longer their problem and that gamers are such a small subset of users that they won't really affect the overall sales picture.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.