Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

What GPU heat did you get?


  • Total voters
    46
I believe that this is completely flawed logic. My understanding is that on a Retina iMac when you are running in 2560x1440 you are still moving around 14.7 million pixels (even if you are not in 5120 x 2880 resolution). That is how it is able to achieve a significantly crisper 'retina' display.

If you weren't using all 14.7 million pixels on the display than 3 out of every 4 pixels would be black. That is certainly not the case.
I don't think you understand well how 3D calculation and displaying work on your Retina... When you run a 3D game on your Retina iMac at 2560x1440, it is not crisper than on the classic 27" iMac, it is EXACTLY the same! Only 2560x1440 pixels are calculated by the GPU, and then a square of 4 pixels display the same value to light them all.
Try to run a bench or a game at the native resolution of your panel (5120x2880): then your game will be crisper than on the classic iMac, but performances will obviously be lower.
 
I don't think you understand well how 3D calculation and displaying work on your Retina... When you run a 3D game on your Retina iMac at 2560x1440, it is not crisper than on the classic 27" iMac, it is EXACTLY the same! Only 2560x1440 pixels are calculated by the GPU, and then a square of 4 pixels display the same value to light them all.
Try to run a bench or a game at the native resolution of your panel (5120x2880): then your game will be crisper than on the classic iMac, but performances will obviously be lower.

Perhaps 14.7 million pixels are not being 'calculated' but 14.7 million pixels are being 'displayed' and thus they are being 'moved around' on screen. I really do not think that displaying 14.7 million pixels on one display can be accurately compared to displaying 3.7 million pixels on another display. Regardless of the necessary calculations used by the benchmark it is not an accurate comparison.

The only real way to make an absolute comparison would be to put a R295 in a non retina machine and benchmark it. Alas, at this time that is not possible as far as I know.

I am pretty sure that the target market for the Retina iMac is people that want to take advantage of the newest screen technology (5k). It is not a product designed for people who want to maximize gaming performance at a previous generation display resolution (1440p).
 
Perhaps 14.7 million pixels are not being 'calculated' but 14.7 million pixels are being 'displayed' and thus they are being 'moved around' on screen. I really do not think that displaying 14.7 million pixels on one display can be accurately compared to displaying 3.7 million pixels on another display. Regardless of the necessary calculations used by the benchmark it is not an accurate comparison.

The only real way to make an absolute comparison would be to put a R295 in a non retina machine and benchmark it. Alas, at this time that is not possible as far as I know.

I am pretty sure that the target market for the Retina iMac is people that want to take advantage of the newest screen technology (5k). It is not a product designed for people who want to maximize gaming performance at a previous generation display resolution (1440p).
I'm quite sure it wasn't designed for people who want to play in 2560x1440 pixels with a "significantly crisper 'retina' display" ;) #troll
 
The only real way to make an absolute comparison would be to put a R295 in a non retina machine and benchmark it.

While not possible with current iMacs, you can use the mid-2012 MacBook Pro to compare non-Retina with Retina performance. Those systems came with the same CPUs and GPUs. According to Macworld's testing, the classic and Retina systems with the 2.6 GHz CPU and 1 GB GPU got the same Cinebench OpenGL scores while the former was faster in Portal 2 by about 8 percent. That's not statistically insignificant, but far from a huge margin.
 
I'm quite sure it wasn't designed for people who want to play in 2560x1440 pixels with a "significantly crisper 'retina' display" ;) #troll

I was not implying that the retina display gave you a crisper display when playing games at 1440p. I was explaining that the way the technology works is that you have 4x the pixel count being displayed on screen at all times (3d calculations aside) and there is no way to shut off 3 out every 4 pixels to generate an accurate comparison.

You seem to to still disagree with this and I still believe that your original logic is flawed.
 
I was not implying that the retina display gave you a crisper display when playing games at 1440p. I was explaining that the way the technology works is that you have 4x the pixel count being displayed on screen at all times (3d calculations aside) and there is no way to shut off 3 out every 4 pixels to generate an accurate comparison.

You seem to to still disagree with this and I still believe that your original logic is flawed.
I agree with you in 2D: the GUI is 4 times more demanding. But in 3D, for me it is not the same: what is very demanding is the 3D computing: moving 3D objects with their textures, and this is made at 1440p. I don't think that dispatching the same information on 4 pixels is very demanding on the GPU or the TCON. But we can't verify that I think...
 
The only real way to make an absolute comparison would be to put a R295 in a non retina machine and benchmark it. Alas, at this time that is not possible as far as I know.

maybe it is possible. i have an apple led cinema display attached. is it possible for me to completely shut off the 5k display?
 
maybe it is possible. i have an apple led cinema display attached. is it possible for me to completely shut off the 5k display?

No need to do that. Just go into Display in System Preferences, hold down Option while clicking on the Scaled modes, - pick "Show low resolution modes" and pick 2560x1440 low resolution. Voila.
 
No need to do that. Just go into Display in System Preferences, hold down Option while clicking on the Scaled modes, - pick "Show low resolution modes" and pick 2560x1440 low resolution. Voila.

That would still need to draw 2k and huddle to 5k, no?
 
I was not implying that the retina display gave you a crisper display when playing games at 1440p. I was explaining that the way the technology works is that you have 4x the pixel count being displayed on screen at all times (3d calculations aside) and there is no way to shut off 3 out every 4 pixels to generate an accurate comparison.

You seem to to still disagree with this and I still believe that your original logic is flawed.

Thats what a GPU does though, thats the heat generation and thus eventually throttling. Lowering the resolution simplifies the calculation eg load.

That is why a GPU will benchmark higher at lower resolutions. If merely lighting up extra pixels as a group was just as much work as running them independently then lowering the resolution wouldn't increase FPS for the reasons you stated. Besides its just a built in monitor, no different then if we were testing a similar specs desktop box with any given monitor. Resolution set, signal sent, display components do the display legwork.

Post #19 in this thread is an example. The poster accidentally used 1080 and got much higher results.
 
Guys, remember to turn on the fans ;)


Fans in Auto ( Macs Fan Control) Max 101°C



Fans at Max, 91/92° C



Fans at min, 106°C + GPU Throttling @ 6xx MHz

 
I was more getting at the fact you think this is a viable solution.

Manual fan control on a £2,500 computer - blasting the fans that users are already complaining about being too loud, too often.
Macs Fan Control was used mainly to give a proof of the fan speed.
The test was made in Windows to show to the ones was claiming higher temp in Windosw than Os X, and in Os X the result is the same.
In Os X there's no need to manual control, during the test the max temp is 101° C with fan speed at 1800 so not loud for my ears.
 
Macs Fan Control was used mainly to give a proof of the fan speed.
The test was made in Windows to show to the ones was claiming higher temp in Windosw than Os X, and in Os X the result is the same.
In Os X there's no need to manual control, during the test the max temp is 101° C with fan speed at 1800 so not loud for my ears.

Could you post your Mac benchmark result screenshots? Sounds like you've got a one-in-a-million 5K iMac there. 1800rpm at 101ºC...?!
 
In Os X was even better, 99° average with 1700 rpm

[url=http://s1.postimg.org/7drz505lr/valley_bench.png]Image[/url]

I think you are missing the point here? My M290X scores only few fps less, with max 85c GPU temps and 1370rpm! I hope that M295X is drawing too much vcore (heat) and it can be fixed via SMC update from Apple and/or AMD drivers. M295X should kick this M290X ass at least good 20% but it does not at the moment!

;)

2d0l8br.jpg
 
I think you are missing the point here? My M290X scores only few fps less, with max 85c GPU temps and 1370rpm! I hope that M295X is drawing too much vcore (heat) and it can be fixed via SMC update from Apple and/or AMD drivers. M295X should kick this M290X ass at least good 20% but it does not at the moment!

;)

Image

Man, you are missing the point, the thread is about Heat not performance :)
 
?

He's talking about heat. 85c is a temp (heat). It just so happens to be at the same performance level.
Real life tasks is not Valley Benchmarks, of course the gap is not huge but there's an improvement with the m295x, and honestly, I don't care about the Temp if the system is made to work at this levels.
My test was to verify the rumors about throttling and if the fan control is working there's no performance drop, over 103/104° probably the performance are the same as the 290 due to a clock jumping between 6xx and 7xx.
I think also that no patch will change the situation in the future and for a gfx performance boost we have to wait for the refresh.
In the 3D Mark 13 there's a 20% between m295x and m290x, and 30% between the 980M and 970M, here to discuss I my opinion there is only the unhappy choice Apple made to change from Nvidia to AMD
 
Real life tasks is not Valley Benchmarks, of course the gap is not huge but there's an improvement with the m295x, and honestly, I don't care about the Temp if the system is made to work at this levels.
My test was to verify the rumors about throttling and if the fan control is working there's no performance drop, over 103/104° probably the performance are the same as the 290 due to a clock jumping between 6xx and 7xx.
I think also that no patch will change the situation in the future and for a gfx performance boost we have to wait for the refresh.
In the 3D Mark 13 there's a 20% between m295x and m290x, and 30% between the 980M and 970M, here to discuss I my opinion there is only the unhappy choice Apple made to change from Nvidia to AMD

I don't think this is a good benchmark for your test of temps vs fan operation.

Its only capable of getting my 775m to 82c. I've seen up to 96c playing games and such, basically real life task are a higher strain then this benchmark.

How do you think performance is affected with something that actually puts a strain on the GPU unlike this benchmark does? Even at 2700RPM fan? I'd venture a guess performance would drop exponentially. That is the order of operation, GPU heats up, fan speed increases accordingly, fan nears max, GPU finally throttles to control heat.

Only thing this benchmark does is illustrate the limits of the 295x at what appears to be a relatively low level. Again I say that because it can't max the temps out of other GPU's.
 
here to discuss I my opinion there is only the unhappy choice Apple made to change from Nvidia to AMD

agree 100%.
In addition to that my understanding is that AMD advertised clock rate in reality means "up to" so it's more a max attainable clock rate. The opposite for Ndivia that advertises a minimum clock rate. If this is true, but I might be wrong, there is not such thing as "throttling" as the fact that the gpu will always perform at its max clock rate is not guaranteed.
 
Interesting: I spoke to an apple rep trying to understand if these temperatures are ok (I got 104 degrees last time).

His recommendation was to do an SMC reset and if the temperatures are still high, I might need to have it checked.

I thought it was probably something he must advise from a suggestion book or something but I thought what the heck.

Did an SMC reset and a PRAM reset.

Now I am mostly on 101 degrees and sometimes at 102, but never 104. I wonder if this is just a statistical error or if it is really something that an SMC reset can help with.

I did replace the stock ram with 32GB of ram and never did a PRAM reset before now, so that might had something to do with it as well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.