So, I've just tested my 2018 (i9, Vega Pro 20) vs. the old 2016 (i7-6920HQ, Pro 460), side by side. I don't have a 2017 machine to hand, but the 2016 should be a reasonable approximation. It's essentially the same GPU anyway. What I did was running Unigine Valley in a small window for a while, while watching the fan speed and temps.
Result: I'd say no difference, really. The 2018 machine ramps up the fans earlier but the 2016 is louder in prolonged operation. Assuming that the fans on both systems operate with the same efficiency, I'd say that the last fact means that (if anything) the 2018 i9 Vega Pro 20 runs more efficiently under load than the 2016 i7-6920HQ Pro 460
Detailed result:
- Both GPUs almost immediately get up to 80C and stay there
- The 2018 model ramps it fans up quicker, its in the 4000 range after a minute, while the 2016 one still sits at around 3000,
but
- after 10 minuets of running, the 2018 model I still in the high 4000 range while the 2016's fans are revving like crazy at 5500. Thats where the equilibrium has been reached since I couldn't observe any changes in fan speed from there on
- Vega Pro 20 was 30% faster when benched. The difference is this small since it was running in a small window, so API and data sync overhead was more noticeable. When run at fullscreen (1920x1080, Ultra, AA off), the Vega Pro is 70% faster in this particular benchmark.
[doublepost=1544109461][/doublepost]
Yes, I am absolutely reducing it to question of power under load. Why wouldn't I? They should be identical at idle.
Not necessarily. There are two "idle" modes after all — one when the dGPU is completely off and one when its on, but not doing anything demanding. The second one is active when you are connected to an external monitor. And that is exactly the situation where running cooler is beneficial, since you don't want your laptop to be loud when all you are doing is driving an extra monitor. I haven't had the chance to test the new laptop with an external monitor, but according to many other reports, it is indeed much cooler than the previous GPUs.
The author of the screenshots I posted says his Vega is 13 degrees cooler at idle. His idle temps with Vega are 43C, which makes his idle temps with 560X - 56C. However you slice it - it is not normal, and I would get rid of such laptop.
That is absolutely normal when running an external monitor. That is the temperature I saw with both the 2016 and with my previous 2018 (560X).
And your fans - they are maxed out at load, actually - Apple raised the rpm limit on Vega.
The max RPM on my the Vega machine is 6000, on the 2016 machine its 5900. I don't know if that is such a large difference...
[doublepost=1544109641][/doublepost]
But because it has higher TDP it will be hotter, progression of technology didn't break so far the first law of thermodynamics. What I have a beef with, is we were led to believe this is going to be 35W GPU, Nvidia killer. Instead we got something that trades blows with Ti MaxQ in XPS15 while having higher TDP and being two years late.
At the same time the very same chassis performs better under stress testing and overall power consumption hasn't changed. And, at least in my experience, I haven't noticed that the 2018 machine runs any hotter than any other MBP I've tested.