By introducing the 5600m they said that you would need to pay $700 more if you want a computer without heat issues. To me it sounds like extortion.
They didn’t say anything though, and have not recognised the heat/fan issue at all.
By introducing the 5600m they said that you would need to pay $700 more if you want a computer without heat issues. To me it sounds like extortion.
Thanks to the huge thread you started, Apple succeeded in extorting $700 from me, and I got a GPU that is way overkill for me. Oh, well. I think I'm gonna learn Metal. I've always wanted to try GPU computing anyway...By introducing the 5600m they said that you would need to pay $700 more if you want a computer without heat issues. To me it sounds like extortion.
I agree and this is awful behavior from Apple. I wished someone would have warned me about this awful overheating bug before I got the 16" MBP so everyone considering this laptop should be warned.
I would have returned it ASAP if I knew!
I've just registered just to post in this topic. It is clear to me now that every Apple hardware review is paid for or somehow influenced by Apple. It is simply impossible that this issue has flown under every reviewer's radar. It really needs some more exposure.
I've just registered just to post in this topic. It is clear to me now that every Apple hardware review is paid for or somehow influenced by Apple. It is simply impossible that this issue has flown under every reviewer's radar. It really needs some more exposure.
I even lost $300 on returning that defective 16" because an incompetent sales associate at Apple Michigan Avenue screwed up my payment methods. The Michigan Avenue store manager behaved extremely unprofessional too when he tried to blanch over his incompetent employee.Thanks to the huge thread you started, Apple succeeded in extorting $700 from me, and I got a GPU that is way overkill for me.
How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.I have not used the 5600m, but I suspect that buying too powerful GPU will also make your computer hotter than if you bought a less powerful base GPU.
I know that more powerful 6 core MacBooks run hotter than less powerful 4 core MacBooks because of the 2 extra cores. I will be glad if GPU world is different and more powerful (faster) GPUs like 5600 do not run hotter that less powerful GPUs like 5300.How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.
Absolutely ridiculous that Apple hasn't acknowledged this issue. I test drove a 16" MBP and returned it because the fans were driving me nuts when using my external monitors. I'd be pissed if I bought the thing and didn't discover the behavior until after my return period.
Power dissipation = ½ Capacitance x Voltage squared x frequency.How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.
In my experience with Apple dGPU machines - they have always turned on the discrete GPU when plugged into a monitor (unless you manually disable it with third party tools).
It's always been stupid behaviour. It's always caused more heat and power draw. I do not expect them to fix it any time soon, as it was a problem with my 2011 machine for its entire life.
Its a major contributor to why I do not buy discrete GPU MacBooks any more.
Due to my experience with the 16" MBP and Apple's utter refusal to even acknowledge the issue, I too will NEVER buy another Apple dGPU laptop. More and more...a Windows desktop appears to be in my future for "pro" work.
Power dissipation = ½ Capacitance x Voltage squared x frequency.
Faster GPUs have higher frequency, and usually achieve that with higher voltage, so faster means more heat is generated.
Faster GPU does not necessarily have higher frequency. They tend to play around with frequency and compute units, a GPU can achieve higher performance by increasing compute units while decreasing frequency. It will have trade off depending on how an application utilize the GPU.
Due to my experience with the 16" MBP and Apple's utter refusal to even acknowledge the issue, I too will NEVER buy another Apple dGPU laptop. More and more...a Windows desktop appears to be in my future for "pro" work.
How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.
Power dissipation = ½ Capacitance x Voltage squared x frequency.
Faster GPUs have higher frequency, and usually achieve that with higher voltage, so faster means more heat is generated.
It very much depends. As someone who has run third party tools to turn the discrete GPU OFF when it is not required, vs. apple's strategy I have seen WAY less heat and WAY better battery life for virtually identical performance.
Turning on the dGPU and increasing heat when not required just makes the system hotter in general and likely impacts the CPU's max boost as well. You are literally wasting battery and dumping heat into the chassis for nothing.
...
But they aren't perfect; and their discrete GPU management under macOS is some of the most stupid software I've ever seen.
Oh they likely will never. They'll just post some more affiliate links to buy the MacBook 16.kinda surprising why MacRumours is not reporting on this...
I know that more powerful 6 core MacBooks run hotter than less powerful 4 core MacBooks because of the 2 extra cores. I will be glad if GPU world is different and more powerful (faster) GPUs like 5600 do not run hotter that less powerful GPUs like 5300.