Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
By introducing the 5600m they said that you would need to pay $700 more if you want a computer without heat issues. To me it sounds like extortion.
Thanks to the huge thread you started, Apple succeeded in extorting $700 from me, and I got a GPU that is way overkill for me. Oh, well. I think I'm gonna learn Metal. I've always wanted to try GPU computing anyway...

I won't complain too much about the overall price, though. The base model is already near perfect for me, so I understand that even the base model can't be cheap. And considering how incredible the combination of 5600m and HBM2 is, it's not extremely unreasonable for Apple to charge an extra $700. It's just that pretty much the only non-minor blemish of 2019 16"MBP, which isn't even an issue if you don't use an external monitor, happened to be unacceptable for my use case.

But I can see how frustrating this situation can be. I wouldn't have been very happy if I had bought one before the 5600m option became available and gotten stuck with the heat issue... And even with the 5600m solution now available, there shouldn't have been a heat issue to begin with; a "pro" laptop should be able to handle a regular external monitor or two without any problem under light CPU/GPU usages.

EDIT: Oh, and thank you guys who reported the problem from early on. Those posts saved me.
 
Last edited:
I agree and this is awful behavior from Apple. I wished someone would have warned me about this awful overheating bug before I got the 16" MBP so everyone considering this laptop should be warned.

I would have returned it ASAP if I knew!

I agree with you. This is such a nuisance.
 
I've just registered just to post in this topic. It is clear to me now that every Apple hardware review is paid for or somehow influenced by Apple. It is simply impossible that this issue has flown under every reviewer's radar. It really needs some more exposure.
 
Last edited:
  • Like
Reactions: Appledoesnotlisten
I've just registered just to post in this topic. It is clear to me now that every Apple hardware review is paid for or somehow influenced by Apple. It is simply impossible that this issue has flown under every reviewer's radar. It really needs some more exposure.

I agree. While I’m hesitant to cry “shill!“ at every reviewer, it’s bewildering to see no mention of this problem. It makes doing my job very difficult as I can’t use dual monitors or open laptop when connected to a single display because the noise is too distracting to clients when on video-conference calls. Just awful...
 
  • Like
Reactions: Appledoesnotlisten
Also, the AMD GPU kicks in every time the "Picture slideshow" screensaver is triggered.
I can't believe this. If you leave your Mac in idle, even on battery power, a simple screensaver will cause noise, temperature and power consumption to skyrocket.
This is the worst kind of optimization I've ever seen on any type of computing device.
Since the GPU is the first component to fail most of the time on laptops, and the screensaver is used by default by a large portion of the users, it looks almost like an intentional way to kill the Mac after a few years of standard usage.
 
I've just registered just to post in this topic. It is clear to me now that every Apple hardware review is paid for or somehow influenced by Apple. It is simply impossible that this issue has flown under every reviewer's radar. It really needs some more exposure.

I'm inclined to believe that any kind of overpriced computing device is heavily endorsed by its manufacturer to reviewers.
I've been having this 16" rMBP which is full of problems that none of the reviews mentioned. All of my problems look like they're by design and not only for my specimen.

I've also owned a Microsoft Surface Pro 7, a true piece of cr*p all around, overheating and underperfoming even for simple tasks and with much lower battery life compared to my expectations.

Next time I'm getting a standard, non-premium ASUS or HP computer, at least I know I'll get what I'm paying for.
 
  • Like
Reactions: Appledoesnotlisten
Thanks to the huge thread you started, Apple succeeded in extorting $700 from me, and I got a GPU that is way overkill for me.
I even lost $300 on returning that defective 16" because an incompetent sales associate at Apple Michigan Avenue screwed up my payment methods. The Michigan Avenue store manager behaved extremely unprofessional too when he tried to blanch over his incompetent employee.

I have not used the 5600m, but I suspect that buying too powerful GPU will also make your computer hotter than if you bought a less powerful base GPU. It's another inconvenience.
 
Last edited:
I have not used the 5600m, but I suspect that buying too powerful GPU will also make your computer hotter than if you bought a less powerful base GPU.
How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.
 
How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.
I know that more powerful 6 core MacBooks run hotter than less powerful 4 core MacBooks because of the 2 extra cores. I will be glad if GPU world is different and more powerful (faster) GPUs like 5600 do not run hotter that less powerful GPUs like 5300.
 
Absolutely ridiculous that Apple hasn't acknowledged this issue. I test drove a 16" MBP and returned it because the fans were driving me nuts when using my external monitors. I'd be pissed if I bought the thing and didn't discover the behavior until after my return period.
 
Absolutely ridiculous that Apple hasn't acknowledged this issue. I test drove a 16" MBP and returned it because the fans were driving me nuts when using my external monitors. I'd be pissed if I bought the thing and didn't discover the behavior until after my return period.

Yep I didn’t realize quite how bad it is until after my 30 days. It’s bad.
 
I have the 16 inch and don't have this issue, all of you guys who are angry better have taken 30 seconds to actually report the issue you're having, otherwise consider it unknown to Apple.
 
How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.
Power dissipation = ½ Capacitance x Voltage squared x frequency.

Faster GPUs have higher frequency, and usually achieve that with higher voltage, so faster means more heat is generated.
 
In my experience with Apple dGPU machines - they have always turned on the discrete GPU when plugged into a monitor (unless you manually disable it with third party tools).

It's always been stupid behaviour. It's always caused more heat and power draw. I do not expect them to fix it any time soon, as it was a problem with my 2011 machine for its entire life.

Its a major contributor to why I do not buy discrete GPU MacBooks any more.

Due to my experience with the 16" MBP and Apple's utter refusal to even acknowledge the issue, I too will NEVER buy another Apple dGPU laptop. More and more...a Windows desktop appears to be in my future for "pro" work.
 
Due to my experience with the 16" MBP and Apple's utter refusal to even acknowledge the issue, I too will NEVER buy another Apple dGPU laptop. More and more...a Windows desktop appears to be in my future for "pro" work.

Why do you think they are quieter or generate less heat when attached to a high-resolution external display?

You can definitely get bulkier units that are quieter, but our top-end Lenovo P 15's units heat up pretty quickly. They do throttle early so that keeps the noise down some, but at the cost of performance.
 
Power dissipation = ½ Capacitance x Voltage squared x frequency.

Faster GPUs have higher frequency, and usually achieve that with higher voltage, so faster means more heat is generated.

Faster GPU does not necessarily have higher frequency. They tend to play around with frequency and compute units, a GPU can achieve higher performance by increasing compute units while decreasing frequency. It will have trade off depending on how an application utilize the GPU.
 
  • Like
Reactions: mastercheif91
Faster GPU does not necessarily have higher frequency. They tend to play around with frequency and compute units, a GPU can achieve higher performance by increasing compute units while decreasing frequency. It will have trade off depending on how an application utilize the GPU.

Increasing the number of compute units increases C in that equation.
 
Due to my experience with the 16" MBP and Apple's utter refusal to even acknowledge the issue, I too will NEVER buy another Apple dGPU laptop. More and more...a Windows desktop appears to be in my future for "pro" work.

Well, you probably won't ever see another "discrete GPU" MacBook Pro anyway.

Apple silicon should hopefully fix these issues.

I'll likely be keen on a larger machine if this is the case.
 
How so? A faster GPU has to work less hard for the same output quality as a slower GPU, resulting in less heat generated.

It very much depends. As someone who has run third party tools to turn the discrete GPU OFF when it is not required, vs. apple's strategy I have seen WAY less heat and WAY better battery life for virtually identical performance.

Turning on the dGPU and increasing heat when not required just makes the system hotter in general and likely impacts the CPU's max boost as well. You are literally wasting battery and dumping heat into the chassis for nothing.

Also, given system heat kills the longevity of batteries (I forget the numbers but from memory, it is as bad or worse than every 5C hotter than 25C on average = 25% increase in battery wear), I would suspect it's also why my 2011 MacBook Pro's battery is still above 85% and never swelled. I.e., even when running on AC power, you're doing damage to the battery with excessive system heat!


Power dissipation = ½ Capacitance x Voltage squared x frequency.

Faster GPUs have higher frequency, and usually achieve that with higher voltage, so faster means more heat is generated.

Yup. And the problem is that apple's software strategy for when to enable the discrete GPU is bone-headed, brain damaged stupid. It turns it on FAR too easily.

It's now marginally better than it was before, but back in the Lion days for example it was so stupid that if you opened ANYTHING that used (IIRC) core animation, it turned the discrete GPU on. Even on battery. Which meant say, "Twitter" caused discrete GPU use. Even though twitter was doing nothing more than scrolling basic 2d text/images. Plugging in an external monitor turned on discrete GPU use. Adobe flash (from memory) turned on discrete GPU use.

It was (and still is) exceedingly stupid. I get it, Apple are attempting to make it "seamless" without the user having to think, but not everyone who owns/uses a Mac is mentally deficient. I'd argue that plenty of Mac users are actually pretty technical users (e.g., I know plenty of network engineers, electricians, people in education/science/university who use them).

What Apple should do IMHO is either
  1. introduce performance modes, which can include the option to turn dGPU use on/off (I think this was rumoured to be coming in macOS?)
  2. place a tick box option under "power" for "Use discrete graphics to improve performance (Note: this may impact battery life)" so that if you want to maximise battery life or reduce heat, you can turn it off without dodgy third party tools

edit:
if the above seems harsh... I'm a massive fan of Apple hardware and software normally. But they aren't perfect; and their discrete GPU management under macOS is some of the most stupid software I've ever seen.
 
Last edited:
It very much depends. As someone who has run third party tools to turn the discrete GPU OFF when it is not required, vs. apple's strategy I have seen WAY less heat and WAY better battery life for virtually identical performance.

Turning on the dGPU and increasing heat when not required just makes the system hotter in general and likely impacts the CPU's max boost as well. You are literally wasting battery and dumping heat into the chassis for nothing.

...

But they aren't perfect; and their discrete GPU management under macOS is some of the most stupid software I've ever seen.

The issue isn't that the dGPU is enabled when an external monitor is plugged in. Afaik, in BootCamp the MBP exclusively uses the dGPU and yet doesn't result in the high power usage we see with macOS.
 
Hi guys, so, I have also just made a registration here, only to post in this thread.

The Macbook pro 16 is my first mac (the basic "step-up" i9 with 5500m 4gb) and i am pretty certain it is also my last. I grew dissatisfied with windows 10 and all its issues (updates, inconsistencies, ads, you name it, but that's a topic for another day), but i gotta say they are nothing compared to what I am experiencing daily with the mac.

Obviously the main issue is the power draw, heat and noise when connected to an external display. The Radeon goes immediately to 18 watts, and the fans spin up from the idle 1800rpm to the maximum - 5200/5600rpm. And the fans just stay there all the time, even with just light office or programming work. Surprisingly the temperatures do not exceed 60-64°C during all that time, so it seems to me that even if the laptop is a bit hot from the extra GPU draw, the fan control is still overreacting. And the noise is driving me nuts, I gotta say.

Moreover I have noticed that the laptop throttles quite aggressively even with light loads. Every other evening, i like to spend an hour or two with WoW classic. Which is an ancient game - it uses like one half of one core and the graphics are also not that great, I mean, it even runs on the integrated graphics if I tell it to. Yet still, less than 10 minutes after I start the game, the performance drops significantly as the computer starts to throttle the performance. The kernel_task taking 800-1000% of CPU time... and again the laptop seems to try to keep the temperatures at around 60-64°C, which is nice and all, but I would really gladly trade an extra 20°C for unthrottled performance...

I am using an hp thunderbolt dock rated for 100W over USB-PD (and recognized by the "System Information" as capable of doing so), so power should not be the issue here. For what it's worth, it behaves the same when i bypass the dock completely, and just charge with original charger on one side, and connect a USB-C display on the other side of the laptop.

Just for sport I am also mentioning the other issues i have never expected from such an expensive machine, such as:
- crashes/panics when going to sleep with external display (already fixed by os update though, but still, does nobody test these machines?)
- computer waking up from sleep with every notification from the "Reminders" app (basically i come home everyday to a hot and noisy running laptop which should have been in sleep the whole time)
- incredibly annoying mouse acceleration and unpredictable scrolling with the mouse wheel (seems also accelerated - to a point where one single slow move of the scroll wheel doesn't even register - scrolling one line in text editor is almost impossible, switching to another weapon in ioquake3 also does not work)

So that's it.. sorry for the long essay, i just felt like getting all of this off my chest would ease my pain of 3.000€ wasted on this machine, when I should have gone for an XPS or something like that instead...

tl.dr - paid a lot of money for my first macbook and now i regret it sorely due to gpu issues, heat and noise, crashes, unusable external mouse and unexpected wakeups.
 
  • Angry
Reactions: bolognese
I know that more powerful 6 core MacBooks run hotter than less powerful 4 core MacBooks because of the 2 extra cores. I will be glad if GPU world is different and more powerful (faster) GPUs like 5600 do not run hotter that less powerful GPUs like 5300.

5600m is more powerful with less power consumption, thus lower heat generated. That's because the HBM2 memory can achieve more data transfer rates compared to GDDR6 on 5300m or 5500m using wider interface with lower memory clockspeed.

I seem to think that the 5600m doesn't really eliminate the main problem of the 16" MBP, the dGPU drawing a lot of power just by connecting the device to an external monitor. It still draws a lot of useless power based on some post (I can't remember where) on the internet (I've been following this issue for months). But since the 5600m tends to produce less heat, "the heat and fan noise problem" won't appear.

I also tend to notice that the memory clockspeed of the 5300m and 5500m is at 12000Mhz on a 128 bit interface based on the number i found at technical.city website. For comparison, Radeon Pro Vega 20 runs at 1480 Mhz at 1024 bit interface, Radeon Pro 580 runs at 6780 Mhz at 256 bit interface, and the Radeon Pro 5600m runs at 1540 Mhz at 2048 bit interface. I don't really know how much the difference in memory clockspeed affect the heat produced. Maybe someone that have knowledge in this area can chime in.

Another topic to consider, graphic card plays the main role in rendering and driving the monitors. On multiple graphic card macbook, there might be an instance where one of the card render the display and the other drive the monitors causing the GPU have to work harder to communicate the data. The work is getting harder when they have to support monitors with different kind of resolutions and refresh rates. You guys can find this information on Apple developer Metal documentation. Anyone tried using their Macbook with monitor that have the same resolution and refresh rate as the built-in screen? I think I'm going to try this tomorrow with my friend's base 16" MBP.

After reading a lot of posts in a lot of forums, i still unsure wether this problem is caused by lazy driver implemetation, or failure in hardware design (85 Watt GPU + 45 Watt CPU on a 100 Watt machine?!). I've been wanting to buy the 16" MBP, but i don't want to waste that amount of money in this hard times for a product that have major hardware issue. Couple of friends pull the trigger on this MBP and experience the same problem.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.