Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

RumorConsumer

macrumors 68000
Original poster
Jun 16, 2016
1,664
1,189
Heat, and overheating is the enemy. My friend's MacBook Pro 2020 16" i7 gets so freaking hot and I think today the GPU might have given out. As a former tech, it made me wonder if these Apple Silicon Macs are going to show more resilience given the fact that they run so much cooler. Its too early to have too much data but its a seriously interesting question.
 
Under load, all Macs are designed to run "hot." Heat/overheating is indeed the enemy for a battery, or when you have insufficient solder chemistry to handle higher temps (iirc, this was the issue with GPUs with the I think 2010/11? MBPs). I've not heard anything about GPUs failing from heat in more modern Macs since they switched solder chemistry.

Now, Apple Silicon Macs are typically able to do more at lower watts than Intel cpus, so the 2020 i7 vs M1-series chips will have different heat characteristics under the same loads, but make no mistake, if you give your machine work to do, it will heat up, as they are designed to work that way, and are most performant/efficient when firing on all cylinders.
 
I mean we can't even compare.. Using ARM as arch and a SOC compared to both dual individual dies of x86... High power chips both of them in a small chassis...

I had the 16" and it was insane the power it would use even for the simple tasks..And forcing to use the dGPU when connected to external display.. wow.
 
I mean we can't even compare.. Using ARM as arch and a SOC compared to both dual individual dies of x86... High power chips both of them in a small chassis...

I had the 16" and it was insane the power it would use even for the simple tasks..And forcing to use the dGPU when connected to external display.. wow.
Same experience. Wow. Doing nothing in a medium temp room with fans blasting.
 
Under load, all Macs are designed to run "hot." Heat/overheating is indeed the enemy for a battery, or when you have insufficient solder chemistry to handle higher temps (iirc, this was the issue with GPUs with the I think 2010/11? MBPs). I've not heard anything about GPUs failing from heat in more modern Macs since they switched solder chemistry.

Now, Apple Silicon Macs are typically able to do more at lower watts than Intel cpus, so the 2020 i7 vs M1-series chips will have different heat characteristics under the same loads, but make no mistake, if you give your machine work to do, it will heat up, as they are designed to work that way, and are most performant/efficient when firing on all cylinders.
For sure. But having gone from the pinnacle of the Intel world of Mac laptops to the pinnacle of the Apple Silicon world, these devices behave very differently. The issue with the solder was expansion and contraction over time would cause them to weaken the traces IIRC. But I can also say that Ive seen Intel machines behave erratically when temps get high in those little enclosures. And after the 2018 MBP I dont trust that Apple was really paying super attention to what they were doing until maybe the 2019 16" and even then.
 
Apple has been running their chips at around 100C under heavy load, this is no different for Apple Silicon than it was for Intel Macs. I am not aware of any practical impact on computer longevity because of that. In our labs the Mac laptops have been hammered for years and I can’t say we observe an abnormally high failure rates. All in all, it’s not a concern worth losing one’s sleep about.
 
Heat, and overheating is the enemy. My friend's MacBook Pro 2020 16" i7 gets so freaking hot and I think today the GPU might have given out. As a former tech, it made me wonder if these Apple Silicon Macs are going to show more resilience given the fact that they run so much cooler. Its too early to have too much data but its a seriously interesting question.
Far too early to tell.

In my opinion Intel reliability issues and them running so hot has more to do with shoddy manufacturing processes and poor design. I'm not trying to cast shade, but I can run my Xbox Series X with all the bells and whistles turned on, and you'll never hear the fan.

While on my sold a year or two ago MacBook Pro 16, core i7 sounded like it was trying to take off by just joining Microsoft Teams.

My opinion is Intel got sloppy and arrogant so they made bad choices. It's not the first time it's happened to them either. Remember the Pentium IV? They literally based all chips we have today off of the Pentium III due to the same decisions.

They've also been I think 5 years behind on their nm process to the point where they are just marking out 10nm and just saying "Intel 7" to look like it's the same as 7nm. It's shameful.

In terms of resilience long term, it's important to take note of the fact that while Apple has been making their own Mac chips for 2 years, they've been building chips of all sizes for I think 12 years. The worst constraints on workloads for testing how long something will last isn't the MacBook Air in my opinion. It's the iPhone and iPad. Much more tightly constrained cooling, and toddlers and teens exhausting them wanting to play games.

Again, this is just an opinion. Who knows, maybe my MacBook Pro 16 M1 Max will catch on fire on my lap. But I think I've heard the fan turn on MAYBE once?
 
Far too early to tell.

In my opinion Intel reliability issues and them running so hot has more to do with shoddy manufacturing processes and poor design. I'm not trying to cast shade, but I can run my Xbox Series X with all the bells and whistles turned on, and you'll never hear the fan.

While on my sold a year or two ago MacBook Pro 16, core i7 sounded like it was trying to take off by just joining Microsoft Teams.

My opinion is Intel got sloppy and arrogant so they made bad choices. It's not the first time it's happened to them either. Remember the Pentium IV? They literally based all chips we have today off of the Pentium III due to the same decisions.

They've also been I think 5 years behind on their nm process to the point where they are just marking out 10nm and just saying "Intel 7" to look like it's the same as 7nm. It's shameful.

In terms of resilience long term, it's important to take note of the fact that while Apple has been making their own Mac chips for 2 years, they've been building chips of all sizes for I think 12 years. The worst constraints on workloads for testing how long something will last isn't the MacBook Air in my opinion. It's the iPhone and iPad. Much more tightly constrained cooling, and toddlers and teens exhausting them wanting to play games.

Again, this is just an opinion. Who knows, maybe my MacBook Pro 16 M1 Max will catch on fire on my lap. But I think I've heard the fan turn on MAYBE once?
Great points all. I have the same set of ideas and regularly tell people about Apple has actually been building its own silicon for a long time as well. They know how to make these chips and the boards they're connected to in the most rigorous environment possible - the mass market handheld. Same with the fans. Im very excited to see.
 
Heat, and overheating is the enemy. My friend's MacBook Pro 2020 16" i7 gets so freaking hot and I think today the GPU might have given out. As a former tech, it made me wonder if these Apple Silicon Macs are going to show more resilience given the fact that they run so much cooler. Its too early to have too much data but its a seriously interesting question.
Apple Silicon on it's current two generations is already more resilient given that they run quite cool. M1 has been out for almost 2 years and I have yet to see a thread of a dead M1 chip due to heat.

M2 is showing the same progress thus far, so I find it (albeit premature) that it will also perform well.
 
  • Like
Reactions: RumorConsumer
Apple has been running their chips at around 100C under heavy load, this is no different for Apple Silicon than it was for Intel Macs. I am not aware of any practical impact on computer longevity because of that. In our labs the Mac laptops have been hammered for years and I can’t say we observe an abnormally high failure rates. All in all, it’s not a concern worth losing one’s sleep about.
I was about to say that my 16" M1 Max MacBook Pro "feels" as hot as my old Intel i9 16" MacBook Pro under high workloads. The biggest difference is that the fans are either barely spinning, or just off on the M1 Max, and of course the performance is higher on the Max.
 
I was about to say that my 16" M1 Max MacBook Pro "feels" as hot as my old Intel i9 16" MacBook Pro under high workloads. The biggest difference is that the fans are either barely spinning, or just off on the M1 Max, and of course the performance is higher on the Max.
Sounds like we have the same purchase history. My experience is more differentiated. My Max gets nowhere near as hot to the touch as the i9 8c 16" did. Its night and day. Especially on "the brow" right above F4-F7 where it is usually the hottest. Nowhere near.
 
Sounds like we have the same purchase history. My experience is more differentiated. My Max gets nowhere near as hot to the touch as the i9 8c 16" did. Its night and day. Especially on "the brow" right above F4-F7 where it is usually the hottest. Nowhere near.

Unless you are pushing both the CPU and the GPU hard, the M1 Max is likely to produce significantly less heat compared to the Intel models, and of course, you have to consider that the new laptops have beefier cooling systems. But I get ally the chip will still run at 100C or close to it under load. Less heat, cooler exterior, similar core/package temperatures.
 
Sounds like we have the same purchase history. My experience is more differentiated. My Max gets nowhere near as hot to the touch as the i9 8c 16" did. Its night and day. Especially on "the brow" right above F4-F7 where it is usually the hottest. Nowhere near.
Unless you are pushing both the CPU and the GPU hard, the M1 Max is likely to produce significantly less heat compared to the Intel models, and of course, you have to consider that the new laptops have beefier cooling systems. But I get ally the chip will still run at 100C or close to it under load. Less heat, cooler exterior, similar core/package temperatures.

I specifically used the term "feels" as I am not (purposely) running any temp monitoring software on my M1 Max. I used iStats Menu religiously on my Intel Mac(s). My new (well 8 months in-now) M1 Max gets hot between the display and the function key row, and I also get notable heat from the keyboard when I am pushing the system (Unreal Engine, Parallels, etc..). I don't think it is "too" hot, but it feels about the same as my Intel Macs did. My new one does run much, much quieter. Even when it is hot, I rarely hear the fans, and performance is all-around better. After all the talk of efficiency and performance, I had assumed (wrongly) that my Mac would run cooler. Granted, maybe it is running a little cooler, but not to the touch :)
 
Worked almost 5 years on Apple repair on a non-authorised store, outside US. We didn't have hundred of Macs coming in every week volume more on dozens but we worked on average on Macs with 5-6 years at that moment.

I can't remember any direct CPU failure. At most something on logicboard went south and shorted some component onto the CPU. And all CPU run as close to 100ºC as possible, even with new thermal paste.

Now, the GPU is another history. We had, as any laptop repair store, quite a few GPUs as causing issues.

A question I find interesting: will the unheard of, super powerful integrated GPU, into the more powerful Pro, Max and Ultra variant have a equal GPU defect rate as dGPU?

That is an area we are in new territory. Maybe?
 
Worked almost 5 years on Apple repair on a non-authorised store, outside US. We didn't have hundred of Macs coming in every week volume more on dozens but we worked on average on Macs with 5-6 years at that moment.

I can't remember any direct CPU failure. At most something on logicboard went south and shorted some component onto the CPU. And all CPU run as close to 100ºC as possible, even with new thermal paste.

Now, the GPU is another history. We had, as any laptop repair store, quite a few GPUs as causing issues.

A question I find interesting: will the unheard of, super powerful integrated GPU, into the more powerful Pro, Max and Ultra variant have a equal GPU defect rate as dGPU?

That is an area we are in new territory. Maybe?
Hard to know but my sense is they won't.
 
will the unheard of, super powerful integrated GPU, into the more powerful Pro, Max and Ultra variant have a equal GPU defect rate as dGPU?
My WAG: based on the premise that the Max can be purchased with a 32 GPU-core version or a 24 core version, which is the same SoC in a lower binning, one can easily infer that the iGPU can run x number of cores where x > 1. In other words, unlike a dGPU, the M-series iGPU is built to fail core-by-core rather than as a whole and will gradually become less powerful but not take a total dump for probably a decade or more.
 
Now, the GPU is another history. We had, as any laptop repair store, quite a few GPUs as causing issues.

A question I find interesting: will the unheard of, super powerful integrated GPU, into the more powerful Pro, Max and Ultra variant have a equal GPU defect rate as dGPU?

That is an area we are in new territory. Maybe?
CPUs and GPUs are both transistors connected together with wires. There's nothing special which makes transistors connected together to form a GPU less reliable than transistors connected together to form a CPU.

Any organization designing a chip, especially a high-power high-complexity chip, does need to pay attention to several design problems which are important for long-term reliability. My sense (which may not be accurate) is that, for quite some time, NVidia and AMD both developed a habit of skimping on reliability engineering in their consumer grade discrete GPU products. GPUs were advancing so fast that yesterday's GPU was obsolete; this turnover helped them avoid major consequences for going light on reliability. But it did hurt laptops, and not just Apple's as I understand it.

Anything with an iN series CPU in the laptops which came through your shop had an integrated GPU. No problems with those chips, right? That was just Intel holding their GPU+CPU SoCs to the same reliability standard they had always used for their CPUs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.