Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is the cooling system inefficient and not fit for purpose? Yes.

Is it VERY VERY likely that Intel sold Apple a bill of goods and even confidently stated that their next processors will most certainly work well with the proposed cooling system? Yes.

SHOULD Apple have done with PC Makers have done and changed their entire line to meet the fact that Intel failed them? Yeah, probably, but they didn’t. Well, not in the way PC makers did. The case was kept the same as they had intended, but, they likely had to redo some of the internals with last minute fixes, which, from the descriptions, sounds like what happened.
What, did you even watch the vids?
 
What, did you even watch the vids?
No, your description was enough for me and a lot quicker than either of the videos. I’m pretty sure, as you stated, they show poor planning and placement of the cooling elements, which is what I’d expect from a “Dammit, Intel screwed us again. We can’t redesign the case, so redesign the cooling system.” And, last minute retooling will ALWAYS have issues.

If Intel’s solution had been more like the M1 or even more like what Intel promised, the story would be different.
 
No, your description was enough for me and a lot quicker than either of the videos. I’m pretty sure, as you stated, they show poor planning and placement of the cooling elements, which is what I’d expect from a “Dammit, Intel screwed us again. We can’t redesign the case, so redesign the cooling system.” And, last minute retooling will ALWAYS have issues.

If Intel’s solution had been more like the M1 or even more like what Intel promised, the story would be different.
What? What do you mean "Intel screwed us again"? It's Apple's cooling design that is broken?
 
So the fan design for Intel was purposefully bad to make Apple’s own chip seem that much better? That seems like something Intel could prove. At least they won’t do that to their own designs (or do they?)
I don't know why Apple made that horrendously gimped fan/heatsink design, but your conclusion is the same one I came to, as I can't think of any other reason. I simply can't believe it was accidental, as the level of incompetence would be mind blowing.
 
What? What do you mean "Intel screwed us again"? It's Apple's cooling design that is broken?
There are three possibilities:
  1. Apple engineers are completely incompetent and just got lucky with their M1 version not needing more cooling than they were able to provide.
  2. Apple consciously built an inferior design so that their machines would look bad compared to competitors' systems for the several years before they were able to release their own silicon, so that they would be able to show improved performance that was artificially better.
  3. Intel promised Apple a set of specs around which Apple built a design, and did not deliver. Apple did not change it because they were promised that the next one would really meet the promised specs (and again did not). Finally, Apple did not change it because they had designed their silicon to exceed those specs and it made sense to not change that design at the same time as they were trying to lock down their new architecture.
While I understand that despite (as far as I can tell) never having designed anything, Linus is an expert on everything, I do not need to watch yet another clickbait video of his to know which of these three options I think is most likely true.
 
There are three possibilities:
  1. Apple engineers are completely incompetent and just got lucky with their M1 version not needing more cooling than they were able to provide.
  2. Apple consciously built an inferior design so that their machines would look bad compared to competitors' systems for the several years before they were able to release their own silicon, so that they would be able to show improved performance that was artificially better.
  3. Intel promised Apple a set of specs around which Apple built a design, and did not deliver. Apple did not change it because they were promised that the next one would really meet the promised specs (and again did not). Finally, Apple did not change it because they had designed their silicon to exceed those specs and it made sense to not change that design at the same time as they were trying to lock down their new architecture.
While I understand that despite (as far as I can tell) never having designed anything, Linus is an expert on everything, I do not need to watch yet another clickbait video of his to know which of these three options I think is most likely true.

Given that Intel publicly released roadmaps with TDP figures that they never actually reached, because they were stuck on 14nm for years, seems pretty clear to me which answer is right.
 
Given that Intel publicly released roadmaps with TDP figures that they never actually reached, because they were stuck on 14nm for years, seems pretty clear to me which answer is right.
I guess the way to think about it is like a paraphrase of the old Marx Brothers quote: Who are you going to believe your own reading of Intel’s published specs or a click bait-y demagogue from the internet? :)
 
  • Haha
Reactions: Unregistered 4U
I agree that Intel has those shortcomings, but that's not to say they can't overcome them.

If you think Apple hasn't gimped the cooling systems of the Intel Macbooks, then you need to watch these vids. The short version is:
1) The heatsink is incorrectly mounted so it isn't flush against the chip, and thus is horribly inefficient. This same problem doesn't exist on the M1.
2) On the 2020 Intel MBA, the fan is offset a long way from the heatsink, which normally wouldn't be a problem, as you'd connect them with a heat pipe. But Apple took the heat pipe out! Yep, the fan spins up when the chip heats up (which it does quickly because the heatsink isn't mounted correctly), but the fan doesn't actually cool anything because there is no heat pipe, so it just sits there noisily spinning away achieving no actual cooling until the user stops using the laptop. Whereas the M1 MBA has dumped the fan and put a massive heat spreader in its place that connects (efficiently) to the heatsink. The same solution could have of course been used on the Intel MBA, and greatly improved its cooling, and thus performance, also without fan noise. Yes, the Intel chips do run hotter, so such a solution would result in quicker throttling than for the M1s, but it would be much slower to throttle than the actual gimped design Apple used.

The second video, he talks about how he isn’t against preventing getting burned by the laptop! Why do Apple users put up with this problem? Stop putting the hot cpu in the laptop, put it in a separate properly ventilated battery powered new small form factor Mac Pro that either wirelessly connects to laptop or uses one USB cable if that’s necessary for thunderbolt graphics.

(this is my 98th post!)
 
  • Like
Reactions: Unregistered 4U
There are three possibilities:
  1. Apple engineers are completely incompetent and just got lucky with their M1 version not needing more cooling than they were able to provide.
  2. Apple consciously built an inferior design so that their machines would look bad compared to competitors' systems for the several years before they were able to release their own silicon, so that they would be able to show improved performance that was artificially better.
  3. Intel promised Apple a set of specs around which Apple built a design, and did not deliver. Apple did not change it because they were promised that the next one would really meet the promised specs (and again did not). Finally, Apple did not change it because they had designed their silicon to exceed those specs and it made sense to not change that design at the same time as they were trying to lock down their new architecture.
While I understand that despite (as far as I can tell) never having designed anything, Linus is an expert on everything, I do not need to watch yet another clickbait video of his to know which of these three options I think is most likely true.

Guess it’s a bit of everything. Intel wasn’t able to deliver the chips that Apple wanted, while Apple was unwilling to make their laptops thicker in order to accommodate better cooling.

In a way, I would blame Apple more so than Intel. Intel doesn’t have to cater to anybody. They release the processors they have available, and it’s really “take it or leave it” for the other PC manufacturers. And that is what everyone else did, because they had no other choice.

Which reminds me of that saying about how unreasonable people change the world to suit themselves. This is basically what Apple has done here. If Intel would not give them the chips in the specifications they wanted so that Apple could make their laptops the way they wanted, then Apple would just have to come up with their own chips then.

This is the Apple I fell in love with. The Apple that just marches to their own beat and not have to care two hoots about what the rest of the world thinks or does.
 
Guess it’s a bit of everything. Intel wasn’t able to deliver the chips that Apple wanted, while Apple was unwilling to make their laptops thicker in order to accommodate better cooling.

In a way, I would blame Apple more so than Intel. Intel doesn’t have to cater to anybody. They release the processors they have available, and it’s really “take it or leave it” for the other PC manufacturers. And that is what everyone else did, because they had no other choice.

Which reminds me of that saying about how unreasonable people change the world to suit themselves. This is basically what Apple has done here. If Intel would not give them the chips in the specifications they wanted so that Apple could make their laptops the way they wanted, then Apple would just have to come up with their own chips then.

This is the Apple I fell in love with. The Apple that just marches to their own beat and not have to care two hoots about what the rest of the world thinks or does.

”Intel doesn’t have to cater to anybody.” Sure, they don’t *have* to. But if you are in the business of selling stuff, and you don’t sell what your customers want, it will end badly for you.

As Intel is about to find out.
 
  • Like
Reactions: Fomalhaut
Intel doesn’t have to cater to anybody. They release the processors they have available, and it’s really “take it or leave it” for the other PC manufacturers. And that is what everyone else did, because they had no other choice.

That is like saying AMD didn't have to listen to M$ or Sony when developing their APUs for the next-gen consoles...

How do you think that would have gone over...?

”Intel doesn’t have to cater to anybody.” Sure, they don’t *have* to. But if you are in the business of selling stuff, and you don’t sell what your customers want, it will end badly for you.

As Intel is about to find out.

^ This exactly...
 
Just speculation...

Given the move to on-package RAM on the M1, would it make sense for Apple to offer a Mac Pro with the option of multiple SoC, each sold as an upgrade board?

e.g. Base model: 16+4; 64 GB RAM (so an upgrade of current M1). Then with options for up to 4 SoC, maxing out at a total of 64+16 cores 256 GB RAM.

A modular MacPro like this seems much more doable with Apple designing/controlling all of the components...

But would individual processes that require more RAM than 64 GB be able to access the RAM split across multiple SoC? Would that be possible? And of course, still a problem for tasks that that require >256GB...

To me this seems more likely that achieving a decent yield of 32- or 64-core processors also with massive amounts of RAM (128 GB+) on the SoC.

Or will all the RAM be off-chip and upgradeable? But I feel like that is not where Apple is heading...
 
Just speculation...

Given the move to on-package RAM on the M1, would it make sense for Apple to offer a Mac Pro with the option of multiple SoC, each sold as an upgrade board?

e.g. Base model: 16+4; 64 GB RAM (so an upgrade of current M1). Then with options for up to 4 SoC, maxing out at a total of 64+16 cores 256 GB RAM.

A modular MacPro like this seems much more doable with Apple designing/controlling all of the components...

But would individual processes that require more RAM than 64 GB be able to access the RAM split across multiple SoC? Would that be possible? And of course, still a problem for tasks that that require >256GB...

To me this seems more likely that achieving a decent yield of 32- or 64-core processors also with massive amounts of RAM (128 GB+) on the SoC.

Or will all the RAM be off-chip and upgradeable? But I feel like that is not where Apple is heading...
But there is no need for multiple Secure Enclaves, or any of the other specialized hardware Apple places on their current Mac SoCs...

Apple may have decided yields be damned & have a massive monolithic SoC (28 Performance cores / 4 Efficiency cores / 32 GPU cores / 24 Neural Engine cores / 64GB HBM3), with a memory subsystem that allows access of up to 1TB of DDR5 (eight 128GB DIMMs) & maybe even PCIe Gen5 x16 slot(s) for Apple Silicon (GP)GPU(s)... 64 & 128 core variants...! Is there local memory on these mystery ASi GPUs...? Tune in next week to find out...!

Ah, Soap...
 
”Intel doesn’t have to cater to anybody.” Sure, they don’t *have* to. But if you are in the business of selling stuff, and you don’t sell what your customers want, it will end badly for you.

As Intel is about to find out.
I was going to mention the special Pentium CPU in the original MacBook Air which was created with input from Apple. Of course Intel cater to requirements from big customers. It's just that they've been unable to meet their own targets but Apple are a customer with the ability to create an alternative option without needing to switch to AMD.
 
Just speculation...

Given the move to on-package RAM on the M1, would it make sense for Apple to offer a Mac Pro with the option of multiple SoC, each sold as an upgrade board?

e.g. Base model: 16+4; 64 GB RAM (so an upgrade of current M1). Then with options for up to 4 SoC, maxing out at a total of 64+16 cores 256 GB RAM.

A modular MacPro like this seems much more doable with Apple designing/controlling all of the components...

But would individual processes that require more RAM than 64 GB be able to access the RAM split across multiple SoC? Would that be possible? And of course, still a problem for tasks that that require >256GB...

To me this seems more likely that achieving a decent yield of 32- or 64-core processors also with massive amounts of RAM (128 GB+) on the SoC.

Or will all the RAM be off-chip and upgradeable? But I feel like that is not where Apple is heading...
I've been wondering how Apple might implement a Mac Pro style design and keep RAM replaceable. Yes there's efficiency benefits to be had by keeping RAM and GPU on package to the point where it appears that 16Gb of RAM on M1 Macs appears to be theoretically the same as having 32Gb on Intel (my extrapolation based on various Youtube performance videos suggesting 8Gb on an M1 system like having 16Gb on an Intel system, depending on the task at hand and native software).

Apple seem to be moving towards TBDR (Tile based deferred rendering) for graphics - which is all about efficiency again rather than what would appear by comparison a brute force approach by NVIDIA and AMD. If Apple can indeed get good performance per watt out of CPU+GPU packages.

On the face of it, this whole discussion reminds me of Planar Graphics vs Chunky graphics back in the 16-bit Amiga vs ST/Windows PC days. Windows PCs won that because of increased investment and hardware accepting more power hungry upgrades with an ever increasing user base thanks to the popularity of Microsoft Windows until we reach the place where we are today with CPUs and PC graphics cards needing loads more power.

I recall from the 1980s that back porting Atari ST software to Amiga was relatively easy (this happened because the ST user base was larger) but then the Amiga was hamstrung because it used a 7.14Mhz 68000 CPU whereas the ST used an 8Mhz one and the back ported software never touched the extra coprocessors that could have made Amiga version much better.

I would suggest that all of this points towards Apple wanting to create energy efficient hardware which will require special coding approaches (different than today's windows PC developing styles) and from an existing huge user base. Rather than trying to cope with Intel ports Apple could be looking forward to using the iOS user base to get developers to think about the new techniques.

Although ARM CPUs exist they won't have the extra huge user base that might have persuaded coders to be lazy and do a version that worked on Android/Switch and then back port to iOS. It's iOS that would have the huge user base that has revenue.

Back on topic though, there's still a top SKU Mac mini plus iMac and (if Apple want to) iMac Pro to update. I don't think people would blink too much if Apple didn't offer RAM upgradability on any of these - especially if the 27" variants were replaced by an all-new 4.6K 24" iMac.

The Mac Pro might be a more emotive subject though, and I can't see how Apple could compromise on the performance benefits of having everything on chip. The way things are going though, it seems possible to me that the Mac mini body shell could accommodate a much more capable headless Mac spec leaving potentially a future Apple Silicon Mac Pro to appear much better value for money as Intel's pricing for their Xeons is exorbitant to say the least.

For example, there probably wouldn't be arguments if an M2X Mac Pro came with 128Gb RAM and 1Tb SSD on chip as base spec for the same money and there was an Apple PCIe 5.0 Graphics Accelerator in 2022. It would be the only way to sell an 'unexpandable' Mac Pro in future if a decently specified iMac or Mac mini was capable of doing more beneath it.
 
The second video, he talks about how he isn’t against preventing getting burned by the laptop! Why do Apple users put up with this problem? Stop putting the hot cpu in the laptop, put it in a separate properly ventilated battery powered new small form factor Mac Pro that either wirelessly connects to laptop or uses one USB cable if that’s necessary for thunderbolt graphics.

(this is my 98th post!)
I presume you're joking? There's nothing particularly hot about Intel mobile processors if the cooling solution is done properly. My 2015 Retina MBP runs cool enough to sit comfortably on my lap in my boxers on a hot summer's day. The Retina generation of MBPs remain the best laptops Apple has ever made. I'm still waiting for them to make something worthy of upgrading to.

That said, the M1 MBA would be a worthy contender if it came in a 16" screen version. And yeah, I know a 16" MBP is coming, but it will be cursed by the touchbar, so...
 
There are three possibilities:
  1. Apple engineers are completely incompetent and just got lucky with their M1 version not needing more cooling than they were able to provide.
  2. Apple consciously built an inferior design so that their machines would look bad compared to competitors' systems for the several years before they were able to release their own silicon, so that they would be able to show improved performance that was artificially better.
  3. Intel promised Apple a set of specs around which Apple built a design, and did not deliver. Apple did not change it because they were promised that the next one would really meet the promised specs (and again did not). Finally, Apple did not change it because they had designed their silicon to exceed those specs and it made sense to not change that design at the same time as they were trying to lock down their new architecture.
While I understand that despite (as far as I can tell) never having designed anything, Linus is an expert on everything, I do not need to watch yet another clickbait video of his to know which of these three options I think is most likely true.
But Apple actually put the fan away from the heatsink, and didn't connect it with a heat pipe. And they actually mounted the heatsink so that it wasn't even mounted against the chip. That's not building to a "promised set of specs", that's a 100% gimped design x2. It doesn't matter what spec Intel delivered, those two thermal design fails wouldn't work regardless. And Apple didn't get lucky with their M1 design, it was a simple basic cooling solution, it's simply a heatsink connected to a heat spreader. This isn't some sort of cutting edge engineering we are talking about, it's basic, simple, well known design, that any engineering team in the world could get right in their sleep. I'm an electrical engineer, and I'm telling you, this design is deliberately gimped, there's no other sane explanation. If you don't believe me, then if you've got any friends who are engineers, then I challenge you to show those two vids to them and watch the shock on their faces when they see what Apple have done.
 
Just speculation...

Given the move to on-package RAM on the M1, would it make sense for Apple to offer a Mac Pro with the option of multiple SoC, each sold as an upgrade board?

e.g. Base model: 16+4; 64 GB RAM (so an upgrade of current M1). Then with options for up to 4 SoC, maxing out at a total of 64+16 cores 256 GB RAM.

A modular MacPro like this seems much more doable with Apple designing/controlling all of the components...

But would individual processes that require more RAM than 64 GB be able to access the RAM split across multiple SoC? Would that be possible? And of course, still a problem for tasks that that require >256GB...

To me this seems more likely that achieving a decent yield of 32- or 64-core processors also with massive amounts of RAM (128 GB+) on the SoC.

Or will all the RAM be off-chip and upgradeable? But I feel like that is not where Apple is heading...
This is possible, but there are technical trade-offs with multi-socket architectures. You have to make the interconnects as fast as possible, and there are physical limits (the speed of light) when components are any significant distance from each other. You can indeed share RAM across processors, where processors favor their "local RAM", but can access RAM used by other processors (Non-uniform memory access - Wikipediaen.wikipedia.org › wiki › Non-uniform_memory_access)

There are already 128-core ARM-based chips in production (based on ARM Neoverse N1), so build 32-64 core chips is not a big stretch for Apple. The RAM is not on the primary die (it's on the SoC package but on separate chips), so this does not impact the yield.

I think is far more likely that Apple will develop a 32-core Mx chip with 4-8 efficiency cores than have a multi-socket solution. At least in the next couple of years. The unknown is whether they will also increase the performance of the integrated GPU, or break this out into a separate die. Apart from potential yield efficiencies of 2 dies vs one, I imagine it's easier to cool if CPU and GPU cores are not on the same die.
 
  • Like
Reactions: Spectrum
The guy in the second video, trying to fix the gimped cooling, removed insulating material and I don’t think he was joking about it being there to protect the user from burns. Comfort must be sacrificed to get full performance. That doesn’t have to happen if we take the cpu out of the laptop.
 
The guy in the second video, trying to fix the gimped cooling, removed insulating material and I don’t think he was joking about it being there to protect the user from burns. Comfort must be sacrificed to get full performance. That doesn’t have to happen if we take the cpu out of the laptop.
Dude, give it up...

Apple is not going to make a laptop that requires a second chassis to house the SoC, and a cable to connect the two...

Kinda kills the whole idea of a laptop as a mobile device, and before you start on how the Mac mini that actually houses the SoC for your "Comfy Couch Laptop" has a battery & therefore is portable as well, do you REALLY think Apple would ever go with such a kludgy system...?
 
  • Like
Reactions: Artiste212
Dude, give it up...

Apple is not going to make a laptop that requires a second chassis to house the SoC, and a cable to connect the two...

Kinda kills the whole idea of a laptop as a mobile device, and before you start on how the Mac mini that actually houses the SoC for your "Comfy Couch Laptop" has a battery & therefore is portable as well, do you REALLY think Apple would ever go with such a kludgy system...?
I think a combination of a Mac Mini with a separate battery pack with AC inverter and an iPad, MacBook Air or Chromebook would meet @HowardEv 's requirement...but I don't see this as something that Apple would build.
 
I’m hoping Apple wows us with a wireless video connection. Logic would run the GUI on the cool laptop, and do the audio processing on the portable cpu. Then the engineer could also be surfing the web on the couch, without affecting playback of the mix or recording from the interface.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.