Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
66,347
35,403


Apple this "past summer" canceled the development of a high-performance Mac chip that would have consisted of four smaller chips stitched together, in order to free up engineering resources for a planned AI server chip, according to The Information.

M4-Extreme-Cancelled.jpg

Based on the report's description of the chip, it sounds like Apple has canceled a previously-rumored "Extreme" chip for the Mac. It was previously reported that an "M2 Extreme" chip was scrapped a few years ago, but perhaps Apple had revisited the idea since then. In any case, it now sounds like an "M4 Extreme" chip is also unlikely.

Apple likely would have introduced the "M4 Extreme" in its high-end Mac Pro tower. The chip would have offered even faster performance than the M4 Ultra chip that is expected to launch in new Mac Studio and Mac Pro models later next year.

If the "M4 Extreme" were to have been a quadrupled version of the M4 Max chip that debuted in the MacBook Pro a few months ago, it would have had massive specifications, including up to a 64-core CPU and up to a 160-core GPU.

While the "Extreme" chip may be off the table once again, it seems like Apple has repeatedly shown interest in developing such a chip, so perhaps it will eventually materialize as part of the M5 series or later. For now, though, the wait continues.

Update: After this story was published, Daring Fireball's John Gruber made a good point about how there is a long, multi-year gap between Apple designing and shipping new chips. Accordingly, it is possible the latest chip canceled actually would have been an "M5 Extreme" chip or later if development was only recently ended.

Article Link: 'M4 Extreme' Chip Unlikely After Apple 'Cancels' High-Performance Chip [Updated]
 
Last edited:
Well you would also need software to do the task assigning for all those cores, and that may have been too much for Apple.
Too bad, probably already a sign of the end of the road for those M chips, at the very least we’re moving towards a plateau which will probably be countered with an increase in clock speeds.
 
It was reported that Apple scrapped the "M1 Extreme" because it was too expensive to manufacture. Sounds like the same has hit the "M2 Extreme" and "M4 Extreme".

Maybe when we are in mainstream sub-micron process ranges using chiplets the cost will become acceptable.
 
  • Like
Reactions: Chuckeee
I've been wondering for years now why Apple doesn't make another dual processor Mac Pro, like they've done in the PPC and early Intel days. It would greatly improve the case for the Mac Pro above the Mac Studio and would actually provide the necessary slots for all the expansion in there (now there's not enough bandwidth in the SOC to actually use all expansion slots at full speed at the same time).
 
I've been wondering for years now why Apple doesn't make another dual processor Mac Pro, like they've done in the PPC and early Intel days.

The Apple Silicon architecture may not be designed for multi-processor operations so performance would not scale and you may hit issues with memory access and cache coherency.

This sounds like what is happening with these "Extreme" SoCs - the Ultra is not twice as good as the Max even though it is two Max connected directly. Two Ultras talking to each other across an external connection on a motherboard would be even worse.
 
This is anecdotal, but I got a M1 Ultra a couple years back to test out training some large LLM models that wouldn’t fit on NVIDIA chips like an RTX 6000 Ada. The latter has 48Gb VRAM, while the M1 Ultra has ~98Gb VRAM effectively, albeit at much slower training speeds.

It did fine for a few models, but then something bizarre happened - anything with an odd numbered batch size returned only infinite gradients and nan losses. Even numbered batch sizes could kind of train, but gradients exploded frequently. I couldn’t replicate this on my NVIDIA hardware or even a Mac M3 Max chip.

Only a hypothesis on my end, but I worry that the dye connection between chips is fragile and prone to error, particularly with these demanding high throughput actions. This can lead to otherwise normal Mac behavior, but render it useless with AI applications. My Mac M1 is still under Apple Care, but I don’t know how to even begin to describe the problem to Apple support.
 
Well you would also need software to do the task assigning for all those cores, and that may have been too much for Apple.
Too bad, probably already a sign of the end of the road for those M chips, at the very least we’re moving towards a plateau which will probably be countered with an increase in clock speeds.
there's plenty of workloads suited for this. It's just not common for most people to need to compile browsers or massive AI models all day
 
  • Like
Reactions: Algr and rp2011
This is anecdotal, but I got a M1 Ultra a couple years back to test out training some large LLM models that wouldn’t fit on NVIDIA chips like an RTX 6000 Ada. The latter has 48Gb VRAM, while the M1 Ultra has ~98Gb VRAM effectively, albeit at much slower training speeds.

It did fine for a few models, but then something bizarre happened - anything with an odd numbered batch size returned only infinite gradients and nan losses. Even numbered batch sizes could kind of train, but gradients exploded frequently. I couldn’t replicate this on my NVIDIA hardware or even a Mac M3 Max chip.

Only a hypothesis on my end, but I worry that the dye connection between chips is fragile and prone to error, particularly with these demanding high throughput actions. This can lead to otherwise normal Mac behavior, but render it useless with AI applications. My Mac M1 is still under Apple Care, but I don’t know how to even begin to describe the problem to Apple support.
M1 Ultra is a mess and more of a prototype. There are many issues with the chip-to-chip stuff on that SoC which the M2 Ultra mostly fixed. I’ve seen the most problems with GPU parallelism which is wonky as hell and doesn’t scale like it should. If you have a scenario where you can repeat this behavior I’d be very interested to see it run on on an M2 Ultra or upcoming M4 Ultra.

I still expect Apple will release a desktop focused CPU by 2028, and probably fold back in the stuff they did on the server end. I’d re-allocate resources this way also, assuming the rumor is true.

Given this news I’m glad I went for the highest-end M4 Max MBP, I’ll upgrade to a Studio or Pro in 2-3 years depending on when this thing finally launches. I really want full ECC for Memory greater than 128GB and don’t need E cores for my work so that rumored Hidra chip would have been nice, and I think will still happen eventually.

edit: Gruber raised a good point that the work being canceled if teams were recently moved around would obviously be for silicon due 2+ years from now, so we could still see something in the coming next year or two, then a 1-2 generation skip, then another one. Depending on their roadmap that could make sense.
 
Last edited:
"While the "Extreme" chip may be off the table once again, it seems like Apple has repeatedly shown interest in developing such a chip," - if there's never been anything but rumor reporting on such a chip, I don't know how you could come to the conclusion that Apple has shown any, much less 'repeated' interest in such a chip.
 
It's a great day for all the commentators here who long for the days of yore and insist Apple stop making their chips faster and devices even smaller and thinner. Big big win for team mediocrity.
Not to cast sunshine on your rainy parade, but the article does suggest that the extreme could return with a future M series.
 
  • Like
Reactions: haddy
Not to cast sunshine on your rainy parade, but the article does suggest that the extreme could return with a future M series.
The M4 is also the fastest single core consumer CPU right now, they’re hardly slacking.

GPU performance is way up and will reach 4090 levels this coming year in many tasks, and if nvidia keeps the 2-3 year cycle I think Apple will probably catch up on the desktop which I would never have said 5 years ago.

They’re doing great overall.

"While the "Extreme" chip may be off the table once again, it seems like Apple has repeatedly shown interest in developing such a chip," - if there's never been anything but rumor reporting on such a chip, I don't know how you could come to the conclusion that Apple has shown any, much less 'repeated' interest in such a chip.

Multiple engineers have confirmed this, it isn’t a rumor so much as a leak. The product part is the real rumor – they can and do explore a lot of options internally and never release them.
 


Apple this "past summer" canceled the development of a high-performance Mac chip that would have consisted of four smaller chips stitched together, in order to free up engineering resources for a planned AI server chip, according to The Information.

M4-Extreme-Cancelled.jpg

Based on the report's description of the chip, it sounds like Apple has canceled a previously-rumored "Extreme" chip for the Mac. It was previously reported that an "M2 Extreme" chip was scrapped a few years ago, but perhaps Apple had revisited the idea since then. In any case, it now sounds like an "M4 Extreme" chip is also unlikely.

Apple likely would have introduced the "M4 Extreme" in its high-end Mac Pro tower. The chip would have offered even faster performance than the M4 Ultra chip that is expected to launch in new Mac Studio and Mac Pro models later next year.

If the "M4 Extreme" were to have been a quadrupled version of the M4 Max chip that debuted in the MacBook Pro a few months ago, it would have had massive specifications, including up to a 64-core CPU and up to a 160-core GPU.

While the "Extreme" chip may be off the table once again, it seems like Apple has repeatedly shown interest in developing such a chip, so perhaps it will eventually materialize as part of the M5 series or later. For now, though, the wait continues.

Article Link: 'M4 Extreme' Chip Unlikely After Apple 'Cancels' High-Performance Chip
I'm still holding out hope for this chip. Here's my reasoning: 1) if the M4 Max has no fusion strip, then the M4 Ultra would likely be a single die chip. 2) If the Ultra does in fact have a fusion chip, then the "Extreme" could be two Ultra's fused together rather than four Maxs stitched together. 3) While this dream chip would not be a linear 2x of the Ultra, it could be exclusive to the Mac Pro, and with appropriate pricing (plus full bandwidth on each expansions slot), the Pro's value could be considerably increase with say a price of $9k for the Extreme version.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.