Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Aside: Any regular apple hardware reviewer putting out a video about the M3 ultra having issues is entirely missing the point. They've traditionally based their reviews on the work they do - video editing, and that's iPad Pro territory these days. That problem is solved with low-mid consumer hardware at this point.

Those looking at Ultras should be those doing 3d renders with massive geometry/texture data, people running LLMs and other high end workload. Essentially workloads that are VRAM bottlenecked; where it doesn't matter how fast the GPU is if you don't have enough VRAM for it.

4k or 8k video editing is no longer a high end workload in 2025.
 
Aside: Any regular apple hardware reviewer putting out a video about the M3 ultra having issues is entirely missing the point. They've traditionally based their reviews on the work they do - video editing, and that's iPad Pro territory these days. That problem is solved with low-mid consumer hardware at this point.

Those looking at Ultras should be those doing 3d renders with massive geometry/texture data, people running LLMs and other high end workload. Essentially workloads that are VRAM bottlenecked; where it doesn't matter how fast the GPU is if you don't have enough VRAM for it.

4k or 8k video editing is no longer a high end workload in 2025.
You appear to be making the case for the Ultra, not the case for an M3 Ultra. Why is it so wrong to want an M4 Ultra?
 
You appear to be making the case for the Ultra, not the case for an M3 Ultra. Why is it so wrong to want an M4 Ultra?
There's nothing wrong with wanting one, but it doesn't exist, likely due to manufacturing teething issues with the M4 and/or the M4 design not being fully shaken out in the real world yet (i.e. apple don't want to commit to building massive M4 based chips until they're 100% certain they aren't building large chips with issues they could correct after discovery in the M4 Pro/Max).

Ultras are very expensive to make, large chips and no CPU vendor does those chips with unproven designs, whether they're intel, AMD or Apple.

The M3 ultra is what exists at the moment.
 
There's nothing wrong with wanting one, but it doesn't exist, likely due to manufacturing teething issues with the M4 and/or the M4 design not being fully shaken out in the real world yet (i.e. apple don't want to commit to building massive M4 based chips until they're 100% certain they aren't building large chips with issues they could correct after discovery in the M4 Pro/Max).

Ultras are very expensive to make, large chips and no CPU vendor does those chips with unproven designs, whether they're intel, AMD or Apple.

The M3 ultra is what exists at the moment.

I wonder how much of the high quality silicon wafers from M4/M5 have gone to Apple Intelligence's PCC? Maybe that is also a contributor aside from yield.
 
  • Like
Reactions: Timpetus and throAU
It never surprises me to see the number of people who defend Apple no matter the circumstances.

It always surprises me that people cant accept their negative views? I am not going to apologize for like my Ultra M3 just because you have made up your mind without trying one :)
Whether it's for everyone or not is not the point. The point is Apple released a new, top end product based on their last generation technology while moving everything else to their current technology.

It's not your point. It is Apple's point. You want to complain, you do you.

Please stop criticizing people who feel that releasing a new product based on last generation technology when everything else has moved to current technology. Especially when the product utilizing last generation technology is the high end offering.

The irony. Okay sorry if my comments offended you.. Did you attack people making sweeping statements on how it's a failure? Of course not. It matches your opinion.

But yeah, sorry, but there are a lot of you tubers that just put out click bait. I am not even sure they believe what they say. But sure, I am criticizing them if I point out the obvious.

Apple has a history of not playing the specs game for specs sake. They get a lot of flack for it, but they take the stance that it's not specs that drive the user experience and I agree. I could care less what the name of the chip is. It does the job very well. Thats what I care about.
 
Ultras are very expensive to make, large chips and no CPU vendor does those chips with unproven designs, whether they're intel, AMD or Apple.
Not really, EPYC is on a better node and they're the biggest CPU's out there.
 
The reason for the M3 Ultra showing up now could be as simple as N3B capacity freeing up as everything else moves to M4s, may also explain why there's no M4 Ultra - Apple choosing how to allocate available manufacturing capacity.

Which is just another way of saying "TimApple just likes screwing his most loyal customers over on the products with the highest margins."
If there was a surplus of N3B capacity putting it towards a low volume high margin halo product is not the way to go. Unless you just can't make them on a better node anyway.

Not monolithic chips but then again neither would a hypothetical M4 ultra be.

Yes and no.
A (M1/M2)Ultra is made up from 2 Max dies in a specific order. So if die 24 goes with 25 and 25 is defect you can't pair 24 with 23 (or any other die). So if all you want is Ultras you may as well see the area taken up by those 2 Max as one chip (monolithic also implies a certain behavior in SW, but we are purely talking manufacturing here).
With M1/M2 you could still salvage 1 good Max chip from a defective Ultra, put it seems that might not be possible with the way the actual M3Max was made and still wouldn't matter as Apple has no further use for M3Max chips.
 
  • Like
Reactions: Timpetus
Not really, EPYC is on a better node and they're the biggest CPU's out there.

They're nothing alike - EPYC is a heap of little dinky Zen dies on a modern process, with a central IO die on a cheap old process. No GPU at all, which is the majority of the die size on Mx Max - and not running anywhere near the same clock (basically HALF the clock speed which is far more tolerant of manufacturing defects).

Much cheaper to make as you're not dealing with teh defect rate on a single huge monolithic die, or even a couple of large monolithic dies including GPU, thunderbolt controllers, display controllers, etc.

Not saying what AMD is doing is bad or anything, its a great idea for what it is and the market it is aimed at, but it is nothing like the Mx Ultra which has a much, much faster interconnect between the two Max dies - its not just a bunch of small dies linked together on a substrate.

A more valid comparison would be the individual Zen 8 core chiplets with M4 Max dies. Not even close in terms of size and complexity - or the clock they're running at then constructed as an EPYC.

They're different products aimed at different markets, and AMDs design has a bunch of tradeoffs which aren't a problem for running a buttload of VMs on a server but isn't suitable for the interactive user workloads Apple are running on M series.
 
Last edited:
It always surprises me that people cant accept their negative views? I am not going to apologize for like my Ultra M3 just because you have made up your mind without trying one :)

I haven't used an M3 Ultra as I am still using my M1 Ultra.

The irony. Okay sorry if my comments offended you.. Did you attack people making sweeping statements on how it's a failure? Of course not. It matches your opinion.
You didn't offend me as I haven't been the target of any such criticism. I am just wondering why people here feel the need to defend Apple and criticize anyone who questions their decisions no matter what the situation.

As for how I feel about it when the Ultra 3 when it was announced I said it felt "weird" to be releasing the top end product with last years technology and I stand by that. However, I did say, back in post #37 of this thread:

That said a person needs to evaluate a system based on their specific needs and buy accordingly. Use the tool which makes you the most productive even if it uses "older" technology.

I have no dog in this fight.
 
  • Like
Reactions: Timpetus
You didn't offend me as I haven't been the target of any such criticism. I am just wondering why people here feel the need to defend Apple and criticize anyone who questions their decisions no matter what the situation.
It’s not so much blind Apple defense as simply framing your expectations vs the physical reality of cpu manufacturing.

The ultra SOCs are really difficult to build. They have always lagged the a series and base m series parts in architecture.

This isn’t new. It’s to be expected. Every other vendors high core count parts lag their low end/consumer parts by 12 months or more.

Expecting otherwise is simply unrealistic.

No other vendor is releasing high core count parts on their current consumer arch, and for many good reasons. Period.
 
Last edited:
I am just wondering why people here feel the need to defend Apple and criticize anyone who questions their decisions no matter what the situation.

Sorry, I wouldn't know, and no clue why you singled me out to ask me. It's not even remotely close to the topic of this thread. I was commenting on some of the conspiracy theories on the M3 Ultra and the broad statements of 'failure' based on what is obviously YouTube click bait. But since you persist in asking me about people irrationally defending Apple (from your perspective), I will answer so we can move on. I see no objective evidence for an irrational defense, just a lot of accusations from people that find a technical discussion inconvenient and try to dismiss it by getting personal and suggestion bias. Saying there is bias, doesn't make it true and is one of the weakest arguments out there. I wish people would just drop claims of bias and get back to technical. Don't like what I say? Point out the fallacy in my logic. Can't? Voice your own opinion but don't drag me in to it. Speaking in general, not you in particular.

I DO think there is a fundamental difference in perspective represented here in MR that does result in similar accusations. You have people such as myself that see technical products as tools to be used. My major questions are, does the tool do what I want, how easy is the tool to use, how durable, and if it doesn't do what I want, what other tools might? I don't waste my time wondering why the company didnt make the tool I want, or putting them down for it. Frankly, they know a lot more about the business of making tools than me or anyone here. Then you have people that assume all these things as if some right and obsess on their perspectives of negatives or worse, cosmetic details as if color were all that important to tool function. Very different perspectives.
 
Sorry, I wouldn't know, and no clue why you singled me out to ask me.
I didn't single you out. I just happened to respond to your post because it was your post that stated YouTube reviewers were making these statements solely for click bait. Did you happen to watch the review in question? IMO it was no where close to being click bait.
 
I agree OP. If I was in the market for an ultra I’d be feeling exactly the same and annoyed with apples decision on this.

I’ve just bought an M4 MBA and even then am slightly unhappy it’s using a chip released nearly a year ago in the iPad Pro. Why they delay? Surely the MBA could’ve skipped the M3 generation last year seeing as the M4 was ready and we’d now be close to starting the M5 gen? I suspect it’s because of what you alluded to with Tim/Apple trying to fleece their customers and upsell as much as they can.

People will defend apple and rightly so for lots of reasons, they’re super successful and do many things right, however many of their recent decisions and failures are making me start to wonder if the big picture wheels are starting to fall off…
 
Given Ultra chips are just two Max chips is there any reason they can't make an M4 Ultra?
Yes, because the Max chips have to have some sort of special "fusion" connector that allows Apple to connect the two chips together to work as one. Mark Gurman said that the M4 Max chips might not have this fusion connector.
 
I’ve just bought an M4 MBA and even then am slightly unhappy it’s using a chip released nearly a year ago in the iPad Pro. Why they delay? Surely the MBA could’ve skipped the M3 generation last year seeing as the M4 was ready and we’d now be close to starting the M5 gen?
The M4 was in the iPad Pro for its power efficiency, not its performance. While it is slightly annoying, most MacBook Air users will never reach the performance peak of the M4 chip, either because of thermal throttling or just Air user's workflows in general.
 
  • Like
Reactions: macbookj0e
Yes, because the Max chips have to have some sort of special "fusion" connector that allows Apple to connect the two chips together to work as one. Mark Gurman said that the M4 Max chips might not have this fusion connector.
Understood. The question is: Why, assuming this is the case, don't they? After all the M1 and M2 chips did and it was possibly added to the M3 chip (I seem to recall hearing that the M3 Max also lacked this connector, and no I didn't hear it solely from Max Tech). IMO production limitations / issues appear to be the likely reason.
 
Unless you’re doing 3-D rendering or 8K video editing I don’t see any need for any of these things. I wound up buying an M4 Mac mini base just to try it out and it wound up working perfectly fine for graphic design, high resolution photo editing, and gaming
 
  • Like
Reactions: throAU
The M4 was in the iPad Pro for its power efficiency, not its performance. While it is slightly annoying, most MacBook Air users will never reach the performance peak of the M4 chip, either because of thermal throttling or just Air user's workflows in general.
Very true and good point but yes I still find it annoying 😂
 
Aside: Any regular apple hardware reviewer putting out a video about the M3 ultra having issues is entirely missing the point. They've traditionally based their reviews on the work they do - video editing, and that's iPad Pro territory these days. That problem is solved with low-mid consumer hardware at this point.

Those looking at Ultras should be those doing 3d renders with massive geometry/texture data, people running LLMs and other high end workload. Essentially workloads that are VRAM bottlenecked; where it doesn't matter how fast the GPU is if you don't have enough VRAM for it.

4k or 8k video editing is no longer a high end workload in 2025.
Multi-cam 8K might still qualify, but I still think 8K is just overkill for everything but cinema level work as Digital Projectors are just now being upgraded to 4K in cinemas.
 
  • Like
Reactions: throAU
I didn't single you out. I just happened to respond to your post because it was your post that stated YouTube reviewers were making these statements solely for click bait. Did you happen to watch the review in question? IMO it was no where close to being click bait.

Who else in this tread did you ask such an off topic question? shrugs. Thats almost the definition of singled out. And of course I watched the review. he used to do a lot of good stuff, but there has been a shift this last year. not going to speculate why. and he was the first to admit he knows zilch about the intended audience, AI. and if he had stuck with that. cool. but nope. that doesn't get clicks.
 
It’s not so much blind Apple defense as simply framing your expectations vs the physical reality of cpu manufacturing.

The ultra SOCs are really difficult to build. They have always lagged the a series and base m series parts in architecture.

This isn’t new. It’s to be expected. Every other vendors high core count parts lag their low end/consumer parts by 12 months or more.

Expecting otherwise is simply unrealistic.

No other vendor is releasing high core count parts on their current consumer arch, and for many good reasons. Period.
Posters around here are getting hung up on a number because they are chasing specs and not thinking about the target market and the workload for the M3 Ultra. The thing is that 99% of those posters don’t have a workload that needs more than M3 or an M4, if that. It’s a weird flex.
 
It’s not so much blind Apple defense as simply framing your expectations vs the physical reality of cpu manufacturing.

The ultra SOCs are really difficult to build. They have always lagged the a series and base m series parts in architecture.

This isn’t new. It’s to be expected. Every other vendors high core count parts lag their low end/consumer parts by 12 months or more.

Expecting otherwise is simply unrealistic.

No other vendor is releasing high core count parts on their current consumer arch, and for many good reasons. Period.
I guess one way to think about it is that we got the other M4 chips really fast, not that the M3 Ultra was late. I took a look at when all of the M-series chips were announced and the generation gaps for the base chips are 19, 16, and 7 months for M1->M2, M2->M3, and M3->M4. The gaps from base chip to ultra for the first 3 gens (since there's no M4 Ultra) are 16, 12, and 17 months. Apple only waited one month longer to announce the M3 Ultra after announcing the first M3 than they did with the M1 to M1 Ultra. The base M4 was released super quickly, though it was only in the iPad Pro at first.

The M3 generation also felt different because there was no gap between the base M3 and the Pro and Max chips. The gaps for those in each generation were 11, 7, 0, and 5 months. The M3 Pro and Max also came out relatively quickly after the M2 Pro and Max, with the gaps between Pro and Max generations at 15, 9, and 12 months.

All of this to say, the M3 Ultra wasn't really released particularly late, it's just that the other M3 chips had a relatively short time as the cutting edge, and the M2 Ultra came out a little sooner relative to the other M2 chips.
 
They're nothing alike - EPYC is a heap of little dinky Zen dies on a modern process, with a central IO die on a cheap old process. No GPU at all, which is the majority of the die size on Mx Max - and not running anywhere near the same clock (basically HALF the clock speed which is far more tolerant of manufacturing defects).
There's no zen5c die for consumer chips, not on 3nm at least also not sure why does IGP matter here? If we're talking absolute monster chips then there's Cerebras and of course Nvidia. There's an argument to be made that M4 max or M3 ultra aren't really consumer chips. They're probably just TR versions of what Apple made with IGP, now the IGP was needed because they've made it impossible to work with Nvidia, AMD or Intel if you wanted to go that route.
Not saying what AMD is doing is bad or anything, its a great idea for what it is and the market it is aimed at, but it is nothing like the Mx Ultra which has a much, much faster interconnect between the two Max dies - its not just a bunch of small dies linked together on a substrate.
I think everyone will move towards chiplets sooner or later, there could be edge cases where monolithic dies make sense but with the ballooning costs at TSMC probably Apple should also be exploring it if not actively working towards something similar. Monoliths make more sense in case of phones, or tablets, where efficiency is paramount but for virtually everything else AMD's approach could be better.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.