Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I sure hope it isn’t the last one, but that was the first thought that occurred to me.

Another thought is that Apple could move to an entirely different approach for the high end desktop. Have one die that has lots of CPU cores (plus other basic SOC stuff) and move the GPU to its own die that connects via ultrafusion. That would allow for more customization in terms of cpu to gpu balance.

It would probably not make sense to develop such a highly custom design for the existing high end Mac desktop market. But if Apple were to also use such a design for AI model training in their own data centers, then it might make sense

Or maybe they could try some actual courage, and admit that they're not good at building GPUs. It would be nice to have Nvidia options on the Mac again.
 
Not to mention the inherent security flaw (GoFetch) in all M series chips.

Don’t worry, this “security flaw” will be part of all CPUs before long because everyone wants more performance.

Javascript can execute an attack that steals data

What? Where did you get that idea from?

I am not satisfied that Apple is only disabling the Prefetcher when performing encryption

We still don’t know whether the exploit even works under macOS. I think you might be jumping to conclusions very quickly.

which is incredible given how significant the flaw is.

So far it has not been demonstrated that the flaw is significant at all. What we know is that a research group has managed to guess an encryption key under some very specific, carefully setup conditions, using a testing framework they have not disclosed. Wait with the pitchforks until the data is available at least.
 
mac pro starting price $7999 with 2TB of raid 0 (x4 512GB) storage. Max it out at 16TB for only an $10K upgrade.
 
When will we finally stop confusing popularity with competence? It is entirely possible that this theory is correct, however, it is unlikely to be correct simply because Yuryev said so. He knows a great deal about video editing, general computing and semiconductor technology are not really his forte though.
I feel the same way about iJustine.
 
Or maybe they could try some actual courage, and admit that they're not good at building GPUs.

They are quite good at building GPUs, considering how much of the headstart Nvidia had. In some key areas (instruction scheduling, resource allocation, ALU utilization) Apple is ahead of Nvidia. I mean, we have Nvidia engineers openly saying that Apple managed to achieve things Nvidia tried and failed. There is a lot of evidence that Apple is building a superscalar GPU. It’s all quite exiting stuff for me as GPU tech enthusiast. In fact, I think that the GPU division is innovating at a much higher pace than the CPU division.
 
Don’t worry, this “security flaw” will be part of all CPUs before long because everyone wants more performance.



What? Where did you get that idea from?



We still don’t know whether the exploit even works under macOS. I think you might be jumping to conclusions very quickly.



So far it has not been demonstrated that the flaw is significant at all. What we know is that a research group has managed to guess an encryption key under some very specific, carefully setup conditions, using a testing framework they have not disclosed. Wait with the pitchforks until the data is available at least.
I am under extremely restrictive NDAs so I can't go into detail, but the nature of the flaw is very bad to put it lightly. There aren't any full-detail proof of concepts for a very good reason, because they would be immediately co-opted by bad actors. You're never going to get the evidence you're asking for unless one of them gets ahold of it and makes it public.

I am not a layman making a comment offhand.

Edit: I've updated my post above with some background information that may interest you or others.

Edit 2: I read some of your other posts on this subject because I was curious and I think you are possibly correct regarding OS-level mitigations for complex encryption keys which could be implemented via thread shifting (and made public by Apple once done) but exfiltration of information that is not heavily encrypted is still an issue. We'll see what source code they release, I hadn't heard any of it was going to come out to validate the PoC. I have a ton of anecdata around this area specifically but I can't share it as I said.

So if I'm spending many thousands on a new computer I'd like one with a more narrowly scoped DMP as Intel has done in their latest CPUs. It's not perfect but it's something.
 
Last edited:
Not to mention the inherent security flaw (GoFetch) in all M series chips. The M3-only mitigation is not sufficient because it requires developers to opt-in and change their code when performing specific operations, and Apple didn't even disclose this functionality existed until last week. Javascript can execute an attack that steals data, I am not satisfied that Apple is only disabling the Prefetcher when performing encryption operations (which only works on M3 chips), and also that it doesn't (so far?) seem like a bug bounty was paid out for this is which is incredible given how significant the flaw is.

Unless the Ultra has some logic redesign specifically addressing this I would absolutely not spend a ton of money on a decked-out pro machine which was my plan. I'll either hobble along with my old Mac for another year or get a base level M3 Studio and replace it as soon as M4 ones are available.

I completely agree about your points also, the neural engine is likely to be massively upgraded in M4.
As long as you're looking on the bright side.
 
The problem with Apple chips has always been clock limiting due to poor thermal design. This requires Apple to over design the chips and not really give the increased processing power to the user. It is sad that Apple prefers to focus on toys for the masses versus real computers because they have the technology.
 
  • Haha
  • Angry
Reactions: eldho and AlexMac89
Given the size of the m3 max, it’s hard to imagine how much a m3 ultra will cost.
 
As long as you're looking on the bright side.
The bright side is that we have a security industry working to make computing more secure for everyone instead of hoarding critical information about flaws like was the practice 15 years ago. We have come a long, long way.

This type of stuff has always existed it's just that now the public knows about it because nation states have figured out protecting everyone is better than leaving gaping holes in interconnected infrastructure. There is far more research being funded and much better hygiene around vulnerability disclosures which is net positive for everyone.
 
Last edited:
Given that the m4 generation will likely have big modifications to the neural engine to support AI, and likely thunderbolt 5 /usb4v2, I would be very wary of an m3 ultra chip if it doesn’t have those features. In fact, I would skip it.
If Apple keeps to the 18 month replacement cycle the 1st M4 chip will be out April or May 2025.

Something to consider if you want to "wait" another 13-14 months.
 
  • Like
Reactions: Chuckeee
I am under extremely restrictive NDAs so I can't go into detail, but the nature of the flaw is very bad to put it lightly. There aren't any full-detail proof of concepts for a very good reason, because they would be immediately co-opted by bad actors. You're never going to get the evidence you're asking for unless one of them gets ahold of it and makes it public.

I am not a layman making a comment offhand.
Common sense indicates that this is probably true. But Apple propaganda (marketing) costs Apple millions just for the reason to get user to forgo any common sense.
 
This is good news!
No need for an efficiency-focused chip on a desktop machine.

I have gotten an M1 Ultra 128GB at launch, and I'm actively looking forward getting an M3 Ultra.
I cannot afford it now, so it won't be at launch, but hopefully in late 2024.
 
ultra. extreme, max …. What matters today is the number of ML/Cuda Cores.

If you want to know what innovation looks like:
The M chips are glorious, and I jumped on the apple silicon bandwagon at day one.
That said...
I'm still sad Nvidia and Apple parted ways.
Nvidia is just unstoppable, and at the forefront of innovation; As much as I love macOS and prefer it to windows, there are some workflows in which windows+nvidia RTX GPU is just miles ahead.

The M series will always be the most incredible SOC solution with no dedicated GPU, but they seriously need to develop some useful in-house technologies to help creatives in the AI and 3D sphere.
And that'd be only half the work...the other half being developers supporting the features.

I'm seeing this with the whole machine learning thing...
Technically speaking, you could take advantage of the specifics of M chips (with their neural accelerators), but few developers really do support such features.

I have experienced this first hand with Generative AIs.
Stable diffusion, for example, technically works on mac...but it's very limited and succumbs to errors and crashes all the time.
If you want to work in 3D or AI, you need windows + RTX GPU...

Thankfully my workflow is still 2D based (I'm a 2D Motion Designer), but I still feel my M1 Ultra, at 6K$, performs half as good as a comparable 3K$ PC; even as a mac enthusiast, I have to accept this.
 
Honestly an "extreme" version has always kinda made sense. Like why would the Mac Pro otherwise exist if it's just the same as a studio but $3k more expensive? I have to assume that there was meant to be an "M2 Extreme" for the Mac Pro but they ran into issues and couldn't get it out in time to release the Mac Pro as they had promised. So instead they threw an Ultra into it and accepted that it was basically a pointlessly large expensive box.
 
Most of the industry is gearing up to more aggressively pursue chiplet designs and stacking techniques to get around larger and more defect-prone chips that only get harder to make large at smaller process nodes. Apple’s Johnny Srouji has dropped hints about advances in “packaging” which likely references Apple pursuing this too. Unless this is just stuff that’s already been in the pipeline and ends up being a one off or dead end, I wouldn’t expect them to try this.
 
  • Love
Reactions: Chuckeee
If there’s one YouTubers videos I roll my eyes constantly at it’s MaxTech.
Also Luke Miani....
I have no idea why I was subbed to the guy to begin with... (mind you, it was just before getting my M1 Ultra, so I was subbed to pretty much all apple-silicon related channels, back then).

I don't think he's evil or something, but the content is a little...meh...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.