I don't know. But my gut says - they'll refresh it at WWDC along with the Mac Studio.Should I buy the M2 Mac Mini now or is there a real hope for WWDC, there are zero news on it
I don't know. But my gut says - they'll refresh it at WWDC along with the Mac Studio.Should I buy the M2 Mac Mini now or is there a real hope for WWDC, there are zero news on it
I sure hope it isn’t the last one, but that was the first thought that occurred to me.
Another thought is that Apple could move to an entirely different approach for the high end desktop. Have one die that has lots of CPU cores (plus other basic SOC stuff) and move the GPU to its own die that connects via ultrafusion. That would allow for more customization in terms of cpu to gpu balance.
It would probably not make sense to develop such a highly custom design for the existing high end Mac desktop market. But if Apple were to also use such a design for AI model training in their own data centers, then it might make sense
Not to mention the inherent security flaw (GoFetch) in all M series chips.
Javascript can execute an attack that steals data
I am not satisfied that Apple is only disabling the Prefetcher when performing encryption
which is incredible given how significant the flaw is.
I feel the same way about iJustine.When will we finally stop confusing popularity with competence? It is entirely possible that this theory is correct, however, it is unlikely to be correct simply because Yuryev said so. He knows a great deal about video editing, general computing and semiconductor technology are not really his forte though.
Or maybe they could try some actual courage, and admit that they're not good at building GPUs.
I am under extremely restrictive NDAs so I can't go into detail, but the nature of the flaw is very bad to put it lightly. There aren't any full-detail proof of concepts for a very good reason, because they would be immediately co-opted by bad actors. You're never going to get the evidence you're asking for unless one of them gets ahold of it and makes it public.Don’t worry, this “security flaw” will be part of all CPUs before long because everyone wants more performance.
What? Where did you get that idea from?
We still don’t know whether the exploit even works under macOS. I think you might be jumping to conclusions very quickly.
So far it has not been demonstrated that the flaw is significant at all. What we know is that a research group has managed to guess an encryption key under some very specific, carefully setup conditions, using a testing framework they have not disclosed. Wait with the pitchforks until the data is available at least.
As long as you're looking on the bright side.Not to mention the inherent security flaw (GoFetch) in all M series chips. The M3-only mitigation is not sufficient because it requires developers to opt-in and change their code when performing specific operations, and Apple didn't even disclose this functionality existed until last week. Javascript can execute an attack that steals data, I am not satisfied that Apple is only disabling the Prefetcher when performing encryption operations (which only works on M3 chips), and also that it doesn't (so far?) seem like a bug bounty was paid out for this is which is incredible given how significant the flaw is.
Unless the Ultra has some logic redesign specifically addressing this I would absolutely not spend a ton of money on a decked-out pro machine which was my plan. I'll either hobble along with my old Mac for another year or get a base level M3 Studio and replace it as soon as M4 ones are available.
I completely agree about your points also, the neural engine is likely to be massively upgraded in M4.
I long for a time when every computer product review video doesn't consist of "scrubbing the timeline in Final Cut Pro."He knows a great deal about video editing
Mac Pro should have 4 ultra chips
The bright side is that we have a security industry working to make computing more secure for everyone instead of hoarding critical information about flaws like was the practice 15 years ago. We have come a long, long way.As long as you're looking on the bright side.
If Apple keeps to the 18 month replacement cycle the 1st M4 chip will be out April or May 2025.Given that the m4 generation will likely have big modifications to the neural engine to support AI, and likely thunderbolt 5 /usb4v2, I would be very wary of an m3 ultra chip if it doesn’t have those features. In fact, I would skip it.
Common sense indicates that this is probably true. But Apple propaganda (marketing) costs Apple millions just for the reason to get user to forgo any common sense.I am under extremely restrictive NDAs so I can't go into detail, but the nature of the flaw is very bad to put it lightly. There aren't any full-detail proof of concepts for a very good reason, because they would be immediately co-opted by bad actors. You're never going to get the evidence you're asking for unless one of them gets ahold of it and makes it public.
I am not a layman making a comment offhand.
Agree, I can’t stand herWhatever new M series comes out later on, JUST PLEASE DO YOURSELF A FAVOR AND DONT WATCH IJUSTINE FOR THE REVIEW.
The M chips are glorious, and I jumped on the apple silicon bandwagon at day one.ultra. extreme, max …. What matters today is the number of ML/Cuda Cores.
If you want to know what innovation looks like:
A motherboard, processor and gpu in one package.They've been using chiplet designs for years. What do you think Ultra is?
Also Luke Miani....If there’s one YouTubers videos I roll my eyes constantly at it’s MaxTech.
The PowerBook G5 rides at long last... heat and battery life that would make an Intel chip blush 😂Put it in a MacBook you cowards!
you are wrong againSoC design is a failure and that's why Apple can't make beyond Max chip. They really need to use chiplet design so that they can make whatever they want.