Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's time for Apple to show us what they've got, ship a performance desktop,

I mean… they have? The Mac Studio is hardly low-performance.

Engineering is hard and Apple doesn't care if anyone gives them a pass?

A Quad M* Max package is a shockingly complicated engineering challenge.

Doesn't matter. It's Apple's self-imposed deadline. They can't suddenly wiggle their way out of it with "oh, it turns out this is difficult".

If people are making personal plans based on a company saying something is going to happen in 2 years, that isn't on the company.

Of course it is.

And you actually believe that?

I don't find "they ended up with a design that would be so specialized and expensive, they wouldn't be able to make it up in volume" hard to believe.
 
  • Disagree
Reactions: DavidSchaub
Looks like Apple is having serious woes with sticking together a bunch of their modular SOCs...
 
Sure, let's base a business on a pirated product. 🤣

That's fine for personal use, but what kind of company wants that liability?
100% legal in the European Union and a lot companies use it . macOS is free and You can install on any device. It is origin macOS downloaded from apple store with no any changes. Anyway according to Apple macOS does not work on PC computers . And if it doesn't officially work then it can't be illegal , logical I guess 😆
 
100% legal in the European Union and a lot companies use it . macOS is free and You can install on any device. It is origin macOS downloaded from apple store with no any changes. Anyway according to Apple macOS does not work on PC computers . And if it doesn't officially work then it can't be illegal , logical I guess 😆
Most hackintosh support sites don’t say that, instead they warn people, it’s a use at you own risk senario For hobbyist. Yes you assume all being supported risks.
 
Last edited:
At this point I don't see how Mx based Mac Pro would be significantly better than Mac Studio. They can stack cores as much as they want to but that requires a lot of compromises because scaling is not 1:1 ratio as seen on M1 versions. Again Apple's best bet is to keep Mac Pro x86 and somehow either switch to AMD all the way or get nVidia on board. I think people would have a lot more flexibility when they can boot to either macOS, Windows or Linux distribution and have powerful x86 CPU and GPU on board.
 
That would be ideal - a classic, upgradable x86 cheesegrater. But that ship has sailed I'm affraid.
 
We’ll be right back.
Updates are coming to the Apple Store. Check back soon.
Macworld says it has seen this message twice today, for both US and UK online Apple Stores.

We’ve seen the U.K. and U.S. Apple store go offline twice today: April 24, 2023, so we are curious to know what is going on? Apple is expected to launch a new 15-inch MacBook Air in the next few weeks, could the company be gearing up to launch it today?
 
The worst problem with the MacPro is the fact that you have to unplug every single device to get the cover off. The old MacPro has a nice drop-down panel. The new MacPro one has to lift the entire housing up and off and if anything is plugged in, you have to unplug it. So dumb.
 
Engineering is hard but not that hard. It's not like they're designing a whole new chip from the ground up.
The interconnect is probably harder than doing the CPU itself, so multiplying the complexity of that interconnect raises the whole complexity of the AS exponentially.

There's a reason Apple's interconnect is pretty unique, as nobody has even got that far before. (at least that I know of. I'm a COSC graduate with heavy EE leanings and it just breaks my brain in how much you have to think about in such a system.)
 
  • Like
Reactions: Realityck
I agree. Apple's SoCs have worked well so far for:
iPhone
iPad
Apple Watch
EarPods
Apple TV
MacBook Air
MacBook Pro
iMac
Mac Mini

But when customers need the fastest processors that go in large desktop computers, SoCs have not been known to outperform dedicated processors. If a Mac Pro with an arm64 processor can't compete in this market segment, then it's harder to justify why Apple should sell one.
The current version of iMac is essentially a giant iPad with a build in stand but no touch screen.
 
  • Like
Reactions: gusmula and opeter
Certainly, nVIDIA has some responsibility for this state of affairs. They could for instance, open source CUDA so others could port their GPU's or third parties could develop solutions. That would allow software that uses CUDA to work with a broader range of GPU's.
?????

Why would any company give up its product?

It's like saying Apple should just open source iOS so that it can work with a broad range of phones.
 
Compare this to the cheesegrater Mac Pros; it was a pain in the ass trying to do something even as simple as flash a new or replacement GPU in those things. You hoped and prayed the EFI ROMs would work with the Mac Pro mobo and then-current version of OSX. Also, there was a time the "Mac Pro compatible" GPUs with the right ROM were more expensive; also, no one wanted to waste time trying to flash sketchy firmware "patches" to get unapproved PC GPUs to work in Mac motherboards. And that's just the GPU, i.e. one of the simplest things to install. Imagine the headaches with more esoteric gear.
Takes me back to the good ole days when I went from an ATI 5870 to an AMD RX 580 because I wasn't going to splurge on the Apple "Blessed" 7970. What a time to be alive.
I did try the GTX 680 at some point but I remember having trouble with it and ultimately gave up or was stuck on an older OS for a while. Could have been both.
 
  • Like
Reactions: gusmula and Velin
That does seem like a conflict of their current interest's. But if the right emphasis came along they might. The right game can make an amazing justification for a change in attitude. I just don't think anything VR gaming is at this level yet that Apple would use.
Not to mention the elephant in the room....p*rn.
 
I call bullcrap... someone has been feeding Gurman misinformation for a while now. His constant back and forth on expectations is getting ridiculous. It seems that he and Ming-Chi Kuo are constantly battling to see who can get these rumors out first.

Furthermore, this rumor makes ZERO sense... Apple is not going to release the Mac Pro with an M2 Ultra and then wait for the M3 Ultra for the Studio. It makes no sense to produce a single variant for such a low volume system and then several months later add an SoC to the Studio that is much more powerful.

It would make much more sense to update the Studio line to the M2 at WWDC, and preview a new "M3-based*" Mac Pro, that would be released later in the year.

(*Although I still believe Apple is developing a completely different chip for the Mac Pro, one that fits into a more expandable, high performance system.)
 
I call bullcrap... someone has been feeding Gurman misinformation for a while now. His constant back and forth on expectations is getting ridiculous. It seems that he and Ming-Chi Kuo are constantly battling to see who can get these rumors out first.
He does seem along with Ming-Chi Kau way to spontaneous to offer his views along with MacRumors propping up their opinions too much. You could this year claim the paid press or financial analysts are constantly directing what we discuss and need to be put on leases. Remember Kau’s crusade concerning the AR/VR vs Apple being the last hope, most of us are sick of these two directing our interests to where Apple is going.
 
Last edited:
  • Like
Reactions: gusmula
The interconnect is probably harder than doing the CPU itself, so multiplying the complexity of that interconnect raises the whole complexity of the AS exponentially.

There's a reason Apple's interconnect is pretty unique, as nobody has even got that far before. (at least that I know of. I'm a COSC graduate with heavy EE leanings and it just breaks my brain in how much you have to think about in such a system.)

Well, AMD's Infinity Fabric connecting multiple chiplets on a package is a very similar thing. It is certainly not impossible, but it is really tricky.

And if Apple is putting a PCIe bridge chip on their UltraFusion fabric as well, that only makes it more complicated.
 
Well, AMD's Infinity Fabric connecting multiple chiplets on a package is a very similar thing. It is certainly not impossible, but it is really tricky.
I'm not familiar with AMD Inifinity Fabric, guess I have some reading to do! I've always thought of AMD as more of an x86 clone and not interesting enough to look in to -- my problem.

No, I don't think it's impossible either, but there's the question of is it worth the work, and how long is it going to take. I really don't know enough about apple's architecture to even make a guess.
 
My bad - M2 Pro memory is 5 times slower, M2 Max memory is "only" 3 times slower than 4090 series cards :D Let's not forget it still needs to efficiently manage and allocate the unified memory since, well, it's shared for everything - unlike VRAM in a graphic card which is sitting there empty waiting only for the GPU to kick in. Also that is only my guess but I doubt M silicon is using it's unified memory with 100% efficiency because that would mean all the software needs to be well optimised and coded. And if it's managed by the M silicon, queueing data access will introduce latency.

What does M1 Ultra performance have to do with anything? The absolute cheapest model of M1 Ultra is $4k currently.
$4k buys you a PC which SMOKES M1 Ultra in 3D / AI / Adobe creative suite applications (single core performance is the only important thing with Adobe - because they don't care and didnt optimise their software). The only thing it *might* be faster in is 4K/8K encoding.

All Apple can do when communicating with someone interested in high performance is lie by showing a misleading graph of how M1 outperforms a 3080 series card at low wattage, which has absolutely no meaning in real life. I have no idea who came up with that graph but it was soooo stupid...

M silicon is cool tech which freed Apple from other (sh*tty on their own terms) vendors such as intel and nvidia.
But the path they chose is not for prosumers. And it was a wise move - it's just a tiny niche market anyway, comparing to the buying potential of all other customers. The benefits of closed architecture vastly outweigh the cons. But they should have declared clearly there is no Mac Pro coming in foreseeable future - since they did declare its coming and didn't deliver. After they realised where the tech world is going (good job not noticing it sooner...), it was fair to announce something, even between the lines, what the future holds. Are we getting semi-open M silicon? Are we getting binned version of regular M silicon? Or maybe a dedicated, GPU-enhanced M silicon just for Pro model? I'm in the market for a Mac Pro, I dont know if I should be waiting or just give up. Considering they didn't deliver on their promise, they owe us at least a controlled leak of what's going on...
My guess is that at the MP level Apple will be continuing in their higher efficiency direction. High performance, but not via hot inefficient bundles of PCI GPU and huge amounts of off-chip RAM. 2023 it seems that including PCIe access would make sense due to users' investment in cards, but what really interests me is to see what the 2023 MP architecture and M3 chips portend for the future. Personally I have always felt that Apple would need M3 transistor densities to make the splash an upgraded MP needs to make.
 
It's a big gamble.

If AI/ML turns out to be a flash in the pan, no harm done.

But if AI/ML turns out to be the future, they will be waaaaaaay behind even more.

From looking at what happened to Siri, it's going to be a very, very serious problem for Apple if the latter is true.
No. Apple choosing a different track than old-style hot desktop behemoths does not mean Apple considers AI/ML a flash in the pan. They may simply be choosing a different long-term track to serve their perceived future users. We get to watch it play out.
 
If Apple really is making a VR headset (I still don't believe it) and they plan to keep ignoring gaming, they're in for some real trouble.
But it's supposed to be AR/mixed reality. Not exactly gaming hardware like Oculus. More like a wearable external screen.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.