Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m 99% certain that Apple ...
I'm not...and that's for 2D and music only.

3D is a bigger market Apple has never supported. Forking back to x86 and expanding GPU compatibility would allow Apple to finally support all the various industries that rely on 3D with something other than half ass measures (Arch, ArchViz, VFX, Game Development, Industrial design, Gaming etc.)
 
I really don't understand the negative reaction these Mac Pro rumours are getting. This is exactly what was expected as soon as Apple Silicon was announced.

The problem isn't upgradability, it is:

1. Apple's prices for BTO expansion are too high;

2. Apple has been trying, and failing, to ship a "4 * Max" "M* Quadra" chip for a while now.

If Apple's built-to-order prices were reasonable, the value proposition would be better; and if Apple actually shipped an "M* Quadra" they would have a computer, with approximately twice the power of the M1 Ultra Mac Studio, which was moderately competitive.

Apple didn't really make prosumer desktops from 2012 until the Mac Studio was released. Future upgradability is clearly something Apple has moved away from. If that is what you want, or you need a TB of RAM, it is probably time to find a new computer platform.

Apple can solve expandability, even if the Mac Pro can't be upgraded in the future.

I'm still guessing the Mac Pro will include an on-package UltraFusion<->PCIe bridge chip, that will turn the Mac Studio into a Mac Pro with a pile of PCIe slots and storage expansion options. The market for this is still small, so I expect that is all we're going to get.

While I'm pretty confident that Apple's video out will be limited to Apple's integrated GPU, there is nothing stopping Apple from support discrete PCIe GPUs purely for software acceleration (as well as all the network cards, storage controller cards, SSD cards, external hardware controller boards, capture I/O Boards, etc...).

I bet: multi-sockets are not going to happen, and a dedicated SoC for the Mac Pro isn't going to happen.
 
here come the pages of people who never even planned on purchasing the thing complaining…
It was inevitable.
Steve certainly knew it, he worked on their first MacBooks without replaceable batteries, and the first MacBooks without upgradable ram came out only a few short months after his death, and he obviously knew about the trashcan MacPro.
But I’m sure we’ll have some “but Steve would never” comments in here.
It’s really simple, the Macpro doesn’t even sell in the millions per year. Neither does the Mac Studio.
The cost of making custom parts for it that make the computer totally upgradable is a huge waste of money for them.
OK, but going with your premise for a sec why would it make sense to even have the machine in the lineup at all if it's just a bigger studio? Wouldnt the cost of an entire production line be a huge waste of money in the same way you're saying but writ large? If they're going to do this machine without the expandability then, unless I'm missing something, it's going to be more of a waste than spending the effort to have expandability.
 
What would be really cool, though highly unlikely, is if the M2 Ultra is a coprocessor with an Intel Xeon for the first gen AS Mac Pro, leveraging the x86 chip for extra expandability. It's not unprecedented, the Apple II for ex, but it's unlikely :)
 
  • Wow
Reactions: gusmula
If they chose to use socketed SoC board modules each with its bank of memory, why couldn't they treat this easily as a computer cluster?
They'd have to build a whole interconnect system and re-implement message passing and scheduling abilities like what they used to have with xgrid but much more generalized for that to be useful - and it'd only really benefit *highly* parallel workloads. If they're going to spend the time on that I would think it would make more sense to just breakout RAM and GPU from the SoC instead.

They could just do MP and NUMA, and have each SoC work as part of a more general multiproc type machine, not a segmented compute cluster, but again that seems unlikely and a lot more work than breaking out RAM and GPU
 
Apple silicon can’t be as good as a high end pc with nvidia 4090 etc so what’s the point the m1 ultra studio is fine for 99.9% of pros the Mac Pro was a huge con and waste of money they never really upgraded the cards so what’s the point of it exactly you spend £50,000 for what exactly nothing
It makes for an effective doorstop or paperweight. That is, if you exclude the wheels. :)
 
  • Haha
Reactions: jakey rolling
They should just put the SoC on a "compute card", so people can at least upgrade that way. Want more RAM, more GPU cores, or an upgrade to M3? Just buy a new compute card and swap them. Would be the most elegant way of combining upgradability with the unified architecture of Apple Silicon.
Aaaah, the Processor Direct Slot: I remember it with such fondness…
 
What the Mac Pro primarily needs is more massively parallel compute, not real-time graphics. It is possible Apple has a solution, but isn't calling it a GPU. Something to further accelerate 3D ray tracing and scientific compute workloads. Maybe they will have add-on MPX compute cards. As impressive as the M2 Ultra will be, it will not have enough parallel compute power for the most demanding users. This might seem like a niche use, but it is for the users Apple is most proud of representing and represents a large number of Mac Pro users. If they don't have something for this it probably just isn't ready and will launch next year. It will be interesting to see if there are any surprises.
 
Last edited:
Why is everyone so sure there won't be internal PCIe slots? They can be connected internally via Thunderbolt 4. It's an internal "eGPU"... macOS needs driver support and drivers for the cards, but they have engineers that can do that.
Because as far as they know PCIe slots are ONLY for storage and GPU’s. Apple has a page that lists all the current use cases for current Mac Pro’s, I’d imagine those hardware vendors are working on Apple Silicon drivers (because they likely don’t have a lot of competition, it’s easy money!).
 
Well Apple sure is innovative/courageous/arrogant (delete as applicable) enough to make the same mistake twice. And the recent iPhone line-up kinda shows they either don't really know, or perhaps value, what their customers actually want from each line anymore.
 
I just don't understand how Apple gets away with insulting consumers like that. It's disgusting.
They’re only insulting the folks who aren’t/weren’t buying it anyway, which is fine. Apple’s pretty good at focusing their products on the specific folks that will buy them in enough numbers to make them profitable, while at the same time ignoring all the folks that are complaining about how they’re going to not buy a product that Apple doesn’t mean for them to buy.
 
But the remedies generally slung by those who become aware is to cancel and/or return the slowed option and then pay up for the more expensive configurations that have the desired speed.
I’m surprised no one is saying to just NOT buy it. Apple, unless they’ve misjudged their products, will sell in the neighborhood of 20-30 million Macs this year. By the time they get to selling another 20-30 million next year, they’re bound to have made changes that MIGHT make it worth it for them. And, if Apple doesn’t release anything they may want, there’s always the next year or the year after that.

I agree, suggesting that someone buy more of a thing that they don’t want isn’t right for the individual and it sends the wrong message to Apple (that this is all ok with the person that plunked down the money).

Not buying anything we consider gimped would get their attention if enough people voted with their wallets.
This is the key, though. Apple will likely sell 20-30 million Macs, many configured just like this, to folks who don’t care because it’s still WAY faster than their 2018 Mac, for example. If Apple’s only market was “folks that bought Macs last year”, then I can see how that might force them to ensure incremental increases year over year on a particular model. However, as half of that number (roughly 15 million) will not have used a Mac before OR not have used ANY computer before, it’s hard to have a “call to action” for that person NOT to buy a thing that’s perfectly fine by them.
 
As far as Apple is concerned - you don't exist as an Apple customer, if you are still trying to keep an 11 year old computer running.

At some point, you have to let go of old hardware - you truly don't realize how far behind you are.
I'm pretty sure they're aware of my existence, what with my multiple  services I'm subscribed to. I'm also well aware of the limitations of my now eleven year old computer as I do IT at a school with many M1 MacBook Airs floating around and am typing this on an M1 13" MacBook Pro. Still, with 128GB of RAM and a 6000MB/sec SSD PCI RAID in my tower at home, and three monitors (two 4k) it still runs Lightroom, Photoshop, and even FinalCut Pro with aplomb. Definitely looking to replace it soon enough with an M2 Pro mini or a Mac Studio (mainly for power saving, but the much improved speed is also welcome), but my coffers are pretty low at the moment and so long as my MacBeasht keeps on keepin' on, I'ma save my pennies. My original reply was more of an obnoxious semantics observation in that Apple definitely *has* made expandable machines, and in fact still ships one, and probably you meant "going forward" they won't, which is indeed possible (however undesirable that may be). But hey, this is the internet and inarticulate language apparently compelled me to be "that guy."
 
  • Like
Reactions: gusmula
What would be really cool, though highly unlikely, is if the M2 Ultra is a coprocessor with an Intel Xeon for the first gen AS Mac Pro, leveraging the x86 chip for extra expandability. It's not unprecedented, the Apple II for ex, but it's unlikely :)
I think what's more likely is that Apple has been developing a secret "X1" chip or some such that does in fact support PCI GPUs and perhaps even "application memory" in addition to the baked in RAM. So the on CPU memory of this unicorn machine would act like it does in the M series, but if you've a really RAM hungry application you could throw some RAM sticks in the thing to feed that program, but the OS would always just use the baked in stuff. Maybe down the line they bring eGPU options back to Pro level laptops with an M3x chip or something. Like others, I can't imagine there being enough of a market for a crazy expensive tower that only offers non-GPU PCI expansion. Mac Studio was kept secret until the very end, so maybe they can keep some wild new chip under wraps as well? Of course this is all just wild speculation.
 
I'm curious to know how many actual Mac Pro owners use Bootcamp Windows on it, and if they are willing to give up Windows when they move to Apple Silicon.
My guess would be… not that many. Folks spending that much are likely in the camp of “I need a really fast Mac to do Mac things on in order to make money”, so every minute it’s booted into Windows is a minute it’s not running Final Cut or Logic or the Mac versions of programs that user needs to make money. Those folks don’t have a problem giving up BootCamp because they never used it.

Now, I’m sure there are some that bought it because it’s the fastest Mac and they wanted it. If I had the disposable income, I wouldn’t mind having one to play around with. Might even install Windows on it just to see what that’s like. But, I’d be nowhere close to the “average” person that owns one :)
 
The real problem with the Apple chips is that they do not do virtualization. On my Studio, I had hoped to be able to run Parallels with Windows subsystem for android, but that is impossible.
Wow. I hadn't even considered this.

How about Wine/Crossover? Does that work?
I guess the problem is not so much the virtualisation, but rather that Windows has no (decent) version for ARM architecture?

edit: I checked, but it seems that there is an AS version of Parallels Desktop available that supports Windows 11. Have you tried it yet?
 
Last edited:
What would be really cool, though highly unlikely, is if the M2 Ultra is a coprocessor with an Intel Xeon for the first gen AS Mac Pro, leveraging the x86 chip for extra expandability. It's not unprecedented, the Apple II for ex, but it's unlikely :)
Also, there's a more recent example of coprocessor shenanigans— the discontinued iMac Pro with its Intel Xeon CPU and T1 chip for I/O and SSD control and such. Thing is, that machine was a hot mess, at least the one my 3d modeling friend got was. Man did he hate that thing. So I'd wager Apple would shy away from any future chip mixing.
 
Mac Enthusiasts are JUST getting around to noticing this? :) Myself and others have been saying for YEARS that RAM upgradability is likely gone the way of the dodo across the entire lineup… some were saying that even before the first M1 shipped. And, since it shipped, that thinking has held true for every Mac they’ve released since then.

(I’m sure I can find one of my old posts somewhere out there…)

Well done you. 👍
 
Last edited:
Whole lot of FUD and illogical thinking up in here...

CPU upgrade, sorry... But with Intel, best hope the socket doesn't change...!

RAM upgrade... Not able for ASi, but okay with the Nvidia Grace Hopper super chip...? Hypocrites...

Storage...? Just get what level of performance one wants on the internal...Need large amounts of storage, up to 64TB with a RAID card...

PCIe expansion is useless without third-party GPUs...! Yeah, because a GPU is the singular use-case for PCIe slots...

Apple can easily offer ASi GPGPUs, display output handled by the iGPU in the SoC, render jobs handled by ASi GPGPU(s)...

Disposable computer...? But if a "real Pro", one can slot the old computer into a render farm... And the "real Pro" is writing off their Mac Pro(s) over time any way, right...? Right...?!?

Those going on (and on, And On, AND ON) about needing massive amounts of RAM and Nvidia GPUs, build a PC already...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.