Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If they continue to offer the MacPro with the same chip complement as the Ultra Studio, the single biggest thing they could do to enable an MP purchase to make sense is to enable it to be upgradable through at least a few generations of AS.

I.e., when the M4 MP is released, those who already own an MP should be able to simply slide out the entire M2 circuit board and replace it with an M4 board, rather than having to replace the entire machine. The case/fans/PS for the MP should easily last a decade (and any parts that fail are readily replaceable).

It's wasteful to have to throw the whole thing out with each upgrade. Apple should stop paying lip service to being green, and actually be green.
 
If they continue to offer the MacPro with the same chip complement as the Ultra Studio, the single biggest thing they could do to enable an MP purchase to make sense is to enable it to be upgradable through at least a few generations of AS.

I.e., when the M4 MP is released, those who already own an MP should be able to simply slide out the entire M2 circuit board and replace it with an M4 board, rather than having to replace the entire machine. The case/fans/PS for the MP should be able to last a decade (and any parts that fail are easily replaceable). It's wasteful to have to throw the whole thing out with each upgrade. Apple should stop paying lip service to being green, and actually be green.
That sounds reasonable in terms of being green, but don't you expect the costs of that upgrade to be almost as much as the whole machine? I'd guess it is at least 75%. I think people may choose to just get a new one in such a scenario.
 
That sounds reasonable in terms of being green, but don't you expect the costs of that upgrade to be almost as much as the whole machine? I'd guess it is at least 75%. I think people may choose to just get a new one in such a scenario.
I would think the BOM for a new MP board should be less than that for an entirely new Studio with the same specs. If so, Apple could offer an MP replacement board for the same price as a new Studio, and have the same or better %profit margin.

Consider an upper-end M2 Ultra chip with 128 GB RAM and a 2 TB SSD.
MP price: $9,200 tower; $9,700 rack mount.
Studio price: $6,200

If Apple sold an M4 MP board for the same price as the same-spec'd M4 Studio (and assuning the M4 and M2 pricing is about the same), you'd save about 1/3 by upgrading just the board. That seems significiant.
 
Last edited:
Needs huge memory support (2-4TB+) for full LLM implementations, ideally as high bandwidth VRAM...

NPU processing is going to be even more important than CPU or GPU processing.

So the machines have to have upgradable memory, to expand into multi-terabyte sizes. But since the Mx CPUs memories are directly attached to the CPU die, it likely means that memory is would have to be expandable alongside the CPUs. In other words, instead of adding DIMMs to expand memory, you add CPU modules to expand memory, and there's some kind of interconnect between CPUs to access each other memories/caches/etc.
I think if they take the Mac Studio anywhere further, it'll be something along these lines but not with expandable memory as that is impossible without breaking what is the greatest special sauce aspect of Apple Silicon chips. I think they're more likely to create a Mac Pro that has expandable 'compute modules' that are basically just M Ultra series chips with ultra high bandwidth interconnects so that the Mac Pro of the future could basically be a mini cluster.

Apple is reportedly building an M5 Apple Silicon based data centre for AI and that kind of tech is exactly what would be needed there to make it worthwhile and would also make the most sense in a Mac Pro aimed at AI. As it is, the upcoming M4 Ultra 256GB Mac Studio will sell like hotcakes for LLM work. The biggest problem with scaling Apple Silicon up to the biggest LLM models (and for training them) is that there's no efficient way to connect multiple of them.

The Mac Pro makes zero sense as it stands with the existence of the Mac Studio. However, if you turn the Mac Pro into a mini-cluster of closely interconnected mini-Mac Studio type compute cards then you could have an AI workstation with up to something like 8 M4 Ultra compute modules with a total of 2TB of RAM for like $60k. While that's obviously way out of the realm of the kinds of people who would have bought Mac Pros in the past but that would make a lot of sense for people working with LLMs in a business setting.
 
Needs huge memory support (2-4TB+) for full LLM implementations, ideally as high bandwidth VRAM...

NPU processing is going to be even more important than CPU or GPU processing.

So the machines have to have upgradable memory, to expand into multi-terabyte sizes. But since the Mx CPUs memories are directly attached to the CPU die, it likely means that memory is would have to be expandable alongside the CPUs. In other words, instead of adding DIMMs to expand memory, you add CPU modules to expand memory, and there's some kind of interconnect between CPUs to access each other memories/caches/etc.
Surely it's only a small fraction of highly-specialized LLM implementations that use that much VRAM. Can you cite any workstations (as opposed to computer clusters) that offer this? Remember the MP is a workstation, so you need to compare it to other workstations.

I believe NVIDIA's highest-VRAM GPU is the dual-GPU H100NVL, which only offers 188 GB:


When you say the MP needs to offer 2-4 TB VRAM, it sounds like you're saying each MP needs to be competitive with a multi-GPU server cluster. And I don't think that is Apple's target.
 
Last edited:
What is the market for a MacPro?
Does Apple want to continue playing in that market, “bragging” the performance of its chips?
time will tell
Every computer they sell comes with a clear “if you don’t need” clause these days. That clear line where,”Oh, that’s what you want to do? Yeah, these, or higher, are the only ones for you.”

Air - mobile, thin, light - if you’re a light user so don’t need fans
MacBook Pro - mobile, fans, more ports - if you don’t need more than 128 gigs of RAM
Mac Studio - desktop, even more ports - if you don’t need slots for network cards storage cards, etc.
Mac Pro - desktop/ports/slots <— That’s the market for a Mac Pro. Anyone that doesn’t need a macOS desktop with those many ports and internal slots should be getting something else.
 
Surely it's only a small fraction of highly-specialized LLM implementations that use that much VRAM. Can you cite any workstations (as opposed to computer clusters) that offer this? Remember the MP is a workstation, so you need to compare it to other workstations.

I believe NVIDIA's highest-VRAM GPU is the dual-GPU H100NVL, which only offers 188 GB:


When you say the MP needs to offer 2-4 TB VRAM, it sounds like you're saying each MP needs to be competitive with a multi-GPU server cluster. And I don't think that is Apple's target.
Servers with multiple H100s exists. That's how they get these large VRAM memories in one machine. They don't need clusters. DeepSeek-R1 was designed to fit in a machine with 8xH100
 
I don't know!

The only Mac Pro I ever saw in actual use was a recording booth for a full orchestra that my ex played in. It had x4 PCIe cards with 8 analog inputs per card, and they would record all 32 channels at every performance, then mix. Is there anything else than can do that?
A similarly configured PC yes.
 
I can see Apple unveiling custom I/O for Mac Studio that enables high speed data transfer between stackable units, including expansion units with PCIe slots for niche I/O like audio/video ports, additional networking and storage. The Mac Pro has outlived its usefulness in the Apple silicon system on a chip era.
 
Servers with multiple H100s exists. That's how they get these large VRAM memories in one machine. They don't need clusters. DeepSeek-R1 was designed to fit in a machine with 8xH100
Sure, but that misses the central question I raised: Why do you think a MP, which is a workstation, needs to compete with a multi-GPU server for VRAM (i.e., have 2 TB–4 TB VRAM)?:

Needs huge memory support (2-4TB+) for full LLM implementations, ideally as high bandwidth VRAM...

NPU processing is going to be even more important than CPU or GPU processing.

So the machines have to have upgradable memory, to expand into multi-terabyte sizes....

What's the highest VRAM currently available in any commercial workstation?

Your claim would make more sense to me if you instead said something like: "While no one expects a workstation like the MP to be able to run LLMs requiring 2 TB - 4 TB VRAM, Apple is also using its Ultra chips in its servers. If it wants to run upper-end LLMs on the latter, then those chips additionally need to be capable of inter-chip pooled memory."
 
I'd say that Apple will be making their own Discreet GPUs soon. for these machine ( well whatever they become in the future )
Not likely. If they will, you’ll hear about it at WWDC first. Look for any changes they make to the memory model. GPU’s for Apple devices, UNLIKE AMD/INTEL, MUST have access to the same RAM that the CPU has access to. If they don’t change that model, then there’s no discrete GPU’s.
 
I reckon the Mac Pro was designed when an M2 Extreme was still expected from the chip team. Obviously that never happened, maybe an M(something) Extreme can happen and the Mac Pro will find its market, but without it it’s just a bit lost.
There was never an M2 Extreme expected. The ONLY thing the Mac Pro was ever going to be (and I may have posted it here before its release), is a Mac Studio with slots, and that’s what it is. And, as such, just by existing, it was the fastest macOS system with slots, no need for it to be any faster.

Those unfortunate souls that leaked the M2 likely lost their jobs immediately after that as it was a planted story to see who would leak it.
 
Sure, but that misses the central question I raised: Why do you think a MP, which is a workstation, needs to compete with a multi-GPU server for VRAM (i.e., have 2 TB–4 TB VRAM)?:



What's the highest VRAM currently available in any commercial workstation?

Your claim would make more sense to me if you instead said something like: "While no one expects a workstation like the MP to be able to run LLMs requiring 2 TB - 4 TB VRAM, Apple is also using its Ultra chips in its servers. If it wants to run upper-end LLMs on the latter, then those chips additionally need to be capable of inter-chip pooled memory."
Developers can and do run LLMs locally. That's a thing. Additionally, many companies ban the use of accessing off-premise LLMs due to IP leakage concerns.
 
  • Like
Reactions: nt5672
Not likely. If they will, you’ll hear about it at WWDC first. Look for any changes they make to the memory model. GPU’s for Apple devices, UNLIKE AMD/INTEL, MUST have access to the same RAM that the CPU has access to. If they don’t change that model, then there’s no discrete GPU’s.
They don't have the in-house talent currently needed to undertake such an effort and would likely need to do something similar to how they acquired PA Semi. That is, in addition to your point about the whole memory architecture. Fraid we are on the current path for some time. Good for most but not all.
 
Developers can and do run LLMs locally. That's a thing. Additionally, many companies ban the use of accessing off-premise LLMs due to IP leakage concerns.
Yes, and they can buy multi-GPU servers and use them locally if they need 2 TB - 4 TB VRAM. Still doesn't answer my question about why the MP, which is a workstation, needs to have the capabilities of a multi-GPU server, since those are two different categories of product.

Just because a long-haul jet serving a major route needs to carry a few hundred passengers, that doesn't mean it makes sense to offer that same capacity in a regional jet.

This thread's about the MP, but it sounds like you're proposing Apple produce a new product—a multi-GPU server box with extremely high pooled VRAM—in addition to the MP.

If you're going to run an LLM that requires 2-4 TB VRAM, you need not just the VRAM, but the GPU capacity to go with it. Does the current MP box have the space, cooling, and power supply to support that? If not, that's going to have to be a different product.
 
Last edited:
Obviously easy to armchair engineer from my couch, but I'd still like to see a daughter-card-based CPU / Memory upgrade path ... For instance, the Mac Pro could come with 1 M5 MAX, but if you want more CPUs / GPUs / RAM you could order more M5 Max Cards that plug into a dedicated card slots and allows them to work in tandem.

Probably not technically feasible.
 
The whole release schedule seems a bit wonky.

Why wait and upgrade an M4 Ultra when within a few months there will probably be an M5?

Maybe they need to hold off and do the M5 Ultra for this device at launch time.
The M4 now for lower end iMacs and Macbook Air.

Stagger the releases better with more power first to the few power users willing to pay more.
 
Not so sure whether there will be a M4 Ultra variant. It might be an M5 version. Also not sure on how Apple, will redesign the Mac Pro.
 
  • Like
Reactions: mganu
Some people are happy to pay a premium to have everything inside one case. I am one of them.
 
Nice! This was 12 years ago or so and audio/video isn't my field. I had no idea analog audio bandwidth was so low, comparatively-speaking.

How about latency, though, with USB?
latency for straight up recording like that don't matter much because its all being played together. for doing re-recording / inserts / double tracking it can be a problem. while USB2 wasn't perfect for latency was workable.

now TB3 interface with 2-3ms latency, USB C is isn't much slower its really a non-issue. more a matter of tweaking the buffer size on the computer as low as you dare. also keep in mind that as a quick reference, sound travels at 1' / ms so if you sit 3 feet further back from the speakers you added 3ms of latency. playing a few feet further away from an amp adds latency too. so headphones are important. basically its a non-issue with modern gear within reason.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.