How about we forget about the Mac Pro being a fashion accessory and just make an outstanding performer that just works?
Pretty sure it was just released to look cool behind MKBHD and iJustine in their videos.
How about we forget about the Mac Pro being a fashion accessory and just make an outstanding performer that just works?
That sounds reasonable in terms of being green, but don't you expect the costs of that upgrade to be almost as much as the whole machine? I'd guess it is at least 75%. I think people may choose to just get a new one in such a scenario.If they continue to offer the MacPro with the same chip complement as the Ultra Studio, the single biggest thing they could do to enable an MP purchase to make sense is to enable it to be upgradable through at least a few generations of AS.
I.e., when the M4 MP is released, those who already own an MP should be able to simply slide out the entire M2 circuit board and replace it with an M4 board, rather than having to replace the entire machine. The case/fans/PS for the MP should be able to last a decade (and any parts that fail are easily replaceable). It's wasteful to have to throw the whole thing out with each upgrade. Apple should stop paying lip service to being green, and actually be green.
I would think the BOM for a new MP board should be less than that for an entirely new Studio with the same specs. If so, Apple could offer an MP replacement board for the same price as a new Studio, and have the same or better %profit margin.That sounds reasonable in terms of being green, but don't you expect the costs of that upgrade to be almost as much as the whole machine? I'd guess it is at least 75%. I think people may choose to just get a new one in such a scenario.
I think if they take the Mac Studio anywhere further, it'll be something along these lines but not with expandable memory as that is impossible without breaking what is the greatest special sauce aspect of Apple Silicon chips. I think they're more likely to create a Mac Pro that has expandable 'compute modules' that are basically just M Ultra series chips with ultra high bandwidth interconnects so that the Mac Pro of the future could basically be a mini cluster.Needs huge memory support (2-4TB+) for full LLM implementations, ideally as high bandwidth VRAM...
NPU processing is going to be even more important than CPU or GPU processing.
So the machines have to have upgradable memory, to expand into multi-terabyte sizes. But since the Mx CPUs memories are directly attached to the CPU die, it likely means that memory is would have to be expandable alongside the CPUs. In other words, instead of adding DIMMs to expand memory, you add CPU modules to expand memory, and there's some kind of interconnect between CPUs to access each other memories/caches/etc.
Surely it's only a small fraction of highly-specialized LLM implementations that use that much VRAM. Can you cite any workstations (as opposed to computer clusters) that offer this? Remember the MP is a workstation, so you need to compare it to other workstations.Needs huge memory support (2-4TB+) for full LLM implementations, ideally as high bandwidth VRAM...
NPU processing is going to be even more important than CPU or GPU processing.
So the machines have to have upgradable memory, to expand into multi-terabyte sizes. But since the Mx CPUs memories are directly attached to the CPU die, it likely means that memory is would have to be expandable alongside the CPUs. In other words, instead of adding DIMMs to expand memory, you add CPU modules to expand memory, and there's some kind of interconnect between CPUs to access each other memories/caches/etc.
Every computer they sell comes with a clear “if you don’t need” clause these days. That clear line where,”Oh, that’s what you want to do? Yeah, these, or higher, are the only ones for you.”What is the market for a MacPro?
Does Apple want to continue playing in that market, “bragging” the performance of its chips?
time will tell
Servers with multiple H100s exists. That's how they get these large VRAM memories in one machine. They don't need clusters. DeepSeek-R1 was designed to fit in a machine with 8xH100Surely it's only a small fraction of highly-specialized LLM implementations that use that much VRAM. Can you cite any workstations (as opposed to computer clusters) that offer this? Remember the MP is a workstation, so you need to compare it to other workstations.
I believe NVIDIA's highest-VRAM GPU is the dual-GPU H100NVL, which only offers 188 GB:
When you say the MP needs to offer 2-4 TB VRAM, it sounds like you're saying each MP needs to be competitive with a multi-GPU server cluster. And I don't think that is Apple's target.
A similarly configured PC yes.I don't know!
The only Mac Pro I ever saw in actual use was a recording booth for a full orchestra that my ex played in. It had x4 PCIe cards with 8 analog inputs per card, and they would record all 32 channels at every performance, then mix. Is there anything else than can do that?
Sure, but that misses the central question I raised: Why do you think a MP, which is a workstation, needs to compete with a multi-GPU server for VRAM (i.e., have 2 TB–4 TB VRAM)?:Servers with multiple H100s exists. That's how they get these large VRAM memories in one machine. They don't need clusters. DeepSeek-R1 was designed to fit in a machine with 8xH100
Needs huge memory support (2-4TB+) for full LLM implementations, ideally as high bandwidth VRAM...
NPU processing is going to be even more important than CPU or GPU processing.
So the machines have to have upgradable memory, to expand into multi-terabyte sizes....
Not likely. If they will, you’ll hear about it at WWDC first. Look for any changes they make to the memory model. GPU’s for Apple devices, UNLIKE AMD/INTEL, MUST have access to the same RAM that the CPU has access to. If they don’t change that model, then there’s no discrete GPU’s.I'd say that Apple will be making their own Discreet GPUs soon. for these machine ( well whatever they become in the future )
There was never an M2 Extreme expected. The ONLY thing the Mac Pro was ever going to be (and I may have posted it here before its release), is a Mac Studio with slots, and that’s what it is. And, as such, just by existing, it was the fastest macOS system with slots, no need for it to be any faster.I reckon the Mac Pro was designed when an M2 Extreme was still expected from the chip team. Obviously that never happened, maybe an M(something) Extreme can happen and the Mac Pro will find its market, but without it it’s just a bit lost.
Developers can and do run LLMs locally. That's a thing. Additionally, many companies ban the use of accessing off-premise LLMs due to IP leakage concerns.Sure, but that misses the central question I raised: Why do you think a MP, which is a workstation, needs to compete with a multi-GPU server for VRAM (i.e., have 2 TB–4 TB VRAM)?:
What's the highest VRAM currently available in any commercial workstation?
Your claim would make more sense to me if you instead said something like: "While no one expects a workstation like the MP to be able to run LLMs requiring 2 TB - 4 TB VRAM, Apple is also using its Ultra chips in its servers. If it wants to run upper-end LLMs on the latter, then those chips additionally need to be capable of inter-chip pooled memory."
They don't have the in-house talent currently needed to undertake such an effort and would likely need to do something similar to how they acquired PA Semi. That is, in addition to your point about the whole memory architecture. Fraid we are on the current path for some time. Good for most but not all.Not likely. If they will, you’ll hear about it at WWDC first. Look for any changes they make to the memory model. GPU’s for Apple devices, UNLIKE AMD/INTEL, MUST have access to the same RAM that the CPU has access to. If they don’t change that model, then there’s no discrete GPU’s.
Yes, and they can buy multi-GPU servers and use them locally if they need 2 TB - 4 TB VRAM. Still doesn't answer my question about why the MP, which is a workstation, needs to have the capabilities of a multi-GPU server, since those are two different categories of product.Developers can and do run LLMs locally. That's a thing. Additionally, many companies ban the use of accessing off-premise LLMs due to IP leakage concerns.
Is there any reason why current Mac Pros don't support GPU PCIe cards?Support for GPU PCIe cards
latency for straight up recording like that don't matter much because its all being played together. for doing re-recording / inserts / double tracking it can be a problem. while USB2 wasn't perfect for latency was workable.Nice! This was 12 years ago or so and audio/video isn't my field. I had no idea analog audio bandwidth was so low, comparatively-speaking.
How about latency, though, with USB?
Good for privacy.I'd say that Apple will be making their own Discreet GPUs soon.