Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When are they going to stop goofing around and discontinue the Mac Pro? It is still on M2 and still $7k.

Just because you might need it doesn't mean others don't.

it still has PCIE

Blackmagic DeckLink 8K Pro G2: An alternative for 8K digital cinema capture and playback, featuring quad-link 12G-SDI connections
NVME RAID cards ( with INSANE speeds )
Kona 12G-SDI and HDMI 2.0 connectivity for high-resolution formats, including 8K/UltraHD2 and 4K/UltraHD.
Various Pro Audio card.

Totally agree about the Price. Should be 4k start.
 
If it has that (M5U + 1TB) and it comes in under £4000, it’ll be a day one purchase for me, too. Lol. Sadly, I think I might be a tiny bit out on price. 😂
The base M5 Ultra might be that.

1TB will be at least £10,000. Probably 12,000 or so.
 
This will be the most powerful AI machine…

…that doesn’t run AI.

Ollama and ComfyUI want to have words with you.

In fact, with Apple's integrated memory, you can run larger models than a pimped out overpriced nVidia 5090. Sure it may be slower, but at least it can do it for substantially less in price.

Now if you're talking about Apple's joke of Apple Intelligence, well then I'll give you that.
 
  • Like
Reactions: navaira
Sounds like the Apple Mac Studio with M5 Ultra Chip is going to be very powerful indeed, and worthy of consideration for all Mac users who need that kind of speed and power.
 
Only the Max variants have had the connector if I remember correctly. The Base, Pro, and Max are all different silicon, and it's just the Max variant that gets doubled up with the connector to double everything.
My understanding is only the Max variants, but not all Max versions, have it. Seems to be only the odd numbered ones so far…

Anyways, I've heard rumors here and there that Apple wants Apple Silicon in its AI servers. If true, I'd be curious to see what an M5 Ultra (or whatever it would be called) blade server would be like. Considering how small the motherboards are, which could be even smaller if you put somethings (like power, some cooling, etc) into a more centralized, standard blade server components, I bet Apple could put a heck of a lot of nodes into a single standard 42U rack.
 
lol what

pytorch runs fine

Local models work fairly well despite the drawbacks of M4 (low inference speed mostly).

M5 will hugely improve this: per GPU core neural acceleration, matrix math improvements, etc. Inference should be significantly faster depending on how high they get the memory bandwidth and what they do with the interposer and SoIC / chip stacking, if it happens this generation.

There are metal optimized libraries that work pretty well and there is an apple-funded CUDA -> MLX workflow in progress.

This machine is going to be excellent for personal and professional AI development / fine-tuning, and for ~$12k it’s going to cost what one nvidia card with 1/5 of the GPU memory does.

If you need really big data nvidia stuff you can just connect to the cloud, but for local work nothing will compare to this for a long time that isn’t 10x the cost.

I say all this also owning a 5090 that’s going into a Linux workstation next week for local CUDA work by the way. :)


Now, fi you were talking about Apple Intelligence… fair point! I have low faith especially given the new rumors that they are not teaming with Anthropic which is an enormous mistake, if true.

Apple’s small models are doing pretty cool stuff but they’re going to take a couple more years to be really relevant relative to the best of what else is out there now, and that is also dependent on them retaining talent. They are doing groundbreaking R&D work with on-device models particularly in RAM constrained areas, but everywhere else they are trailing pretty badly.
I have to wonder what things like Folding@Home would be like if fully optimized for M-series chips? Also wonder what a JeOS version of F@H would be like?
 
I would so love if they could figure out a way to have a GPU that can match the RTX 5090, with the amazing integrated RAM of Apple Silicon. I don't even need gigantic CPU performance…
The 5090 is a 600 W board. I don't think that kind of performance is going to fit in a studio. I don't think the market is big enough for a CPU dedicated solely to the Mac Pro.

Even the infamously power hungry dual G5 when fully loaded pulled less than than just the 5090.
 
I think some (very few) people want to use very specific and expensive PCI cards and an old (but still quite powerful) Mac is better than nothing for them.
So, the price makes sense. It's there just for those very few ones, it has to be justified for Apple. Apple moved away from computers with any sort of customization a decade ago. Who still wants that, has to pay.
And of course it's not consumers, it's businesses. If you compare that with any sort of pro-oriented low-production machine for very specific businesses, it's not that much of a crazy price.

I want a Mac Pro. I absolutely have having an external drive PERMANENTLY connected to my Mac Pro. I want internal expansion of storage. But it’s not worth the $3,000 premium over the Mac Pro. Especially now it’s still on M2. I was actually going to just get one if it got the M3 Ultra.
 
I used to think the Mac Studio was impressive until I saw the new Mac Mini with a Pro chip. Most people don’t need more than that.
 
until I saw the new Mac Mini with a Pro chip
Here's my take, the M4 Mini, is where its at, most consumers will be fine with the non pro Mini. The pro version is more expensive, runs hot, and doesn't have the thermal head room for extended periods of processing.

The studio, has more cores, more ram, more ports, more cpu core, and more gpu cores. The computer runs significantly cooler, and as such I think its very much impressive, thought its also very much more expensive :)
 
Hopefully, the M5 Ultra can be ordered with more than 512GB of RAM. 512GB isn't quite enough for an open-weights, non-quant model like DeepSeek R1. Then it could be used as standalone for LLM duties without compromise. As well as for people who tackle combinatorial problems by writing highly-parallelized programs. Problem is, it will be closing in on $20K. Need a grant or belong to a research lab for that. If it was $10K (no chance), one could think of amortizing one for $2K/year and 5 years of use before replacement. Just thinking out loud here.
 
  • Like
Reactions: vantelimus
Wasn't there a rumor that Apple was going to move away from local silicon interconnects (LSIs) that are using to bridge Max chips together in the Ultra package? I guess this would either be a single monolithic chip (high $$$) or a full chiplet design using a full interposer (e.g. CoWoS).
The rumor is it will be SoIC, which is compatible with the InFO-LSI advanced packaging used for UltraFusion. So it’s not either/or. The M5 Ultra could use both without using CoWoS.
 
I want a Mac Pro. I absolutely have having an external drive PERMANENTLY connected to my Mac Pro. I want internal expansion of storage. But it’s not worth the $3,000 premium over the Mac Pro. Especially now it’s still on M2. I was actually going to just get one if it got the M3 Ultra.
I know how you feel but sadly, that's not meant for you...
You should get an external drive and a Mac Studio, or at least that's what Apple believes. Or move to PC.
With Silicon architecture, it's been sad to see so all the customization go away: the RAM, the drives, at some points even the CPU and GPU. And most of the repairability.
But we also got the greatest and cheapest consumer Macs ever. Some of them really resemble Jobs' dream computer.
 
Waiting for M7 personally, should be even better
1762357410540.png


Sorry couldn't resist :)
 
  • Like
Reactions: Tdude96
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.