Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And if there are eight LPDDR5 SDRAM chips, the question is who is making the 64GB chip for Apple? Samsung currently advertises their largest LPDDR chip at 32GB, Micron at 16GB.

The M4 Max can use four 32GB LPDDR to get the 128GB total RAM.

Apple could have put 16 32GB LPDDR on the M3 Ultra daughter board (that is home to the SoC) but that means doubling up the memory rows on the board. Sounds tricky to me because the PCB will need to route a great many lines in a small space, but I guess that is why Apple has high-paid EEs.
Doubling density may be the simplest solution, as the footprint of the Studio is already quite filled by the previous Ultra chip. The logic board may still have space for 8 more DRAMs but the copper heatsink structure and the heatpipe may not.
 
It seems steep, but compared to other solutions it's not bad. Like even 4 framework Mainboards will run you 6400$, not including a case, power, cables etc.
 
Like the M4 Pro and M4 Max chips, the M3 Ultra chip supports Thunderbolt 5 for up to 120 GB/s data transfer speeds on Macs with Thunderbolt 5 ports.
[sarcasm]
That Thunderbolt 5 port will come in really handy when backing up the latest iPhone 16 with only USB 2, which is 240 times slower than Thunderbolt 5. Tim Cook is a genius who can do no wrong. I love waiting 4 hours to back up an iPhone 16 over USB 2 when it could've taken 1 minute over Thunderbolt 5.
[/sarcasm]
 
  • Like
Reactions: chfilm
Apple should be optimizing Comfy UI and similar AI tools for these beasts. With the amount of unified RAM that they have, they would put the 4090 to shame if they helped the developers optimize the code. But they are missing this opportunity and letting much cheaper NVIDIA cards outperform the Maxes and Ultras by a wide margin. It’s sad.

The only AI Apple cares about is their own, I imagine.
 
Depending on benchmarks, I am considering the Ultra this time around. And I think the Ultra will fare better in terms of sales with this iteration, considering how much and how fast the memory is for those planning on running local LLM's. So, I think I'll just wait for benchmarks before I pull the trigger.
 
But, then who makes 64GB LPDDR chips? AFAIK, Samsung's largest is 32GB, and Micron's is 24GB.
Apple getting the first dip? Like the 2TB NAND chips used on the 8TB M4 Pro mini and now the 16TB Studio configs, these are almost not used elsewhere.
 
Ah the good old days.

View attachment 2488612
And now... convoluted?
Captura de tela 2025-03-07 101316.png
 
Up to just over half a terabyte of RAM is crazy.
It is.
But you used to be able to go to 1.5 TB RAM in the Intel Mac Pro from 2019. This is of course not ”unified memory” which makes the M3 Ultra special – that its memory can be used as video memory for the graphics.
 
It is.
But you used to be able to go to 1.5 TB RAM in the Intel Mac Pro from 2019. This is of course not ”unified memory” which makes the M3 Ultra special – that its memory can be used as video memory for the graphics.

I wouldn't be shocked if the number of people who configured a 1.5 TiB RAM Mac Pro is zero.

(The RAM in the M3 Ultra also runs at more than twice the clock — LPDDR5-6400 vs. DDR4-2933.)

But yeah, in theory, there's workloads that Mac Pro could handily fit in RAM that the Mac Studio won't be able to. Given the current pace (the M1 Ultra maxed out at 128 GiB, the M2 Ultra up to 192), I guess we'll see that resolved around the M6 Ultra.
 
  • Like
Reactions: star-affinity
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.