Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is the configuration people are buying to run local AI models. A cluster of 4 Mac Studios with 512GB can run the largest models
And how many people have bought it/are buying it in a 512GB configuration?

Apple wouldn't drop something (e.g. iPhone mini, iPhone Plus, original 12" MacBook, etc.) if it was selling in large enough numbers for it to be justified and to remain in the product mix.
 
What on earth do you need 512gb of ram for? Only people that need that are probably some of the folks who think 8gb isn’t enough to surf the internet, watch some videos, and complete light tasks.
 
  • Like
Reactions: Justin Cymbal
$4,000 is the current market price of 256GB DDR5. Apple was definitely losing money if you chose 512GB. Strange but true.
They would only be losing money if they paid these prices (between $2000-$4000) for 512gb or whatever current market prices are

Apple locks in prices for a pre-established supply and once that supplier contact is fulfilled, Apple renegotiates brand new contracts at negotiated rates based upon current market prices so I assume that once Apple’s existing supply of these 512gb became exhausted, instead of passing along along a significant price increase (and potentially getting bad PR for doing so) they decided to just remove 512gb as as option entirely and will bring it back once they are able to procure lower prices

Kuo says that Apple negotiates memory prices with suppliers on a quarterly basis, so price increases are expected in the second quarter of 2026
 
They would only be losing money if they paid these prices (between $2000-$4000) for 512gb or whatever current market prices are

Apple locks in prices for a pre-established supply and once that supplier contact is fulfilled, Apple renegotiates brand new contracts at negotiated rates based upon current market prices so I assume that once Apple’s existing supply of these 512gb became exhausted, instead of passing along along a significant price increase (and potentially getting bad PR for doing so) they decided to just remove 512gb as as option entirely and will bring it back once they are able to procure lower prices



In all likelihood, Apple contracts were likely limited to a year or less. Suppliers most likely told Apple no to 512GB (4x128GB) since hyperscalers are willing to buy at all cost.

Yes, Apple can weather out the storm for iPhone, but Mac Studio doesn't hold a candle to hyperscalers who have committed to contracts through 2030.

"Quarterly" pricing is now old school. Prices are now hourly and even determined after the DRAM has shipped.

 
lol, they have been promoting it as a LLM beast due to the amount of shared RAM it can have, and now they remove it due to the target market they are aiming it at haha, talk about irony.
 
  • Love
Reactions: wilhoitm
Makes sense.

They must have different stock piles for all the product tiers, or at the least some method of calculating what's needed for where, where the potential stock's sitting at per each variant; at some point -given the current "shortage"- someone wisely thought they might just need to compare overall sales of the 512GB variant to money potentially saved if said mem. chips allocation was split for multiple lower tier models.

And the math showed it was so 🙂

* Now while this is logical, perfectly fine and not a big loss, might we focus on the outrageous prices Apple extorts for nvme storage? That's definitely news worthy to me, for each and every gen, again and again front page. They literally use the same controller everywhere, they just solder different flashes on, capacity depending. The controller costs them a tiny bit, but the flashes? At the bulk quantities obtained? Honestly not even near enough to justify what they ask of you. And mind you, no constant, ongoing, year-end persisting flash chip "crisis" either.
 
Last edited:
I’m not a techie like some here, but isn’t 512GB RAM as mentioned in the second sentence a teensie bit overkill?
It depends on what you're doing. For most people it's just a little overkill. I could easily use that much RAM. I have some of my scientific computing that can use basically as much RAM as exists on a system. I generally use my university's high performance cluster, but if I could run some of the processes locally, that would be amazing.
 
Apple wouldn't drop something (e.g. iPhone mini, iPhone Plus, original 12" MacBook, etc.) if it was selling in large enough numbers for it to be justified and to remain in the product mix.
Isn't this a BTO option? Apple doesn't have to concern themselves with product stock vs. demand when it's a one-off build config. Someone mentioned them saving them for data-centers and honestly that sounds rather plausible. High density modules worth more going to corporate customers. Just like how the highest capacity NVMe SSDs and even conventional hard disks are inflated in price, but low capacity ones isn't being hit nearly as much.
 
I drive by Micron's massive expansion occasionally. I can only assume this is to chase the demand. Its a good two years from completion, I'd guess. Huge expansion though.

1772744659838.png
 
Last edited:
AI needs to crawl up its own ass and die. While I'm terrified for what the inevitable AI bubble burst will mean for the economy (and for the thousands that will lose their jobs), it really can't happen soon enough.

If you think AI is a bubble you have no idea. AI has escaped from the labs and is happening so fast that nobody can keep track of everything. There are hundreds of thousands of really smart people building all kinds of stuff with nothing but thought. Is it production ready? Not yet...but it will be. Is it good enough? Probably, yes.

At this point the main thing holding back AI is QA test cases. I'm sure at some point someone will come up with a mechanism for AI to do a black box stare-and-compare kind of QA, if they haven't already.

The fact is, AI is probably better at any given thing than 80% of the humans out there. Could it, say, come up with a better, more cost effective and lower-impact way of making steel? Maybe. Could you? Probably not.
 
I figured this would happen so I jumped on a 2nd 512GB model in Jan to stay ahead of the memory shortages. Now we have 2, one as a devkit and one for a client

Will try to scoop up refurbs or used ones as available
 
The fact is, AI is probably better at any given thing than 80% of the humans out there. Could it, say, come up with a better, more cost effective and lower-impact way of making steel? Maybe. Could you? Probably not.

Just because it holds a wealth of information, doesn't mean it's more intelligent than an encyclopedia.

Better at regurgitating it's training data, of course. But not at knowing what it means. It just knows that in a certain context it "read", the next word is most likely to follow the proceeding word. That's why it's all inference. There's zero intelligence behind it, otherwise companies wouldn't have to force it to lie to when certain questions are asked.

AI does excel at pattern matching, and identifying discrepancies.
 
Is there anything else? I don't know anyone running or trying LLMs on their own computer.
I put some of this in another comment, but as a scientist I can use basically all the RAM available on a system.

If I had the money, I'd quickly buy the Mac Studio with 512 GB of RAM. Some of my processes can use 64+ GB of RAM each and I'd be able to run several at once with that much RAM. I even have some files that can be 400+ GB each. Being able to load them completely in RAM within software would be amazing. What I typically need to do is subset them into smaller files to load them.
 
Just because it holds a wealth of information, doesn't mean it's more intelligent than an encyclopedia.

Better at regurgitating it's training data, of course. But not at knowing what it means. It just knows that in a certain context it "read", the next word is most likely to follow the proceeding word. That's why it's all inference. There's zero intelligence behind it, otherwise companies wouldn't have to force it to lie to when certain questions are asked.

AI does excel at pattern matching, and identifying discrepancies.
Well, maybe. There's considerable debate about the extent and nature of the intelligence of current models.

For: https://www.nature.com/articles/d41586-026-00285-6

And a countering reply (I know this is a letter to the editor, but it's a somewhat weak reply): https://www.nature.com/articles/d41586-026-00495-y
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.