What does it even matter? Apple isn't doing anything exciting with AI.
AI needs all the RAM you can throw at it. 12GB is pathetic in 2025 even for a smartphone with no AI capabilities.
My MacBook Air has 16GB which is long overdue as standard. Apple knows they couldn't get away with 8GB anymore but 16GB is actually a joke. I do a lot of AI work and the thought of even trying to do anything AI related on my MacBook makes me laugh. I have actually just put my MBA up for sale on eBay to get rid of it. Whatever I get for it will go towards replacing my RTX 5090. I want an RTX Pro 6000 lmao.
My main PC has 96GB (64GB RAM and 32GB VRAM) and even that limits me at times. I wish I had 128GB RAM and 96GB VRAM. Actually, I wish had x2 GPUs with 96GB each. So that's a total of 320GB RAM. And yes, I could hit that limit without even trying. There are some workflows/usage cases where my 32GB RAM simply isn't enough and I keep running into memory errors. I try to run as much as I can locally but my small RAM capacity is holding me back so I find myself more and more paying £90-200/month to ChatGPT/Claude services. My next PC will focus on obtaining as much RAM as possible. 128GB and 96GB is the minimum upgrade for me.
I respect Apple for being privacy focused (or at least that's what their marketing team wants you to believe) but running local models only means you are getting much less quality and you're always going to be behind the competition. Local AI models are dumb as hell. Trust me, I've downloaded loads of them. They're OK for basic questions but if you want to have good reasoning/image generation/coding then forget about it. When it comes to coding, even the basic ~£20/month paid tiers of ChatGPT/Claude hold you back a lot.
If Apple actually manage to release a useful AI product they need to have 24GB or even 48GB RAM as standard on Mac. iPhones should be 16GB.