Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is easy to answer. It's because some bean-counter committee decided, that they can add 10 Cent to the bottom line, by putting 2GB less RAM in the base model.
Yup, exactly the issue with the state of Apple. And I bet it’s literally right around 10 cents too, at their scale and vendor relationships.
 
  • Like
Reactions: nemodomi and MilaM
Are you for real? Apple never gives credit to their suppliers. He’ll would have to freeze over before Apple admitted they’re using ChatGPT technology.
Apple would be violating their users privacy by hiding ChatGPT in this case, the one angle they have over their competitors. “He’ll” looks like this tech will be handy for you too Mr. IQ
 
Negative. Only 1 iCloud stamp, not even publicly accessible is in AWS. No Azure anywhere at all. Some GCP. But 99.9 of iCloud is Apple data centers and has always been. Hundreds of exabytes and millions of servers.

AWS/GCP at Apple is for experimentation, play, GPU, extra capacity, object storage, etc. iCloud data and your personal data is not there.

The Apple Silicon stuff is nothing new. They use that for X-Code cloud for simulating Apple devices like ipads and iphones for CI/CD and builds. Think 336 SoC’s (not M2 or M4 or anything retail) in a single rack.
You’re wrong. Most of iCloud runs off AWS. The rest is Google Cloud or Azure.

Apple has not been in any of the cloud business until yesterday when they announced their private cloud for Apple Intelligence. But up until that point, everything on the cloud was dependent on someone else.

 
This is easy to answer. It's because some bean-counter committee decided, that they can add 10 Cent to the bottom line, by putting 2GB less RAM in the base model.
There is only so much room on the die. RAM takes up lots of space. Remember, everything on an M series chip is all on the same slab. They pack them to the limit. There is no extra space to add more stuff.
 
Terrible idea and opening massive privacy risk integrating chatgpt into siri, millions of people queries with potential sensitive info will be sent to openAI servers, and I'm pretty sure openAI stores all chat logs for further training.

I used chatgpt maybe couple times and found it useless (maybe only useful for cheating on schoolwork), this is adding a ton of useless bloat no one needs or asked for.
ChatGPT is very useful. But most requests will hopefully be using Apple Intelligence. Anything that needs to go outside of On Device knowledge or Apple Private Cloud will use (at this time) be using ChatGPT.
 
  • Like
Reactions: Tagbert
And it is Apple themselves who decided to not give the 15 the 8GB so I don’t get the point of this. Galaxy S10 had 8GB of RAM, 5 years ago, at a lower price point, and it was a beautiful device for its time that had no compromises when it comes to internals, camera, screen and so on. Apple makes the decisions for both the product and the software it runs, so why skim on the RAM when it's going to be an issue?
If the rumours of Apple genuinely being caught flat-footed on AI are true, you get the sense from today's WWDC keynote that Apple practically dropped everything they were originally working on in order to focus on AI starting from somewhere early last year.

Meanwhile, the specs and design of their iPhones are likely locked in way in advance (ie: before 2023), meaning that at the time the iPhone 14 and 15 models were being designed, the thought of adding AI features as a key feature wasn't a consideration yet. So there was no reason then for them to add more ram than what they believed the iPhone needed to run at that particular point in time.

So my theory is that the iPhone 15 pro gets 8gb ram in part to run the camera, which so happens to also be the baseline needed for Apple's AI features to run properly on their phones. Maybe they managed to squeeze in an extra 2gb ram into the pro models at the last minute because they knew they would need at least 1 smartphone to demo this year's AI features, but I doubt it was the case. It feels more like a coincidence more than a deliberate attempt to force iPhone users to upgrade their devices early, but it's also a convenient "misstep" that so happens to work in Apple's favour.
 
There is only so much room on the die. RAM takes up lots of space. Remember, everything on an M series chip is all on the same slab. They pack them to the limit. There is no extra space to add more stuff.
I don't think it's a space problem. 15 and 15 Pro have mostly the same form factor. Also, the surface area of 6 GB vs. 8 GB chips can't be so different. My guess is, that you could fit 16 GB in the same space. There could be a trade-off regarding power consumption though.
 
Wonder if HomePod will benefit at all from the improved Siri..? I can choose to not use Siri on my Macs, iPad and iPhone to keep my blood pressure down, but it's not as easy to opt out when using the HomePod.

I was thinking about this. There is a news story going around that not only does the iPhone 15 Pro have a thread radio, but a lot of the Mac’s also have thread radios in them. I wonder if that’s so your HomePod Mini can access your iPhone/Mac’s information. I don’t know for sure, but one has to wonder.

Apples to Oranges. Different ram. Different architecture. Different language. Everything running on iOS is compiled languages like ObjectiveC and Swift. The memory footprint is exceedingly efficient and fast. Contrast to how much of the Android world runs in the JVM. Like saying an F350 is faster than a Mustang because it has more power.

And to your point, consumers don’t care about this. It has no bearing on anything. It’s easier to say AI works with iPhone 15 vs you need this CPU and this RAM.

Qualcomm also uses Unified Memory. Yes, Apple’s neural engine has been more powerful than theirs (up until this year) but I don’t know of any ARM chips that don’t put memory on the SoC.

And it is Apple themselves who decided to not give the 15 the 8GB so I don’t get the point of this. Galaxy S10 had 8GB of RAM, 5 years ago, at a lower price point, and it was a beautiful device for its time that had no compromises when it comes to internals, camera, screen and so on. Apple makes the decisions for both the product and the software it runs, so why skim on the RAM when it's going to be an issue?

Yep, that’s true. And it may continue to be true. Apple often locks things for their more expensive devices because they don’t charge for software updates. Well just have to wait and see if iPhone 16 gets more RAM.

I can see why more ram will be needed on future iPhones if this is the start of what they are doing. I'm sure they will bring some new things for the 16 pros as well relating to al.

hopefully it can help the image processing for the 16 series as well.

Every AI chip on the market has 8 gigs of ram or more. Although I agree
Some of these AI improvements can be made to work on less, the more information it can work with at the same time the better it will work.

So my 14 pro max is obsolete but my M1 MBP which is nearly 2 years older is fine...

It’s not obsolete. It just won’t be getting AI features (apparently).

If the rumours of Apple genuinely being caught flat-footed on AI are true, you get the sense from today's WWDC keynote that Apple practically dropped everything they were originally working on in order to focus on AI starting from somewhere early last year.

Meanwhile, the specs and design of their iPhones are likely locked in way in advance (ie: before 2023), meaning that at the time the iPhone 14 and 15 models were being designed, the thought of adding AI features as a key feature wasn't a consideration yet. So there was no reason then for them to add more ram than what they believed the iPhone needed to run at that particular point in time.

So my theory is that the iPhone 15 pro gets 8gb ram in part to run the camera, which so happens to also be the baseline needed for Apple's AI features to run properly on their phones. Maybe they managed to squeeze in an extra 2gb ram into the pro models at the last minute because they knew they would need at least 1 smartphone to demo this year's AI features, but I doubt it was the case. It feels more like a coincidence more than a deliberate attempt to force iPhone users to upgrade their devices early, but it's also a convenient "misstep" that so happens to work in Apple's favour.

I don’t think that’s true. If my theory above this holds true, they have been secretly working towards this for a while. I don’t know why they didn’t make the iPhone 15 have 8 GB of ram but I don’t think the answer is Apple has been behind on AI.

I think the fact they came out with so much of their own LLM on device shows they have been working on this longer than 2023.
 
  • Like
Reactions: Tagbert
It’s not obsolete. It just won’t be getting AI features (apparently).
Fair point. I have a feeling this is what most OS releases are going to be like from now on, though. Pretty much just updates to AI. Even iOS 18 brings just a calculator and weird home screen stuff to pre-15 Pro devices 😅 Obviously it's not obsolete, but it won't be getting what's really new. We've never seen a cutoff this dramatic. It wasn't like this even with the Intel to Apple Silicon transition.
 
There are open source topo maps and licensed ones. Anyway I think that Apple Maps already encompass many layers of proprietary data (for routes, traffic data, 3d buildings etc) so...if even third party apps in the Apple Store provide topo maps without breaking a sweat I really don't understand why Apple can't afford to reach a deal to embed topo maps for the whole Earth.
I’m sure they could afford it, but the primary market for Apple is urban and suburban users of phones with cellular and wifi connectivity in those environments. If anything I think Apple should support downloaded topographical maps for Apple Watch. Seems more hiker friendly than constantly pulling out a phone.
 
  • Like
Reactions: XRC771
Indexing for search is not remotely the same as Indexing for AI.Search is largely just searching an index. When Indexing for AI, you are actually dealing with filling attributes for billions of parameters.
I know hence why I'm stating they are doing Recall. What recall does is create a semantic index of what is on your screen with associated data point on your device into a vectorized database. Then using a text/image encoder, etc to parse for whatever you ask for in natural language, in simple terms. That is just simply how it has to work if you want something to work fast enough in real-time. This is entirely different from a simple file and system index. It is Recall.
Screenshot-2024-06-10-201211.png

Apple has published papers about the work they have done
I know, I have played with some of their models on huggingface.
in shrinking 7B+ parameter models to run on device.
No, they have not shrunk a 7B+ parameter model to run on iPhone, they have created a specific 3B parameter model that is quantized to less than 3.4-bit to make it run on device. It is not magic, there are many tiny models out there, Phi3 mini, gemma2 2B etc. I have played with Phi-3 mini on my iPhone 13 Pro Max and it works.

Gemini nano models on android phones.
Tg1mA5C.png

They are doing H100 level GPU semantic and contextual modeling and understanding on your phone.
They are not doing a H100 level GPU semantic and contextual modeling and understanding on your phone. I don't know what that even means. A H100 can do 3,958TOPS and iPhone 15 Pro can do 35TOPS. A H100 can run ChatGPT 4o and Gemini 1.4 Advanced class multimodal models with 1+ trillion parameters.
 
Last edited:
Terrible idea and opening massive privacy risk integrating chatgpt into siri, millions of people queries with potential sensitive info will be sent to openAI servers, and I'm pretty sure openAI stores all chat logs for further training.

I used chatgpt maybe couple times and found it useless (maybe only useful for cheating on schoolwork), this is adding a ton of useless bloat no one needs or asked for.
Apple says that their agreement with OpenAI prevents them from storing and using your chat logs. Also, it looks like the ChatGPT integration is only for certain kinds of queries and you have the option to say no for each one. I am sure that you will be able to turn off the ChatGPT integration entirely in settings, if you prefer. Apple's focus for their own AI was on very focused and functional features so that looks pretty useful.
 
If the rumours of Apple genuinely being caught flat-footed on AI are true, you get the sense from today's WWDC keynote that Apple practically dropped everything they were originally working on in order to focus on AI starting from somewhere early last year.

Meanwhile, the specs and design of their iPhones are likely locked in way in advance (ie: before 2023), meaning that at the time the iPhone 14 and 15 models were being designed, the thought of adding AI features as a key feature wasn't a consideration yet. So there was no reason then for them to add more ram than what they believed the iPhone needed to run at that particular point in time.

So my theory is that the iPhone 15 pro gets 8gb ram in part to run the camera, which so happens to also be the baseline needed for Apple's AI features to run properly on their phones. Maybe they managed to squeeze in an extra 2gb ram into the pro models at the last minute because they knew they would need at least 1 smartphone to demo this year's AI features, but I doubt it was the case. It feels more like a coincidence more than a deliberate attempt to force iPhone users to upgrade their devices early, but it's also a convenient "misstep" that so happens to work in Apple's favour.
According to reports coming out of WWDC, it was in 2022 that Craig Federighi decided to change direction and embrace LLM-style AI. By then, the 14 were ready to ship and the design for the 15 series phones was already locked down. They were fortunate that they had added 8GB to the Pro phone for whatever reason so that they would have at least those devices to run their AI on.
 
  • Love
Reactions: smulji
and the ipad mini which is older but still. how are they selling a form factor that isn't fully capable of what was just announced? who would buy an ipad mini now?
And basically any iPhone except 15pro lineup? They still sell 14, 13 they rendered them obsolete regarding the future way iOS is going to take. They will still get new updates, yes, but these will be with less and less tech relevance. And since technology relevance is all in tech world they (Apple) is from now on selling E-waste. If we are 100% honest.
 
  • Like
Reactions: applepotato666
People are complaining that the new AI features don’t support phones as new as the 15 or the 14 Pro.

People would also complain that Apple didn’t push the envelope far enough and their new AI sucked because it didn’t to this or that, or blah.

Apple decided to go cutting edge with it and I applaud them for it. If you want the premium product, pay for it. Apple isn’t selling android phones to Cricket subscribers. It’s de facto a premium product.

If you don’t want the new AI, then keep using your older iPhone with the crappy Siri that doesn’t know squat. That’s fine. You can do that. If you want the latest jawn, then fork over the sawbucks.

I upgrade every year anyway so IDK. I have no vices to spend money on so that is how I treat myself every year. But that’s what I do. My SO is still hanging on to an iPhone 11 with a dying battery. I offer to buy a new phone, but nope. Won’t have it. 😔.

As soon as I can get home and back up my 15PM I’ll be installing the developer beta just to mess with it. But not until I get the backup done. I suspect the first release will be super buggy.
 
  • Like
Reactions: Tagbert
I suspect that context memory will all be in the cloud. The local device will take local data, probably convert it into vectors, then feed the cloud hosted GPT instance the data that it needs. The remote host will run a Langchain tool that requests the data from the small LLM on the local device. That data will go into context in one way or another.

You need context memory to be on the device doing the actual computation.

No, it’s actually a 3B parameter LLM that Apple has running on-device. The cloud is more for things that the phone doesn’t know, like flight schedules. Look at the papers they released. They fit a 3B parameter LLM into a 2 gig sliding window.
 
According to reports coming out of WWDC, it was in 2022 that Craig Federighi decided to change direction and embrace LLM-style AI. By then, the 14 were ready to ship and the design for the 15 series phones was already locked down. They were fortunate that they had added 8GB to the Pro phone for whatever reason so that they would have at least those devices to run their AI on.

I tend to think it was when Apple canceled the car. I take these origin stories with a grain of salt.
 
Has no bearing on anything? You have one iPhone model that supports AI because... reasons.

Yeah, it's much easier not to explain how Apple being cheapskate screwed over their customers. Much less discomforting that way. Let's pretend it's just an unfortunate turn of events and not decisions made by Apple. And we'll never get any clarification or explanation from them, either, leaving people to argue in circles instead

“Cheapskate“. What the hell are you talking about? There’s a physics part, LLM’s take memory, they increased memory to support the future projects and they made the choice to do so.

There’s a thing called a power budget in a mobile device. Throwing more RAM than needed into a device just so someone feels good about having RAM is a waste of power budget, a waste of materials, a waste of space in the device and a waste of money. It’s not about being cheap. Apple has utilization data across a billion devices. If they didn’t put 8 GB in until the 15 Pro it is because there wasn’t a use case for RAM. Apples background subsystem and kernel doesn’t require RAM like Android. It doesn’t keep background apps open. The specs don’t matter to the consumer and fracturing the device landscape over specs like Android just causes customer confusion.

Android and iOS and the apps that run on them share nothing in common on how they use memory, GPU, etc.. there are hardware and software advantages Apple has that Android just can’t play in. They throw RAM at it to keep apps active because the SoC’s and the storage subsystem are inferior. Apple can page apps in and out without a user knowing and having zero impact on battery.
 
You’re wrong. Most of iCloud runs off AWS. The rest is Google Cloud or Azure.

Apple has not been in any of the cloud business until yesterday when they announced their private cloud for Apple Intelligence. But up until that point, everything on the cloud was dependent on someone else.


Uhh.. I am not wrong. It’s in their earnings reports and stuff. They do not run iCloud in AWS. That is old and wrong data.

Apple has been doing Cloud stuff forever. Do you know what CloudKit is? Every app dev gets 1 PB of storage for free, 200 TB of CDN (they run their own CDN too), 10 TB of database, logging, monitoring, container management, etc.



There is absolutely no Azure. GCP is a small secondary storage footprint to AWS S3. All the compute is Apple owned data centers and the lions share of storage is on-prem. Apple is too big for AWS. They are almost the size of AWS themselves.


Apple owns 17.x.x.x because the data center footprint is so big. They maintain their own backbone, peering, etc.


The clouds serve as a capacity backstop, play to run GPU jobs, etc.. Things that are hourly workloads and fill gaps. The bulk of persistent workloads is on-prem. They have in excess of 10 million square feet of data center space and over a million bare metal servers.




And that 2019 article is so far off base it’s silly, that relationship between AWS and Apple is billions, not millions. Apple has been building data centers for over 10 years and has been the largest storage customer of NetApp, Dell, etc many times.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.