Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JPack

macrumors G5
Original poster
Mar 27, 2017
13,321
25,642
According to Kuo, AI requires less than 2GB RAM. If true, any of the iPhone 14 series should be able to run AI given all devices carry 6GB RAM. This makes you wonder if Apple is pulling another 4K ProRes situation where they lock out 128GB devices even though the storage is fast enough.

"The demand for DRAM can be verified in another way. Apple Intelligence uses an on-device 3B LLM (which should be FP16, as the M1’s NPU/ANE supports FP16 well). After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7-1.5GB of DRAM needs to be reserved at any time to run the Apple Intelligence on-device LLM."

 
  • Wow
  • Like
Reactions: DOD250 and Hal~9000

GMShadow

macrumors 68020
Jun 8, 2021
2,072
8,503
The lowest supported iPhone has 3GB, but I suspect the 'average' across the in-use lineup is about 5GBish, probably pushing closer to 6GB. Regardless, Apple can't just take 2GB away from users that they've previously had - that's something nearly every user would immediately notice. The 15 Pros have 8GB, and more to spare given that most iPhone development will be done with a 4-5GB RAM environment in mind.
 

JPack

macrumors G5
Original poster
Mar 27, 2017
13,321
25,642
The lowest supported iPhone has 3GB, but I suspect the 'average' across the in-use lineup is about 5GBish, probably pushing closer to 6GB. Regardless, Apple can't just take 2GB away from users that they've previously had - that's something nearly every user would immediately notice. The 15 Pros have 8GB, and more to spare given that most iPhone development will be done with a 4-5GB RAM environment in mind.

An on/off toggle for Apple AI should help users who are concerned about memory. Given iOS 18 supports 3GB iPhone XR, it suggests 6GB iPhone 14 with 2GB reserved should perform fine. It's strange that Google's Gemini Nano requires 8GB, but with Apple's supposed memory efficiency, their model also requires 8GB.
 
  • Like
Reactions: DOD250

now i see it

macrumors G4
Jan 2, 2002
11,225
24,165
Google has stated that in order to run their ai, a phone should have 12GB. And to keep everything in RAM will require 24GB.
 
  • Wow
Reactions: Shirasaki

ProbablyDylan

macrumors 65816
Mar 26, 2024
1,285
2,496
Los Angeles
Isn't this guy a supply chain analyst? Must be branching out I guess.

Something to consider is that 🍎I is not a single LLM, but multiple models working in conjunction to complete the specified task. This means that, while the LLM component can run with 2GB of memory, other components might not. For example, the model that performs image recognition might also require 1GB of memory, then the software getting info from Calendar might need another 0.25GB, and so on and so forth.

This already means that a device with 3GB of memory is a non-starter, but how about 6GB?

Let's be generous and say it takes a total of 3GB of memory to run 🍎I. As we're all aware, iOS and iPadOS are very aggressive with memory management, apps constantly reloading or swapping to keep things moving along.

If half of system memory is allocated to Siri, it's inevitable that some stuff needs to be kicked from memory to keep things moving. What if I'm switching between Safari, Maps, Messages and Yelp to plan a date? If I ask Siri a question, are Safari and Yelp going to get kicked from memory so that the UX stays fluid? Sounds fine, until I go back to Yelp to find it reloaded and I forgot the name of the restaurant I was looking at.

It's not just about if it can run with limited memory. It also needs to be considered if they should run, and if they do, does it degrade the rest of the UX?
 

MegaBlue

macrumors 6502
Sep 19, 2022
358
858
Tennessee, United States
According to Kuo, AI requires less than 2GB RAM. If true, any of the iPhone 14 series should be able to run AI given all devices carry 6GB RAM. This makes you wonder if Apple is pulling another 4K ProRes situation where they lock out 128GB devices even though the storage is fast enough.

"The demand for DRAM can be verified in another way. Apple Intelligence uses an on-device 3B LLM (which should be FP16, as the M1’s NPU/ANE supports FP16 well). After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7-1.5GB of DRAM needs to be reserved at any time to run the Apple Intelligence on-device LLM."

This isn't factoring in to how much RAM is already being used by the system. If you've got an iPhone with 4GB of RAM and 3GB of it is being used by other system resources....

Much like Stage Manager with iOS 16, I anticipate a few of Apple's Intelligence features to work their way down to the A16, and maybe even the 6GB A15. It would be the features way less intensive, or potentially the features that can only be done in the cloud, but I think it's a pretty reasonable possibility.
 

hovscorpion12

macrumors 68030
Sep 12, 2011
2,930
2,974
USA
Kuo is incorrect, as it has been confirmed that Apple Intelligence's on device processing uses 4.2GB of RAM.

ALL iPhone 16 A18 models will ship with 8GB of RAM. [Maybe 12GB on the A18 pro models based on M4]
 
  • Like
Reactions: Homme and GMShadow

JPack

macrumors G5
Original poster
Mar 27, 2017
13,321
25,642
Kuo is incorrect, as it has been confirmed that Apple Intelligence's on device processing uses 4.2GB of RAM.

ALL iPhone 16 A18 models will ship with 8GB of RAM. [Maybe 12GB on the A18 pro models based on M4]

That sounds way too high. We already know Google Gemini Nano 3.25B uses less than 2GB RAM. Apple’s LLM is 3B, so it has to be only around 2GB as well.
 

bryo

macrumors member
Apr 6, 2021
95
166
I’m starting to wonder, would a future iPhone SE support Apple intelligence, or will Apple try to use AI as product differentiation? Would seem odd to me because then what’s the selling point of the SE in an increasingly competitive market then?
 
  • Like
Reactions: Nicky84

hovscorpion12

macrumors 68030
Sep 12, 2011
2,930
2,974
USA
That sounds way too high. We already know Google Gemini Nano 3.25B uses less than 2GB RAM. Apple’s LLM is 3B, so it has to be only around 2GB as well.

1718297286550.png
 

ProbablyDylan

macrumors 65816
Mar 26, 2024
1,285
2,496
Los Angeles
That sounds way too high. We already know Google Gemini Nano 3.25B uses less than 2GB RAM. Apple’s LLM is 3B, so it has to be only around 2GB as well.

We have to consider that Apple Intelligence is more than just an LLM. The system as a whole likely uses 4ish GB, with the LLM being a part of that usage.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.