According to Kuo, AI requires less than 2GB RAM. If true, any of the iPhone 14 series should be able to run AI given all devices carry 6GB RAM. This makes you wonder if Apple is pulling another 4K ProRes situation where they lock out 128GB devices even though the storage is fast enough.
"The demand for DRAM can be verified in another way. Apple Intelligence uses an on-device 3B LLM (which should be FP16, as the M1’s NPU/ANE supports FP16 well). After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7-1.5GB of DRAM needs to be reserved at any time to run the Apple Intelligence on-device LLM."
"The demand for DRAM can be verified in another way. Apple Intelligence uses an on-device 3B LLM (which should be FP16, as the M1’s NPU/ANE supports FP16 well). After compression (using a mixed 2-bit and 4-bit configuration), approximately 0.7-1.5GB of DRAM needs to be reserved at any time to run the Apple Intelligence on-device LLM."
How Apple defines on-device AI and future trends — from the perspective of analyzing supported…
The iPhone 15 with the A16 chip cannot support Apple Intelligence, but M1-equipped models can. Therefore, it can be concluded that the key…
medium.com