I agree, it’s not yet there for smartphones or tablets. On the Mac however, if they make 24GB of RAM the minimum, maybe they could launch that local, on device Siri LLM.But you cannot do that, it takes LOTS of memory and power to run these AI models, LOTS. As I understand. It would be impossible to run them on a phone at present or even iPad to an extent. The tech to do that on a device that fits on your pocket just isn’t there.
And before you reply saying that 24GB is not enough, I know. A year ago I was researching about local LLMs for the M4 Mac mini, and the bare minimum was 48GB of that shared fast memory… better if it was 64GB or even the 128GB from the Mac Studio.
Yes, that’s an insane amount of unified memory. However, I still hope that Apple may find a way, maybe through some revolutionary quantization, maybe through some other type of trick, that allows us to have a much more powerful assistant running locally on an M5 Mac.
If Apple wants to differentiate itself from the competition, they must take a different, unexplored approach.
Last edited: