Seems you don't know how "unified memory" works?Thats not how unified memory works. And why do you think that AI needs constant access to 8GB?

Shared memory - Wikipedia
It doesn't mean you have 'magically' more memory, it only means your other devices (like GPU and Neural Engine) don't have their own memory and instead use the main memory also (thereby creating a bottleneck for memory access between more participants, instead of each having their own memory). If AI needs 8 GB of memory, this memory is used and not free for other purposes. If other processes need this memory, the AI model has to be swapped out of the memory, leading to delays, when the AI model is needed again, since it has to be reloaded from disk (SSD). Since the local AI should "learn" from the use of the device and should be "screen aware" it has to be in memory to get further trained.
So i stay by my expression: Memory should already be 16 GB!
Last edited: