Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Keeping your existing phone and turning off battery- & performance-eating features for all but your most-used apps keeps you with the latest security updates without having to go through new phone churn every couple of years

Great for your sanity & your savings accounts
 
  • Like
Reactions: pksv
What!

So they really mean that my 13 mini won't get and can't handle the Apple Intelligence 🥵
Can't be true 🙁
 
It makes perfect sense if you know anything about LLMs. To run models locally you need a lot of RAM, there's no way around it. 15 Pro is the first one with 8GB which allows to run 3B parameter model locally. Apple avoids saying it that it's about RAM but in reality, this is the sole reason that all the M-chips and A17 Pro allow it.
It would have made more sense to provide 8 GB since 13 or 14 and not only PRO, it is not that they did not know ...did they? :p Either they did not anticipate it(because they totally underestimated AI, which is not unlikely being the last one coming to it) or this was plain and simple planned obsolescence. Pick the one you prefer, both are not nice
 
Really?

You sure about that?


Because Apple says iOS has virtual memory. And since it only runs on iPhones, and iPhones have A-chips....
iOS has virtual memory, but it's not what you are thinking of. It just means the memory space is "virtual", ie, not mapped directly to physical memory. iOS presents a virtual memory space to apps so each 32 bit app can behave as if it has 4GB. Each MacOS 64 bit app can behave as if it has 2^64 bytes (a sh##ton) of memory. In reality, if an iPhone app actually tries to allocate that much memory, iOS would probably just tell it to get bent

When iOS is low on memory, it does not swap to storage. It tells the low priority apps I'm going to suspend you now so save whatever you need so when the user switches back you can pick up where you left off. It may leave the app paused (no CPU allocation), it may reclaim the memory allocated to the app, or it may terminate the app outright.

Windows 3 had a swap file and CPU as far back as i386 had support for virtual memory and memory paging. It would be surprising if the A chips don't, but then, that may be the case if the OS has no use for it. iPadOS got memory swapping in 2020, but the swap file is limited to like 1 or 2gb and I think it only swaps certain system processes, not user apps.
 
  • Like
Reactions: MilaM
That refers to the M# chips, not the A# chips.

Yes. The M# iPads with 128GB or larger drives have swap memory.
Alright so on M series chip, 8GB is still equivalent to 16GB of RAM in windows counterparts…
Wonder if Apple would allow M chips to run AI with only 4GB of physical RAM had they push that configuration as default.
Also interesting to know that iPadOS supports swap but not iOS.
 
Alright so on M series chip, 8GB is still equivalent to 16GB of RAM in windows counterparts…
Also interesting to know that iPadOS supports swap but not iOS.
Realistically it’s more like 12GB, but yeah. Also the A12-based developer Apple Silicon transition Macs had custom A12s which did support swap memory.
 
What about MacBook Air M1 with only 7gb. Will that work, or does it have to have 8gb?
 
I'm pretty sure that was made up for this thread, as an attempt to excuse Apple's bad behavior.

Problem is that Apple says iOS absolutely DOES support virtual memory, and you can't even turn it off.
That “8GB=16GB” excuse existed ever since critics accuse Apple of price gouging RAM upgrades and storage upgrades after the transition to Apple silicon. And since then there has been no shortage of “most people only need 8GB of RAM for their daily tasks” claims and posts everywhere.

I thought Apple would be able to work out that limited 8GB of RAM so it functions like 16GB physical memory for AI purposes. And 6GB works as 12GB and so on, going as low as 4GB. (/sarcasm)
 
  • Like
Reactions: Atog
Any word whether we can opt out of Apple Intelligence?
Highly unlikely you can opt out. My bet is you can choose to not use some of their generative features, but you will probably be using it one way or another in various minuscule ways.
 
Ok so Apple’s 8GB can’t be used as 16GB then, glad my basic computer knowledge is still good. (Mocking all people saying 8GB is more than enough for most)

Seriously though, A-series chip doesn’t support swapping? So does that mean iPadOS support swap, same as macOS?
When the M1 iPad was introduced in 2021, that’s when virtual memory in the form of storage swap was made possible and was introduced in iPadOS 16 for M-series iPads with at least 256GB of storage. That list included the 2021 iPad Pros and the iPad Air 5th Generation back then, to of course including any newer iPad Pro or Air. This also helped facilitate the creation of Stage Manager, though Apple did use some tricks to get around the lack of swap in the A12X and A12Z chips (2018 and 2020 iPad Pros). Initially Stage Manager could not be run on those iPads, but Apple figured out a way to get around it and managed to enable it for a single screen. Note A-series chips have never supported more than one screen while base M-series chips from M1-M4 support two displays.

8GB of RAM is still 8GB, but there are all sorts of tricks programmers can use to make RAM usage more efficient. If you remember the old days of MS-DOS, there was a product called RAM Doubler that used on-the-fly compression to make it seem like there was more RAM. Apple uses similar methods and more to optimize how much memory is actually used.
 
News flash: The phone already uses virtual memory.
Not in the way most people think of virtual memory, which typically refers to swap files temporarily located on drives that serve as backup memory. All modern OS’es have virtual memory in the sense that the address in memory that a program thinks its memory allocation is located is not the actual location in RAM. Every byte in RAM is specified by a numerical address from 0 to the maximum RAM. Every program thinks it has the entire memory space allocated to it, but the OS fools it into thinking that. In reality, the 0 address space for an app is somewhere else in physical memory and the OS’s memory manager tracks where that real address is.

When people mostly talk of virtual memory, they are referring to physical disk space and a swap file where memory allocated can be removed from RAM and temporarily stored on disk where the OS can move memory pages in and out of physical RAM according to what is needed at any specific moment. While iPadOS with M-series SoC’s does have swap memory, iOS does not. In iOS, the memory manager will move RAM out of physical RAM by suspending that application so it’s no longer actually running and unsuspend the new active app. iPadOS does that to some extent as well, but allows up to four apps (eight with external display) to remain active at a time, using swap to facilitate having that many apps open at once.
 
A polite way of saying «all these years, we've saved some pennies by shipping underpowered devices with twice less RAM than competition, and now it's going to finally bite you, but don't worry, we're not incentivising you to buy a new device!»

Couple more years and all the 8gb Macs will become so useless that it's going to hurt reputation of Mac platform badly, while PC users will keep enjoying multi-tasking on their reasonably affordable 32gb laptops with crisp and punchy OLED screens.
There’s something to be said on that, so credit where credit is due.

But also there have been comments here of people going from 64GB and 128GB PCs “down” to MacBook Pros with 32GB because no matter what they upgraded on the Windows world for their use case it would always chug.

The reason being that yes, RAM is cheaper there but as the saying goes “there’s no free lunch” and there’s always a trade off. In the case of that ensemble of cheaper motherboard, ram channels, bit depths, etc… it would “always saturate” and underperform (or something along those lines, forgetting the technical terms).
The ideal being going the fast ram of the GPU world, but as we know, higher very fast GPU RAMs offers happens to get back into the very expensive realm.
 
Initially Stage Manager could not be run on those iPads, but Apple figured out a way to get around it and managed to enable it for a single screen. Note A-series chips have never supported more than one screen while base M-series chips from M1-M4 support two displays.

There are plenty of users with older A-series chip iPads that have succesfully enabled Stage Manager and external display support. As far as I have read and seen the Stage Manager hack works on iPadOS 16-17.1.2. Here is a few models that has been metioned to work:

iPad Pro 2017 10.5 (A10X Fusion)
iPad 6th gen 2018 (A10 Fusion)
iPad Air 4 (A14 Bionic)
iPad mini6 2021 (A15 Bionic)

So, yes A-series chips have never officially supported more than one screen and Stage Manager, but it is still possible and usable.
 
Realistically it’s more like 12GB, but yeah. Also the A12-based developer Apple Silicon transition Macs had custom A12s which did support swap memory.

The chips themselved were regular A12s, but I assume the ssd controller was different, and then there was OS difference, too.
 
Cook‘s leadership has allowed Apple to fall far behind the AI curve. They are now scurrying to catch up while desperately trying to appear well positioned. Apple was once a great company with great products. But that was then and this is now.

I think that when the dust settles, you will see that Apple will end up being right where they need to be. As for where all the other AI companies end up, well, time will tell. Too many people tend to overestimate what can happen in the short run while severely underestimating what does actually wind up happening in the long term.
 
  • Like
Reactions: DefNotAnLLM
Crazy to me that Apple has apparently not thought ahead here. If it's RAM, then they could have easily just upped the RAM to 8GB in new phones sold since - say - 2022. Would have cost them almost literally nothing. Even if they just did 8GB of RAM on the Pro models.

I'm only just assuming but I think this is showing that Apple either (1) didn't expect Generative AI to take off this fast and weren't prepared to roll this out so early; the contracts with OpenAI support this theory. Or (2) they wanted to get their generative AI working on a wider variety of hardware but couldn't get the model to run on less than 8GB of RAM.

I feel like #2 is also supports #1, in that if they had more development time they probably could have optimized a bit more to run on 6GB or even 4GB but they probably would have bumped all phones to a min of 8GB eventually. Again, this is an assumption on my behalf. I know LLM's are vast but perhaps they could be much smaller in 2 years (if that's even when Apple would have launched this).
It cost Apple MORE to source smaller ram chips because there were fewer of them in circulation.

it’s wild to think about that. Apple had these modules made specifically for
I disagree. AI has, and still is to an extent, been a ******** marketing word/phrase but you can't deny AI has improved a lot recently.

16GB in "any device" has been more than enough and still is. You do NOT need more than 16GB today if you have a smartphone, tablet, gaming console, or a mid range gaming PC. If you have a high end gaming PC, workstation PC/laptop then you want 32GB.

I think 8GB in MacBooks is a joke in 2024 and do think 12-16GB should have been standard in the last few years.

I don't think Apple knew this was coming. Not many people/companies have known that AI was going to progress so fast in the last 2-3 years. The only companies that seem to have prepared for AI are Nvidia/Microsoft. Of course Apple have long been doing the bare minimum in the effort to force people to upgrade.
at no point did I say we needed more than 16gb.
I’m saying 16gb should have been the minimum for a decade now.
Laptops shipping today with 8GB of ram are either $200 asus windows throw away machines OR $1200 beautiful engineered MacBooks……
Which is a joke.
 
  • Like
Reactions: thebart and tomchr9
"Older phones simply lack the hardware required to run the new killer features.
Hardware that we purposefully designed; while we already knew about these killer features in the pipeline. Like 8 gig of ram that we said ought to be enough for any use case (except these new killer features)"

As such you'll need to upgrade to the latest phones that have just enough resources to run our new killer feature (but not enough to run our next killer feature that we're already aware of)"
 
As such you'll need to upgrade to the latest phones that have just enough resources to run our new killer feature (but not enough to run our next killer feature that we're already aware of)"
It’s all about Tim Cook and his greed.

But if Apple started to charge for better upgrades on older phones, so that we could run the new AI, wouldn’t it be a great solution. I would be ready to pay for it, instead of being forced to buy a phablet, I have 2 iPad mini’s and don’t need a big phone in a similar size
I want my iPhone MINI to run the latest 😡
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.