Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It still doesn't make sense
It makes perfect sense if you know anything about LLMs. To run models locally you need a lot of RAM, there's no way around it. 15 Pro is the first one with 8GB which allows to run 3B parameter model locally. Apple avoids saying it that it's about RAM but in reality, this is the sole reason that all the M-chips and A17 Pro allow it.
 
Yet they didn't mention RAM in the snippet that Mac Rumors shared.
RAM seems to be a bit of a sore spot for Apple. They generally seem to try to avoid talking about it. It’s sort of like how they love talking about how fast their latest processor, NPI, or GPU is, but they don’t give any real concrete specs, such as clock speed.
 
Joswiak: "No, not at all. Otherwise, we would have been smart enough just to do our most recent iPads and Macs, too, wouldn't we?"

What does he mean here? All Macs have M2 or later, so they all support the same features as the A17 Pro/M-series chips.
 
  • Like
Reactions: Jovijoker
Mind you, the A14, A15 and A16 all have as powerful of a Neural Engine as the M1 has. So either it isn't related to the SoC at all, but solely to the RAM (in which case Apple is an absolute idiot for going so long with so little RAM in such expensive devices, which incidentally is something people have warned them about that the amount of RAM they gave their devices isn't very future-proof) or this very much is just to get people to buy the next iPhone or the current but more expensive iPhone Pro.

Either way, the sudden explosion of AI absolutely caught Apple with its pants down. The M4 is just another example of that.
 
So, iPhone 15 and 15 Plus and everyone else gets most of the software features available in iOS 18 except for Apple Intelligence, which due to 8GB RAM requirement and faster Neural Engine in A17 Pro limits access to iPhone 15 Pro and 15 Pro Max until iPhone 16 Pro and 16 Pro Max gets released...

So, iPad 6 with A10 is limited to iOS 17 while iPad 7 with A10 can get iOS 18

iPhone 5s [A7] and iPhone 6 [A8] are both limited to iOS 12
iPhone 6s [A9] and iPhone 7 [A10] are both limited to iOS 15

In the future, iPhone 13 series may get cut off earlier than iPhone 14 and 14 Plus even though they all use the same A15 Bionic, and iPhone 14 Pro and 14 Pro Max may get cut off earlier than iPhone 15 and 15 Plus even though they all use A16 Bionic

Buying the repackaged iPhone 14 Pro and 14 Pro Max [iPhone 15 and 15 Plus] seems to be a very not so wise decision if you are planning to use the Apple Intelligence features offered in iOS 18... This is the earliest that newly released [2023] models are immediately obsolete...

Makes you wonder why one should buy a base iPhone 16 in the future if you cannot be sure it will support whatever is coming in iOS 19, like you never know. Maybe next year something will be iPhone 17 + iPhone 16 Pro exclusive because the "repackaged" iPhone 15 Pro (iPhone 16) chip / ram, whatever is suddenly not good enough.
 
Mind you, the A14, A15 and A16 all have as powerful of a Neural Engine as the M1 has. So either it isn't related to the SoC at all, but solely to the RAM (in which case Apple is an absolute idiot for going so long with so little RAM in such expensive devices, which incidentally is something people have warned them about that the amount of RAM they gave their devices isn't very future-proof) or this very much is just to get people to buy the next iPhone or the current but more expensive iPhone Pro.

Either way, the sudden explosion of AI absolutely caught Apple with its pants down. The M4 is just another example of that.
It's about RAM. You need RAM to run LLMs locally, that's the sole reason.
 
I’ll continue to use ChatGPT (and be happy) through shortcuts on my iPhone 14 Pro Max. Best of luck, Siri. 👍
ChatGPT just chimed in: "Ah, iPhone 15 Pro required for Apple Intelligence? Guess that means Siri will now be smart enough to finally understand that 'call mom' doesn’t mean 'launch Apple Music and play the latest Taylor Swift album.' Baby steps, Apple, baby steps."
 
How much battery is this going to hoover up?
Depending on usage this could be significant. I suspect this is a reason why so many of the older iOS devices are not supporting this.

When he said it could be too slow to be usable… I suspect it’s a combination of slowness and power draw that makes AI on older iOS devices a no go. They probably think it’s safer for them not to support it than to have an avalanche of complaints that about battery drain.
 
  • Like
Reactions: AppleEnthusiast1995
Joswiak: "No, not at all. Otherwise, we would have been smart enough just to do our most recent iPads and Macs, too, wouldn't we?"

What does he mean here? All Macs have M2 or later, so they all support the same features as the A17 Pro/M-series chips.
What he means is they could've said it will only work on let's say M3 and M4 Macs and iPads. What they have said it is will work with all M-series devices, back to M1. So my M3 Max MacBook is good to go, but my 2019 Intel iMac is SOL. (Not unexpected, I'm waiting for an M4 Max MacStudio to replace this old warhorse.). Also, my M2 iPad Pro is also ready for AI- I'm not be forced to upgrade my iPad.
 
From my experience with LLMs it's been RAM, so I get that, and I also get that Apple hate to mention RAM

Still, I don't update my phone that often, and I guess I expected my 14 Pro to get the latest features for a little while. It'll be a few years till I get to use that AI on a phone!
 
SE2 here, and safe from the AI overlords. I might dabble via my M1 iPad. But I don’t need a new phone.
 
The next iPhone SE is going to be so much more powerful than all the current iPhones being sold, bar the iPhone 15 Pro (Max). No way Apple sells the next SE without hardware to support AI.
 
  • Like
Reactions: Fuzzball84
Depending on usage this could be significant. I suspect this is a reason why so many of the older iOS devices are not supporting this.

When he said it could be too slow to be usable… I suspect it’s a combination of slowness and power draw that makes AI on older iOS devices a no go. They probably think it’s safer for them not to support it than to have an avalanche of complaints that about battery drain.
That’d have been another conspiracy theory and a bunch of lawsuits about Apple deliberately destroying the battery life just to get them to upgrade.

From what I gather it’s mostly a RAM limitation and even then 8GB can only get you so far with LLMs.
 
  • Like
Reactions: Fuzzball84
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.