Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Has there been any word on whether it will be possible to *turn off* Apple Intelligence in iOS?
It will be opt-in and you enable it from the menu in Settings where Siri is. It's renamed to "Apple Intelligence and Siri" and Apple Intelligence is a switch you turn on.
 
The things that companies make up to get you to buy their products 🤦‍♂️
My thoughts exactly lol. Usually a load of BS if it takes that long to explain.

The chips are iterative each year with minimal increase. There’s a reason they don’t do direct comparisons with the outgoing models of their products. Here’s iPhone 16 Pro let’s compare it to iPhone 2007.
 
Nobody (outside Apple) knows what model it is. It could be just a ****** LLM with very few parameters.
It's their own ~3B parameter on-device "small language model" that they described in detail in some research articles. It'd take around 1.5GB RAM to run it, which isn't too little when you already have the OS and app taking up much of the 6GB (actual available RAM is even less). It shouldn't be impossible though if they really wanted to. The semantic index likely works similar to Spotlight and the context is obviously not always stored in the RAM. I've noticed my iPhone 15 (non-Pro) finds things waaay quicker now on iOS 18. Shockingly so, it's instantaneous. Very fast to resurface old messages that contain the word when looking for something from Spotlight. Even before searching in Safari shows up, which never used to be the case.
 
  • Like
Reactions: DefNotAnLLM
Presumably iPhone 16 will have 8GB RAM? But if 8 is the bare minimum then why not bump to 16? Is it hardware margins? Battery life? Both?
I wouldn't be surprised if Apple even sells the Pro models with the bare minimum of required RAM.
 
  • Like
Reactions: tomchr9
Has there been any word on whether it will be possible to *turn off* Apple Intelligence in iOS?
I'm sure it'll be possible. My question is, do you have to turn Siri off altogether? Do they have a dumb version of Siri on board? Also, even if you turn AI off, does it still do semantic indexing of all your data anyway? After all, you could decide at any time to turn it on and ask some questions; does it then tell you to chill for a couple hours? What about something like Math Notes? Can you just keep that?

This will be fun
 
I thought they were using ChatGPT, which runs in the cloud? Even if it runs on-device, it could run in the cloud, which would make it device-agnostic? I know, I know "privacy", but apparently it does send "more complex queries" to the cloud? Why not send everything to the cloud on older devices?
Step 1, first it has to be approved by apple for the app store.
 
Short answer, Apple really doesn't want to spend money on any servers or bandwidth to do AI like litereally all their competitors will do on even the cheapest phones. Apple won't have to spend any money on the back end to implement this feature and it drives people to upgrade earlier than they would have otherwise done. To do it over the internet means including many old Apple devices for a feature that costs them money but doesn't sell more devices. By doing it the Apple way they make more revenue, or to put it simply, F you pay me.
 
Lol how saying that company wants to make money is a conspiracy? I think conspiracy would be to claim apple wants us all to buy as little as possible so our little wallets are happy 😊

The first is just obvious and like if you think they aren’t milking the most they can then wake up lol. It’s a ruthless behemoth with seldom but the next quarter in mind.

We all signed the pact with the devil for convenience so let’s not be surprised when it sucks on our blood. It was in the contract.

Edit: yeah it’s ram but who puts little ram in our devices? Santa Claus? Nope apple. You think they couldn’t just slap 16 GB for 7 more cents? No, they will put 8gb so they can have opportunities as this to get people to buy new thing. Planned obsolescence without riling up eu regulators.

Another conspiracy theorist.

This is complete garbage. Literally the model won't fit in the RAM. I doubt it'll fit in the 8Gb they have either. And I doubt the battery life won't get completely destroyed by it. If there's any wankery going on it's that they are winging it and it might not work out yet.

As for planned obsolescence, my iPhone 6s still works fine and is getting patched.

I would have bought a 15 Pro with this crap or not. And I'll sell it after 2 years and buy whatever crap they're selling me then. Because it works!
 
  • Like
Reactions: alfonsog
Why are people disparaging apple for selling AI in new phones when AI was the flagship feature for Samsung and Google's new phones as well.
 
  • Like
Reactions: hans1972
I'm betting it's the RAM mostly. All you people that have defended that lackluster ram in Apple devices where you at?

Shockingly to you maybe, but we're like if you have an iPhone with less than 8Gb of RAM, you can't use Apple Intelligence unless you buy an iPhone with 8Gb of RAM, or more.

Everyone who bought a Mac with Apple Silicon in the last 4 years, can enjoy Apple Intelligence.

And we're not angry at all. We just go through life calmly and collected.
 
So iPhone 16 pro gonna have sufficient RAM that future proof it or it’ll have exact 8 gigs RAM that’s needed to run AI? I’m hoping for 16 gigs RAM.
 
  • Like
  • Haha
Reactions: pappl and G725
Short answer, Apple really doesn't want to spend money on any servers or bandwidth to do AI like litereally all their competitors will do on even the cheapest phones. Apple won't have to spend any money on the back end to implement this feature and it drives people to upgrade earlier than they would have otherwise done. To do it over the internet means including many old Apple devices for a feature that costs them money but doesn't sell more devices. By doing it the Apple way they make more revenue, or to put it simply, F you pay me.

Yes, but a company saving money isn't something they should be blamed for. Everyone should try to reduce cost if they can, including Apple customers.

Apple's solution also has at least three advantages:

1) Latency will be low which is important for tasks you do on device. If I ask Siri for "Send a text message to my mother" I want instant reaction and not wait 1 second or more.

2) A lot of Apple Intelligence will work when you don't have Internet connection. Let's say you're used to giving Siri commands and suddenly they don't work. It's not a good user experience if you expect something to work all the time.

3) A lot of people are skeptical about AI including privacy. Having an on-device narrative for most of the AI stuff, will make some of these people less skeptical.
 
  • Like
Reactions: DefNotAnLLM
So iPhone 16 pro gonna have sufficient RAM that future proof it or it’ll have exact 8 gigs RAM that’s needed to run AI? I’m hoping for 16 gigs RAM.
Rumors point towards 8gb , with the 17 pro getting an increase , with 12gb of ram

Yeah, it's making me doubt whether I should shell out almost 2,000$ in a 16 pro, when it'll be severely limited in a mere year
 
  • Like
Reactions: DOD250
It's PR ********. It's just about the RAM - 8GB minimum. There are so many recent researches, that proves A17 Pro NPU is not much faster than previous year. 35 TOPS, but measured in INT8. A16, A15 ects are measured in FP16. So basically A17 Pro is 17.5 TOPS in FP16 vs A16's 17.

Also the "poor" M1 is just A14 with 2 more P cores. It's using the same old 11 TOPS NPU, but sure... no problem to run AI.

This channel is researching the topic deeply btw. It's in french mostly, so you have to use the translator.
Maybe someone else has mentioned this by now but this is really Apple taking advantage of the fact the industry has standardized around 8 bit inference. There isn't really a point in doing 16 bit anything inference, especially at the edge. I don't see this as a genuine attempt by Apple to cover things up.
 
  • Like
Reactions: DefNotAnLLM
They can run more requests on the cloud on older devices. They have been doing that with dictation for a long time now.

If ChatGPT, Gemini, Claude, Co-Pilot, etc. all have apps that can run capable chatbots on almost vintage iPhones via cloud processing, so can Apple.

If they wish to avoid privacy issues, they just need to make the difference clear on the settings page. But they also said that even AI on their servers are very private.

So it still seems that it is a free decision on their end, likely in consideration of their revenue, convenience, and other factors.
 
Last edited:
My iPhone 14 Pro Max ALSO has a 16 cores neural engine! 6 GB RAM, so they cannot use my 1 TB of memory as swap? What a nonsense Apple!
They can, everyone whinging on about RAM has not watched any of the WWDC seminars about ML on Apple platforms and is instead rehashing an old argument about device memory. The whole model doesn’t *need* to fit into RAM to operate. It’s about battery draw and thermal performance.

Which to be fair, is part of why apple is anemic with memory in general—that extra ram wouldn’t be used to improve user multitasking, it would be eaten up from sloppy memory management by developers almost immediately (consider the proliferation of electron on desktop platforms)
 
Naïve it was to think them answer like “yeah, man. For sure! Poors gonna pay for these new expensive tech, you know”
In defense of Gruber, you read a transcript and didn't imagine further elements of language such as tone and sarcasm for the laugh. Watch for yourself.

Gruber literally asks the question sarcastically after Giannandrea was done explaining the technical reasons for cut off.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.