Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Remains to be seen, it is curious though that they restricted AI to only the newest iPhone chip. The M1 is supported yet has a weaker neural engine than the A16, for example - so it has to just be RAM if we do believe they aren't just using it as a scheme to sell new phones.
Or they are just full of ****. :)
 
Quite funny to read the comments here. The less knowledge a person has about LLM and how AI works, the more definitive statements they make about how Apple could run Apple Intelligence on the iPhone 3GS locally. Seriously, people, have at least a little knowledge before you make fools of yourselves here.
The only valid criticism is that they did not provide 8GB of RAM, at least in the earlier Pro models (like 14 and 13).
 
I'd much rather wait 6 seconds instead of three, and still have the damn feature!!!

Can macrumors open a survey, and ask iphone 13,14 and 15 owners if they'd rather have AI, albeit slower??
I'm serious: could we open a poll or something?
Who's with me?
 
Crazy to me that Apple has apparently not thought ahead here. If it's RAM, then they could have easily just upped the RAM to 8GB in new phones sold since - say - 2022. Would have cost them almost literally nothing. Even if they just did 8GB of RAM on the Pro models.

I'm only just assuming but I think this is showing that Apple either (1) didn't expect Generative AI to take off this fast and weren't prepared to roll this out so early; the contracts with OpenAI support this theory. Or (2) they wanted to get their generative AI working on a wider variety of hardware but couldn't get the model to run on less than 8GB of RAM.

I feel like #2 is also supports #1, in that if they had more development time they probably could have optimized a bit more to run on 6GB or even 4GB but they probably would have bumped all phones to a min of 8GB eventually. Again, this is an assumption on my behalf. I know LLM's are vast but perhaps they could be much smaller in 2 years (if that's even when Apple would have launched this).

Occam's razor - the simplest explanation is the best one - they were blindsighted by AI. They were wasting their engineering effort on the car, the vision pro, and finewoven and totally missed it. So now they're catching up by outsourcing it to Microsoft/OpenAI.
 
The next iPhone SE is going to be so much more powerful than all the current iPhones being sold, bar the iPhone 15 Pro (Max). No way Apple sells the next SE without hardware to support AI.
Doubt it. The point of the SE is to be a cheap entry level phone for those that don’t want to pay more for the latest features. No way they will add 8gb of ram, it would jump the cost up too much. This is exactly why it still has Touch ID and not Face ID.

Entry level iPads also won’t be getting AI anytime soon.
 
so iphone 14pro max is ancient tech already?
On-device AI has created an inflection point, so it's no surprise there is a new threshold for minimum performance for an entirely new (and demanding) application. My iPhone 14Pro is getting ready for a refresh anyway ;-)
 
Apple knew what its AI requirements were when developing the 15 series. They intentionally didn’t give the 15 and 15 Plus 8GB; if not to drive 15 Pro sales, then as a way to force iPhone 16 upgrades sooner.
Apple knew the minimum spec for a feature that doesn't even enter beta until Fall 2024 when developing a phone to launch in Fall 2023? Heck, some parts of the iPhone 15 development probably started very shortly after the iPhone 14 shipped. So Apple knew all about LLM requirements in Fall 2022?

You're also complaining that Apple made a less expensive phone with cheaper parts. Of course they did. That's the whole point of a non-Pro phone--to make it more affordable.
 
Last edited:
I think Apple should introduce a mid-year iPhone upgrade cycle, around march and call it the iPhone 16.5 and render all previous models obsolete (including the iPhone 16.0). Starting in March, iOS 18 will be exclusively compatible with the iPhone 16.5. 🤡
 
  • Like
Reactions: frozencarbonite
Joswiak: "No, not at all. Otherwise, we would have been smart enough just to do our most recent iPads and Macs, too, wouldn't we?"

What does he mean here? All Macs have M2 or later, so they all support the same features as the A17 Pro/M-series chips.
It runs on older M1 Macs too, including the nearly 4 year old MacBook Air.
 
Apple saying that this isn't a scheme to sell knew iPhones is hilarious. Of COURSE it is. If it wasn't, then every phone, iPad and Mac that can take the update should be able to run "Apple Intelligence" I just wish they were honest with their customers and stop trying to be a "caring" company.
They have to strike a balance. Too many “forced” upgrades and people will give Android and Windows another look. What’s interesting is that Microsoft is limiting Copilot+ to new PCs with the arbitrary 40 TOPS threshold. While Copilot+ is a bit more aggressive than Apple Intelligence, it could likely run on older PCs with sufficient RAM.
 
Microsoft is doing the same with the new Copilot+.

Yep. Still struggling to see what I get out of buying a copilot+ pc. The most exciting part is their arm chips finally getting up to par.

Reviews have been delayed on these because of the Recall feature being removed or changed. It’s a mess.

I don’t really get it though. Let windows finish your drawing? Then it’s not my drawing anymore. Not that i draw anything ever. Studio effects on a zoom meeting? Ugh.

Feel the same about anything I saw in apples demo.
 
Last edited:
So what if older devices aren’t supported. Most people will likely turn off all the Artificial Intelligence stuff after playing with it for a while anyway
 


With iOS 18, iPadOS 18, and macOS Sequoia, Apple is introducing a new personalized AI experience called Apple Intelligence that uses on-device, generative large-language models to enhance the user experience across iPhone, iPad, and Mac.

Apple-WWDC24-Apple-Intelligence-hero-240610.jpg

These new AI features require Apple's latest iPhone 15 Pro and iPhone 15 Pro Max models to work, while only Macs and iPads with M1 or later chips will support Apple Intelligence. Since the news came to light, many users have been asking what the reason is for the cut-off.

In The Talk Show Live From WWDC 2024, Daring Fireball's John Gruber put the question to Apple's AI/machine learning head John Giannandrea, marketing chief Greg Joswiak, and software engineering chief Craig Federighi, and this was the response.
Apple's software engineering chief Craig Federighi said that the company's first move with any new feature is to work out how to bring it back to older devices as far as possible. But when it comes to Apple Intelligence, "This is the hardware that it takes... It's a pretty extraordinary thing to run models of this power on an iPhone," he added.

The iPhone 15 Pro models use the A17 Pro chip, which has a 16-core Neural Engine that's up to 2x faster than the A16 chip found in the iPhone 15 and iPhone 15 Plus, performing nearly 35 trillion operations per second. Federighi hinted that RAM is also another aspect of the system that the new AI features require, so it is perhaps no coincidence that all the devices compatible with Apple Intelligence have at least 8GB of RAM.

Despite the cutoff, owners of older iPhones still have plenty to look forward to in Apple's upcoming software update: iOS 18 boasts several new features besides Apple Intelligence, and every iPhone that can run iOS 17 is compatible iOS 18. That includes the iPhone XR from 2018.

If you still want Apple Intelligence in your pocket but don't have an iPhone 15 Pro or iPhone 15 Pro Max, you may want to hold out for the iPhone 16 series, which is expected to launch when iOS 18 is released in the fall.

Article Link: Apple Explains iPhone 15 Pro Requirement for Apple Intelligence
All the fanboys that cried "RaM iS nOt NeEdEd 🤓" are suddenly real quiet now
 
It makes perfect sense if you know anything about LLMs. To run models locally you need a lot of RAM, there's no way around it. 15 Pro is the first one with 8GB which allows to run 3B parameter model locally. Apple avoids saying it that it's about RAM but in reality, this is the sole reason that all the M-chips and A17 Pro allow it.
So why didn't they include more RAM for the past couple of years? You know, like other manufacturers.

If this has been many years in the making, they probably understood the hardware requirements a looooong time ago, no? Some might give them a pass, but it is intentional. As they always brag, they basically own the whole stack - HW and SW, always puffing their chest how they can do more with less... except when they need any SW excuse to sell more product.
 
I'd much rather wait 6 seconds instead of three, and still have the damn feature!!!

Can macrumors open a survey, and ask iphone 13,14 and 15 owners if they'd rather have AI, albeit slower??
I'm serious: could we open a poll or something?
Who's with me?

It'll be interesting to see if Apple outright prevents users of older iPhones to use Intelligence or simply allow them to use it at a less than ideal experience.
 
  • Like
Reactions: DOD250
Doubt it. The point of the SE is to be a cheap entry level phone for those that don’t want to pay more for the latest features. No way they will add 8gb of ram, it would jump the cost up too much. This is exactly why it still has Touch ID and not Face ID.
There is precedent. The original SE got the same 2 GB RAM as the then latest iPhone 6S, a step up from 1 GB in iPhone 6.
 
New features aren’t intended to sell new phones? Ok Apple. Also, Macrumors posters: surprised that new features are intended to sell new iphones and question this? What planet are we living on here?
 
  • Like
Reactions: ToyoCorollaGR
It's PR bullsh*t. It's just about the RAM - 8GB minimum. There are so many recent researches, that proves A17 Pro NPU is not much faster than previous year. 35 TOPS, but measured in INT8. A16, A15 ects are measured in FP16. So basically A17 Pro is 17.5 TOPS in FP16 vs A16's 17.

Also the "poor" M1 is just A14 with 2 more P cores. It's using the same old 11 TOPS NPU, but sure... no problem to run AI.

This channel is researching the topic deeply btw. It's in french mostly, so you have to use the translator.
The problem with your logic is a17pro and m4 ANE are optimized for int8 inferencing.

M1-m3 they can utilize the GPU as well

RAM is a factor but power consumption is a factor on cellphones
 
Quite funny to read the comments here. The less knowledge a person has about LLM and how AI works, the more definitive statements they make about how Apple could run Apple Intelligence on the iPhone 3GS locally. Seriously, people, have at least a little knowledge before you make fools of yourselves here.
The only valid criticism is that they did not provide 8GB of RAM, at least in the earlier Pro models (like 14 and 13).
Quite funny how the defenders are basically saying either Apple was late to the party with AI, or late to the party with HW requirements to run AI. Either way, they "win" by making the features exclusive to new devices.
 
  • Like
Reactions: frozencarbonite
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.