Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I called this out in Dec 2024 that they could have used the action button for the 15 PM, or even a software button in the camera app, there was no reason why this could not work on older phones if they have the hardware requirements and not the actual button. Come on Apple, bring this to the 15 PM then, as you have no damn excuse now!

Indeed you did I remember reading it.
 
Definitely no need for it, but I really do like it. Makes zooming in and changing some settings a little quicker, and feel more like a normal camera.
Camera button is THE single thing I miss MOST from my Sony Xperia. I'd like a simpler one on my iPhone which also can pick up your dog from doggycare, make you an espresso and serenade you to sleep. (Ie the Apple button is too complex and convoluted...)
 
There’s no excuse for the lack of iPhone 15 pro support.

There was a hidden control center toggle that was uncovered in one of the betas, so I have no doubt it’s coming eventually. But it should’ve been available with iOS 18.1 for iPhone 15 Pro. I get it though, they wanted to upsell the iPhone 16.
 
I’d like to see them bring it to the iPad mini. That’s pretty much my mobile device. Of course, if it is no more accurate then visual look up it’s a moot point.
 
It's possible that Apple will add Visual Intelligence support to the iPhone 15 Pro models in an upcoming iOS update, but it's also equally possible that it will remain an iPhone 16 feature.
What the article doesn’t mention though is it’s likely the hardware limitation, specifically the A18 chip.

Since Visual Intelligence is all done on device rather than sending it to the cloud AI, it needs more neural engine power. That’s where A18 chip comes into place. It has more neural engine power compared to A17 Pro chips.
 
They have done this before, didn't they?
  • Not having Siri on iPhone 4 (I jailbroke mine and had Siri on it, and it does not have any performance issues).
  • Not having Apple Music Sing on 2021 Apple TVs (I mean, this feature was enabled for a while but later removed and only available on 2022 Apple TVs)
I mean, you can set Google Lens for the Action Button for a similar AI experience, and they had better realize that it would only benefit Google, not Apple. Nobody is gonna upgrade from 15 Pro to 16 Pro JUST for this feature, right? I still wish they would end up adding it to 15 Pros because if they didn't, it would have sounded completely dumb. Let's see.
 
The iPhone 15 Pro not having the Visual Intelligence feature is a hassle because the core functionality is there … like you can call Siri to upload what you’re seeing in the camera app, it’s just not as elegant as the Visual Intelligence feature.

Then there’s no reason it this feature couldn’t come to the M4 iPad Pro either if you think about it or the iPad Mini for that matter, I guess.
 
  • Like
Reactions: amartinez1660
What the article doesn’t mention though is it’s likely the hardware limitation, specifically the A18 chip.

Since Visual Intelligence is all done on device rather than sending it to the cloud AI, it needs more neural engine power. That’s where A18 chip comes into place. It has more neural engine power compared to A17 Pro chips.
Not a hardware limitation in the true sense—the feature would work older hardware harder, which coupled with the aged battery, would reopen the cultural perception of “new software features”=“forced obsolescence”.

This marketing-first strategy should have been obvious when they announced Apple Intelligence support for M1 devices but not the A16—everyone just ran with the notion that 8GB RAM was a hard limitation (as if Apple has not done several WWDC presentations about their frameworks for pipelining model memory). Executing their model doesn’t take all the device memory, but the added memory pressure makes swap read/writes more likely which draws more power. It’s all about managing battery life expectations.
 
What the article doesn’t mention though is it’s likely the hardware limitation, specifically the A18 chip.

Since Visual Intelligence is all done on device rather than sending it to the cloud AI, it needs more neural engine power. That’s where A18 chip comes into place. It has more neural engine power compared to A17 Pro chips.
It’s not a hardware limitation the parts where data is not sent to the cloud it just visual lookup, the XS could’ve supported visual intelligence if they really wanted
 
It’s not a hardware limitation the parts where data is not sent to the cloud it just visual lookup, the XS could’ve supported visual intelligence if they really wanted
Nope. It is hardware limitation. The real time analysis of the image is all done on device via neural engine. The data from analysis is then processed on device before it’s returned to the data to the user.

The real time analysis requires significant amount of neural processing power that is not possible even on A17 Pro chip.

Only time it requires analysis via cloud is if it requires ChatGPT assistance to get the result.
 
  • Like
Reactions: amartinez1660
Update --
It is hardware limitation. The real time analysis of the image is all done on device via neural engine. The data from analysis is then processed on device before it’s returned to the data to the user.

The real time analysis requires significant amount of neural processing power that is not possible even on A17 Pro chip.

nope!

iPhone 15 Pro, with A17 Pro chip, is getting Visual Intelligence

 
  • Like
Reactions: The Oak
Surprised. If the new 16e can support this, the 15 Pro should also have been given this feature. Disappointing to see Apple keeping the feature away from the previous year flagship but it is very common to keep features exclusive to the new devices. Would love to have this feature on my 15 Pro Max
 
  • Like
Reactions: mganu
Can't do that ... it requires new hardware*



*for the moment, standby please...
Well, I do wish my mini had the 18 instead of the 17 pro. And skipping the M3 on the iPad Pro makes my M2 feel older than it actually is. This is the second time Apple has done this to me. I’m still reeling from having my iPad three superseded by the iPad four in only six months.
 
  • Like
Reactions: turbineseaplane
Thats fine. I'm suspicious that the camera control button is even going to survive to the next generation. I keep trying to adjust my picture setting with it and almost always find using the touch screen faster and more covenant at finding and selecting the right options. It is nice as a basic button to open the camera and take a quick picture but the touch functionality is weak IMO.
It’s really dumb. Nobody asked for it.
 
I made an update on post. Looks like Apple is way WAY ahead of schedule on optimizing Apple Intelligence and neural engine power on A17 Pro chip. (Roadmap was hopefully with luck, Apple might be able to optimize neural engine power on A17 Pro to support on device Visual Intelligence on iOS 19, but it wouldn’t be full on support. Looks like I was proven wrong and that Apple is way ahead of schedule on running fully on device AI without need for cloud in the future)

Would not be surprised if all of iPads capable of running Apple Intelligence get Visual Intelligence support on iPadOS 18.4 as well. (Considering iPad mini runs on A17 Pro and entry level iPad plans to run A17 Pro as well)
 
Nope. It is hardware limitation. The real time analysis of the image is all done on device via neural engine. The data from analysis is then processed on device before it’s returned to the data to the user.

The real time analysis requires significant amount of neural processing power that is not possible even on A17 Pro chip.

Only time it requires analysis via cloud is if it requires ChatGPT assistance to get the result.
Nah it’s software locking

All visual intelligence features are just a compilation of already existing ones under new branding


Absolute tosh older phones can’t use a more centralised version of an already existing feature
 
Nah it’s software locking

All visual intelligence features are just a compilation of already existing ones under new branding
It’s coming to iPhone 15 Pros and hopefully iPads in next iOS and iPadOS Update, which Apple confirmed; meaning Apple did actually make some significant software optimization that finally allows the feature to run well with A17 Pro neural engine.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.