Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,081
38,824


Apple has announced a major Visual Intelligence update at WWDC 2025, enabling users to search and take action on anything displayed across their iPhone apps.

f-e8c21d3b4b19bd802aeabf2bfc6bbd7966564385.jpg

The feature, which previously worked only with the camera to identify real-world objects, now analyzes on-screen content. Users can ask ChatGPT questions about what they're viewing or search Google, Etsy, and other supported apps for similar items and products.

Visual Intelligence recognizes specific objects within apps – like highlighting a lamp to find similar items online. The system also detects events on screen and suggests adding them to Calendar, automatically extracting dates, times, and locations.

Accessing the feature appears straightforward: users press the same button combination used for screenshots. They can then choose to save or share the screenshot, or explore further with Visual Intelligence.

The update basically makes Visual Intelligence more of a universal search and action tool across the entire iPhone experience. Apple says the feature builds on Apple Intelligence's on-device processing approach, maintaining user privacy while delivering contextual assistance across apps.


Article Link: iOS 26: Visual Intelligence Now Searches On-Screen Content
 
I’m more interested if it’ll be available on iPhones prior to the 15 Pros - I mean it’s straight up a copy / paste feature that was available on a Pixel 8 with a lesser capable chipset. Why should my 14 Pro not be able to handle it?!
 
  • Like
Reactions: jm22381
I kinda expected it to do that last year, so I'm glad that they are catching up. Should've been able to do that last year without having to go through the process to move the screenshot into the ChatGPT app.
 
  • Like
Reactions: jm22381
Good feature but does it always see what’s on screen, or does it only look when I take a screenshot and tap on one of the VI buttons? I don’t want it to look at my screen or even my screenshot until I tap that button.
 
This seems nice! But it’s a little strange you access it through taking a screenshot…
The action button on the 16 series defaults to visual intelligence - maybe the phones this year will have a two state version of this where a slight depression of the button will trigger this feature. Just speculating.
 
Is this in the developer beta? I can seem to find the toggle to turn it on. I know it may come out in a future release but just curious.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.