Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,544
39,399


Visual Intelligence, an Apple Intelligence feature that Apple introduced last year, has some new capabilities in iOS 26 that make it more useful and better able to compete with the functionality available through some Android smartphones.

visual-lookup-ios-26.jpg

Onscreen Awareness

In iOS 18, Visual Intelligence only works with the camera, but in iOS 26, it also works with what's on your device. You can capture a screenshot of what's on your screen and then use Visual Intelligence on it to identify what you're looking at, find images, and get more information through ChatGPT.

How to Get Use Onscreen Awareness for Visual Intelligence

Visual Intelligence for screenshots works about the same as Visual Intelligence with the camera app, but it's located in the screenshot interface. Take a screenshot (hold down the volume up button and the side button), and then tap out of the Markup interface if it's showing.

To get out of Markup (which is the default view), tap on the little pen icon at the top of the display. From there, you should see the Visual Intelligence options.

Highlight to Search

With Highlight to Search for Visual Intelligence onscreen content awareness, you can use a finger to draw over the object in the screenshot that you want to look up. It's similar to Android's Circle to Search feature.

visual-intelligence-screenshot-ios-26.jpg

Highlight to Search lets you conduct an image search for a specific object in a screenshot, even if there are multiple things in the picture. It uses Google Image search by default, but Apple showed off the feature working with other apps like Etsy during its keynote event. Apps will likely need to add support for the feature.

In some cases, Visual Intelligence will identify individual objects in an image on its own, and you can tap without needing to use Highlight to Search. This is similar to the object identification feature in the Photos app, but it still leads to an image search.

Ask and Search

If you don't need to isolate one object in your screenshot, you can simply tap on the Ask button to ask questions about what you're seeing. Questions will be relayed to ChatGPT, and ChatGPT will provide the information. The Search button queries Google Search for more information.

visual-intelligence-ios-26-screenshots.jpg

As with the standard Visual Search, if your screenshot includes dates, times, and related information for an event, it can be added directly to your calendar.

New Object Identification

Apple didn't mention it, but Visual Intelligence adds support for quick identification of new types of objects. It can now identify art, books, landmarks, natural landmarks, and sculptures, in addition to the animals and plants it was able to provide information on before.

visual-lookup-books-ios-26.jpg

If you use Visual Intelligence on an object that it is able to recognize, you'll see a small glowing icon pop up. Tapping on it reveals information about what's in view. What's neat about this aspect of Visual Intelligence is that it works with the live camera view or with a snapped photo.

For standard Ask and Search requests using Visual Intelligence, you have to take a photo so that it can be relayed to sources like ChatGPT or Google Image Search. Art, books, landmarks, natural landmarks, sculptures, plants, and animals can be identified on-device without contacting another service.

Compatibility

Visual Intelligence is limited to devices that support Apple Intelligence, which includes the iPhone 15 Pro models and the iPhone 16 models. It is activated by a long press on the Camera Control button on devices that have Camera Control, or using the Action Button or a Control Center toggle.

Launch Date

iOS 26 is in beta testing right now, but it will launch to the public in September.

Article Link: Here's What's New With Visual Intelligence in iOS 26
 
I don’t mean to sound ungrateful after all the stuff that’s already coming to my iPad this fall but it sure would be nice if this wasn’t fenced off by product differentiation.
In particular since Apple Intelligence is a mix of on-device and online functions, it would really make sense to offer a cloud fallback for devices that lack the resources to do it on-device. They can still upsell to newer hardware for on-device privacy, offline capability and lower latency.
 
Discoverability: 0.00pts

Hopefully this also works if you ask Siri about what's on your screen. "Hey Siri, tell me about this bird feeder". Siri assumes it's about something you're looking at on screen (and if not, it asks you to point your camera at it) and runs through the actions outlined above: screenshot, circling, Ask button.
 
My 16 pro max can’t run last years AI features without constantly crashing safari due to memory issues. So there’s no way older phones have a chance
The fine article says that the new Visual Intelligence features in iOS 26 offload at least some of the work to ChatGPT. There’s no reason they couldn’t offload *all* the work. I can use Google’s Gemini for the same stuff today on my iPad, so there’s no reason Apple can’t do it too.
 
The fine article says that the new Visual Intelligence features in iOS 26 offload at least some of the work to ChatGPT. There’s no reason they couldn’t offload *all* the work. I can use Google’s Gemini for the same stuff today on my iPad, so there’s no reason Apple can’t do it too.
Apple’s motto is to be different from what the industry already have, Onscreenshot will stay for a while.
 
Can it translate what is currently shown on the screen? I use this almost daily on my android device and it is just very convenient, not having to copy paste anything to translate it in another app.
It’s not a single tap, but you can do that now: select the text in an image (or anywhere) and then tap Translate from the menu. I do this almost daily.
 
Can it translate what is currently shown on the screen? I use this almost daily on my android device and it is just very convenient, not having to copy paste anything to translate it in another app.

Also dumb, that this is somehow linked to the screen shot functionality and isn't its own thing like Circle to Search
Yes, there is a translate button
 
so we will be able to send QR code image to someone now and it will recognise and open the link?

That would be very handy to a lot of businesses. no more signs to scan to get staff to download apps etc.
 
  • Like
Reactions: Artemis70
The screenshot thing is poor - and many apps prohibit screenshots by the way. But I guess „circle to search“ is patente and Apple is forced to eat its own medicine - since circle to search is just a gesture to take a screenshot and define the area of interest at the same time.
 
the fact that you have to screenshot (with the screenshot sound and everything) is the dumbest design i never thought I'd see
Yeah I hate being somewhere public like on a train and taking a screenshot because I forget it stupidly makes the camera shutter noise, so people near me get edgy that I’m taking sneaky photos
 
Some very good improvements to Visual Intelligence. Happy that all these features are coming to my 15 Pro Max.
 
  • Like
Reactions: mganu
Yeah I hate being somewhere public like on a train and taking a screenshot because I forget it stupidly makes the camera shutter noise, so people near me get edgy that I’m taking sneaky photos
Unless you have a Japanese iPhone (they have mandatory shutter sounds, though I don’t know if the requirement extends to screenshots), you can turn on Silent Mode manually, or create a Shortcut that turns it on temporarily while taking a screenshot, and bind that to Back Tap.
 
Last edited:
  • Like
Reactions: jm22381
Is this US only? When I press the camera button, the camera appears. Screenshot awareness isn’t there either
 
I am using backtap in my iPhone to take a screenshot and search with google lens (shortcuts)so that I get a better result. I have been using this för years! And screenshots will not be saved.
 
  • Like
Reactions: jm22381
QR codes are recognized in images since iOS 16 (“Live Text”). You have to tap and hold and select the option to open the link in the menu that appears.
thanks. it's been a few years since i was supporting an app and back then you couldnt do it... staff would always ask and it seemed like something that should have been possible. cheers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.