Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,028
36,845


Visual Intelligence is an Apple Intelligence feature that's exclusive to the iPhone 16, iPhone 16 Pro, and iPhone 16e models, but it is rumored to be coming to the iPhone 15 Pro in the future. Visual Intelligence is available as of iOS 18.2, and this guide outlines what it can do.

Visual-Intelligence-Feature-2.jpg

Activating and Using Visual Intelligence

To use Visual Intelligence, you need to hold down on the Camera Control button for a few seconds to activate the Visual Intelligence mode. On the iPhone 16e, you need to use a Control Center toggle or the Action Button assigned to the Visual Intelligence feature as there is no Camera Control button.

Just pressing opens up the camera with Camera Control, so you do need a distinct press and hold gesture to get to it. Make sure you're not already in the Camera app, because it doesn't work if the camera is already active.

The Visual Intelligence interface features a view from the camera, a button to capture a photo, and dedicated "Ask" and "Search" buttons. Ask queries ChatGPT, and Search sends an image to Google Search.

Using Visual Intelligence requires taking a photo of whatever you're looking at. You need to snap a photo, which you can do with the Camera Control button, and select an option. It does not work with a live camera view, and you cannot use photos that you took previously.

Get Details About Places

If you're out somewhere and want to get more information about a restaurant or a retail store, click and hold Camera Control, and then click Camera Control again to take a photo or tap the name of the location at the top of the display.

Apple-Visual-Intelligence.jpg

From there, you can see the hours when the business is open, place an oder for delivery at relevant locations, view the menu, view offered services, make a reservation, call the business, or visit the location's website.

Summarize Text

Take a photo of text from the Visual Intelligence interface. Choose the "Summarize" option to get a summary of what's written.

visual-intelligence-summarize.jpg

The Summarize option is useful for long blocks of text, but it is similar to other Apple Intelligence summaries so it is brief and not particularly in-depth.

Read Text Out Loud

Whenever you take a Camera Control image of text, there is an option to hear it read aloud. To use this, just tap the "Read Aloud" button at the bottom of the display, and Siri will read it out loud in your selected Siri voice.

visual-intelligence-read-aloud.jpg

Translate Text

If text that you capture with Visual Intelligence is not in your language (limited to English at this time), you'll see a "Translate" option. You can tap it to get an instant translation.

visual-intelligence-translate.jpg

Go to Website Links

If there's a link in an image that you capture with Visual Intelligence, you'll see a link that you can tap to visit the website.

visual-intelligence-websites.jpg

Send Emails and Make Phone calls

If there is an email address in an image, you can tap it to compose an email in the Mail app. Similarly, if there is a phone number, you'll see an option to call it.

visual-intelligence-email.jpg

Create a Calendar Event

Using Visual Intelligence on something that has a date will give you an option to add that event to your calendar.

visual-intelligence-events.jpg

Detect and Save Contact Info

For phone numbers, email addresses, and addresses, Apple says you can add the information to a contact in the Contacts app. You can also open address in the Maps app.

Scan QR Codes

Visual Intelligence can be used to scan a QR code. With QR codes, you don't actually need to snap an image, you simply need to point the camera at the QR code and then tap the link that pops up.

visual-intelligence-qr-code.jpg

QR code scanning also works in the Camera app without Visual Intelligence active.

Ask ChatGPT

You can take a photo of anything and tap on the "Ask" option to send it to ChatGPT while also asking a question about it. If you take a picture of an item with Visual Intelligence and want to know what it is, for example, you tap on Ask and then type in "What is this?" to get to a ChatGPT interface.

visual-intelligence-chatgpt.jpg

ChatGPT will respond, and you can type back if you have followup questions.

Visual Intelligence uses the ChatGPT Siri integration, which is opt-in. By default, no data is collected, but if you sign in with an OpenAI account, ChatGPT can remember conversations.

Search Google for Items

You can take a picture of any item that you see and tap on the "Search" option to use Google Image Search to find it on the web. This is a feature that's useful for locating items that you might want to buy.

visual-intelligence-google-search.jpg

Read More

For more on the features that you get with Apple Intelligence, we have a dedicated Apple Intelligence guide.

Article Link: Visual Intelligence: How to Use It, Features and More
 
Last edited:
Wow so one redundant button is keeping me from having this on my 15 Pro that has all other Apple Intelligence features? Does Apple have no shame left?
This is extra annoying as its the only truly useful feature that stands out in the extremely lacklustre Apple Intelligence Features released so far.
When you bought your 15Pro, did you do that in anticipation to get this feature?
 
Wow so one redundant button is keeping me from having this on my 15 Pro that has all other Apple Intelligence features? Does Apple have no shame left?
This is extra annoying as its the only truly useful feature that stands out in the extremely lacklustre Apple Intelligence Features released so far.
Maybe you can retrofit it?
 
  • Haha
  • Like
Reactions: 75Batt and RF2
The features on offer continue to be mid, it's probably cooler if you haven't had it on a Pixel or Galaxy already. Apple really needs to do better by users and do something only they can do with their hardware and software integration and this ain't it.
 
None of this sounds useful. I’m sure it’s faster in most cases to just go up to the restaurant and look at the hours or go inside and make a res. If it’s closed, open Google Maps (Apple Maps doesn’t have enough good, worldwide data yet) and click on the place that’s right across the street? These don’t require the use of wasteful AI compute power.

What’s the logic of disallowing AI analysis of previously taken photos?
 
None of this sounds useful. I’m sure it’s faster in most cases to just go up to the restaurant and look at the hours or go inside and make a res. If it’s closed, open Google Maps (Apple Maps doesn’t have enough good, worldwide data yet) and click on the place that’s right across the street? These don’t require the use of wasteful AI compute power.

What’s the logic of disallowing AI analysis of previously taken photos?
You're right none of it's particularly useful. Even the stuff on offer with the Pixel and the Galaxy, while much better as a whole in terms of offerings, is meh aside from their cool photo tools that aren't hampered like Apple's. I'd say Galaxy is the winner right now. Apple chasing this fad instead of holding strong and waiting to see what is actually of utility reeks of desperation frankly. I was fine for my 16PM not to have any of this stuff but they are gonna keep pushing software updates and doing goofy ill-executed stuff like summarization.
 
I use the 16 PM, but there was no reason why this could have not been on the 15 Pro, there is no excuse other than laziness and trying to upsell the 16 Pro. The hardware and software is there, just no easy shortcut to it. Dick move by Apple to be honest.

Four ways to make it work without the new shortcut button, one either make it a separate app you could open, two, add the option to the camera context menu when you long hold, three, a shortcut button in the actual camera app.

And four, use the bloody dedicated shortcut button they added on the 15 Pro!. Come on Apple, it is customisable remember?
 
The features on offer continue to be mid, it's probably cooler if you haven't had it on a Pixel or Galaxy already. Apple really needs to do better by users and do something only they can do with their hardware and software integration and this ain't it.
Thank you! And from what I've got from reading from this article Samsung has much cooler AI features & so does Pixel and there are rumours that Google will release the AI soon on older Pixel models.
 
None of this sounds useful. I’m sure it’s faster in most cases to just go up to the restaurant and look at the hours or go inside and make a res. If it’s closed, open Google Maps (Apple Maps doesn’t have enough good, worldwide data yet) and click on the place that’s right across the street? These don’t require the use of wasteful AI compute power.

What’s the logic of disallowing AI analysis of previously taken photos?

I think the search function is okay, and I guess the ask thing could maybe be useful, but ChatGPT is so unreliable. I took a picture of a label of some vegan chips and asked if the chips were vegan, and it said no, the chips had cheese. The label had vegan cheese flavor listed, not cheese.

This is a feature that does seem helpful for those with low vision. It's good at reading things out loud and describing what's in front of you.
 
Honestly these are all features that the google app has been able to do for years. Pretty sad that apple couldn't have built this for a quialtiy of life update that could get people excited for apple intelligence for all models. If you wanna try out these features just download the google app and see what you're missing. It's not anything special that should make anyone with a iphone 13/14/15 "need" to update.
 
Pure upsell tactics. Joke’s on Apple though, I’m all out of FOMO.

I’m content with my 15Pro.
Right there with you, after some back and forth I finally got rid of a personal 13 Pro Max and got an S24 ultra and it's amazing- I've missed having so much engagement and delight with a phone. I've also got a Pixel 9 Pro and a 16 Pro Max for work and the 16 felt like my 15 before it, and the 14 before that and the 13 I just sold off ( all of them feeling no different than the last going all the way back to the X which was amazing). Apple is not making anything engaging these last 5+ years and I fear they've lost the bead on fun devices you want to pick up and noodle around with.
 
Thank you! And from what I've got from reading from this article Samsung has much cooler AI features & so does Pixel and there are rumours that Google will release the AI soon on older Pixel models.

what features are you talking about from personal experience? there are a 1000 things one can say, some good, some right, so without specifics .... yeah, I have read the Samsung and google are artificial dumb features.
 
  • Disagree
Reactions: decypher44
what features are you talking about from personal experience? there are a 1000 things one can say, some good, some right, so without specifics .... yeah, I have read the Samsung and google are artificial dumb features.
Side-by-side I find text formatting better on Android. Smart removal of people in the background of a photo or even taking a photo and inserting someone (like the camera person holding it and then going and standing in while one of the other participants takes the second photo). Just a few things I find useful. Gemini as an assistant is genuinely helpful and I've had zero wrong answers or "I'll search the web for" moments. Context building based on conversation is impressive as well. For me- converting handwriting on the S24 ultra is deadly accurate even with my chicken scratch. The Google Pixel weather app trounces Apple' built-in offering (Apple ruined Dark Sky). I could go on but by every metric Apple is lagging pretty bad. I desperately want nouveau Siri to be better but she sucks in much the same way as before and in a few excitingly frustrating ways that had me impressed. Not even mad. You almost have to try and be this awful. Message summaries don't work well with hilarious results. One cool feature is being able to draw in objects and have AI fill it in with an appropriately generated image. Moving items around in images seems better and more consistent on Pixel and Galaxy too. Just my observations since I own all 3.

I'm so frustrated I want Siri, and by extension, Apple Intelligence to be best in class. But from their slightly misleading marketing of as-yet-unreleased features prior to this new update, to their goofy diversions with image playgrounds and genmoji- I keep asking myself who finds utility in this after the novelty wears off?
 
Side-by-side I find text formatting better on Android. Smart removal of people in the background of a photo or even taking a photo and inserting someone (like the camera person holding it and then going and standing in while one of the other participants takes the second photo). Just a few things I find useful. Gemini as an assistant is genuinely helpful and I've had zero wrong answers or "I'll search the web for" moments. Context building based on conversation is impressive as well. For me- converting handwriting on the S24 ultra is deadly accurate even with my chicken scratch. The Google Pixel weather app trounces Apple' built-in offering (Apple ruined Dark Sky). I could go on but by every metric Apple is lagging pretty bad. I desperately want nouveau Siri to be better but she sucks in much the same way as before and in a few excitingly frustrating ways that had me impressed. Not even mad. You almost have to try and be this awful. Message summaries don't work well with hilarious results. One cool feature is being able to draw in objects and have AI fill it in with an appropriately generated image. Moving items around in images seems better and more consistent on Pixel and Galaxy too. Just my observations since I own all 3.

I'm so frustrated I want Siri, and by extension, Apple Intelligence to be best in class. But from their slightly misleading marketing of as-yet-unreleased features prior to this new update, to their goofy diversions with image playgrounds and genmoji- I keep asking myself who finds utility in this after the novelty wears off?

If my phone was my only device, I might agree with you. Might. Well okay, probably not. But I also have several computers, tablets, and watches and I have tried the alternatives to Apple's ecosystems and I find the total experience to be a pale comparison. So go on with naming this feature or that, but the sum total of the Apple experience trounces what you are talking about, to use your phrasing. So yeah, really don't care. Apple's weather app does the job for me. I don't write on my screen that often, the keypad works fine for me. And I can google fine without the aid of some AI assistance. But okay, draw your images :)

Choice is good. Glad you have found your choice. I have found mine.
 
If my phone was my only device, I might agree with you. Might. Well okay, probably not. But I also have several computers, tablets, and a watches and I have tried the alternatives to Apple's ecosystems and I find the total experience to be a pale comparison. So go on with naming this feature or that, but the some total of the Apple experience trounces what you are talking about, to use your phrasing. So yeah, really don't care. Apple's weather app does the job for me. I don't write on my screen that often, the keypad works fine for me. And I can google fine without the aid of some AI assistance. But okay, draw your images :)

Choice is good. Glad you have found your choice. I have found mine.
Neutrally, I was responding to your question and nothing more. I'm not here to convince anyone. If anything I want my iPhone to be better than the sub par experience it currently offers, and get closer to what is on offer with the other platforms. I think Apple has leaned too far into novelty and kitsch without a critical eye on whether the efforts to design genmoji- could have been better applied to make summaries not suck, or offer something useful along the lines of circle to search.

I find this more interesting than Apple's vision because of the actual day-to-day utility it offers
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.