Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,258
39,059


With the new Liquid Glass design taking the spotlight, Apple didn't spend a ton of time discussing Apple Intelligence at WWDC 2025, nor was there a mention of the missing Siri features. Apple Intelligence wasn't a focus, but Apple is continuing to build out Apple Intelligence in iOS 26. There are new features, and updates to some existing features.

iOS-26-Apple-Intelligence-Features.jpg

We've outlined what's new with Apple Intelligence below.

Live Translation

Live Translation works in Messages, FaceTime, and Phone. It auto translates both spoken and text conversations if the people conversing do not speak the same language.

messages-live-translation.jpg

In a Messages conversation with someone, tap on the person's name and then toggle on the Automatically Translate option. From there, you can select a language that you want to translate your conversations to. Language packs vary in size, but they are around 900MB.

ios-26-select-language-messages.jpg

Language options include English (US), English (UK), Chinese (Mandarin, Simplified), French, German, Italian, Japanese, Korean, Portuguese (Brazil), and Spanish (Spain). You can set both the translate to and the translate from languages.

The messages that you send to someone will show up in both your language and the translated language on your iPhone, while the person on the other end sees the message only in their language. Messages they respond with will show both their language and the translated language.

messages-ios-26-live-translation-2.jpg

Live Translation works similarly in the Phone and FaceTime apps, and it needs to be turned on for each conversation and language assets need to be downloaded. In the Phone app, Live Translation uses actual voice translation with an AI voice to translate spoken content quickly and efficiently, but you can also see a transcript of the conversation.

ios-26-phone-live-translation.jpg

In FaceTime, you'll see translated captions for speech, so you'll hear what the person is saying in their own language while also being able to read live captions with a translation.

ios-26-live-translation-facetime.jpg

To use these features, both participants should have Live Translation, so an Apple Intelligence-enabled iPhone, iPad, or Mac that is running the 26 series software. In Messages, though, if you have Live Translation turned on and you're chatting with someone who has an older device, they can type in their language and you will see the translation. Your responses to them aren't translated to their language.

Onscreen Visual Intelligence

In iOS 26, you can use Visual Intelligence with content that's on your iPhone, asking questions about what you're seeing, looking up products, and more.

visual-lookup-ios-26.jpg

Visual Intelligence works with screenshots, so if you take a screenshot on your iPhone and tap into it, you'll see new Ask and Image Search buttons. With Ask, you can ask ChatGPT a question about what's in the screenshot. The image is sent to ChatGPT, and ChatGPT is able to provide a response.

ios-26-visual-intelligence-ask.jpg

Search has two features. You can send a whole screenshot to Google or another app, or you can use a Highlight to Search feature to select something specific in the screenshot. Just use a finger to draw over what you want to look up, and then swipe up to conduct a search.

ios-26-highlight-search.jpg

You can search Google Images, Etsy, and other apps that implement support for the feature.

If there's an event in your screenshot, Visual Intelligence will pop up an "Add to Calendar" option and it can be added directly to the Calendar app. It will also automatically suggest identifications for animals, plants, sculptures, landmarks, art, and books.

ios-26-visual-intelligence-identification.jpg

Wallet Order Tracking

Apple Wallet can scan your emails to identify order and tracking information, adding it to the Orders section of the Wallet app. The feature works for all of your purchases, even those not made using Apple Pay.

ios-26-order-tracking-wallet.jpg

Automatic order detection can be enabled in the Wallet app settings under Order Tracking. Once turned on, you can see your orders by opening up Wallet, tapping on the "..." button, and choosing the Orders section.

Tapping into an order will provide you with the merchant name, order number, and tracking number, if available. You can also see the relevant email that the order information came from, and tap it to go straight to the message in the Mail app.

Image Playground

Apple quietly upgraded Image Playground, and the images that it generates using the built-in Animation, Sketch, and Illustration styles have improved. Faces and eyes look more natural, hair is more realistic, and it's overall better at generating a cartoonish image that looks similar to a person.

ios-26-image-playground-emotion.jpg
Image Playground in iOS 18 on left, iOS 26 version on right


The change is most notable with people, but objects, food, and landscapes have improved too. We have a full Image Playground guide with more info.
... Click here to read rest of article

Article Link: iOS 26: All the New Apple Intelligence Features
 
I'm honestly not upset about the delayed features because it wasn't promised with the 15 Pro. Of all the A.I. options, I have enjoyed Grok the most. Co-pilot is ok.
 
I've been hoping that the language translation will at some point work with the text in any app.
So if there's an app you wish to use that is only available in a language other than your own iOS translates the text displayed into your own chosen language.
That wouldn't be too difficult to do I'd think being an app developer myself.
Handy for travelling if there's a popular app in a country you are in that provides handy services or info.
Yes, would also be nice if I too as an App developer didn't need to deal with all the translation strings to extend my user base reach. EDIT: It could be opt in for app publishers much like allowing iOS apps to be installed on MacOS.
That would keep those who want tight control of translations happy.
 
as I have a iPhone 14 Pro Max IOS 26 has nothing for me so I will not update my phone or watch. Hopefully the new iPhone will have a good upgrade then I will get a new phone if not then I will keep my 14 Pro Max.
 
I'm really hoping that the "Onscreen Visual Intelligence" doesn't automatically pop up with every screenshot you take, because I frequently take screenshots that I want to have immediately get out of the way, so that I can keep doing whatever and/or possibly take another screenshot.

I'm just worried they'll do a "here, we added a kewl new feature, and since it's new we're sure you'll want to use it all the time".
 
  • Like
Reactions: Promostyle
Live translation is a useful, if occasional, feature. The rest? Big meh. Call me when Siri has a brain - this morning it again managed to start playing random audiobooks tracks when I asked it to play a music playlist in my local library. Yesterday it found things for me on the web when I asked it what time it was in another city. The day before it ignored me when I asked it to change the volume on my headphones while running.

Most of the apps listed have basic non-AI solvable problems (like wallet not letting you use nicknames for credit cards, or reorder payment methods). Apple needs to get the basic stuff fixed, then worry about generating emoji’s for middle school kids.
 
To use these features, both participants should have Live Translation, so an Apple Intelligence-enabled iPhone, iPad, or Mac that is running the 26 series software
What? How did they manage to ruin even this feature? Pixel phones implemented this feature years ago and do not require second person to use the Pixel phone.
 
I don’t need any of these. But please fix Siri, have it underhand do tasks, especially organizing my Apple Music library.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.