Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,563
37,946


In iOS 15, Apple is introducing a new feature called Live Text that can recognize text when it appears in your camera's viewfinder or in a photo you've taken and let you perform several actions with it.

Apple_iPadPro-iPadOS15-photos-LiveText_060721_big.jpg.large_.jpg

For example, Live Text allows you to capture a phone number from a storefront with the option to place a call, or look up a location name in Maps to get directions. It also incorporates optical character recognition, so you can search for a picture of a handwritten note in your photos and save it as text.

Live Text's content awareness extends to everything from QR codes to emails that appear in pictures, and this on-device intelligence feeds into Siri suggestions, too.

ios15-live-text.jpg

For instance, if you take a picture that shows an email address and then open the Mail app and start composing a message, Siri's keyboard suggestions will offer up the option to add "Email from Camera" to the To field of your message.

Other Live Text options include the ability to copy text from the camera viewfinder or photos for pasting elsewhere, share it, look it up in the dictionary, and translate it for you into English, Chinese (both simplified and traditional), French, Italian, German, Spanish, or Portuguese.

live-text-translate.jpg

It can even sort your photos by location, people, scene, objects, and more, by recognizing the text in pictures. For example, searching for a word or phrase in Spotlight search will bring up pictures from your Camera Roll in which that text occurs.

Live Text works in Photos, Screenshot, Quick Look, and Safari and in live previews with Camera. In the Camera app, it's available whenever you point your iPhone's camera at anything that displays text, and is indicated by a small icon that appears in the bottom right corner whenever textual content is recognized in the viewfinder. Tapping the icon lets you tap recognized text and perform an action with it. A similar icon appears in the Photos app when you're viewing a shot image.

visual-look-up-ios-15.jpg

In another neural engine feature, Apple is introducing something called Visual Look Up that lets you take photos of objects and scenes to get more information from them. Point your iPhone's camera at a piece of art, flora, fauna, landmarks, or books, and the Camera will indicate with an icon that it recognizes the content and has relevant Siri Knowledge that can add context.

Since Live Text relies heavily on Apple's neural engine, the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an iPhone X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.

The iOS 15 beta is currently in the hands of developers, with a public beta set to be released next month. The official launch of iOS 15 is scheduled for the fall.

Article Link: iOS 15's Live Text Feature Lets You Digitize Written Notes, Call a Number on a Sign, Translate a Menu, and Much More
 
Last edited:
Since Live Text relies heavily on Apple's neural engine, so the feature is only available on iPhones and iPads with at least an A12 Bionic or better chip, which means if you have an iPhone X or earlier model or anything less than an iPad mini (5th generation), iPad Air (2019, 3rd generation), or iPad (2020, 8th generation), then unfortunately you won't have access to it.
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
 
I remember using an app called World Lens in 2013 or so which did live translation through the camera. Helped me a lot when I traveled to various countries in Europe. Was it acquired by Google later? Anyway it’s great that Apple now has the function back and even better.
 
Last edited:
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac. Apple isn’t the company from heaven as most seem to think.
 
They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac.
I'm sure Apple could get it running on the A11, however people apparently forget there will be trade-offs to enable that to happen. What if enabling it on the A11 halved the battery life, would you want that, would that be a trade-off you'd be willing to make? Or perhaps the performance of the phone drops by 25%, how about that?

Apple can't (and shouldn't be expected to) support older hardware with every single new feature that's released. Apple do a good job of supporting older hardware far better than the competitors do (the iPhone 6s, a 2015 phone, gets iOS 15), they'll get the benefit of at least some new features and improvements.

I'm sure some people will want to upgrade their hardware to support the latest software additions, but I (and I'm sure like a lot of other folks like me with newish machines), won't be bothered enough. Nice features, but I won't lose sleep over not having them.

Apple isn’t the company from heaven as most seem to think.
Of course they're not - Apple is a money making enterprise.
 
They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac. Apple isn’t the company from heaven as most seem to think.
Haha it seems that it was you who thought Apple is the company from heaven and now you’re disappointed?

Yeah the A11 has the neural engine, just like your 1993 Intel Pentium PC also has a CPU. It doesn’t mean anything. Probably the performance will be so bad if they give it to the iPhone X?

Talking about iPhone X, this phone easily gets hot and eats battery like nothing. Now with the battery degraded it’s even worse. I only use it as a back up phone but even as a back up phone the battery life is not that good.
 
The old World Lens app (and subsequent iterations after Google acquired it) were able to both capture and translate text in real-time using substantially slower hardware.

Live text would just be this, without the translation (whether on device or not). I can’t see why the neural engine would be an absolute requirement, even if you try to make a ‘performance’ argument.

Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.

The later phones have an improved Neural Engine. I'm guessing that these are required in order to ensure good performance.
 
The old World Lens app (and subsequent iterations after Google acquired it) were able to both capture and translate text in real-time using substantially slower hardware.

Live text would just be this, without the translation (whether on device or not). I can’t see why the neural engine would be an absolute requirement, even if you try to make a ‘performance’ argument.
Try using Word Lens for any sustained period - it absolutely hammered your battery. And just to be clear, that isn't a criticism of Word Lens - it was an exceptional app for its time, it just couldn't take advantage of optimised silicon.

The performance argument is *the* argument. Apple more than likely could implement it on all hardware, but what it it caused your battery life to be half of usual, or Safari and camera apps to be considerably slower while it runs CPU-bound neural networks on every image downloaded, or every camera frame?

Also worth noting, text recognition isn't a trivial task, especially with handwriting.
 
Last edited:
I remember using an app called World Lens in 2013 or so which did live translation through the camera. Helped me a lot when I traveled to various countries in Europe. Was it acquired by Google later?
Correct - and the original functionality has been baked into the Google Translate app.
 
  • Like
Reactions: haruhiko
They don’t want to. Hence why they aren’t enabling it for the intel macs. Just a cash grab as they hope everybody will sell their newish intel Mac, which they paid handsomely for, at a huge loss and then go and pay handsomely again for a new Apple silicon Mac. Apple isn’t the company from heaven as most seem to think.
Ok it’s a cash grab…now what?
 
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.
From memory I recall reading something about the A11 Bionic being very limited in what it could do. For example 3rd party apps couldn’t utilise it.

EDIT: I also believe it was something like 600 Billion, 1 trillion to 11 trillion operations per sec from the A11 neural engine to the A13.
 
Last edited:
  • Like
Reactions: haruhiko
The feature is so quick! Just long-tap an image, then tap on “Show Text” and boom!
 
Forgive me if I'm wrong, but wasn't the A11 the first chip with a Neural Engine? So surely they could make this work on the X (and even the 8 and 8 Plus) if they REALLY wanted to.

Not wrong, and no doubt along with many hundreds of other features if they REALLY wanted to. The real reality is, that all companies, even those of Apple's immense size, have limitations of what could be done at a particular point in time, given finite resources and time schedules.

Just because Apple is one of the largest and most successful companies in the world does not mean that everything under the Sun that could be done, can be done, due to finite resources.
 
This feature is turning out to be quite surprising. Looking forward to getting my hands on it. Really liking the translation of foreign text.
 
  • Like
Reactions: citysnaps
Not disagreeing, but how so? I have an iPhone but use a lot of Google apps so I've had these abilities for years now. Why is the Apple one better?
You've been here long enough to know that there's a large enough cohort of posters who only post stuff like this as fast as they can without putting any thought behind it.

there's a term for them, but we're not allowed saying it :p

Have used Google Lens, and looking forward to testing this feature out on my iPhone. But people jumping in before testing it out and claiming "Apple did it better!" already before this thing is in the wild is just pure fandom.
 
  • Like
Reactions: haruhiko
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.