Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,889
31,449


One of the new iOS 18 accessibility features that Apple previewed today is a Reader Mode in the Magnifier app, which will convert words in images to uniform lines of text. Apple did not provide any specific details about the feature, but it shared a screenshot showing that users will be able to change the font and have the text read aloud.

Magnifier-Reader-Mode-Article.jpg

Apple also revealed that iOS 18 will allow iPhone users to easily launch the Magnifier app's Detection Mode with the Action button, which debuted on the iPhone 15 Pro models and is rumored to expand to the entire iPhone 16 lineup later this year. Detection Mode can identify and read aloud all text within an iPhone camera's field of view.

It is already possible to set the Action button to open the Magnifier app in general, so this will be an expansion of the app's integration with the button.

iOS 18 is expected to be unveiled at Apple's developers conference WWDC on June 10, and the update should be widely released in September.

Article Link: Apple Previews iOS 18's Upgraded Magnifier App With New Reader Mode
 

steve123

macrumors 65816
Aug 26, 2007
1,012
594
Sounds good. I hope Apple gives some thought to providing a way to help people remote manage an elderly parents device. I would like to be able to prevent certain types of web sites and phishing and be able to see the screen of my Mom's phone or iPad remotely so I can help her when she encounters a problem. The only way at the moment is to have her point her phone camera at the screen of the iPad. But, this is confusing to her. If I could use ARD and iCloud to view and control the device, that would be ideal.
 

Lounge vibes 05

macrumors 68040
May 30, 2016
3,654
10,615
Sounds good. I hope Apple gives some thought to providing a way to help people remote manage an elderly parents device. I would like to be able to prevent certain types of web sites and phishing and be able to see the screen of my Mom's phone or iPad remotely so I can help her when she encounters a problem. The only way at the moment is to have her point her phone camera at the screen of the iPad. But, this is confusing to her. If I could use ARD and iCloud to view and control the device, that would be ideal.
Not exactly what you’re looking for of course, but screen sharing is built-in to FaceTime and has been for quite a while.
 

amaze1499

macrumors 65816
Oct 16, 2014
1,016
998
Why would Apple do something like this? Like reveal something four weeks before the actual introduction of iOS 18?
 

dandy1117

macrumors regular
Sep 18, 2012
145
361
Damage control ... they are behind on AI integration and playing catch up.
Apple has been previewing accessibility features of its upcoming OS for years. (But don't let a little research get in the way of a hot take! 😜)

 

zhtfreak

macrumors member
Oct 23, 2021
34
38
I upgraded to a 15 pro expecting something new that would make use of the lidar sensor, but I'm glad I did anyway. Being able to launch detection mode with the action button will make me use it more.

VoiceOver has a custom gesture to bring it up, but the action button will be so much easier. My beef right now is that they don't have a lot of documentation on how some of the features work. Point and Speak, that they added last year, had potential but I don't know anyone who uses it. I've tried and can't figure it out. Granted, I have no useable vision at all, but there's nowhere that explains what the different sounds mean. The YT video on their accessibility channel shows it in perfect conditions and doesn't explain much of how to use it optimally. </rant>
 

tivoboy

macrumors 68040
May 15, 2005
3,997
803
Sounds good. I hope Apple gives some thought to providing a way to help people remote manage an elderly parents device. I would like to be able to prevent certain types of web sites and phishing and be able to see the screen of my Mom's phone or iPad remotely so I can help her when she encounters a problem. The only way at the moment is to have her point her phone camera at the screen of the iPad. But, this is confusing to her. If I could use ARD and iCloud to view and control the device, that would be ideal.
But, you can already do this.. just do a screen share in facetime.

As for the magnifier… A) I wouldn’t waste the action button single assignment with this, when a double or triple tap of the main button can activate it or a select list of other actions.

Everyone whom I show this to (I’m an early GenX, with at or older friends and colleagues) says it changes their world with use cases and making the phone an even greater tool for interacting in the world. Labels, menus, prescription TEXT, so many companies put text in like 1-2 font, which is RIDICULOUS. Even at my BEST I couldn’t read the print that is on some documents today.

I would REALLY like to see some type of AI (more just contextual interpretation) that looks at what it’s well LOOKING at and thinks.. ‘ok, this is TEXT, and the font is 1-3 or so, my owner likes to read text at 12-14 font, so I’m going to automagically just present the content in that way’, so it’s just an activate, aim, read type of usage.
 

CarAnalogy

macrumors 601
Jun 9, 2021
4,301
7,908
The AI completely fails to transcribe "LUNCH BOX" in the screenshot.

I don't know if it's the image recognition library they are using or what, but I have noticed it completely missing things in live text, translation, etc.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.