Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
64,968
33,120


Apple today previewed a wide range of new accessibility features for the iPhone, iPad, and Mac that are set to arrive later this year.

Apple-accessibility-iPad-iPhone-14-Pro-Max-Home-Screen.jpg

Apple says that the "new software features for cognitive, speech, and vision accessibility are coming later this year," which strongly suggests that they will be part of iOS 17, iPadOS 17, and macOS 14. The new operating systems are expected to be previewed at WWDC in early June before launching in the fall.

Assistive Access

Assistive Access distills iPhone and iPad apps and experiences to their core features. The mode includes a customized experience for Phone and FaceTime, which are combined into a single Calls app, as well as Messages, Camera, Photos, and Music. The feature offers a simplified interface with high contrast buttons and large text labels, as well as tools to help tailor the experience. For example, users can choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.

assistive-access-mode-apps.jpg


Live Speech and Personal Voice Advance Speech Accessibility

Live Speech on the iPhone, iPad, and Mac allows users to type what they want to say and have it spoken out loud during phone and FaceTime calls, as well as in-person conversations. Users can also save commonly used phrases to chime into conversations quickly.

live-speech.jpg

Users at risk of losing their ability to speak, such as those with a recent diagnosis of amyotrophic lateral sclerosis (ALS), can use Personal Voice to create a digital voice that sounds like them. Users simply need to read along with a randomized set of text prompts to record 15 minutes of audio on an iPhone or iPad. The feature uses on-device machine learning to keep users' information secure and private, and integrates with Live Speech so users can speak with their Personal Voice.

Detection Mode in Magnifier and Point and Speak

In the Magnifier app, Point and Speak helps users interact with physical objects that have several text labels. For example, while using a household appliance, Point and Speak combines input from the Camera app, the LiDAR Scanner, and on-device machine learning to announce the text on buttons as users move their finger across the keypad.



Point and Speak is built into the Magnifier app on iPhone and iPad, works with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment more effectively.

Other Features

  • Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to a Mac with specific customization options.
  • Voice Control gains phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound similar, like "do," "due," and "dew."
  • Voice Control Guide helps users learn tips and tricks about using voice commands as an alternative to touch and typing.
  • Switch Control can now be activated to turn any switch into a virtual video game controller.
  • Text Size is now easier to adjust across Mac apps including Finder, Messages, Mail, Calendar, and Notes.
  • Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
  • Users can customize the speed at which Siri speaks to them, with options ranging from 0.8x to 2x.
  • Shortcuts gains a new "Remember This" action, helping users with cognitive disabilities create a visual diary in the Notes app.

Article Link: Apple Previews iOS 17 Accessibility Features Ahead of WWDC
 
Last edited:

drumcat

macrumors 65816
Feb 28, 2008
1,176
2,880
Otautahi, Aotearoa
I don't see what's the point of divulging them now when WWDC is less than a month away? Any ideas?
From the linked Apple site:

>

Celebrating Global Accessibility Awareness Day Around the World

To celebrate Global Accessibility Awareness Day, this week Apple is introducing new features, curated collections, and more:
etc. etc.

Accessibility is a win for everyone. Once again, a heartfelt thanks to Apple for continuing to push this relentlessly. You never know when you'll need it.
 

CarAnalogy

macrumors 601
Jun 9, 2021
4,724
8,633
This is all excellent stuff. Some of these things I think they might just make standard, like Calls instead of a separate FaceTime app.

I would like to see a little more thought to accessibility in their default design though, like higher contrast and buttons that aren’t so hard to hit on the Mac.

Tangentially related: One time, exactly one time, I used my iPhone to look up how to pair a hearing aid with an iPhone to help an elderly relative. Pro tip, don’t do that. Now every ad I get in Apple News is for hearing aids and retirement plans.
 

Crowbot

macrumors 68000
May 29, 2018
1,786
4,070
NYC
Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.

This is good for me. I can't stand all those moving objects in my periphery when I'm trying to read the newspaper.
 

phill85

macrumors regular
Jul 19, 2010
212
1,190
I personally have had patients that have been suffering from ALS. Check out the Steve Gleason movie if you need a refresher… And the current gold-standard iPhone app that offers a voice-banking feature is $300. So on that note, to have it for free, this part is absolutely epic:

Live Speech and Personal Voice Advance Speech Accessibility​

For users at risk of losing their ability to speak — such as those with a recent diagnosis of amyotrophic lateral sclerosis (ALS) or other conditions that can progressively impact speaking ability — Personal Voice is a simple and secure way to create a voice that sounds like them.
Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on ‌iPhone‌ or ‌iPad‌. This speech accessibility feature uses on-device machine learning to keep users' information private and secure, and integrates with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
 
Last edited:

Lounge vibes 05

macrumors 68040
May 30, 2016
3,769
10,861
Detection Mode in Magnifier and Point and Speak

In the Magnifier app, Point and Speak helps users interact with physical objets that have several text labels. For example, while using a household appliance, Point and Speak combines input from the Camera app, the LiDAR Scanner, and on-device machine learning to announce the text on buttons as users move their finger across the keypad.

Point and Speak is built into the Magnifier app on iPhone and iPad, works with VoiceOver, and can be used with other Magnifier features such as People Detection, Door Detection, and Image Descriptions to help users navigate their physical environment more effectively.
For those wondering when Apple is going to enter the AI game… It’s this. This is the kind of stuff Apple will and has been using AI and machine learning to create for years.
Not a silly false info littered essay writing machine, actual useful stuff that will help people who need it without vacuuming their data.
 

NotTooLate

macrumors 6502
Jun 9, 2020
444
891
For those wondering when Apple is going to enter the AI game… It’s this. This is the kind of stuff Apple will and has been using AI and machine learning to create for years.
Not a silly false info littered essay writing machine, actual useful stuff that will help people who need it without vacuuming their data.
Both are remarkable use cases no need to call mid journey or chatGPT silly , they are far from silly and are changing the world as we speak , sure for the folks with ALS , Apple usage of ML is life changing as well !!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.