Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,544
30,855


Apple today previewed a range of new accessibility features, including Door Detection, Apple Watch Mirroring, Live Captions, and more.

Apple-Accessibility-OS-features-2022.jpg

Door Detection will allow individuals who are blind or have low vision to use their iPhone or iPad to locate a door upon arriving at a new destination, understand how far they are from it, and describe the door's attributes, including how it can be opened and any nearby signs or symbols. The feature will be part of a new "Detection Mode" in Magnifier, alongside People Detection and Image Descriptions. Door Detection will only be available on iPhones and iPads with a LiDAR scanner.

Users with physical disabilities who may rely on Voice Control and Switch Control will be able to fully control their Apple Watch Series 6 and Apple Watch Series 7 from their iPhone with Apple Watch Mirroring via AirPlay, using assistive features like Voice Control and Switch Control, and inputs such as voice commands, sound actions, head tracking, and more.

New Quick Actions on the Apple Watch will allow users to use a double-pinch gesture to answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout.

Deaf users and those who are hard of hearing will be able to follow Live Captions across the iPhone, iPad, and Mac, providing a way for users to follow any audio content more easily, such as during a phone call or when watching video content. Users can adjust the font size, see Live Captions for all participants in a group FaceTime call, and type responses that are spoken aloud. English Live Captions will be available in beta on the iPhone 11 and later, iPad models with the A12 Bionic and later, and Macs with Apple silicon later this year.

Apple will expand support for VoiceOver, its screen reader for blind and low vision users, with 20 new languages and locales, including Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese. In addition, users will be able to select from dozens of new optimized voices across languages and a new Text Checker tool to find formatting issues in text.

There will also be Sound Recognition for unique home doorbells and appliances, adjustable response times for Siri, new themes and customization options in Apple Books, and sound and haptic feedback for VoiceOver users in Apple Maps to find the starting point for walking directions.

The new accessibility features will be released later this year via software updates. For more information, see Apple's full press release.

To celebrate Global Accessibility Awareness Day, Apple also announced plans to launch SignTime in Canada on May 19 to support customers with American Sign Language (ASL) interpreters, launch live sessions in Apple Stores and social media posts to help users discover accessibility features, expand the Accessibility Assistant shortcut to the Mac and Apple Watch, highlight accessibility features in Apple Fitness+ such as Audio Hints, release a Park Access for All guide in Apple Maps, and flag accessibility-focused content in the App Store, Apple Books, the TV app, and Apple Music.

Article Link: Apple Previews New Door Detection, Apple Watch Mirroring, and Live Captions Accessibility Features
 
  • Love
Reactions: Apple$

t0rqx

macrumors 68000
Nov 27, 2021
1,591
3,717
Sounds great. Still they broke Siri smart features via Homekit with WatchOS8. So whats the deal. Always need to manually confirm a voice command via Apple Watch.
 

MikhailT

macrumors 601
Nov 12, 2007
4,582
1,325
Apple again leads in accessibility. Love the Live captions and door detection.
To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
 

arnoz

macrumors regular
Jun 20, 2007
233
194
Switzerland
Sounds great. Still they broke Siri smart features via Homekit with WatchOS8. So whats the deal. Always need to manually confirm a voice command via Apple Watch.
Looks like most if not all my recent WatchOS bugs/issues, including Siri being more useless than ever, have been fixed with yesterday's issue.
 

eilavid

macrumors regular
Oct 25, 2021
109
841
To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
 

kc9hzn

macrumors 68000
Jun 18, 2020
1,582
1,896
To be fair, Android has this Live Captions feature already as well as Google Chrome. I had to rely on it on all platforms.

Microsoft announced and is testing Live Captions on Windows 11 insider builds for a few months now.

Apple is late as usual but I’m sure they will be the best implemented one as that is just them.

Regardless, everyone wins here. We need more accessibility support across the industry.
I wouldn’t say that Apple is late as usual when it touches on accessibility features, as Apple usually does make those a priority. It’s just that there’s quite a bit more work involved with developing AI workflows that run locally than ones in the cloud, where you can add a couple thousand blade servers to address performance and availability issues. Especially where accessibility is concerned, and you want to get new accessibility features into the hands of as many users who need it as possible. That means you want to avoid depending on hardware features and performance of the newest phones for accessibility features if possible, so you ideally need local performance that still works well on even the oldest currently supported device. Means that a lot more time is spent optimizing for performance than in a cloud based solution (not to say that no optimization occurs in the cloud, just less).
 
  • Like
Reactions: Gasu E. and eilavid

Apple$

macrumors 6502
May 21, 2021
349
649
Better late than never, Apple. As a CI Android user, I love the live captions feature so much! it's just so handy when you are watching a YouTube video that doesn't have captions. Instead of skipping it as I did in the past, I just turn on the live captions.
 

NoGood@Usernames

macrumors regular
Dec 3, 2020
235
287
United States
I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
 

iStorm

macrumors 68000
Sep 18, 2012
1,766
2,201
Actually, Google’s live caption is all done on-device and does not require an internet connection to function. They have been moving more and more voice request processing to on-device the past few years.
This is correct. Taken from Android Accessibility Help page: "All captions are processed locally, never stored, and never leave your device."

When it comes to accessibility, users need anything that can help them now. They can't sit around and wait for something else, so I would say Apple is late to the game here. I know a co-worker who switched to Android several years ago so he could use the live caption feature for meetings. Previously, he was using a captioning service over the phone, but was not a fan of having another live person listening in on the meetings.
 

Cameront9

macrumors 6502a
Aug 6, 2006
961
499
I have been waiting for Live Captions in FaceTime for years--ever since they first had the technology in the Clips app. I watched as Live Captions got added to Teams and then Zoom during the pandemic. I currently start up a Zoom call and just don't dial anyone to get live captions for meetings and conversations in the car. I'm hoping the live captions on FaceTime work even better than that.

I hope the feature is easy to turn off and on, as it sounds like it can caption any on-device audio. I don't need live captions on content that is already subtitled, for example. It also remains to be seen what the in-person interface looks like.

As a deaf user that doesn't use ASL, this will be a godsend.
 
  • Like
Reactions: MisterSavage

MikhailT

macrumors 601
Nov 12, 2007
4,582
1,325
I think the difference is that Google does all processing on their servers, Apple's implementation is on-device only and works offline. (not to mention your conversation stays private)
As others have mentioned, Google has moved it to be on-device only and they're moving more and more of their ML stuff onto their device with the Tensor chips.

Remember Apple did in fact upload all Siri voice usage to their service as well, which got them in trouble: https://www.macrumors.com/how-to/delete-siri-audio-history-opt-out-audio-sharing/ Apple isn't fully innocent here.

Apple just like Google has been trying to move stuff to be on-device with each generation of their Silicon adding more dedicated hardware support for this (neural engine).

Also, Microsoft is the same; on-device only.

Importantly, live captions on Windows 11 leverage state-of-the-art speech recognition while staying completely local to your device. This means that once they are set up, they are always available, without an internet connection; they are instantly responsive; and they can be trusted to respect users’ privacy because they are generated on device, and don’t send any content to the cloud. Live captions are available with support for U.S. English.

Source: https://blogs.windows.com/windowsex...-accessibility-features-coming-to-windows-11/
 

Lounge vibes 05

macrumors 68040
May 30, 2016
3,576
10,517
For anyone who thinks AR glasses are A solution in search of a problem, I’m telling you right now that door detection on a pair of glasses would be a life changer.
Coming from someone who has no usable vision
 

polyphenol

macrumors 68000
Sep 9, 2020
1,894
2,247
Wales
It is always welcome to read of new accommodations being provided to improve accessibility.

I often use captions to avoid having to listen to the burble or shouting or whatever it is on the soundtrack of so many videos.

I wish they would provide an equaliser and a way of tuning it to my ears. I'm not deaf, just have a notch in my hearing and the usual effects of aging. I find I have to turn the volume up higher than I'd like in order to catch everything. Would be so helpful for an app to make sounds, and me to respond, through the whole audible range. At the end, it should have come up with settings that suit me, with the device I am using, in the environment I'm in.

As a regular speech listener, it might also help to adjust the dynamic range, so I don't miss important whispers!

Obviously, including external non-Apple speakers and the ability to switch automatically if I change output device.
 

MikhailT

macrumors 601
Nov 12, 2007
4,582
1,325
So accessibility features don’t deserve a keynote anymore?
They're previewing their upcoming changes because of the upcoming global accessibility awareness day which is May 19th.


WWDC is too late for this but they will talk about it at WWDC as well. They did this the past few years as well.
 

MikhailT

macrumors 601
Nov 12, 2007
4,582
1,325
It is always welcome to read of new accommodations being provided to improve accessibility.

I often use captions to avoid having to listen to the burble or shouting or whatever it is on the soundtrack of so many videos.

I wish they would provide an equaliser and a way of tuning it to my ears. I'm not deaf, just have a notch in my hearing and the usual effects of aging. I find I have to turn the volume up higher than I'd like in order to catch everything. Would be so helpful for an app to make sounds, and me to respond, through the whole audible range. At the end, it should have come up with settings that suit me, with the device I am using, in the environment I'm in.

As a regular speech listener, it might also help to adjust the dynamic range, so I don't miss important whispers!

Obviously, including external non-Apple speakers and the ability to switch automatically if I change output device.
If you haven't, I suggest to see an audiologist to test your hearing, there are hearing aids that does have that tuning built-in to match your frequency loss (hearing). What you're describing is what many hearing impaired (HOH) users including myself have; many have loss in specific frequency ranges and wear hearing aids to help with the frequencies they can't hear and ignore the frequencies they can hear. I can hear certain sounds at perfect level but others, I can't.

The issue with you increasing volume to catch everything is that you're going to harm the remaining hair cells and they cannot be recovered, so you'll end up with even more hearing loss. So, you should avoid that by getting hearing aids instead.

And AirPods and iPhone has an EQ for this already via their hearing accessibility support. https://support.apple.com/en-us/HT211218
 

_Spinn_

macrumors 601
Nov 6, 2020
4,857
10,041
Wisconsin
This is great.
Deaf users and those who are hard of hearing will be able to follow Live Captions across the iPhone, iPad, and Mac, providing a way for users to follow any audio content more easily, such as during a phone call or when watching video content. Users can adjust the font size, see Live Captions for all participants in a group FaceTime call, and type responses that are spoken aloud.
I can see this being useful for people without hearing impairments as well.
 

polyphenol

macrumors 68000
Sep 9, 2020
1,894
2,247
Wales
If you haven't, I suggest to see an audiologist to test your hearing, there are hearing aids that does have that tuning built-in to match your frequency loss (hearing). What you're describing is what many hearing impaired (HOH) users including myself have; many have loss in specific frequency ranges and wear hearing aids to help with the frequencies they can't hear and ignore the frequencies they can hear. I can hear certain sounds at perfect level but others, I can't.

The issue with you increasing volume to catch everything is that you're going to harm the remaining hair cells and they cannot be recovered, so you'll end up with even more hearing loss. So, you should avoid that by getting hearing aids instead.

And AirPods and iPhone has an EQ for this already via their hearing accessibility support. https://support.apple.com/en-us/HT211218
I do appreciate your reply.

When my hearing was diagnosed, it was a full hospital job including MRI. I don't need it so loud that I think it would be threatening my hearing, just a bit louder than I would like. Yes, I understand the hair cells issue.

I do have some pods but they are not supported (Xiaomi). And I tend not to like headphones. I prefer the feeling of hearing my environment. But what that does sounds just right - but please extend to other (non-Apple) devices!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.