Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,701
38,168


iOS 18 introduced an accessibility feature called Music Haptics that has value for everyone. When the feature is turned on, the iPhone's Taptic Engine taps and vibrates to match the audio of a song playing in Apple Music, Shazam, and supported third-party apps, so long as the device is connected to a Wi-Fi or cellular network.

iOS-18-Music-Haptics.jpg

With iOS 19, Music Haptics will get better in two ways.

Apple today announced that Music Haptics will be even more customizable starting later this year. First, users will have the option to receive haptic feedback for vocals only. Second, users will be able to adjust the overall intensity of taps, textures, and vibrations. These enhancements are expected to roll out with iOS 19, which will be unveiled during the WWDC 2025 keynote on June 9 and released to the general public in September.

Music Haptics is supported on the iPhone 12 and newer, excluding the latest iPhone SE.

Article Link: iOS 19 to Upgrade iPhone's Music Haptics Feature in Two Ways
 
Huh. There’s a Play Sample button in the Music Haptics section. I tried it out. It’s an interesting feature, but not for me.
 
Had no idea this was even a thing. Just played Midnight City by M83, and it’s spot-on. The accessibility team at Apple is still immune to service-revenue cancer, and it shows.
 
  • Like
Reactions: maxoakland
It is for users with hearing disabilities, something the article doesn't care to mention.
I get that. But no offence to anyone with hearing disabilities. The haptics doesn’t show what the song is. So is then pretty pointless. The phone just vibrates where there is a bit of bass. Like vibrate when it’s synced to a to a ringtone. I don’t see the point of to. They should focus more on sorting out the os as it is lagging massively behind.
 


iOS 18 introduced an accessibility feature called Music Haptics that has value for everyone. When the feature is turned on, the iPhone's Taptic Engine taps and vibrates to match the audio of a song playing in Apple Music, Shazam, and supported third-party apps, so long as the device is connected to a Wi-Fi or cellular network.

iOS-18-Music-Haptics.jpg

With iOS 19, Music Haptics will get better in two ways.

Apple today announced that Music Haptics will be even more customizable starting later this year. First, users will have the option to receive haptic feedback for vocals only. Second, users will be able to adjust the overall intensity of taps, textures, and vibrations. These enhancements are expected to roll out with iOS 19, which will be unveiled during the WWDC 2025 keynote on June 9 and released to the general public in September.

Music Haptics is supported on the iPhone 12 and newer, excluding the latest iPhone SE.

Article Link: iOS 19 to Upgrade iPhone's Music Haptics Feature in Two Ways
Apple continues to add features no one asks or cares about…
 
  • Like
Reactions: miq
Oh yeah, the "haptic engine". Hasn't been on on my iPhones since, well, forever.
I even forgot that it can actually vibrate when there's a call, I turned that off too a long time ago.
 
Dearest Apple engineers - isn't the point of a music app sound quality?! give us a fully manual graphic equalizer *please* - 10 or 12 band and remember individual settings for each Bluetooth device a user may have (car, headphones, speaker, etc).

Relying on some earbud manufacturer's crappy app for eq and having to fiddle with settings every time you connect to a different device is not a high end experience in 2025.

and if you really want to be nice to us, allow swiping left or right on album art to skip tracks.
 
Last edited:
  • Like
Reactions: miq
I get that. But no offence to anyone with hearing disabilities. The haptics doesn’t show what the song is. So is then pretty pointless. The phone just vibrates where there is a bit of bass. Like vibrate when it’s synced to a to a ringtone. I don’t see the point of to. They should focus more on sorting out the os as it is lagging massively behind.
There's a wide range of disabilities, and frankly you probably have no idea what would or would not be helpful to anyone in particular.
 


iOS 18 introduced an accessibility feature called Music Haptics that has value for everyone. When the feature is turned on, the iPhone's Taptic Engine taps and vibrates to match the audio of a song playing in Apple Music, Shazam, and supported third-party apps, so long as the device is connected to a Wi-Fi or cellular network.

iOS-18-Music-Haptics.jpg

With iOS 19, Music Haptics will get better in two ways.

Apple today announced that Music Haptics will be even more customizable starting later this year. First, users will have the option to receive haptic feedback for vocals only. Second, users will be able to adjust the overall intensity of taps, textures, and vibrations. These enhancements are expected to roll out with iOS 19, which will be unveiled during the WWDC 2025 keynote on June 9 and released to the general public in September.

Music Haptics is supported on the iPhone 12 and newer, excluding the latest iPhone SE.

Article Link: iOS 19 to Upgrade iPhone's Music Haptics Feature in Two Ways
Saying this has value for everyone is a massive stretch and quite frankly untrue - its for those who have impaired hearing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.