One thing I was never able to verify with Apple's use of Bluetooth AAC is whether they synchronise the bitrate of the playback with the bitrate of the Bluetooth transmission?
If I play a song on Apple Music on my iPhone 12 Pro wirelessly using my Apple AirPods Pro or AirPods Max. Will it play using 256 kbps without any kind of re-encoding or transcoding taking place? What if I move away from my phone making the Bluetooth connection worse will it then suddenly start transcoding as a result of the connection not being able to keep any stable 256 kbps connection going?
Unless Apple is somehow capable of enforcing Bluetooth AAC transmission to not do any kind of transcoding, aka it preserves the 256 kbps stream for Apple Music bit-perfect between the iPhone/iPad/Mac having access to Apple Lossless will be a benefit regardless of any changes to the Bluetooth transmission.
I would think Apple has to do Bluetooth AAC transcoding for Bluetooth AAC transmission to work. Meaning they take a really great 256 kbps AAC and transcodes it between the source and the destination. Transcoding from a good 256 kbps AAC source using Bluetooth AAC will most likely be able to keep some kind of near bit-perfect transmission going most of the time but I doubt it's able to do it all of the time.
Having 16-bit, 44.1kHz Apple Lossless created from the same great Apple Digital Masters will make sure that we won't need to have any kind of transcoding or re-encoding of an already lossy source taking place. So instead of the worst-case scenario where Bluetooth AAC needs to take its already lossy AAC 256 kbps source and start messing with it, you will now have a scenario with a source that is lossless that needs to mess it. Still not perfect or ideal, but much better. And considering how pretty much every double-blind test shows how even the most extreme audiophiles have a really hard telling a ~200 kbps VBR AAC file apart from a fully lossless file from the same source the Bluetooth AAC transcoding needs to do a really bad job for it to manage to mess up the transcoding in such a way that it impacts the audio quality.
This will only make a difference if Apple doesn't have something in place that already makes sure that Apple Music playback with its 256 kbps AAC source somehow is going bit-perfect from the devices using Bluetooth AAC to their headphones. If they have something in place that make sure this is already happening then using Apple Lossless for playback using Bluetooth AAC will just be wasting bandwidth for no reason.
They don't really mention how the Spatial Audio / Dolby Atmos will work either. Bluetooth AAC is limited in its capabilities and bandwidth. How will these tracks work? Dolby Atmos is binaural audio. So it doesn't really feature any specific amounts of audio channels. But one has to think that having a song that would normally be 256 kbps stereo (2.0) AAC that suddenly becomes available in a Spatial Audio / Dolby Atmos version would no longer fit the same 256 kbps bitrate as it will obviously have to contain additional binaural audio data? How will that work with Bluetooth AAC? Will Dolby Atmos playback happen at a lower bitrate? I don't think it will matter all that much, going down to even 128 kbps AAC and it will still feel pretty much transparent with 16-bit, 44.1kHz lossless for the majority of users. I would just love to have some technical details on what is going on here.
It would make it much easier to decide what settings will be best suited for each of my devices. I'm not connecting any wired headphones to my iPhone or iPad anymore. So I would really love to know the technical details as there is no point in activating settings that will just increase bandwidth without having any possibility of increasing the quality. On my Mac mini, I will obviously activate Hi-Res Apple Lossless as I have my Hegel HD12 DAC, connected to my Schiit Magnius amplifier running my Sennheiser HD 800S headphones running with fully balanced cables through the entire stack. I don't expect me to be able to tell any difference between 16-bit, 44.1kHz 256 kbps AAC and 24-bit, 192kHz, 1411 kbps Apple Lossless but there is no reason for me to not opt for it when it doesn't cost me anything extra as my equipment, in theory, might sound better with these options enabled.
On my iPhone and iPad, it will only make sense if the technical details make sense for it to be enabled. If Apple is already ensuring bit-perfect 256 kbps AAC transmission using Bluetooth AAC and nothing will change on the Bluetooth side of things from Apple to make it capable of taking any benefit from either Apple Lossless or Hi-Res then I would simply be wasting bandwidth for no reason by enabling it on my iPhone and iPad.
All the information seems to point to Apple using the very same Apple Digital Masters which is supposed to be 24-bit, 192kHz for creating all these new versions. So everything from the Hi-Res 24-bit, 192kHz Apple Lossless to the regular 16-bit, 44.1kHz Apple Lossless and the already existing 16-bit, 44.1kHz AAC 256 kbps and 128 kbps lossy versions are created using the same source masters. I highly doubt any of us will be able to tell any of these versions apart in a double-blind test. It's not like human ears need any higher resolution compared to 16-bit, 44.1kHz. It's not like we are in the need of added dynamic range for audio playback or anything.
https://www.mojo-audio.com/blog/the-24bit-delusion/ is a good read on the topic.
Obviously, Apple needs different masters for the Spatial Audio / Dolby Atmos versions. So they will most likely sound different compared to their regular versions. But as binaural audio is so different compared to regular stereo and surround playback these will sound and feel very different compared to their regular versions regardless of their masters. It will be very interesting to see what binaural audio has to offer for music playback. Having music in surround was all the hype back in the day but it died rather quickly and didn't really offer anything that improved upon the music listening experience. Heck, a lot of music is more or less mono and not stereo at all. If you listen to the most popular songs that are being released in the last couple of years barely any of them integrates any kind of stereo mixing in their soundtracks. You could simply take everything that is being played on the left channel and duplicate it to the right channel and you will have 99% the same exact experience listening to it. That was the whole problem with music utilising surround as it didn't really make much sense splitting soundtracks into 5-9 channels. Or at least no artist managed to think of any clever way of recording music where they could really utilise additional channels to their benefit. Now we have to see if artists are capable of utilising binaural for anything useful and interesting.