If you are listening to lossless on AirPods or AirPods Pro, forget it: there’s just no way that you’re going to hear any difference. With AirPods Max, they do support Hi-Resolution Lossless – but only when used as wired headphones.
It all depends on the transmission. I don't think Apple is capable of ensuring bit-perfect transmission using Bluetooth AAC? So in most scenarios, you have a lossy source (256 kbps AAC) being transcoded using a lossy transmission (Bluetooth AAC) before the digital to analogue conversion is happening on your headphones.
I'm sure Apple has very good control over the Bluetooth AAC transmission happening on their AirPods line-up (AirPods, AirPods Pro and AirPods Max) so I'd bet you will need to have some poor connection before you start running into trouble. But transcoding a lossy source will run the risk of degrading its quality.
Using Apple Lossless as your source will ensure that Bluetooth AAC is working with a perfect copy of the source material before it starts transcoding it for the Bluetooth transmission so you are limiting the chance of having your quality degraded during the transmission between your phone and the headphones.
This is only true if Apple doesn't already somehow ensures that Apple Music playback is happening bit-perfect between the phone and the headphones already. If they already do that then playing back lossless won't give you any benefits at all. At least not until Apple does something with the Bluetooth transmission to ensure that playing back Apple Lossless gives you higher quality than 256 kbps AAC can provide.
None of this really matters for 99,99% of people. Double-blind tests have show time and time again that even on high-tier equipment even those who call themselves audiophiles are able to tell lossless and ~200 kbps VBR lossy apart when derived from the same high-quality masters.
Apple is supposedly using the very same high-quality Apple Digital Masters (24-bit, 192kHz) for all versions. So no matter if you playback the Apple Lossless Hi-Res 24-bit, 192kHz version, the Apple Lossless 16-bit, 44.1kHz version or the 16-bit, 44.1kHz 256 kbps or 128 kbps AAC versions they all created using the same high-quality masters.
All evidence points to barely anyone having the hearing to tell any of this apart. Pretty much all double-blind tests done on top tier equipment concluded that things become transparent (you can't tell them apart anymore) at around ~200 VBR lossy using good encoding. You sure need to have some special hearing and some expensive equipment for being able to tell Apple's 256 kbps AAC encodings apart from the top tier Apple Lossless Hi-Res versions. It's mostly placebo.
The biggest benefit of having access to lossless is to ensure that you have a bit-perfect version going in your pipeline for as long as possible. As Bluetooth AAC adds one additional layer of lossy transcoding you are lowering the risk of anything bad happening along in the transmission when feeding it a lossless source instead of a lossy source. Unless Apple already has something in place in the Bluetooth stack to ensure that playing back Apple Music with 256 kbps AAC is transferred bit-perfect using Bluetooth AAC but I'm not aware of any such capabilities. But Apple doesn't really provide any technical details on how this is working.
https://www.mojo-audio.com/blog/the-24bit-delusion/ is a good read for anyone trying to convince themselves that 24-bit and 192kHz matters for audio playback. It's not like we are lacking in dynamic range or distortion when using 16-bit and 44.1kHz so what do people expect the additional bit-depth and sampling range to offer?
Apple is trying to ensure great masters through their Apple Digital Masters program:
Apple Digital Masters Studio-quality sound. For everyone.
www.apple.com
And when tracks are being encoded using 24-bit, 192kHz high-quality sources barely anyone will be able to tell the difference between 24-bit, 192kHz lossless, 16-bit, 44.1kHz lossless and 16-bit, 44.1kHz lossy no matter how much money you are putting into the equipment. That's if the transmission is bit-perfect. For Bluetooth playback having lossless might prove to be beneficial as I doubt your lossy playback is having a bit-perfect transfer to the headphones so you are lowering the risk of something bad happening during transmission by using a lossless source instead of a lossy one. Having it being 24-bit or 16-bit, 192kHz or 44.1kHz shouldn't make any difference. In fact, when it comes to Bluetooth you are wasting precious bandwidth by tossing in unneeded bit-depth and sampling range.