So weird how people can't seem to tell the difference between AAC and lossless, when I can hear the difference immediately on just my mid-level desktop speakers. On my full-sized audio system the difference is glaring. And while the difference between lossless and hi-resolution lossless is less, it's still there.
Anyone with a decent sound system can tell the difference if you know what to listen for. AAC is just MP3 done correctly, but no lossy codec should be the standard when people can hook up their Apple TV, or stream from an iPhone, to a home audio system. And while 48/24 lossless sounds great, but difference between that and hi-res is in detail.
For example, if you're listening to the sound of a violin, the difference is in hearing the notes played, as opposed to hearing the sound of the bow on the strings. Or not just hearing whether it's a Strat or Telecaster, but what gauge strings are being played. Some of us want to hear that realism. Why should we be denied just because "most people" can't tell the difference? Especially when the hi-resolution version is just sitting there on Apple's servers.
Let me put it another way: why even bother with 4K televisions, when most people can't tell the difference between that and 1080p? Why bother with high-resolution photographs, or even the retina screen on your iPhone, when most people can't tell the difference? Why are people wanting 120mHz refresh rates on their screens when most people can't tell the difference?
Lastly, not being able to stream high-res audio from Apple Music is bad enough, but not being able to stream high-resolution audio files from my Mac is just stupid. Why does my Apple TV take the extra step of down-sampling those files instead of just passing the audio full-resolution, like my 9 year old Blu-ray player does? It's just lame.