Yeah, I actually used Compact Cassettes with Dolby B and then Dolby C as the 1980s hit. And most still sounded worse than modern digital. I loved LPs, but as the oil crisis went on, they got thinner and thinner and the the vinyl got worse, so you would get warps and crackles. I remember CDs being a major improvement just for the lack of noise, but I do recall some complaints that the early CDs were mastered like they still used the RIAA Curve. We used to pay $25 for CDs when LPs were only $10, so someone must have thought they sounded better. LP prices now are breathtaking.8 tracks were a garbage format from the get go, let alone by today’s standards. I would not compare them to other contemporary analog tape formats, any of which would handily outperform it.
It's what they do.So……my AirPods Pro 2 from last year won’t get this lossless via software update and I can’t buy the case separately? Sigh…..Apple…..so not cool.
Speak for yourself. My parents and grandparents had 2 ****** stereo speakers hooked up to old stereo. I have a Dolby atmos 5.1.2 home theater system that sounds incredible🤷🏻♂️The average person’s audio system 50 years ago was a half decent stereo receiver and some speakers. The average person’s audio system today is a Beats Pill or similar battery powered bluetooth speaker, if not just listening to the music straight out of the phone speakers. You really think this is an improvement for the average person’s audio experience?
Except that AptX was not lossless at that time. But of course you knew that.Come on. My 2014 LG Bluetooth buds thingy did CD-quality when connected to anything that had an APTX codec (like my 2011 MacBook).
Weak sauce, Apple. Weak sauce.
But aren’t all AirPods, including both APP 2 and APP 2 USB-C, all still just Bluetooth at their core? Specifically, it’s the H2 chip and Bluetooth 5.3 for APP 2.I wonder if limited to the headset by choice or if it’s because the signal needs to 2 inches away to get that bandwidth.
Exactly. Really confusing as it’s just a side note for APP 2 USB-C, not a headline.That’s why it is so bizarre. Apple said they are using a new protocol here. So not Bluetooth.
“The H2 chip in the latest AirPods Pro and Apple Vision Pro, combined with a groundbreaking wireless audio protocol, unlocks powerful 20-bit, 48 kHz Lossless Audio with a massive reduction in audio latency.”
That makes sense as Bluetooth lacks bandwidth so they had to limit it to this new hardware as the new protocol likely needed new antennas. Where it falls down for me though. Why now? Why make the AirPod Pro even more confusing for a device that isn’t out this year?
Would make more sense launching this with AirPod Pro 3rd-Generation next year with the same chip on the iPhone 16.
How is the new pro iPhone not capable of doing this? What a mindboggling restriction with the Vision.
When Steve introduced the original iPod and said it holds 1,000 songs at 128kbps, he called it a "very high quality bitrate". lol
Lossless audio was a thing back then, even long before that.Well he has been dead for long. We also used to think 720p HD looked incredible
They did, for about two seconds. I thought they were talking about older ones.Strange why it was not mentioned in the keynote. It could have taken just a few minutes.
This is horse ****. Audio quality has definitely gotten better over the last 50 years. Shoot, mastering and the production pipeline is crystal clear nowadays and has noticeably more punch than even 10 years ago. People can’t tell the difference between high bitrate lossy (256mbps AAC or mp3) and lossless. You just plain can’t and it’s been proven over and over. This is a complete non-is
It was compared to file sharing sites lie Napster which always had bit rates all over the place and people were copying files and then converting them to a different format and then re-uploading them and converting them and re-uploading them to make it appear as if it were a higher bitrate. Compared to how things are now, it was a crappy bitrate.When Steve introduced the original iPod and said it holds 1,000 songs at 128kbps, he called it a "very high quality bitrate". lol
Yeah, I actually used Compact Cassettes with Dolby B and then Dolby C as the 1980s hit. And most still sounded worse than modern digital. I loved LPs, but as the oil crisis went on, they got thinner and thinner and the the vinyl got worse, so you would get warps and crackles. I remember CDs being a major improvement just for the lack of noise, but I do recall some complaints that the early CDs were mastered like they still used the RIAA Curve. We used to pay $25 for CDs when LPs were only $10, so someone must have thought they sounded better. LP prices now are breathtaking.
I used to load AIFF files on my original iPod, I think I could fit around 30 songs on it and the buffer was always so full that the hard drive would run constantly.Lossless audio was a thing back then, even long before that.
And I sure won't buy the same AirPods twice. Would have bought an USB-C case however.Won’t be purchasing since I wirelessly charge my case anyway.
I agree in principle but thanks to modern mastering choices I think there is a strong argument that music sounds worse than it did say in the 80s. Nothing to do with codecs or formats but maybe these mastering choices are why it is harder to tell a 256AAC from 24/192 FLAC. Also the overall quality iof your earphone counts too(driver, processing etc) just because your $50 earbuds do lossless does not mean they will sound better, the whole chain counts.This is horse ****. Audio quality has definitely gotten better over the last 50 years. Shoot, mastering and the production pipeline is crystal clear nowadays and has noticeably more punch than even 10 years ago. People can’t tell the difference between high bitrate lossy (256mbps AAC or mp3) and lossless. You just plain can’t and it’s been proven over and over. This is a complete non-issue.
16/44 is more than good enough IMO but I agree.20bit/48khz lossless is more than good enough, 24bit is only needed for mastering purposes really to prevent generational losses in processing stages. For playback, the level of dynamic range increase over 16 bit will be perceptually perfect. But is a shame if limiting it to vision pro, even if likelt understanable due to the bandwidth needed.
I'm surprised they haven't dual-stacked Bluetooth & their own new wireless audio protocol. Seems like the obvious thing to do, given their Ultra Wideband custom silicon. Maybe they're ramping up to do it? Or maybe Bluetooth 6 will have a 2nd set of frequencies using a different technology.This most likely has to do with the significant bandwidth that is required by Lossless audio and the fact that Bluetooth can't do full bandwidth over distance. To me it makes sense that the Vision Pro can do it with the airpods being so close but with the phone, the airpods are roughly 3ft away from your pocket or more if you leave your phone on a table, your outside deck, etc while listening.
I don't like this restriction but the restriction is definitely because of bluetooth. Bluetooth 5.3 has a max banwidth cap of 2Mbps and that drops significantly over distance. From a BT 5.1 test I saw, the bandwidth dropped around 4ft and after that it kept dropping. There was a lot more detail to this test and it was done with solid bluetooth hardware.
When Im on my laptop again tomorrow, ill try to remember to find the link and share it here.