Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
8 tracks were a garbage format from the get go, let alone by today’s standards. I would not compare them to other contemporary analog tape formats, any of which would handily outperform it.
Yeah, I actually used Compact Cassettes with Dolby B and then Dolby C as the 1980s hit. And most still sounded worse than modern digital. I loved LPs, but as the oil crisis went on, they got thinner and thinner and the the vinyl got worse, so you would get warps and crackles. I remember CDs being a major improvement just for the lack of noise, but I do recall some complaints that the early CDs were mastered like they still used the RIAA Curve. We used to pay $25 for CDs when LPs were only $10, so someone must have thought they sounded better. LP prices now are breathtaking.
 
  • Like
Reactions: MrRom92
This keynote was really terrible. It seems like they’ve virtually forgotten to mention the most important stuff. Lmao.
 
Last edited:
The average person’s audio system 50 years ago was a half decent stereo receiver and some speakers. The average person’s audio system today is a Beats Pill or similar battery powered bluetooth speaker, if not just listening to the music straight out of the phone speakers. You really think this is an improvement for the average person’s audio experience?
Speak for yourself. My parents and grandparents had 2 ****** stereo speakers hooked up to old stereo. I have a Dolby atmos 5.1.2 home theater system that sounds incredible🤷🏻‍♂️
 
  • Like
Reactions: bgillander
Come on. My 2014 LG Bluetooth buds thingy did CD-quality when connected to anything that had an APTX codec (like my 2011 MacBook).

Weak sauce, Apple. Weak sauce.
Except that AptX was not lossless at that time. But of course you knew that.

True lossless over BT barely exists at this point, but we may get it, eventually.

 
  • Like
Reactions: MrRom92
I wonder if limited to the headset by choice or if it’s because the signal needs to 2 inches away to get that bandwidth.
But aren’t all AirPods, including both APP 2 and APP 2 USB-C, all still just Bluetooth at their core? Specifically, it’s the H2 chip and Bluetooth 5.3 for APP 2.

But it’s all wireless via Bluetooth, and, surely, Apple Vision Pro cannot alter the hardware of APP 2 USB-C, but will just transmit audio to them like any other Apple device?

Or do APP 2 USB-C and Apple Vision Pro have some undisclosed wireless radio that we don’t know of?

If APP 2 Lightning and APP 2 USB-C have the exact same hardware internals, apart from their respective charging ports, then the Lightning version is only limited artificially via software, not anything else.

As it stands, it doesn’t add up unless it’s a sales strategy and not an actual hardware limitation for APP 2 Lightning.
 
That’s why it is so bizarre. Apple said they are using a new protocol here. So not Bluetooth.

“The H2 chip in the latest AirPods Pro and Apple Vision Pro, combined with a groundbreaking wireless audio protocol, unlocks powerful 20-bit, 48 kHz Lossless Audio with a massive reduction in audio latency.”

That makes sense as Bluetooth lacks bandwidth so they had to limit it to this new hardware as the new protocol likely needed new antennas. Where it falls down for me though. Why now? Why make the AirPod Pro even more confusing for a device that isn’t out this year?

Would make more sense launching this with AirPod Pro 3rd-Generation next year with the same chip on the iPhone 16.
Exactly. Really confusing as it’s just a side note for APP 2 USB-C, not a headline.
 
This is horse ****. Audio quality has definitely gotten better over the last 50 years. Shoot, mastering and the production pipeline is crystal clear nowadays and has noticeably more punch than even 10 years ago. People can’t tell the difference between high bitrate lossy (256mbps AAC or mp3) and lossless. You just plain can’t and it’s been proven over and over. This is a complete non-is

When Steve introduced the original iPod and said it holds 1,000 songs at 128kbps, he called it a "very high quality bitrate". lol
It was compared to file sharing sites lie Napster which always had bit rates all over the place and people were copying files and then converting them to a different format and then re-uploading them and converting them and re-uploading them to make it appear as if it were a higher bitrate. Compared to how things are now, it was a crappy bitrate.
 
Yeah, I actually used Compact Cassettes with Dolby B and then Dolby C as the 1980s hit. And most still sounded worse than modern digital. I loved LPs, but as the oil crisis went on, they got thinner and thinner and the the vinyl got worse, so you would get warps and crackles. I remember CDs being a major improvement just for the lack of noise, but I do recall some complaints that the early CDs were mastered like they still used the RIAA Curve. We used to pay $25 for CDs when LPs were only $10, so someone must have thought they sounded better. LP prices now are breathtaking.
 
Lol I feel even worse than you guys who got the AirPod Pro 2's when they came out last year considering I got AirPod Pro 2's like..a month ago :D

if I'm not too worried about USB C (already have to carry the lightning for my phone...), am i missing any other features? I'm getting mixed messages about the lossless; is it only the USB C thats supporting it or the lightning will too?

Thirdly, the new lossless would only be for the Vison Pro?
 
This is horse ****. Audio quality has definitely gotten better over the last 50 years. Shoot, mastering and the production pipeline is crystal clear nowadays and has noticeably more punch than even 10 years ago. People can’t tell the difference between high bitrate lossy (256mbps AAC or mp3) and lossless. You just plain can’t and it’s been proven over and over. This is a complete non-issue.
I agree in principle but thanks to modern mastering choices I think there is a strong argument that music sounds worse than it did say in the 80s. Nothing to do with codecs or formats but maybe these mastering choices are why it is harder to tell a 256AAC from 24/192 FLAC. Also the overall quality iof your earphone counts too(driver, processing etc) just because your $50 earbuds do lossless does not mean they will sound better, the whole chain counts.

I'd take a well mastered version at 256 AAC over brickwalled 24/192 all day long. Everythng else being equal though I also think that CD quality is more than good enough.

Straying into another conversation though. As for the OP I am interested as to exactly how Apple will implement lossless into the bluetooth equipped APP, even if I do not really care!
 
Last edited:
20bit/48khz lossless is more than good enough, 24bit is only needed for mastering purposes really to prevent generational losses in processing stages. For playback, the level of dynamic range increase over 16 bit will be perceptually perfect. But is a shame if limiting it to vision pro, even if likelt understanable due to the bandwidth needed.
16/44 is more than good enough IMO but I agree.
 
We are most likely jumping to conclusions on this topic. I highly doubt the actual hardware of the earbuds is any different between the AirPods Pro 2 (Lightning) and AirPods Pro 2 (USB-C). This sounds to be exclusive to when you combine Vision Pro + AirPods Pro 2, most likely as a result of range restrictions when operating in "lossless" mode, just like how Sonys LDAC has terrible range when operating in "lossless".

I'd bet this will be available with the original AirPods Pro 2 as well via a firmware update. Doesn't make much sense for Apple to have two different revisions of the earbuds with different capabilities.
 
  • Like
Reactions: bgillander
Hang on, I’m confused.
So there are new airpods pro that have been launched with other differences, not just the usb c case? Reading earlier articles I thought the existing airpods pro were now shipping with just a new case that has usb-c but everything else was the same?
 
Not sure why people are bringing up 24-bit at 192kHz. Pretty much all digital music has been using 16-bit at 44.1kHz. This already covers about twice the dynamic range a perfect human ear can hear. None of us above 20 years have anything close to perfect hearing, to begin with, and even if you somehow have perfect hearing, it still covers lower and higher frequencies than your perfect ear can hear.

What possible reason would Apple have to waste range and bandwidth by going 24-bit at 192kHz? This is like asking for Thunderbolt 5 when you are only connecting a 5400 RPM external hard drive. Utterly useless. It's even more useless when talking about codec for your wireless earbuds as the meaningless high bandwidth needed for 24-bit at 192kHz isn't only wasteful itself, but this will directly affect the range and power usage.
 
  • Like
  • Disagree
Reactions: nrose101 and base08
This most likely has to do with the significant bandwidth that is required by Lossless audio and the fact that Bluetooth can't do full bandwidth over distance. To me it makes sense that the Vision Pro can do it with the airpods being so close but with the phone, the airpods are roughly 3ft away from your pocket or more if you leave your phone on a table, your outside deck, etc while listening.

I don't like this restriction but the restriction is definitely because of bluetooth. Bluetooth 5.3 has a max banwidth cap of 2Mbps and that drops significantly over distance. From a BT 5.1 test I saw, the bandwidth dropped around 4ft and after that it kept dropping. There was a lot more detail to this test and it was done with solid bluetooth hardware.

When Im on my laptop again tomorrow, ill try to remember to find the link and share it here.
I'm surprised they haven't dual-stacked Bluetooth & their own new wireless audio protocol. Seems like the obvious thing to do, given their Ultra Wideband custom silicon. Maybe they're ramping up to do it? Or maybe Bluetooth 6 will have a 2nd set of frequencies using a different technology.
 
  • Like
Reactions: Sikh
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.