Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I possess a Special Audio Cable from AudioQuest. Perhaps we should consider purchasing a USB-C cable with the same build.
Um. So expensive! And looks like these are more designed for eliminating signal noise rather than "microphonics" noise...
 
This should have been there from day 1 of the product. Took some time but anyway good to see it now especially for a very costly headphone.
 
  • Like
Reactions: mganu
But you lose Atmos with the cable, mechanically TRS can only transmit stereo.

There’s no way to get lossless spatial tracked no latency audio into the ligthning AirPods Max without using a different rendering solution on the computer. This removes that as a barrier at least, and allows for head tracking and lossless from all modern devices.



People are really confused at this 3.5mm thing, the announced 3.5mm USB-C cable is incidental and useful if you have a Stereo analog source you want to use them with.

You need a C-to-C cable to get lossless and Personalized Spatial Audio / Atmos with head tracking.

The 3.5mm cable will enable low latency, near lossless (but not completely) playback of Stereo sources only.

The USB-C to USB-C cable will enable low latency, entirely lossless playback of Stereo and Spatial Audio/Atmos and support head tracking and pro-audio mixing via Personalized Spatial Audio with Apple’s renderer.


The “personalized spatial audio” is doing a lot of work in the marketing, but it is important because it applies a HRTF to your profile which makes a difference. There were previously other mostly-paid solutions that would render on the computer and downmix to 2.1 but this is a much more straightforward option for people that care about it.

It also means the cable in the box isn’t a charge cable but is a data cable. I guess if we had paid attention to that at the time this could’ve been predicted.
Thanks for this. Someone else mentioned that the C to 3.5mm won’t work to plug into a microphone and get audio back?
 
  • Like
Reactions: novagamer
Thanks for this. Someone else mentioned that the C to 3.5mm won’t work to plug into a microphone and get audio back?

Right, the USB-C/3.5mm cable can only be used for two channels of audio (stereo audio). If you plug the USB-C end into your AirPods Max headphones, it will be "audio input" to the headphones. If you plug it into anything else, it will be "audio output" from that device.

Using a "C-to-C" cable *might* allow the AirPods Max to be used as a wired headset (stereo audio *plus* microphone) — a USB connection could certainly handle that. But, Apple hasn't actually stated that the capability is there, so it is just speculation until someone tries it, and it cannot be tried until Apple pushes the AirPods Max firmware update out.
 
  • Like
Reactions: viachicago22
You can look at the end of the cable to tell — the 3.5mm connector only has two bands which means it has two channels (left ear and right ear), if it supported headset/mic input then it would have three bands.

So, no. Audio output only. No microphone signal going back.
Yeah yesterday I went looking for the adaptor on Apple.com and checked

Super lame, maybe it works via USB-C direct but I doubt it - the firmware probably just doesn’t allow for it
 
  • Like
Reactions: viachicago22
Super lame, maybe it works via USB-C direct but I doubt it - the firmware probably just doesn’t allow for it
They are supporting other features like spatial audio / head tracking over USB-C. It seems like they are trying to offer the full AirPods Max experience over USB-C. I wouldn't be surprised if the mic is supported. (I also wouldn't be surprised if it is not.)

I am sitting here ready to test it out, as soon as the firmware update drops ...
 
Last edited:
  • Like
Reactions: viachicago22
True - but could these connect to a DAC with a USB-C out?
This would mean a DAC to DAC connection which would be generally pointless: the headphone's built-in DAC is the limiting factor since analog bypass isn't supported/possible with its current hardware design.

This is the problem with modern "Active Digital Systems" and "smart audio devices" like AirPods Max: they create confusion because they contain built-in DSPs and DACs that re-process incoming signals (even if wired analog inputs are possible) through proprietary digital pathways, making it unclear whether external DACs provide any benefit or if analog signals remain pure. It's not just Apple problem, it's an industry problem.

Manufacturers need to adopt standardized terminology and provide transparent documentation about how signals are processed internally—whether analog inputs are digitized, if analog bypass modes exist, and what sampling/bit rates are used throughout the entire signal chain—so consumers can make informed decisions about which components actually improve their listening experience.

Most of the time I need to contact a company's tech support to get answers about their product's signal path. There's currently no standard practice or format for documenting this. The better audio companies will publish specs with signal path info, even sharing DSP chipset details, but those companies are rare.

In Apple's case it looks like customers are stuck with the H1 chipset's capacity which isn't clear. It's likely 48kHz/24-bit is the max capacity for the H1 due to its chip architecture and/or power limits.
 
Not at all false, it has been proven time and time again in blind A/B tests on gear you'll never even see in your life time let alone hear.

Of course you're welcome to prove you have better ears than the best mastering engineers in the industry - you'd quickly get a job with those golden ears! https://abx.digitalfeed.net/faac.320.html
The problem with these tests is that it is not the gear, but the ears and music. To do a proper test the listeners have to be intimately familiar with the raw analog un-amplified version of the music and have good ears.

Audiophile listeners do NOT necessarily have good ears, just big pocketbooks.

The listener has to have the ability to detect the differences and know what the raw un-amplified analog baseline is. And by analog, I don't mean an analog recording, but live without amplification.

Trying to notice the difference with modern intentionally distorted digital music is another problem with these tests. Once distortion enters the picture then it hides the distortion added by the conversion rate. The creators of these tests simply are not controlling all of the variables and using good source material.

Then you get to the part where the creators of these tests are "in the industry" and have a vested interest in 48K and below. It is cheaper and requires less skill to produce. So I would say that these tests are doing exactly what they were created to do, that is create a mindset that the resolution does not matter so people will not be critical. Apparently it works.

I agree with you that most people cannot hear the difference because all they have listened to their entire life is distorted digital music. That does not mean there is not a difference.

And finally it is not necessarily the sampling frequency that makes the difference. The difference is in the reconstructed audio which is influenced by both the sampling frequency and how well the post DAC filtering is designed and matched to the sampling frequency. It is the post DAC filtering that makes the argument that sampling frequency is end-all, be-all false. If you listen to raw DAC output without any post DAC filtering, then what you hear is not pleasant.
 
They are supporting other features like spatial audio / head tracking over USB-C. It seems like they are trying to offer the full AirPods Max experience over USB-C. I wouldn't be surprised if the mic is supported. (I also wouldn't be surprised if it is not.)

I am sitting here ready to test it out, as soon as the firmware update drops ...
Fingers crossed, I really like the way the AirPods Max sound and I would love to use them while streaming but the audio quality drop while recording is horrendous
 
The problem with these tests is that it is not the gear, but the ears and music. To do a proper test the listeners have to be intimately familiar with the raw analog un-amplified version of the music and have good ears.

Audiophile listeners do NOT necessarily have good ears, just big pocketbooks.

The listener has to have the ability to detect the differences and know what the raw un-amplified analog baseline is. And by analog, I don't mean an analog recording, but live without amplification.

Trying to notice the difference with modern intentionally distorted digital music is another problem with these tests. Once distortion enters the picture then it hides the distortion added by the conversion rate. The creators of these tests simply are not controlling all of the variables and using good source material.

Then you get to the part where the creators of these tests are "in the industry" and have a vested interest in 48K and below. It is cheaper and requires less skill to produce. So I would say that these tests are doing exactly what they were created to do, that is create a mindset that the resolution does not matter so people will not be critical. Apparently it works.

I agree with you that most people cannot hear the difference because all they have listened to their entire life is distorted digital music. That does not mean there is not a difference.

And finally it is not necessarily the sampling frequency that makes the difference. The difference is in the reconstructed audio which is influenced by both the sampling frequency and how well the post DAC filtering is designed and matched to the sampling frequency. It is the post DAC filtering that makes the argument that sampling frequency is end-all, be-all false. If you listen to raw DAC output without any post DAC filtering, then what you hear is not pleasant.
Well said. With my closed studio and open audiophile headphones I can notice the difference between 48k and 96k, but beyond that it's harder to discern bitrate quality differences. But post-DAC filtering is a massive difference. The focus on transients handling, the air and spaciousness of acoustic dynamics is dramatic depending on which type of filtering is used, and how it was recorded and mixed. Currently use minimum phase filters most often. Appreciating this level of detail only works for high quality recordings.
 
  • Like
Reactions: nt5672
Does anybody else think it’s a bit disingenuous of Apple to say Spatial Audio will be lossless as well considering on all music streaming services Dolby Atmos audio is lossy files?
 
Well there you have it. These just went from no-need to must-buy. I’ve got some purple ones on the way finally. Nice present to myself for being patient and not letting them get away with releasing these without at least feature parity with the gen1’s.

With these additions these finally feel like a truer sequel. Still the H1 chip which is annoying but I’m willing to take the plunge now at Amazon’s discounted $479 price point.
 
The problem with these tests is that it is not the gear, but the ears and music. To do a proper test the listeners have to be intimately familiar with the raw analog un-amplified version of the music and have good ears.

Audiophile listeners do NOT necessarily have good ears, just big pocketbooks.

The listener has to have the ability to detect the differences and know what the raw un-amplified analog baseline is. And by analog, I don't mean an analog recording, but live without amplification.

Trying to notice the difference with modern intentionally distorted digital music is another problem with these tests. Once distortion enters the picture then it hides the distortion added by the conversion rate. The creators of these tests simply are not controlling all of the variables and using good source material.

Then you get to the part where the creators of these tests are "in the industry" and have a vested interest in 48K and below. It is cheaper and requires less skill to produce. So I would say that these tests are doing exactly what they were created to do, that is create a mindset that the resolution does not matter so people will not be critical. Apparently it works.

I agree with you that most people cannot hear the difference because all they have listened to their entire life is distorted digital music. That does not mean there is not a difference.

And finally it is not necessarily the sampling frequency that makes the difference. The difference is in the reconstructed audio which is influenced by both the sampling frequency and how well the post DAC filtering is designed and matched to the sampling frequency. It is the post DAC filtering that makes the argument that sampling frequency is end-all, be-all false. If you listen to raw DAC output without any post DAC filtering, then what you hear is not pleasant.

Some of what you say is true - however you cannot claim that everything they use in the tests is "modern distorted music" - and they're using the master files as they come out of the mastering studio - it is as good as it can sound. The have everything from the Eagles in the 70s to modern day releases. Some of which is made in bedrooms on cheap audio interfaces of course.

I'm not sure why you're conflating sample rate with lossless compression. We're not comparing "high resolution audio" against CD quality, we're comaring lossless CD quality against lossly CD quality. These are two cans of worms, but for the records no, high bit-rates and higher sample rates do not make any difference to the end user either.

Higher sample rates do not requiremore skill to produce at, just more money and more powerful hardware - when it's pointless, you can just use over sampling for the same end result.

Regardless of that - no one is hearing the difference on AirPods Max people an Apple Lossless audio file or a 256kbps AAC file. I guarantee there's people on here who'll tell you you can clearly hear the difference when playing back lossless files on Apple Music when they've been listening to them over bluetooth for the last 3 years.
 
Some of what you say is true - however you cannot claim that everything they use in the tests is "modern distorted music" - and they're using the master files as they come out of the mastering studio - it is as good as it can sound. The have everything from the Eagles in the 70s to modern day releases. Some of which is made in bedrooms on cheap audio interfaces of course.

I'm not sure why you're conflating sample rate with lossless compression. We're not comparing "high resolution audio" against CD quality, we're comaring lossless CD quality against lossly CD quality. These are two cans of worms, but for the records no, high bit-rates and higher sample rates do not make any difference to the end user either.

Higher sample rates do not requiremore skill to produce at, just more money and more powerful hardware - when it's pointless, you can just use over sampling for the same end result.

Regardless of that - no one is hearing the difference on AirPods Max people an Apple Lossless audio file or a 256kbps AAC file. I guarantee there's people on here who'll tell you you can clearly hear the difference when playing back lossless files on Apple Music when they've been listening to them over bluetooth for the last 3 years.
I am sure you don't think that Masters cannot have distortion, because they can depending on the source material and recording techniques. And that distortion, when present potentially covers up DAC distortion. Anything run through a DAC is distorted until processed by the post DAC filter. Furthermore, DAC processing adds odd-harmonic distortion, not even-harmonic distortion what is more comforting to the ear.

I never mentioned compression, not sure where that came from.

I agree no one is hearing the difference on AirPods.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.