Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The ABX method is explicitly designed to test whether there are discernible differences between two inputs and it's considered valid if performed properly.

The main issue is that often it's just not performed properly, not in the method itself.
No test has, to my knowledge, been carried out to establish how large a difference needs to be to be consistently “discernable”. I read on diyaudio about someone trying to blind test speaker drivers, and they had to cancel because all they learned was that they couldn’t consistently hear ANY difference between ANY of the drivers… and I don’t think anyone will argue that speaker drivers don’t make a difference… All they learned was that blind test is hard. So, I still claim that blind test is one way. If you can tell a difference in a scientificcaly valid (!) blind test, then I 100% agree that you have now proven there is a difference. If not, then you have ONLY proven that under the given test conditions you were not able to prove a difference. That does NOT mean that you have proven that there is no discernable difference, since you don’t know what the result of different test conditions would be.

My favourite analogy: Look at a bowl with 100 M&M’s. Now look at a bowl with 99 M&M’s. Can you tell which is which? Probably not. Now eat 100 M&M’s, and compare with eating 99 M&M’s. Can you tell the difference? Probably not. Now find 100 5-year olds and tell them they can each have one M&M. You should now have a very noticable auditory response letting you know if you had 100 or only 99… you changed the test conditions, so now the difference matters.

I am NOT saying that there exists a setup where you can blind test and spot the difference scientifically, I’m claiming that such a setup doesn’t exist, BUT that this does not render the difference irrelevant. Because if you add up 10 or even 5 or 2 “imperceivable” difference, you may very well add up to a perceivable one. But how are you going to test that?
 
  • Like
Reactions: sofila and B/D
It tells me I can only download my library and that it is controlled through Apple Music when I try to sync. Interesting. I will have to see if I have something set up wrong. And if I do, thank you for alerting me to this.

**Edit: It worked. I just had to manually delete my library on my phone to make the setting be recognized. Simply changing the setting wasn't enough for MacOS to recognize what I wanted to do. Thank you again.
You might already be aware of this, but as an FYI, you can’t have lossless audio on your phone if you have Apple Music enabled (AFAIK). To my knowledge, unless you sync directly from the computer to the phone, it isn’t lossless. If you’re using Apple Music, that means you’re using the iCloud library to sync. This converts everything to 256k AAC
 
Last edited:
Completely agree, that's why I'm saying one shouldn't expect someone to easily pick which is a 256 AAC vs 24 Bit 192 KHz track without hearing both.

Sure, if it's music you usually listenen to you would most likely be able to do that too, if you have a somewhat trained ear and already know what to listen for. I've been producing music for more than 20 years, and analysing music and especially the mix of a song is something I'm used to and also tend to do even if I'm not actively trying to.

I just did a very unscientific test using Michael Jackson's Billie Jean, AAC from Apple Music vs 24 bit 96 KHz from HDtracks, played back at the same level, and I can easily tell which version I'm listening to even without switching between them.

This is certainly not about having "golden ears" or anything like that, it's about picking up nuances that most people don't care enough about to "hear" (technically they do, though). I'm 45 and my hearing is definitely not 100 %, but again, experience with audio of course helps.
Are you sure we're not talking about two different masterings of the album for the iTunes and 24/96 Hi-Res HD Tracks version to begin with? That's what people often forget to put into the equation when comparing two recordings (besides knowing which source is which when comparing).

If you're ripping „Thriller" from the same CD or whatever source, in 256 AAC and Lossless, comparing without knowing which format is currently playing and still guessing the lossless correct more often than pure chance - consider me convinced.
 
All of these online tests are themselves snake oil. Almost nothing transmits over the Internet losslessly so the samples are themselves compressed--even the "lossless" samples.

But let's take a much better real life example. How many of you have AV receivers with your TV's plugged into a decent set of passive speakers? Do you watch Blu Rays? That's lossless audio. The audio coming from DVD's or your streaming box or cable box: that's lossy audio. You mean to tell me you can't hear how crystal clear Blu Ray audio sounds compared with the comparatively dull and flat sounding audio from other sources?

Can you hear a big difference over Bluetooth headphones or smart speakers or laptop speakers? No. But if your source is good, there is very much an audible difference. And my 47 year old ears probably suck after too many rock concerts and too much headphone use. Now let's hope Apple gets to work on providing hardware that matches the lossless quality of the files.
 
Last edited:
Completely agree, that's why I'm saying one shouldn't expect someone to easily pick which is a 256 AAC vs 24 Bit 192 KHz track without hearing both.

Sure, if it's music you usually listenen to you would most likely be able to do that too, if you have a somewhat trained ear and already know what to listen for. I've been producing music for more than 20 years, and analysing music and especially the mix of a song is something I'm used to and also tend to do even if I'm not actively trying to.

I just did a very unscientific test using Michael Jackson's Billie Jean, AAC from Apple Music vs 24 bit 96 KHz from HDtracks, played back at the same level, and I can easily tell which version I'm listening to even without switching between them.

This is certainly not about having "golden ears" or anything like that, it's about picking up nuances that most people don't care enough about to "hear" (technically they do, though). I'm 45 and my hearing is definitely not 100 %, but again, experience with audio of course helps.

This is a really bad comparison. Unless you can somehow verify that HDtracks and Apple is using the same master/source the difference you notice will most likely be a result of the master/source and not the encoding.

This is why any decent ABX test needs to be a double-blind test and all tracks need to be created from the same master/source.

A good source for testing would be to use Apple Music as the back-end for the ABX test when Apple Lossless and Apple HiRes Lossless becomes available. They should all be using the same master/source so it makes for a good comparison.

When you say that the track from HDtracks sound better there is no telling if by replacing the 96kHz@24-bit track with a good 44.1kHz@16-bit 256 kbps AAC encode you would be able to tell that one apart from the 96kHz@24-bit version.
 
  • Like
Reactions: lars666
My favourite analogy: Look at a bowl with 100 M&M’s. Now look at a bowl with 99 M&M’s. Can you tell which is which? Probably not. Now eat 100 M&M’s, and compare with eating 99 M&M’s. Can you tell the difference? Probably not. Now find 100 5-year olds and tell them they can each have one M&M. You should now have a very noticable auditory response letting you know if you had 100 or only 99… you changed the test conditions, so now the difference matters.
These analogies get weirder and weirder … What are we talking about here? That the compressed version isn't identical to the lossless version in a kind of 100% identical fingerprint/DNA blueprint or in platon's allegory of the cave? No it isn't. But for your ears it still is, cause what was changed/taken away was beyond what your human ear is anatomically able to hear.
 
  • Like
Reactions: Cayden
I did not say blind tests are not scientific!! They certainly can be (if done right). I am saying that they are being misused. As any proper scientist will tell you, also in medical, a test cannot provide proof that there is no difference. It can either provide proof of a difference, or be inconclusive (my wording may be scientifically inaccurate here, I’m not native English).

I fundamentally disagree with your last statement. An example: I am hyper sensitive to nickel. I get rashes from several products that are officially “nickel free”, because there is a limit below which the amount of nickel is officially “not meaningful”, to the extent that the manufcturer can market the product as nickel free. Yet I get rashes that turn bloody if I keep wearing the item. I know it’s not directly comparable, but the point is that when there undeniably IS a difference, we cannot set a fixed limit on when the difference matters.
You’re right statistical analysis doesn’t lead to definitive proof there is no difference. It proves there is no distinguishable difference in the current testing situation. Given the ABX tests that have been given, this means we know that with the vast majority of people’s ears and audio equipment, they can’t tell the difference. I will say my last sentence was not as specific as I indented and left it up for interpretation. What I meant to say was if there is no distinguishable difference with most people, then the difference is meaningless to most people. There are scientifically testable limits to the human senses. Using these limits and characteristics we know about the physics of audio, we can know the limits we need to take audio to be imperceivable to the vast majority of people. There will always be outliers, like you are with nickel. I’m also not trying to say there is no use for lossless audio. I’m not saying Apple or anyone else shouldn’t provide it. I’m saying for the vast majority of people on the vast majority of audio equipment, the difference is imperceivable
 
tenor.gif
 
This is like Tesla coming out with custom Tesla designed car tires and then saying "oh by the way not work on any of our cars"
 
All of these online tests are themselves snake oil. Almost nothing transmits over the Internet losslessly so the samples are themselves compressed--even the "lossless" samples.

But let's take a much better real life example. How many of you have AV receivers with your TV's plugged into a decent set of passive speakers? Do you watch Blu Rays? That's lossless audio. The audio coming from DVD's or your streaming box or cable box: that's lossy audio. You mean to tell me you can't hear how crystal clear Blu Ray audio sounds compared with the comparatively dull and flat sounding audio from other sources?

Can you hear a big difference over Bluetooth headphones or smart speakers or laptop speakers? No. But if your source is good, there is very much an audible difference. And my 47 year old ears probably suck after too many rock concerts and too much headphone use. Now let's hope Apple gets to work on providing hardware that matches the lossless quality of the files.

You are not really giving us much to work with here. Are we talking music videos? Live concerts video? Movies? In what kind of scenarios do you have any apple-to-apple comparison when watching something on your streaming box, DVD or Blu-Ray that makes for any decent comparisons?

Most people do not have setups that supports the playback of Dolby True HD or DTS Master Audio so it doesn't really matter as you are not getting the True HD or DTS Master Audio tracks being used.

So this isn't telling us anything really.
 
  • Like
Reactions: Cayden
This is a really bad comparison. Unless you can somehow verify that HDtracks and Apple is using the same master/source the difference you notice will most likely be a result of the master/source and not the encoding.

This is why any decent ABX test needs to be a double-blind test and all tracks need to be created from the same master/source.

A good source for testing would be to use Apple Music as the back-end for the ABX test when Apple Lossless and Apple HiRes Lossless becomes available. They should all be using the same master/source so it makes for a good comparison.

When you say that the track from HDtracks sound better there is no telling if by replacing the 96kHz@24-bit track with a good 44.1kHz@16-bit 256 kbps AAC encode you would be able to tell that one apart from the 96kHz@24-bit version.
Well, I can still spot which is which after converting the hi-res version down to 256 Kbps AAC. 💁‍♂️
 
  • Like
Reactions: vddobrev
This is like Tesla coming out with custom Tesla designed car tires and then saying "oh by the way not work on any of our cars"
Terrible analogy. It’s more like wondering why your toaster can’t run on 240v. Or why your iPhone power adapter doesn’t charge a MacBook Pro.
 
All of these online tests are themselves snake oil. Almost nothing transmits over the Internet losslessly so the samples are themselves compressed--even the "lossless" samples.
Huh? Of course serious tests like the NPR abx lossless tests provides real lossless samples for the comparison (together with the compressed samples). This is getting ridiculous and the snake oil here stays lossless over AAC 256 - no harm in that besides of higher bandwidth, of course, but really believing and telling others that this sounds so much better is the crazy thing.

Apple itself said in a statement that 256 AAC and their new lossless offer will be "virtually indistinguishable". Apple, the master of overpraising their new features, itself. Let that sink. Dolby Atmos and/or Spatial may make a difference, I don't know about that (although I'm sure that if yes, these two things will be mixed up here as soon as the service starts à la „See?! There's a CLEAR difference!"), but lossless doesn't.
 
Last edited:
  • Like
Reactions: Cayden
Well, I can still spot which is which after converting the hi-res version down to 256 Kbps AAC. 💁‍♂️
If the High-Res 24/96 HD track version had a different mastering to begin with, of course, you can … 🤦‍♂️
 
You are not really giving us much to work with here. Are we talking music videos? Live concerts video? Movies? In what kind of scenarios do you have any apple-to-apple comparison when watching something on your streaming box, DVD or Blu-Ray that makes for any decent comparisons?

Most people do not have setups that supports the playback of Dolby True HD or DTS Master Audio so it doesn't really matter as you are not getting the True HD or DTS Master Audio tracks being used.

So this isn't telling us anything really.
It doesn't matter. I've watched concert Blu Rays, movies, TV shows--they all sound clearer and more immediate than any other source. It's not really even that subtle. And this is not some radical statement. I've heard many other people comment on how much better Blu Ray audio sounds. And yes I have a Yamaha receiver which can decode DTS Master and Dolby True HD so I am getting the true lossless audio. I don't think these kinds of receivers are all that rare.
 
In that case analog music can’t really record true audio, it can only approach it asymptotically. Microphones and analog storage mediums always have physical limitations as to the highest frequencies they can detect. They are limited by the frequency response of the circuits used to record them and to process them. Because of this, analog recordings themselves are almost always capped at the same rate as digital recordings
Semantically, lossless doesn’t exist as there is loss as soon as the sound hits the microphone, and it only gets worse from there.

As is often the case when people are arguing, the truth is somewhere in between. I find that in most arguments both parties are wrong to a certain degree… I believe all below statements to be true:

1: Compression is bad and should be avoided if practical
2: Compression is not as bad as some will try to make you believe
3: Blind tests are being misused and in this context is inherently misleading
4: There is a lot of snake oil in audio
5: Objectivists are missing a lot of details that matter because they constantly seek proof

I don’t consider myself either an “audiophile” or a scientist. I try to seek better performance by making sense of my experiences. Neither outlier camp seem to be right to me. The thing I keep returning to though, is that if I make a top ten of my best, most mind-blowing or most emotionally impacting experiences with hifi systems, they have all been done by “audiophiles”, and most had snake oil components. I prefer great sound with a few non-functioning tweaks over a scientifically proven system that doesn’t sound right. I’d rather have great sound, than know that I was RIGHT!

All good scientists don’t rest in the knowledge that everything is known. They search for truths that have yet to be discovered.
 
🤔 I just said I converted the the hi-res now to 256 Kbps AAC.
I'm not talking about bitrate, but how the album was (re-)mastered. The "Thriller" album on HD Tracks seem to have the SACD as a source and I'm pretty sure that's a different mastering than the normal iTunes version.
 
Huh? Of course serious tests like the NPR abx lossless tests provides real lossless samples for the comparison (together with the compressed samples). This is getting ridiculous and the snake oil here stays lossless over AAC 256 - no harm in that besides of higher bandwidth, of course, but really believing and telling others that this sounds so much better is the crazy thing.

Apple itself said in a statement that 256 AAC and their new lossless offer will be "virtually indistinguishable". Apple, the master of overpraising their new features, itself. Let that sink.
Obviously Apple is going to say that as they don't yet have hardware that can play it back so it's a form of PR damage control. Once they do, they will change their tune on a dime. Let that sink in.
 
I'm not talking about bitrate, but how the album was (re-)mastered. The "Thriller" album on HD Tracks seem to have the SACD as a source and I'm pretty sure that's a different mastering than the normal iTunes version.
Sure. But again, I'm now using my 24 Bit 192 KHz as source and I'm creating a 256 Kbps AAC from that. How is that not a valid test here?
 
  • Like
Reactions: ErikGrim
Sure. But again, I'm now using my 24 Bit 192 KHz as source and I'm creating a 256 Kbps AAC from that. How is that not a valid test here?
Ah, okay – so you're comparing the HD source with a self-created 256 AAC of the same source, not the iTunes source anymore? In this case, I have to admit that I'm having a hard time to believe that you're able to spot this in a real BLIND test (which you are doing, right?), but if you say so, I believe you, of course. Congratulations to your ears. 🙂
 
I think I’m able to spot differences between ALAC and 256 kbps files, but this happens only if:
- It’s music I really know well
- I’m using my Shure SE-846 which I’m really happy with.
I’ve tried several blind tests with other (even very expensive) equipment and different music but not with the same results. Particular knowledge of the tracks you’re listening to makes the difference. My 2 cents.
 
Ah, okay – so you're comparing the HD source with a self-created 256 AAC of the same source, not the iTunes source anymore? In this case, I have to admit that I'm having a hard time to believe that you're able to spot this in a real BLIND test (which you are doing, right?), but if you say so, I believe you, of course. Congratulations to your ears. 🙂
Correct. And again, there's no magic here, this is not my ears being special but me having experience in analysing minute differences in audio from many years of doing just that. I don't get why this is considered so out of this world odd.

Maybe this track is easier than others to distinguish, but the snare drum is really a dead giveaway where the transient is noticeably softer in the AAC version.
 
  • Like
Reactions: ErikGrim
These analogies get weirder and weirder … What are we talking about here? That the compressed version isn't identical to the lossless version in a kind of 100% identical fingerprint/DNA blueprint or in platon's allegory of the cave? No it isn't. But for your ears it still is, cause what was changed/taken away was beyond what your human ear is anatomically able to hear.
You insist on dismissing the premis for my argument. 1: I do not agree that a blind test is proof that a difference is undetectable. 2: Even if it is, when you add this to other differences that objectivists tell me don’t matter, like DAC, amp, decoupling, cables, power cleaning, correctly phased power etc, all these “undetectable” errors DO add up to very significant differences. So your claim that the compression is irrelevant is incorrect, because you never listen to ONLY the difference in compression scheme, except in a stupid debate like this. I actually agree (contrary to some audiophiles) that if you have a truly great system where you dismissed ALL the other points where objectivists try to make my system worse, and ONLY compare compression vs lossless, all else being equal - then the difference is so small that it may be undetectable. I prefer a good master in 256 kbit over a bad master in 24/192 (and yes there are several examples of exactly that) any day of the week.

BUT, I dismiss your argument that if I can’t prove scientifically tha I can tell the difference, then the difference doesn’t matter. Because if I accept that, you and your peers will throw ten more “undetectable” errors at me, until my system sounds like crap.

I don’t think you have any idea what so ever how large a difference can be, without being provable in a blind test, most likely because you are only blind testing differences that are actually small. Try establishing how large the difference needs to be before the blind test is positive. You will be surprised. If you still think differences that you won’t consistently verify, such as AMT vs dome tweeters, are pointless, then debating with you in the first place is pointless.

In other words, why should I accept THIS particular right-on-the-edge-of-indiatinguishable error, but not all the other similar ones that objectivists try to tell me are irrelevant? Or do you fundamentally diaagree that several undetectable errors can add up to a detectable one?
 
Last edited:
  • Like
Reactions: B/D and SpringKid
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.