Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

gnasher729

Suspended
Nov 25, 2005
17,980
5,565
I plan to use the iTunes match trick to upgrade all my low bit rate tunes to 256 kbps AAC using iTunes match.

What about tunes that are 256 kbps mp3 and above. Are the iTunes store tracks still better? a) AAC is better than mp3 and b) iTunes tracks are made from masters and not the CD.

I know that lossless tracks will be better, but are there any higher bit rate mp3 that are better quality than the 256 kbps AAC files from the iTunes store?

256 KBit AAC will be better quality than 320 KBit mp3. And it is possible that the iTunes Store has better originals than you had. There's also the problem that the encoder that was used plays a role - 320 mp3 encoded with a rubbish encoder isn't as good as 320 mp3 encoded with a very good encoder. Worst case, someone might have recorded an LP with 128 KBit mp3, then converted it to 320 KBit.
 
  • Like
Reactions: Wizec

Aragornii

macrumors 6502a
Original poster
Jun 25, 2010
512
139
That would tend to support the theory that the 256 AAC Apple ripped from a CD would actually be better than the 320 MP3 that the OP ripped from a CD, assuming the AAC used Apple's most modern flavor of AAC. And you can probably bet that Apple is not just using the consumer encoder available in every free copy or iTunes to build their library, but top-shelf encoding hardware. That equals even better quality.

That's my thought as well. I ripped about half my CD collection using iTunes at 320 kbps MP3 before ripping the rest using Apple lossless. I have no intention or re-ripping half of my entire CD collection, but I think I will take advantage of the fact that I can use iTunes match to replace the mp3 tracks with 256 kbps AAC from the Apple store.
 

Lvivske

macrumors 6502a
Aug 22, 2011
600
242
🇺🇦
I really hope you are not going to cling to that. If you put either one of them into a quality stereo system, it is difficult to immediately declare "that's the rip!" or "that's the original". It's not even easy in a double-blind study where you play first one, then the other, and then ask which is which. But if you play them simultaneously and switch back and forth between them, it is abundantly clear which is the original, even to tin ears.

So you can give me access to your PayPal account whenever you are ready.;)

You're so full of **** it's unbelievable.
 

nitromac

macrumors 6502
Jul 29, 2012
282
13
US
You're so full of **** it's unbelievable.

No, he isn't.

There is a difference between 320kbps MP3 and WAV/CD/FLAC or what have you. You have to A/B to hear it and it's very subtle but it's there. Mainly in the high freq range (drum cymbals and such) where the CD is a bit louder/clearer in those frequencies.

Now I'm not saying 320kbps MP3 is therefore inferior and awful. All my music on my Classic is in 320kbps MP3; ripped to WAV from CDs and converted in dbPowerAmp. I listen to WAV whenever I have the chance but honestly unless you are specifically listening for a difference, you won't really hear one.

Now, between 192kpbs and 320kbps there is a pretty clear difference. Highs are audibly compressed and muddy sounding; everything sounds more packed together.

Non-audiophiles probably won't notice a difference unless shown an A/B, and as for the original argument, I don't think anyone will hear a difference between 256kbps AAC and 320kbps MP3. At least I haven't.
 

buklau

macrumors newbie
May 20, 2010
28
0
Assuming they're both from good quality sources, you almost certainly won't hear a difference with casual listening.

Hell, people were content with CBR 128kbps for a very long time, and for good reason: proper audio compression is very good and highly transparent for the majority of listeners.

(People are reluctant to give an answer because at those bit rates it doesn't really matter what codec you're using, it matters much more when you get to lower rates, like sub 96kbps).
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,565
We don't know this and its not likely that Apple has access to record compony property (master tapes) or needs/wants this. iTunes' songs would probably be at a 48KHz sampling rate if this were so. I bet the record companies send over CD quality (44.1/16) files (maybe even just a Redbook CD) for Apple to use.

There's the "made for iTunes" program - check it out on the iTunes store.

Lots of recordings are actually converted from 192,000 samples per second / 24 bit recordings, which improves the quality. And Apple provides the record companies with tools that make sure there is no clipping in the whole encoder / decoder chain.
 

phrehdd

macrumors 601
Oct 25, 2008
4,286
1,292
I'm using an Apple TV 2 hooked up to a Denon 4308 receiver and Bower & Wilkins 683 speakers.

Nice system you have there. Here is some thoughts but not answers per se -

In general I can tell the difference between mp3 and aac files both at 256. A well done mp3 file at 320 can be extremely good as well.

I have a particular LP that I later got as a CD and also downloaded it from iTunes. There is a definite difference in how the 256 AAC file sounds compared to the CD. The LP to me sounds best but the CD is pretty darn close.

My problem with iTunes is that we don't know what the "Masters" are. This is not much difference between when we get a DVD that just sucks beyond belief in quality of transfer and those that are done by a good master with some craft. My point is that not everything on iTunes is great but some are very good and worthy.

If you want to do matching from iTunes, be sure you know what you are getting best you can. If you already downloaded a song from an album and like the quality then chances are the rest of the album via match would be to your liking. The biggest loser downloads are any recordings that are vintage by nature. Newer recordings tend to sound very good from iTunes.

I did a test where I played Carole King's "Tapestry" album via LP, then CD and later my friend's 256 AAC files. Last, I downloaded from HDtracks higher res flac file version. Here is the order of preference that we liked -
Best - Flac 96/24
LP
CD
AAC

Later, we did a lossless Apple file from the CD and it was just slightly better sounding than the AAC 256. One had to listen carefully. - Remember, this is all subjective and limited to equipment.

My equipment included a Marantz AVR, Oppo103, NAS storage, Mac Mini, Audacity software, Pioneer Turntable (upgraded), Garage custom turntable, vintage but spectacular Dynco PreAmp for both turntables, Seinnhauser cans, and older but moderately faithful Energy speakers. Between all this we have what might be a music hobbyist set up but certainly not audiophile level. If we can tell on this system, chances are you might too.
 

Gizmo22

macrumors regular
Oct 22, 2009
148
3
Midwest USA
I recommend opting to rip all CDs 8-12 times at various bitrates and codecs so when the popular culture tastes change on "what's best" from month to month, you can keep up with everyone else online as they bicker back and forth on preferences.
 

Parkin Pig

macrumors 6502a
Oct 23, 2009
670
141
Yorkshire-by-Gum
My 10 cents

The MP3 codec uses linear compression over the whole track, which means that portions of the track with more detail will lose more of that detail, whereas portions with less detail will sound less compressed.
AAC is a variable compression codec, which uses less compression in areas of more detail (preserving that detail), and more compression in quieter areas (where detail can be preserved even after compression).

Therefore, the quality of the MP3 on average will be poorer than the AAC.

Obviously the quality of the source track needs to be considered too, as does the playback hardware, the acoustics of the venue, and the aural health of the listener.

All things being equal except for the codec, I would plump for the AAC option
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,565
The MP3 codec uses linear compression over the whole track, which means that portions of the track with more detail will lose more of that detail, whereas portions with less detail will sound less compressed.

Actually, that's totally dependent on what the encoder does. In an mp3 file, the sound is split into little blocks of maybe 20 milliseconds, and each of these blocks can be compressed at a different bit rate.

Same with AAC; AAC can use fixed bit rate or variable bit rate as well, depending on the encoder and what settings you use for the encoder.

A difference that can lead to confusion is that AAC and mp3 interpret "bit rate" different: A 320 KBit/sec VBR mp3 uses 320 KBit/sec on average; some blocks are smaller, some are larger. A 256 KBit/sec VBR AAC uses _at least_ 256 KBit/sec and never less; blocks can be larger, so files are larger than they should be.

So mp3 being lower quality at same or slightly larger file size is because the codec is much older and less advanced, but it has nothing to do with constant or variable bit rate.


Yeah. iThink the sample rate (44.1 kHz vs. 48 kHz) is much more important. I use always 48 kHz instead of 44.1 kHz in the iTunes AAC or MP3 encoder, and the results sound great.

See also:
http://en.wikipedia.org/wiki/Compact_Disc#44.1_kHz_sample_rate
(Nyquist–Shannon sampling theorem)

That seems strange since your data source is most likely a CD with 44.1 KHz sample rate, so the very first step in your encoding process would be converting the sample rates, which would introduce errors already.
 

Mr. Retrofire

macrumors 603
Mar 2, 2010
5,064
518
www.emiliana.cl/en
The MP3 codec uses linear compression over the whole track, which means that portions of the track with more detail will lose more of that detail, whereas portions with less detail will sound less compressed.
Not if you use the VBR and/or Joint Stereo encoding options.

----------

That seems strange since your data source is most likely a CD with 44.1 KHz sample rate, so the very first step in your encoding process would be converting the sample rates, which would introduce errors already.
Yeah, i know. You can test it with iTunes. And sample rate converters use floating point numbers, which means that the errors are very small.
 

MacCruiskeen

macrumors 6502
Nov 9, 2011
321
5
It seems to me that the OP could easily test for himself on his own system. All he has to do is round up some uncompressed source originals he is familiar with, rip them into each format, play them, and then decide which is satisfactory to him. It would probably take less time than reading this thread.
 

ybz90

macrumors 6502a
Jul 10, 2009
609
2
There is a lot of "audiophile" misinformation in this thread. Very few individuals can A/B accurately above 256kbps in any sort of compression, and certainly some have better ears than others, but that's the general consensus for threshold where the high quality is difficult/impossible for most people to distinguish.

A bigger issue is related to the OP's equipment and if they're even good enough. Fortunately, it seems he has some pretty great entry-level speakers (depending on how you look at it), but they are definitely good enough to tell the differences between lower and higher bitrates. In practice, I am doubtful the OP will be able to distinguish between either 320 MP3 or 256 AAC, but the simplest solution would be just to A/B it to see if it makes a difference. On a technical basis, AAC might be the better choice, if marginally, and again, I have my doubts as to whether there will be any real-world hearable difference for the OP.

On principle, I use ALAC for all of my recordings, even though I know that lower quality rips have a negligibly perceptible (if at all) difference to my ears, but if I can only get something on iTunes or in 320 MP3, I'm not even remotely beat up about it. Let's be honest here, when you're talking at these levels of compression and fidelity, regardless of your equipment, the minute compression artifacts and errors are on such a low order of magnitude that they will not reduce your listening enjoyment at all. Again, I'm talking about that "threshold"; lower bitrates are easily discernable to even the untrained ear and for me, unlistenable. Part of why I hate Spotify and Rdio.

tl;dr: OP, try out both and see if you can tell a difference. If you can't, and chances are by limitations of biology and probability, you can't, then don't worry about it. If you're completely bonkers like me, then just get everything in FLAC/ALAC.
 

elistan

macrumors 6502a
Jun 30, 2007
997
443
Denver/Boulder, CO
I also want 48KHz or 96KHz 24 bit files offered by iTunes (I still but CD's because I don't listen to (much) lossy music). However the article clearly states (and it's 10 months old) that "...record labels are supposedly in discussions with Apple to begin offering 24-bit music files...". Thats says Apple doesn't have access to Master tapes (and likely never will since Master tapes are proprietary and used for storage) or even 92/24 file copies of the Master tapes at this time.

Just because Apple doesn't offer 24/96 to us customers doesn't mean they don't get those as masters from record labels.

From http://images.apple.com/itunes/mastered-for-itunes/docs/mastered_for_itunes.pdf:

Provide High Resolution Masters

To take best advantage of our latest encoders send us the highest resolution master file
possible, appropriate to the medium and the project.

An ideal master will have 24-bit 96kHz resolution.
These files contain more detail from
which our encoders can create more accurate encodes. However, any resolution above
16-bit 44.1kHz, including sample rates of 48kHz, 88.2kHz, 96kHz, and 192kHz, will benefit
from our encoding process.

(Not that this has anything to do with 320 MP3 vs 256 AAC. ;) )
 

jon3543

macrumors 6502a
Sep 13, 2010
608
265
I am doubtful the OP will be able to distinguish between either 320 MP3 or 256 AAC, but the simplest solution would be just to A/B it to see if it makes a difference.

To properly A/B it, the files need to be sourced from the same lossless file and volume-matched, and the test needs to be blind, as it is with the foobar2000 ABX comparator add-in. If the files came from different sources, e.g. Amazon vs. Apple, they could be from different masterings, which can sound wildly different even when comparing lossless versions, and just about all recordings have been the subject of one or more remastering efforts. When deliberately comparing different masterings, you need to use ReplayGain or other volume-matching system to do a valid comparison; foobar2000 makes that easy to do.

Different masterings of the same recording can sound different due to things like dynamic range compression and volume boosting (i.e. the loudness wars) and often different EQ (e.g. the boosting of high frequencies is common), all of which conspire to make remasters sound bad more often than not. Bad remasters may sound OK for background listening at low volume, or perhaps in a car or other noisy environment, but they suffer badly when you crank them up on good equipment in a decent listening environment, where they sound shrill and fatiguing over time.

Sadly, when you buy music online as MP3s or M4A files, typically there is no indication of the mastering, which is usually specified in CD liner notes.

On a technical basis, AAC might be the better choice, if marginally, and again, I have my doubts as to whether there will be any real-world hearable difference for the OP.

AAC is better than MP3 at low bitrates but probably not at the high bitrates that are the subject of this thread. On various killer samples, I find even 192 Kbps LAME MP3 non-transparent. AAC achieves transparency at 128 Kbps. That's what I have on my iPod thanks to iTunes transcoding all my lossless stuff when I sync.

On principle, I use ALAC for all of my recordings

It is smart to have master copies in a lossless format. You can encode them to any lossy format and bitrate without worrying about distortions due to transcoding one lossy format/bitrate to another. And of course, it doesn't hurt to play lossless music on your PC even if you can't tell the difference between it and lossy. Hard drive space is cheap.
 

ybz90

macrumors 6502a
Jul 10, 2009
609
2
To properly A/B it, the files need to be sourced from the same lossless file and volume-matched, and the test needs to be blind, as it is with the foobar2000 ABX comparator add-in. If the files came from different sources, e.g. Amazon vs. Apple, they could be from different masterings, which can sound wildly different even when comparing lossless versions, and just about all recordings have been the subject of one or more remastering efforts. When deliberately comparing different masterings, you need to use ReplayGain or other volume-matching system to do a valid comparison; foobar2000 makes that easy to do.

Different masterings of the same recording can sound different due to things like dynamic range compression and volume boosting (i.e. the loudness wars) and often different EQ (e.g. the boosting of high frequencies is common), all of which conspire to make remasters sound bad more often than not. Bad remasters may sound OK for background listening at low volume, or perhaps in a car or other noisy environment, but they suffer badly when you crank them up on good equipment in a decent listening environment, where they sound shrill and fatiguing over time.

Sadly, when you buy music online as MP3s or M4A files, typically there is no indication of the mastering, which is usually specified in CD liner notes.



AAC is better than MP3 at low bitrates but probably not at the high bitrates that are the subject of this thread. On various killer samples, I find even 192 Kbps LAME MP3 non-transparent. AAC achieves transparency at 128 Kbps. That's what I have on my iPod thanks to iTunes transcoding all my lossless stuff when I sync.



It is smart to have master copies in a lossless format. You can encode them to any lossy format and bitrate without worrying about distortions due to transcoding one lossy format/bitrate to another. And of course, it doesn't hurt to play lossless music on your PC even if you can't tell the difference between it and lossy. Hard drive space is cheap.

The point of it being double blind is obvious, otherwise, what's the point of A/B if you are allowing bias?

But you're totally missing my point. If you A/B any files, of any origin and source, and you can't hear a difference, then that's it. You you don't have to "properly" do anything (volume-match is a good point, but also moot in a situation where...) if you cannot personally detect any audible difference anyway. So in the OP's case, just take a listen to a bunch of tracks and see if you can tell. And if you can't, don't worry about it.

As for masters, that's usually not an issue for new music. There is generally only one master, and it's probably rubbish and very loud. I can't speak to the whole Mastered for iTunes program but again, it's quite possible (likely?) that a listener wouldn't be able to tell between that and a different source anyway, and given this fact, this is really a non-issue if you can't tell the difference. That's the operative phrase I keep using, as it applies to the vast majority of people at these bitrates.

Like I said, I personally listen to lossless anyway because I like to and it makes me feel happy. But I know I can't hear a difference above a certain bitrate, and neither can most people.
 

jon3543

macrumors 6502a
Sep 13, 2010
608
265
The point of it being double blind is obvious, otherwise, what's the point of A/B if you are allowing bias?

The point is many people don't understand this, and it always bears mentioning.

But you're totally missing my point. If you A/B any files, of any origin and source, and you can't hear a difference, then that's it. You you don't have to "properly" do anything (volume-match is a good point, but also moot in a situation where...) if you cannot personally detect any audible difference anyway. So in the OP's case, just take a listen to a bunch of tracks and see if you can tell. And if you can't, don't worry about it.

While ignorance is indeed bliss, you're still not getting it. Getting back to the subject of this thread, 256 Kbps AAC vs 320 MP3, if you unknowingly compare different masterings and find one better than the other, the takeaway would be that the bitrate or codec is the reason for it, or even more naively, that "Amazon or Apple sounds better than the other". That would be completely wrong. You have to know what you're testing, and you have to test using proper methods. That's the only way you can ever hope to reach valid conclusions on which you can base future decisions.

Here's an example of a valid test. Start with your own lossless file. Convert it to 256 Kbps AAC and 320 Kbps MP3. Compare the two lossy files in foobar2000 using its ABX comparator over at least 10 trials. See how you did. (NB: This still doesn't tell you anything about Amazon vs iTunes files, because you still don't know what mastering you'll get. Looking at the bonus tracks and comparing to CD releases is the only way to guess that I know of, but then I don't buy lossy music enough to have investigated this.)

As for masters, that's usually not an issue for new music.

True, you're safer on that front the more your taste in music ends in the last decade or so. There's an awful lot of music before that, though, and most of it has been remastered at least once, and often several times, often with very different sonic qualities. You can't do a proper comparison of bitrates or codecs when the masterings represent an unknown.
 

TinHead88

macrumors regular
Oct 30, 2008
214
39
Not if you use the VBR and/or Joint Stereo encoding options.

----------


Yeah, i know. You can test it with iTunes. And sample rate converters use floating point numbers, which means that the errors are very small.

It does not make any sense to do this. There is no possibility of getting any benefit from converting to 48kHz using a 44.1kHz source. You can't go "up" in information density. The information is not there.
 

ybz90

macrumors 6502a
Jul 10, 2009
609
2
The point is many people don't understand this, and it always bears mentioning.
This is a fair point.

While ignorance is indeed bliss, you're still not getting it. Getting back to the subject of this thread, 256 Kbps AAC vs 320 MP3, if you unknowingly compare different masterings and find one better than the other, the takeaway would be that the bitrate or codec is the reason for it, or even more naively, that "Amazon or Apple sounds better than the other". That would be completely wrong. You have to know what you're testing, and you have to test using proper methods. That's the only way you can ever hope to reach valid conclusions on which you can base future decisions.

Here's an example of a valid test. Start with your own lossless file. Convert it to 256 Kbps AAC and 320 Kbps MP3. Compare the two lossy files in foobar2000 using its ABX comparator over at least 10 trials. See how you did. (NB: This still doesn't tell you anything about Amazon vs iTunes files, because you still don't know what mastering you'll get. Looking at the bonus tracks and comparing to CD releases is the only way to guess that I know of, but then I don't buy lossy music enough to have investigated this.)

Perhaps I should clarify again, since you're still kind of missing my point. What you say is all true, but [1] one is meant to test multiple different songs, not just one, and if one should get (mostly) consistent results, the minutiae isn't really that important, and [2] my recommendation is more specifically to see if one can discern any difference at all, not necessarily to determine which is better. If you unknowingly compare different masters, and still can't tell a darn difference between the two, then it's really a total non-issue between that and the bitrates.

Of course, maybe someone can, in which case, they should go about the methodical testing you outlined. I'm not ashamed to admit that I can't though for the most part (though admittedly, the poor modern remastering is a much easier thing to tell, especially at higher volumes).

True, you're safer on that front the more your taste in music ends in the last decade or so. There's an awful lot of music before that, though, and most of it has been remastered at least once, and often several times, often with very different sonic qualities. You can't do a proper comparison of bitrates or codecs when the masterings represent an unknown.

I think the vast majority of music libraries of the vast majority of listeners will mostly be newer music. That's not to say a substantial number of people don't listen to older recordings though, and to this point, I completely agree. The loudness war is disgusting. A bit off topic, but besides the novelty of vinyls, I don't get the point of buying them for digitally mastered modern music.
 

gnasher729

Suspended
Nov 25, 2005
17,980
5,565
It does not make any sense to do this. There is no possibility of getting any benefit from converting to 48kHz using a 44.1kHz source. You can't go "up" in information density. The information is not there.

Here's a possibility: The "Sound Enhancer" feature in iTunes is known to play havoc with the sound it's supposed to enhance. It should always be turned off. Maybe it only works with 44.1kHz? In that case 48kHz recordings would sound audibly better.
 

Aragornii

macrumors 6502a
Original poster
Jun 25, 2010
512
139
Thought you all might be interested in the results of a test I ran.

I started with George Harrison's "What is Life" in Apple Lossless format. I converted one copy of it at 256kbps AAC using iTunes, and downloaded a 256kbps AAC from the iTunes store using iTunes match.

All versions sounded great, and straining to hear the slightest differences, I could not find any differences between the ALAC and 256 AAC versions. I could however, spot a difference between the lossless and the iTunes store version.

If you listen to the song, the high-hat cymbal comes in when the verse starts (go to 0:28 here for what I'm referring to: http://www.youtube.com/watch?v=3XFfUt7HQWM). On the original lossless version and the 256kbps conversion, the high-hat cymbal is loud and clear, and really stands out. On the iTunes store version it is much less prominent and sounds a little muffled.

So, im my listening test, ALAC = 256 AAC > iTunes store.

I think what that really means is that for me all else being equal I can't distinguish 256 AAC from lossless, and that things like the original source are much more important to the final product.




MacCruiskeen said:
It seems to me that the OP could easily test for himself on his own system. All he has to do is round up some uncompressed source originals he is familiar with, rip them into each format, play them, and then decide which is satisfactory to him. It would probably take less time than reading this thread.

Nice system you have there. Here is some thoughts but not answers per se -

...
 

Fox Fisher

macrumors member
Mar 19, 2013
31
11
AAC has a slight case specific advantages.

Beyond 192kbps, the difference is not distinguishable. However, AAC has some advantages because it's a newer format.

A-) If you shop from itunes store a lot, your library will be intact. Just 1 sound format across all your library.

B-) If you have any gapless albums such as live concert records or DJ mix compilations, the transition between songs will be seamless since AAC supports gapless playback natively, where they integrated it into mp3 later and the performance differs from encoder to encoder.

C-) Smaller file size. Storage is cheap nowadays but the storege on portable devices are still limited. If you have a huge library, there will be a lot of space savings compared to 320kbps mp3.

The only downside of aac may be compatibility but I havent encountered any device that does not recognize aac yet. Even our 7 year old car stereo recognizes it.
 

wtsitmn

macrumors newbie
Aug 25, 2012
17
13
Texass
"But no one needs such quality. Even Audiophiles can't tell the difference with 256 and above. 256 is enough. In fact, 128 should be."

The above statement is total fertilizer, although it's a common misconception. Assuming the audio equipment being used is of good quality, there can be quite a significant loss in fidelity. On the other hand, if you're comparing using today's typical consumer-quality audio stuff, you will indeed have trouble discerning much, if any, difference.

---

Out of curiosity, I went to a BestBuy with several digitally mastered CDs in hand (classical, jazz, and pipe organ) to compare in their private listening room. I told the guy to use the very best stuff they had, and to crank it up. Everything sounded pretty decent, except for the bass. When the guy boasted about their ridiculously expensive bass speaker(s), I was speechless. My 30 year old Infinitys blew theirs out of the water. I've since upgraded to even better used speakers (from guys whose female partners were making them dump their big old speakers for "less intrusively sized" ones because "they sounded about the same").

The same when playing lossy music, of course.

FYI, I'm no audiophile, and I'm not young. So if I can easily tell the difference, then anybody can!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.