PDA

View Full Version : What's best: 320 kbps mp3 or 256 kbps AAC from the iTunes store?




Aragornii
Jan 7, 2012, 11:22 PM
I plan to use the iTunes match trick to upgrade all my low bit rate tunes to 256 kbps AAC using iTunes match.

What about tunes that are 256 kbps mp3 and above. Are the iTunes store tracks still better? a) AAC is better than mp3 and b) iTunes tracks are made from masters and not the CD.

I know that lossless tracks will be better, but are there any higher bit rate mp3 that are better quality than the 256 kbps AAC files from the iTunes store?



TMRaven
Jan 7, 2012, 11:23 PM
What equipment are you using to playback the songs?

miles01110
Jan 7, 2012, 11:27 PM
Err... you seem to be confused. AAC is not lossless.

Aragornii
Jan 7, 2012, 11:31 PM
What equipment are you using to playback the songs?

I'm using an Apple TV 2 hooked up to a Denon 4308 receiver and Bower & Wilkins 683 speakers.

Peace
Jan 7, 2012, 11:32 PM
I don't think you will notice any difference.

Aragornii
Jan 7, 2012, 11:32 PM
Err... you seem to be confused. AAC is not lossless.

I know. My point is that I know lossless is better than 256 AAC, but is 256 AAC better than 320 mp3? Especially when the mp3 is ripped from a CD and the AAC is made by Apple straight from the master (higher sample rate) recording.

miles01110
Jan 7, 2012, 11:33 PM
My point is that I know lossless is better than 256 AAC, but is 256 AAC better than 320 mp3? Especially when the mp3 is ripped from a CD and the AAC is made by Apple straight from the master (higher sample rate) recording.

No.

Aragornii
Jan 7, 2012, 11:39 PM
I don't think you will notice any difference.

That is almost certainly true. In my personal tests using successively higher bit rates from the same CD, I could tell the difference up to about 192 kbps but anything above that I couldn't distinguish. I guess I'm still interested in what is theoretically better as there might be some subtle differences you don't pick without doing comparisons across a wide variety of music.

----------

No.

That's a useful opinion. I ripped about half my library at 320 k before hard drives got bigger and I did the rest lossless. My inclination is to just leave those alone, but if an easy opportunity presented itself I'd upgrade them to a higher quality.

TMRaven
Jan 8, 2012, 12:23 AM
In all honesty it probably varies from song to song. The two different codecs treat different types of complexities within a file differently.

I think the best option would be to not worry about it and give yourself some peace of mind.

RolledUp20s
Jan 8, 2012, 01:31 AM
I ripped my fave cds to apple lossless, off the top of my head its something like 928kbps - 500mb an album? ..up to 1GB, which obviously is quite a HDD muncher. I use bose ae2i headphones. Playing them on my iphone 4s was great coz it has no EU volume cap, since i bought the new nano though, the fricking volume has a hidden cap and so i cant really tell at that level. I no longer store music on my iphone, so i guess it all depends, if its portable music your after, keep 320kbps...theyre like 4x smaller file size, so you'll save more room.
On a good home sound system you will notice difference at a decent volume level though. ...but you dont seem to be going for Apple lossless anyway?? Just do the itunes match thing, you won't notice.

By the way, if you've 'downloaded' a file at say 198kbps, it will stay at that sound quality..even if changed to apple lossless. That only works direct from CD rip

Julien
Jan 8, 2012, 06:15 AM
....and the AAC is made by Apple straight from the master (higher sample rate) recording.
We don't know this and its not likely that Apple has access to record compony property (master tapes) or needs/wants this. iTunes' songs would probably be at a 48KHz sampling rate if this were so. I bet the record companies send over CD quality (44.1/16) files (maybe even just a Redbook CD) for Apple to use.

Aragornii
Jan 8, 2012, 12:11 PM
We don't know this and its not likely that Apple has access to record compony property (master tapes) or needs/wants this. iTunes' songs would probably be at a 48KHz sampling rate if this were so. I bet the record companies send over CD quality (44.1/16) files (maybe even just a Redbook CD) for Apple to use.

Here's where I got that information.

http://arstechnica.com/apple/news/2011/02/itunes-may-upgrade-to-24-bit-files-but-why-bother.ars

Julien
Jan 8, 2012, 04:39 PM
Here's where I got that information.

http://arstechnica.com/apple/news/2011/02/itunes-may-upgrade-to-24-bit-files-but-why-bother.ars

I also want 48KHz or 96KHz 24 bit files offered by iTunes (I still but CD's because I don't listen to (much) lossy music). However the article clearly states (and it's 10 months old) that "...record labels are supposedly in discussions with Apple to begin offering 24-bit music files...". Thats says Apple doesn't have access to Master tapes (and likely never will since Master tapes are proprietary and used for storage) or even 92/24 file copies of the Master tapes at this time.

phrehdd
Jan 9, 2012, 12:35 AM
You are going to get all sorts of responses (as you have seen)..here's my two cents -

When possible, best to have "master" copies that are the highest bitrate possible. Lossless is ideal and given a choice between AAC 256 and mp3 320, I'd take AAC 256 as I can tell on some files a difference. Some files you wont hear much difference do to their range or the quality of the original.

All my CDs are converted to Lossless. They play of course great.If I want, I can make AAC 256 from them with just the direct compression "loss."

My downloads from iTunes remain at AAC 256 and I have from days gone by some MP3's that are ranging from 128 up to 320 bitrate. I hope to replace those later.

The only advantage of Mp3 is if you have multiple "players" that don't play AAC files. Again, having a Lossless copy you can then make mp3 version for those players.

I'll be short on the iTunes store - I have downloaded very good copies of songs at 256 and then all the older material (stuff from decades ago) sound horrible. However, LP version of the same album and some CDs were evidently cut from different "masters" than what iTunes got. I often think the problem remains with the high speed method of conversion. No one checks the quality and not all "masters" (I should say original source) are the same when it comes to transfers. So if you prefer more modern stuff, iTunes is pretty darn good. If you like say some Glen Miller or Ethel Waters or...its hit and miss.

Bottom line - mp3=good, AAC 256 Superior and Lossless =can't be better

Aragornii
Jan 9, 2012, 01:16 PM
phrehdd - that's my belief as well, that AAC is a superior format so the quality is better than mp3 even at a lower bitrate. I've decided not to mess with my 320k mp3's in any case, just because it's not worth the hassle.

Instead of the next step up in sampling rate I wish the industry would start delivering multi-channel music. Most of us are hooked up to home theater type systems now and can take advantage of it, and the difference of moving from stereo to 5.1 music would almost certainly be more noticeable than a higher sampling rate or higher bitrate.

iEvolution
Jan 9, 2012, 04:40 PM
After 192 kbps there is little difference between the two unless you have some really Hq headphones.

That being said if you are only using ipods or AAC compatible media player I would pick AAC on the account that you'd save a little bit of space. If the choices were between mp3 at the same bitrate I'd pick mp3 for Max compatibility.

At 128 kbps AAC is superior to MP3 no doubt.

lostless
Jan 9, 2012, 10:18 PM
Also iTunes is not just pure CBR 256Kb/s. They are actually VBR and sometimes reach up to 320Kb/s. Found this out when getting iTunes match and my old CBR 256 mp3s were redownloaded as AAC and the files were more than marginally bigger on some songs (256kB/s is 256kB/s, regardless of file format, plus or minus a few bit difference for format meta data). Also my 3rd party sees them as VBR files.

peite
Jul 27, 2012, 12:10 PM
How about "Mastered for iTunes" - doesn't that equal to a vast amount of music on iTunes Store? When I listen to audio previews on iTunes Store (or Spotify) a lot sounds more "wider/clear" compared to equal tracks from my own mp3s/AAC's. Perhaps its my pure imagination or could it be that music on iTunes Store is ripped with superior gear, some "nice compression/EQ"?

marioman38
Jul 27, 2012, 02:30 PM
Theoretically the 320 MP3 will ALWAYS be better. But that is purely in theory. In the real world it would be entirely a case by case basis. Song A might sound better with the AAC iTMS copy in "only" 256kbps where song B from a different album will actually sound "better" in 320 MP3. But better is subjective and rarely can anyone tell the difference without $1000 audio components.

From FLAC to V0 MP3 I can tell subtle differences, but not nearly enough to warrant the extra disk space.

As you said before, anything above 192 will sound good. 128 on the other hand, I can almost always tell the difference.

sakau2007
Jul 27, 2012, 02:49 PM
It's all about the quality of the rip and the encoder.

If all conditions were "perfect" the 320kbps rip would sound better. But if you don't really know what you are doing, it is possible to end up with a 320kbps rip that sounds inferior to a well-done 192 or even 128 rip.

I'd also like to interject that I'd wager money that most people who claim they can tell the difference between a good 320 rip and a lossless FLAC copy or even the original source are full of it.

quasinormal
Jul 27, 2012, 04:55 PM
I'd also like to interject that I'd wager money that most people who claim they can tell the difference between a good 320 rip and a lossless FLAC copy or even the original source are full of it.

I'm full of it.

I can tell the difference between 192 aac and apple lossless on my equipment. Lossless> Apogee Duet firewire DAC> Burson HA-160 headphone amp > Sennheiser HD-600.

and similar quality loudspeaker systems.

TyroneShoes2
Jul 27, 2012, 07:43 PM
Theoretically the 320 MP3 will ALWAYS be better. But that is purely in theory...Is that a personal theory? If bit rate were the only variable, yes, 320 would be better. But there are other variables. When there are other variables, "ALWAYS" goes right out the window.

For example, is a 9 GB episode of Fringe better than a 7 GB episode of Fringe? Absolutely not if the first copy is MPEG-2 and the second copy is MPEG-4 AVC, even if from the same original. The critical variable here is that MPEG-4 is a better codec; it can either do a better job at the same bit rate, or an equivalent job at a lower bit rate.

Its the same with AAC vs. MP3. AAC is a better codec, and can also either do a better job at the same bit rate, or an equivalent job at a lower bit rate.

So the OPs question, which is better, MP3 at 320 or AAC at 256, is still an open question. 320 is absolutely not ALWAYS better.

With all else held equal, it was well accepted back when Apple started using AAC that a 128 kbps AAC was equivalent to a 160 kbps Fraunhofer MP3 (other versions of MP3 were inferior). If the math is linear, that would make a 256 AAC equivalent to a 320 MP3. But then development on MP3 has stagnated while there have been a number of improvements to AAC since Apple first started using it in 2003.

That would tend to support the theory that the 256 AAC Apple ripped from a CD would actually be better than the 320 MP3 that the OP ripped from a CD, assuming the AAC used Apple's most modern flavor of AAC. And you can probably bet that Apple is not just using the consumer encoder available in every free copy or iTunes to build their library, but top-shelf encoding hardware. That equals even better quality.

And if Apple really is getting the masters, which is their stated intent, that will be significantly better. One of the basic rules of encoding is that when you restrict to a severely-low bit rate in a lossy format such as consumer AAC/MP3, what really makes a difference is the quality of the original. And that is exactly why they are doing it.

Problem is, I don't think that program is all that ramped up quite yet, and it will probably take a few years for them to really get it pervasively through their library (which is why although I intend to do iTunes Match, I am waiting).

A CD is 16 bit at 44.1 Khz. That's an equivalent uncompressed bit rate of 1,411.2 kbps (for stereo). Modern masters are 24 or even 32 bit at 192 KHz, which can give them an uncompressed bit rate up to 12,288 kbps. This means more samples more often, up to a factor of times 8.6, or even better. It also means quantization steps orders of magnitude less far apart. That means a whole lot fewer or smaller rounding errors during final encoding, which means a whole lot fewer audio artifacts when compressed at an equivalent 256 kbps bit rate.

We might actually get back all of the things we have given up for the portability of MP3s, including stereo imaging, separation, real crisp highs instead of swishy highs, and real solid bass instead of mushy bass or bass MIA. Maybe even some dynamic range.

TyroneShoes2
Jul 27, 2012, 08:49 PM
... if you don't really know what you are doing, it is possible to end up with a 320kbps rip that sounds inferior to a well-done 192 or even 128 rip. The biggest secret to "knowing what you are doing" is mostly simply to get the input encoding level correct. If you normalize the digital master so that the highest peak is at 0 dBFS, and even limit some of the highest peaks beforehand, you can then take advantage of every least-significant bit during encoding, especially if the original digitization was done at the optimum level. That minimizes the quantization noise floor and means you are utilizing more of the available bits in every sample more effectively, as well as creating the fewest rounding errors, resulting in the best-quality product possible.

And if you are using 16 bit, which implies a noise floor of -96 dBFS, another trick is to use a hard gate at -95 and simply remove any potential contribution from quantization noise in the original file (which only works if your system s/n is lower than that). That will give you another 2 to 3 dB of dynamic range above the final noise floor in the encode.

And if you are using a good encoder, it is then hard to screw things up quite that badly. But if you just manually ride levels at approximately -18 dBFS, it won't be all that great. Its a combination of knowing how and taking the care to do it right.

I'd also like to interject that I'd wager money that most people who claim they can tell the difference between a good 320 rip and a lossless FLAC copy or even the original source are full of it.I really hope you are not going to cling to that. If you put either one of them into a quality stereo system, it is difficult to immediately declare "that's the rip!" or "that's the original". It's not even easy in a double-blind study where you play first one, then the other, and then ask which is which. But if you play them simultaneously and switch back and forth between them, it is abundantly clear which is the original, even to tin ears.

So you can give me access to your PayPal account whenever you are ready.;)

Retina Display
Jul 28, 2012, 12:52 AM
FLAC/ALAC/AIFF will always be the best.

But no one needs such quality. Even Audiophiles can't tell the difference with 256 and above. 256 is enough. In fact, 128 should be.

SDAVE
Jul 28, 2012, 01:13 AM
My solution throughout the years:

1. Rip in FLAC or Apple Lossless (usually this since my move to Apple)
2. Convert from Lossless to 224Kbps VBR.
3. Keep both copies, the MP3s for portable use and lossless for higher end equipment.

Hard drive space is so cheap nowadays that you can't complain about lossless taking up space.

Also nice thing about having a Lossless version is that compressed formats change over time and you can always go back and just do a batch convert to a new format (MP4 will be the new standard soon, but MP3 still is on it's two toes for now.)

gnasher729
Jul 28, 2012, 08:01 AM
I plan to use the iTunes match trick to upgrade all my low bit rate tunes to 256 kbps AAC using iTunes match.

What about tunes that are 256 kbps mp3 and above. Are the iTunes store tracks still better? a) AAC is better than mp3 and b) iTunes tracks are made from masters and not the CD.

I know that lossless tracks will be better, but are there any higher bit rate mp3 that are better quality than the 256 kbps AAC files from the iTunes store?

256 KBit AAC will be better quality than 320 KBit mp3. And it is possible that the iTunes Store has better originals than you had. There's also the problem that the encoder that was used plays a role - 320 mp3 encoded with a rubbish encoder isn't as good as 320 mp3 encoded with a very good encoder. Worst case, someone might have recorded an LP with 128 KBit mp3, then converted it to 320 KBit.

Aragornii
Jul 29, 2012, 02:01 AM
That would tend to support the theory that the 256 AAC Apple ripped from a CD would actually be better than the 320 MP3 that the OP ripped from a CD, assuming the AAC used Apple's most modern flavor of AAC. And you can probably bet that Apple is not just using the consumer encoder available in every free copy or iTunes to build their library, but top-shelf encoding hardware. That equals even better quality.


That's my thought as well. I ripped about half my CD collection using iTunes at 320 kbps MP3 before ripping the rest using Apple lossless. I have no intention or re-ripping half of my entire CD collection, but I think I will take advantage of the fact that I can use iTunes match to replace the mp3 tracks with 256 kbps AAC from the Apple store.

Lvivske
Jan 30, 2013, 05:57 PM
I really hope you are not going to cling to that. If you put either one of them into a quality stereo system, it is difficult to immediately declare "that's the rip!" or "that's the original". It's not even easy in a double-blind study where you play first one, then the other, and then ask which is which. But if you play them simultaneously and switch back and forth between them, it is abundantly clear which is the original, even to tin ears.

So you can give me access to your PayPal account whenever you are ready.;)

You're so full of **** it's unbelievable.

nitromac
Jan 30, 2013, 06:29 PM
You're so full of **** it's unbelievable.

No, he isn't.

There is a difference between 320kbps MP3 and WAV/CD/FLAC or what have you. You have to A/B to hear it and it's very subtle but it's there. Mainly in the high freq range (drum cymbals and such) where the CD is a bit louder/clearer in those frequencies.

Now I'm not saying 320kbps MP3 is therefore inferior and awful. All my music on my Classic is in 320kbps MP3; ripped to WAV from CDs and converted in dbPowerAmp. I listen to WAV whenever I have the chance but honestly unless you are specifically listening for a difference, you won't really hear one.

Now, between 192kpbs and 320kbps there is a pretty clear difference. Highs are audibly compressed and muddy sounding; everything sounds more packed together.

Non-audiophiles probably won't notice a difference unless shown an A/B, and as for the original argument, I don't think anyone will hear a difference between 256kbps AAC and 320kbps MP3. At least I haven't.

buklau
Jan 30, 2013, 06:51 PM
Assuming they're both from good quality sources, you almost certainly won't hear a difference with casual listening.

Hell, people were content with CBR 128kbps for a very long time, and for good reason: proper audio compression is very good and highly transparent for the majority of listeners.

(People are reluctant to give an answer because at those bit rates it doesn't really matter what codec you're using, it matters much more when you get to lower rates, like sub 96kbps).

gnasher729
Jan 30, 2013, 07:41 PM
We don't know this and its not likely that Apple has access to record compony property (master tapes) or needs/wants this. iTunes' songs would probably be at a 48KHz sampling rate if this were so. I bet the record companies send over CD quality (44.1/16) files (maybe even just a Redbook CD) for Apple to use.

There's the "made for iTunes" program - check it out on the iTunes store.

Lots of recordings are actually converted from 192,000 samples per second / 24 bit recordings, which improves the quality. And Apple provides the record companies with tools that make sure there is no clipping in the whole encoder / decoder chain.

Mr. Retrofire
Jan 30, 2013, 08:28 PM
I don't think you will notice any difference.
Yeah. iThink the sample rate (44.1 kHz vs. 48 kHz) is much more important. I use always 48 kHz instead of 44.1 kHz in the iTunes AAC or MP3 encoder, and the results sound great.

See also:
http://en.wikipedia.org/wiki/Compact_Disc#44.1_kHz_sample_rate
(Nyquist–Shannon sampling theorem)

phrehdd
Jan 30, 2013, 10:12 PM
I'm using an Apple TV 2 hooked up to a Denon 4308 receiver and Bower & Wilkins 683 speakers.

Nice system you have there. Here is some thoughts but not answers per se -

In general I can tell the difference between mp3 and aac files both at 256. A well done mp3 file at 320 can be extremely good as well.

I have a particular LP that I later got as a CD and also downloaded it from iTunes. There is a definite difference in how the 256 AAC file sounds compared to the CD. The LP to me sounds best but the CD is pretty darn close.

My problem with iTunes is that we don't know what the "Masters" are. This is not much difference between when we get a DVD that just sucks beyond belief in quality of transfer and those that are done by a good master with some craft. My point is that not everything on iTunes is great but some are very good and worthy.

If you want to do matching from iTunes, be sure you know what you are getting best you can. If you already downloaded a song from an album and like the quality then chances are the rest of the album via match would be to your liking. The biggest loser downloads are any recordings that are vintage by nature. Newer recordings tend to sound very good from iTunes.

I did a test where I played Carole King's "Tapestry" album via LP, then CD and later my friend's 256 AAC files. Last, I downloaded from HDtracks higher res flac file version. Here is the order of preference that we liked -
Best - Flac 96/24
LP
CD
AAC

Later, we did a lossless Apple file from the CD and it was just slightly better sounding than the AAC 256. One had to listen carefully. - Remember, this is all subjective and limited to equipment.

My equipment included a Marantz AVR, Oppo103, NAS storage, Mac Mini, Audacity software, Pioneer Turntable (upgraded), Garage custom turntable, vintage but spectacular Dynco PreAmp for both turntables, Seinnhauser cans, and older but moderately faithful Energy speakers. Between all this we have what might be a music hobbyist set up but certainly not audiophile level. If we can tell on this system, chances are you might too.

Gizmo22
Jan 30, 2013, 10:36 PM
I recommend opting to rip all CDs 8-12 times at various bitrates and codecs so when the popular culture tastes change on "what's best" from month to month, you can keep up with everyone else online as they bicker back and forth on preferences.

Parkin Pig
Jan 31, 2013, 06:48 AM
The MP3 codec uses linear compression over the whole track, which means that portions of the track with more detail will lose more of that detail, whereas portions with less detail will sound less compressed.
AAC is a variable compression codec, which uses less compression in areas of more detail (preserving that detail), and more compression in quieter areas (where detail can be preserved even after compression).

Therefore, the quality of the MP3 on average will be poorer than the AAC.

Obviously the quality of the source track needs to be considered too, as does the playback hardware, the acoustics of the venue, and the aural health of the listener.

All things being equal except for the codec, I would plump for the AAC option

gnasher729
Jan 31, 2013, 08:20 AM
The MP3 codec uses linear compression over the whole track, which means that portions of the track with more detail will lose more of that detail, whereas portions with less detail will sound less compressed.

Actually, that's totally dependent on what the encoder does. In an mp3 file, the sound is split into little blocks of maybe 20 milliseconds, and each of these blocks can be compressed at a different bit rate.

Same with AAC; AAC can use fixed bit rate or variable bit rate as well, depending on the encoder and what settings you use for the encoder.

A difference that can lead to confusion is that AAC and mp3 interpret "bit rate" different: A 320 KBit/sec VBR mp3 uses 320 KBit/sec on average; some blocks are smaller, some are larger. A 256 KBit/sec VBR AAC uses _at least_ 256 KBit/sec and never less; blocks can be larger, so files are larger than they should be.

So mp3 being lower quality at same or slightly larger file size is because the codec is much older and less advanced, but it has nothing to do with constant or variable bit rate.


Yeah. iThink the sample rate (44.1 kHz vs. 48 kHz) is much more important. I use always 48 kHz instead of 44.1 kHz in the iTunes AAC or MP3 encoder, and the results sound great.

See also:
http://en.wikipedia.org/wiki/Compact_Disc#44.1_kHz_sample_rate
(Nyquist–Shannon sampling theorem)

That seems strange since your data source is most likely a CD with 44.1 KHz sample rate, so the very first step in your encoding process would be converting the sample rates, which would introduce errors already.

Mr. Retrofire
Jan 31, 2013, 06:18 PM
The MP3 codec uses linear compression over the whole track, which means that portions of the track with more detail will lose more of that detail, whereas portions with less detail will sound less compressed.
Not if you use the VBR and/or Joint Stereo encoding options.

----------

That seems strange since your data source is most likely a CD with 44.1 KHz sample rate, so the very first step in your encoding process would be converting the sample rates, which would introduce errors already.
Yeah, i know. You can test it with iTunes. And sample rate converters use floating point numbers, which means that the errors are very small.

MacCruiskeen
Feb 1, 2013, 08:40 AM
It seems to me that the OP could easily test for himself on his own system. All he has to do is round up some uncompressed source originals he is familiar with, rip them into each format, play them, and then decide which is satisfactory to him. It would probably take less time than reading this thread.

ybz90
Feb 1, 2013, 09:52 AM
There is a lot of "audiophile" misinformation in this thread. Very few individuals can A/B accurately above 256kbps in any sort of compression, and certainly some have better ears than others, but that's the general consensus for threshold where the high quality is difficult/impossible for most people to distinguish.

A bigger issue is related to the OP's equipment and if they're even good enough. Fortunately, it seems he has some pretty great entry-level speakers (depending on how you look at it), but they are definitely good enough to tell the differences between lower and higher bitrates. In practice, I am doubtful the OP will be able to distinguish between either 320 MP3 or 256 AAC, but the simplest solution would be just to A/B it to see if it makes a difference. On a technical basis, AAC might be the better choice, if marginally, and again, I have my doubts as to whether there will be any real-world hearable difference for the OP.

On principle, I use ALAC for all of my recordings, even though I know that lower quality rips have a negligibly perceptible (if at all) difference to my ears, but if I can only get something on iTunes or in 320 MP3, I'm not even remotely beat up about it. Let's be honest here, when you're talking at these levels of compression and fidelity, regardless of your equipment, the minute compression artifacts and errors are on such a low order of magnitude that they will not reduce your listening enjoyment at all. Again, I'm talking about that "threshold"; lower bitrates are easily discernable to even the untrained ear and for me, unlistenable. Part of why I hate Spotify and Rdio.

tl;dr: OP, try out both and see if you can tell a difference. If you can't, and chances are by limitations of biology and probability, you can't, then don't worry about it. If you're completely bonkers like me, then just get everything in FLAC/ALAC.

elistan
Feb 1, 2013, 02:18 PM
I also want 48KHz or 96KHz 24 bit files offered by iTunes (I still but CD's because I don't listen to (much) lossy music). However the article clearly states (and it's 10 months old) that "...record labels are supposedly in discussions with Apple to begin offering 24-bit music files...". Thats says Apple doesn't have access to Master tapes (and likely never will since Master tapes are proprietary and used for storage) or even 92/24 file copies of the Master tapes at this time.

Just because Apple doesn't offer 24/96 to us customers doesn't mean they don't get those as masters from record labels.

From http://images.apple.com/itunes/mastered-for-itunes/docs/mastered_for_itunes.pdf:

Provide High Resolution Masters

To take best advantage of our latest encoders send us the highest resolution master file
possible, appropriate to the medium and the project.

An ideal master will have 24-bit 96kHz resolution. These files contain more detail from
which our encoders can create more accurate encodes. However, any resolution above
16-bit 44.1kHz, including sample rates of 48kHz, 88.2kHz, 96kHz, and 192kHz, will benefit
from our encoding process.

(Not that this has anything to do with 320 MP3 vs 256 AAC. ;) )

jon3543
Feb 1, 2013, 02:54 PM
I am doubtful the OP will be able to distinguish between either 320 MP3 or 256 AAC, but the simplest solution would be just to A/B it to see if it makes a difference.

To properly A/B it, the files need to be sourced from the same lossless file and volume-matched, and the test needs to be blind, as it is with the foobar2000 ABX comparator add-in. If the files came from different sources, e.g. Amazon vs. Apple, they could be from different masterings, which can sound wildly different even when comparing lossless versions, and just about all recordings have been the subject of one or more remastering efforts. When deliberately comparing different masterings, you need to use ReplayGain or other volume-matching system to do a valid comparison; foobar2000 makes that easy to do.

Different masterings of the same recording can sound different due to things like dynamic range compression and volume boosting (i.e. the loudness wars) and often different EQ (e.g. the boosting of high frequencies is common), all of which conspire to make remasters sound bad more often than not. Bad remasters may sound OK for background listening at low volume, or perhaps in a car or other noisy environment, but they suffer badly when you crank them up on good equipment in a decent listening environment, where they sound shrill and fatiguing over time.

Sadly, when you buy music online as MP3s or M4A files, typically there is no indication of the mastering, which is usually specified in CD liner notes.

On a technical basis, AAC might be the better choice, if marginally, and again, I have my doubts as to whether there will be any real-world hearable difference for the OP.

AAC is better than MP3 at low bitrates but probably not at the high bitrates that are the subject of this thread. On various killer samples, I find even 192 Kbps LAME MP3 non-transparent. AAC achieves transparency at 128 Kbps. That's what I have on my iPod thanks to iTunes transcoding all my lossless stuff when I sync.

On principle, I use ALAC for all of my recordings

It is smart to have master copies in a lossless format. You can encode them to any lossy format and bitrate without worrying about distortions due to transcoding one lossy format/bitrate to another. And of course, it doesn't hurt to play lossless music on your PC even if you can't tell the difference between it and lossy. Hard drive space is cheap.

ybz90
Feb 1, 2013, 03:08 PM
To properly A/B it, the files need to be sourced from the same lossless file and volume-matched, and the test needs to be blind, as it is with the foobar2000 ABX comparator add-in. If the files came from different sources, e.g. Amazon vs. Apple, they could be from different masterings, which can sound wildly different even when comparing lossless versions, and just about all recordings have been the subject of one or more remastering efforts. When deliberately comparing different masterings, you need to use ReplayGain or other volume-matching system to do a valid comparison; foobar2000 makes that easy to do.

Different masterings of the same recording can sound different due to things like dynamic range compression and volume boosting (i.e. the loudness wars) and often different EQ (e.g. the boosting of high frequencies is common), all of which conspire to make remasters sound bad more often than not. Bad remasters may sound OK for background listening at low volume, or perhaps in a car or other noisy environment, but they suffer badly when you crank them up on good equipment in a decent listening environment, where they sound shrill and fatiguing over time.

Sadly, when you buy music online as MP3s or M4A files, typically there is no indication of the mastering, which is usually specified in CD liner notes.



AAC is better than MP3 at low bitrates but probably not at the high bitrates that are the subject of this thread. On various killer samples, I find even 192 Kbps LAME MP3 non-transparent. AAC achieves transparency at 128 Kbps. That's what I have on my iPod thanks to iTunes transcoding all my lossless stuff when I sync.



It is smart to have master copies in a lossless format. You can encode them to any lossy format and bitrate without worrying about distortions due to transcoding one lossy format/bitrate to another. And of course, it doesn't hurt to play lossless music on your PC even if you can't tell the difference between it and lossy. Hard drive space is cheap.

The point of it being double blind is obvious, otherwise, what's the point of A/B if you are allowing bias?

But you're totally missing my point. If you A/B any files, of any origin and source, and you can't hear a difference, then that's it. You you don't have to "properly" do anything (volume-match is a good point, but also moot in a situation where...) if you cannot personally detect any audible difference anyway. So in the OP's case, just take a listen to a bunch of tracks and see if you can tell. And if you can't, don't worry about it.

As for masters, that's usually not an issue for new music. There is generally only one master, and it's probably rubbish and very loud. I can't speak to the whole Mastered for iTunes program but again, it's quite possible (likely?) that a listener wouldn't be able to tell between that and a different source anyway, and given this fact, this is really a non-issue if you can't tell the difference. That's the operative phrase I keep using, as it applies to the vast majority of people at these bitrates.

Like I said, I personally listen to lossless anyway because I like to and it makes me feel happy. But I know I can't hear a difference above a certain bitrate, and neither can most people.

jon3543
Feb 1, 2013, 03:36 PM
The point of it being double blind is obvious, otherwise, what's the point of A/B if you are allowing bias?

The point is many people don't understand this, and it always bears mentioning.

But you're totally missing my point. If you A/B any files, of any origin and source, and you can't hear a difference, then that's it. You you don't have to "properly" do anything (volume-match is a good point, but also moot in a situation where...) if you cannot personally detect any audible difference anyway. So in the OP's case, just take a listen to a bunch of tracks and see if you can tell. And if you can't, don't worry about it.

While ignorance is indeed bliss, you're still not getting it. Getting back to the subject of this thread, 256 Kbps AAC vs 320 MP3, if you unknowingly compare different masterings and find one better than the other, the takeaway would be that the bitrate or codec is the reason for it, or even more naively, that "Amazon or Apple sounds better than the other". That would be completely wrong. You have to know what you're testing, and you have to test using proper methods. That's the only way you can ever hope to reach valid conclusions on which you can base future decisions.

Here's an example of a valid test. Start with your own lossless file. Convert it to 256 Kbps AAC and 320 Kbps MP3. Compare the two lossy files in foobar2000 using its ABX comparator over at least 10 trials. See how you did. (NB: This still doesn't tell you anything about Amazon vs iTunes files, because you still don't know what mastering you'll get. Looking at the bonus tracks and comparing to CD releases is the only way to guess that I know of, but then I don't buy lossy music enough to have investigated this.)

As for masters, that's usually not an issue for new music.

True, you're safer on that front the more your taste in music ends in the last decade or so. There's an awful lot of music before that, though, and most of it has been remastered at least once, and often several times, often with very different sonic qualities. You can't do a proper comparison of bitrates or codecs when the masterings represent an unknown.

TinHead88
Feb 1, 2013, 08:05 PM
Not if you use the VBR and/or Joint Stereo encoding options.

----------


Yeah, i know. You can test it with iTunes. And sample rate converters use floating point numbers, which means that the errors are very small.

It does not make any sense to do this. There is no possibility of getting any benefit from converting to 48kHz using a 44.1kHz source. You can't go "up" in information density. The information is not there.

ybz90
Feb 1, 2013, 10:18 PM
The point is many people don't understand this, and it always bears mentioning.
This is a fair point.


While ignorance is indeed bliss, you're still not getting it. Getting back to the subject of this thread, 256 Kbps AAC vs 320 MP3, if you unknowingly compare different masterings and find one better than the other, the takeaway would be that the bitrate or codec is the reason for it, or even more naively, that "Amazon or Apple sounds better than the other". That would be completely wrong. You have to know what you're testing, and you have to test using proper methods. That's the only way you can ever hope to reach valid conclusions on which you can base future decisions.

Here's an example of a valid test. Start with your own lossless file. Convert it to 256 Kbps AAC and 320 Kbps MP3. Compare the two lossy files in foobar2000 using its ABX comparator over at least 10 trials. See how you did. (NB: This still doesn't tell you anything about Amazon vs iTunes files, because you still don't know what mastering you'll get. Looking at the bonus tracks and comparing to CD releases is the only way to guess that I know of, but then I don't buy lossy music enough to have investigated this.)

Perhaps I should clarify again, since you're still kind of missing my point. What you say is all true, but [1] one is meant to test multiple different songs, not just one, and if one should get (mostly) consistent results, the minutiae isn't really that important, and [2] my recommendation is more specifically to see if one can discern any difference at all, not necessarily to determine which is better. If you unknowingly compare different masters, and still can't tell a darn difference between the two, then it's really a total non-issue between that and the bitrates.

Of course, maybe someone can, in which case, they should go about the methodical testing you outlined. I'm not ashamed to admit that I can't though for the most part (though admittedly, the poor modern remastering is a much easier thing to tell, especially at higher volumes).


True, you're safer on that front the more your taste in music ends in the last decade or so. There's an awful lot of music before that, though, and most of it has been remastered at least once, and often several times, often with very different sonic qualities. You can't do a proper comparison of bitrates or codecs when the masterings represent an unknown.

I think the vast majority of music libraries of the vast majority of listeners will mostly be newer music. That's not to say a substantial number of people don't listen to older recordings though, and to this point, I completely agree. The loudness war is disgusting. A bit off topic, but besides the novelty of vinyls, I don't get the point of buying them for digitally mastered modern music.

gnasher729
Feb 2, 2013, 02:52 AM
It does not make any sense to do this. There is no possibility of getting any benefit from converting to 48kHz using a 44.1kHz source. You can't go "up" in information density. The information is not there.

Here's a possibility: The "Sound Enhancer" feature in iTunes is known to play havoc with the sound it's supposed to enhance. It should always be turned off. Maybe it only works with 44.1kHz? In that case 48kHz recordings would sound audibly better.

Aragornii
Feb 3, 2013, 02:58 PM
Thought you all might be interested in the results of a test I ran.

I started with George Harrison's "What is Life" in Apple Lossless format. I converted one copy of it at 256kbps AAC using iTunes, and downloaded a 256kbps AAC from the iTunes store using iTunes match.

All versions sounded great, and straining to hear the slightest differences, I could not find any differences between the ALAC and 256 AAC versions. I could however, spot a difference between the lossless and the iTunes store version.

If you listen to the song, the high-hat cymbal comes in when the verse starts (go to 0:28 here for what I'm referring to: http://www.youtube.com/watch?v=3XFfUt7HQWM). On the original lossless version and the 256kbps conversion, the high-hat cymbal is loud and clear, and really stands out. On the iTunes store version it is much less prominent and sounds a little muffled.

So, im my listening test, ALAC = 256 AAC > iTunes store.

I think what that really means is that for me all else being equal I can't distinguish 256 AAC from lossless, and that things like the original source are much more important to the final product.




It seems to me that the OP could easily test for himself on his own system. All he has to do is round up some uncompressed source originals he is familiar with, rip them into each format, play them, and then decide which is satisfactory to him. It would probably take less time than reading this thread.

Nice system you have there. Here is some thoughts but not answers per se -

...

Fox Fisher
Apr 2, 2013, 01:16 PM
Beyond 192kbps, the difference is not distinguishable. However, AAC has some advantages because it's a newer format.

A-) If you shop from itunes store a lot, your library will be intact. Just 1 sound format across all your library.

B-) If you have any gapless albums such as live concert records or DJ mix compilations, the transition between songs will be seamless since AAC supports gapless playback natively, where they integrated it into mp3 later and the performance differs from encoder to encoder.

C-) Smaller file size. Storage is cheap nowadays but the storege on portable devices are still limited. If you have a huge library, there will be a lot of space savings compared to 320kbps mp3.

The only downside of aac may be compatibility but I havent encountered any device that does not recognize aac yet. Even our 7 year old car stereo recognizes it.