I decided to check out the new VBR option in iTunes 5 along with the Apple Lossless codec that I haven't used at all yet... I encoded the same song (Skellig by Loreen McKennit) in Lossless format, AAC 96kbps VBR, AAC 128kbps CBR (iTMS format), and AAC 256kbps CBR.
I then loaded them onto my 1G iPod mini in a playlist, went into a fairly quiet room and listened to them all with the standard iPod headphones (quite a new pair, they came with a shuffle, not the iPod I was testing with). I then rated each song according to the apparent audio quality by giving each track a star rating. After syncing the iPod up to my computer, I looked at the rating I gave each track and it's bitrate.
My results ended up putting them in this order, from highest to lowest:
256kbps AAC CRB
96kbps AAC VBR
128kbps AAC CBR
Apple Lossless
This leads me to believe that I cannot hear the difference between 96kbps VBR AAC and the source media. I wonder how common that is? Does anyone else have any anecdotal evidence of how low quality music they find indistinguishable from the original?
I then loaded them onto my 1G iPod mini in a playlist, went into a fairly quiet room and listened to them all with the standard iPod headphones (quite a new pair, they came with a shuffle, not the iPod I was testing with). I then rated each song according to the apparent audio quality by giving each track a star rating. After syncing the iPod up to my computer, I looked at the rating I gave each track and it's bitrate.
My results ended up putting them in this order, from highest to lowest:
256kbps AAC CRB
96kbps AAC VBR
128kbps AAC CBR
Apple Lossless
This leads me to believe that I cannot hear the difference between 96kbps VBR AAC and the source media. I wonder how common that is? Does anyone else have any anecdotal evidence of how low quality music they find indistinguishable from the original?