I was discussing in another thread what the ATV outputs via optical. I have my ATV connected via optical to my Denon 3805. With other sources I have this provides a digital input and the receiver decodes the signal and then you have stereo, DD, DTS, etc. My receiver shows a PCM Dig input signal from the ATV. The question I have is the ATV decoding the MP3 and AAC files into stereo? I'm pretty sure my receiver doesn't decode MP3 and I'm positive it doesn't decode AAC files so what's going on?
Thanks for any insight you can provide.
The AppleTV converts the content on the fly from its existing format, be it MP3, AAC or 24-bit AIFF, to two-channel (stereo) 16-bit, 44.1kHz, Linear PCM. It is useful to note that while PCM is not considered a compression schema, it is a mathematically limited representation of the analog waveform. 16-bit samples have a finite Dynamic Range (amplitude) and 44.1kHz sampling frequency is at the Nyquist limit which has a finite set of quantizaton intervals and therefore amplitude values are subject to some degree of quantization error and frequency values are subject to frequency response roll off as the sampled frequency approaches 1/2 the Nyquist limit (i.e. 22.05 kHz).
However, most modern D/A converters employ sample & hold buffers large enough to allow internal reclocking of the signal and during recording and/or mastering, a low-pass filter is applied at 1/2 the Nyquist limit to avoid introduction of frequencies that would induce aliasing during reconstruction (it is a common myth that aliased frequencies are the result of digital encoding whereas it is actually an artifact resulting from a faulty signal reconstruction).
The bottom line is that provided you use a near-lossless encoding schema, e.g. 128 Kbps MPEG-4 AAC (determined by the AES to be perceptiually indiscernible from 16-bit LPCM), you'll mitigate the potential for reconstruction errors and artifacts, and you should not be able to discern a difference between that playback and playback from the original 16-bit LPCM source from which the AAC was transcoded.
Yes. I know what hardcore audiophiles will say to that, and they're wrong.
P.S. With regard to
lostless's reply, DTS by the way is not more efficient than Dolby Digital. Theatrical DTS uses a 1.5Mbps ADPCM bitstream whereas theatrical Dolby Digital uses a 640 Kbps bitstream coupled with perceptual algorithms and bandpass filtering to achieve the same or better result. Therefore, Dolby Digital is more efficient than DTS. DTS is regarded by some as capable of a more accurate reproduction but that claim has yet to be substantiated in double-blind testing.