PDA

View Full Version : Are music CDs compressed?




MikaelSmoot
Nov 28, 2009, 05:49 PM
I know CDs use the Compact Disc Digital Audio standard, but are music cds really lossless or are they compressed like dvds



J the Ninja
Nov 28, 2009, 05:51 PM
DVDs use lossy compression (MPEG-2), Blu-Ray's also use lossy compression (typically MPEG-4). CDs are not compressed, they are raw 44.1/16 LPCM.

MikaelSmoot
Nov 28, 2009, 05:57 PM
does dvd mpeg2 toss out more info that mpeg4 bluray since i hear that mpeg4 is a better more effcient codec

Teej guy
Nov 28, 2009, 08:39 PM
However, the audio on Blu-ray is losslessly compressed. When decoded, it is identical to the original PCM data. Put bluntly, DTS-HD Master Audio and Dolby TrueHD are basically like zip files...nothing is lost (unlike the Dolby and DTS technologies used on DVDs)

Also Mikael, for a more detailed answer to your question, see my post in your other thread: http://forums.macrumors.com/showpost.php?p=8883837&postcount=4

applesupergeek
Nov 30, 2009, 08:17 PM
DVDs use lossy compression (MPEG-2), Blu-Ray's also use lossy compression (typically MPEG-4). CDs are not compressed, they are raw 44.1/16 LPCM.

I think this is inaccurate, as most films are shot in, well, film you can't really say it's "lossy" the way they are encoded in other media, it's just a different format, a digital one vs. an analog one. Lossy or non lossy compression doesn't apply here.

Also, there are is a lot of hd camera content that is itself in mpeg-2 or 4 so when it's transcribed to dvd or blu ray you can't say it's a lossy process since nothing is lost from the original which itself is mpeg 2 or 4.

does dvd mpeg2 toss out more info that mpeg4 bluray since i hear that mpeg4 is a better more effcient codec

It does in a sense, but it's better to say it uses information less efficiently.

Teej guy
Nov 30, 2009, 08:24 PM
I think this is inaccurate, as most films are shot in, well, film you can't really say it's "lossy" the way they are encoded in other media, it's just a different format, a digital one vs. an analog one. Lossy or non lossy compression doesn't apply here.

No, it's accurate. MPEG 2 and 4 are lossy. Raw lossless exactly-what-the-sensor-captured HD footage is HUGE. MPEG 4 can contain lots of different formats but as far as I know they're all lossy.

Also, there are is a lot of hd camera content that is itself in mpeg-2 or 4 so when it's transcribed to dvd or blu ray you can't say it's a lossy process since nothing is lost from the original which itself is mpeg 2 or 4.

I believe it's only consumer level cameras that shoot "HD" footage in MPEG 2 or 4. I believe when writing to DVD / Blu-ray there's another encoding pass so you're actually piling lossy upon lossy compression if you're starting with MPEG stuff.

applesupergeek
Nov 30, 2009, 08:33 PM
No, it's accurate. MPEG 2 and 4 are lossy. Raw lossless exactly-what-the-sensor-captured HD footage is HUGE. MPEG 4 can contain lots of different formats but as far as I know they're all lossy.

I believe it's only consumer level cameras that shoot "HD" footage in MPEG 2 or 4. I believe when writing to DVD / Blu-ray there's another encoding pass so you're actually piling lossy upon lossy compression if you're starting with MPEG stuff.

Of course they are lossy formats, there's no question about that. But film to any digital conversion is by definition "lossy" that's what I am saying, while digital audio to digital audio conversion CAN be non lossy.

Plus like I said surely there is material on dvd already recorded in mpeg 2 or 4 that just get put on the dvd without any further transcoding.

I thought most prosumer and pro hd cameras captured in some kind of mpeg format, that can't be otherwise now can it? Is there any camera that captures raw hd footage, even pro cameras? Wouldn't that require that they have some tb arrays for that?

Teej guy
Nov 30, 2009, 08:43 PM
Of course they are lossy formats, there's no question about that. But film to any digital conversion is by definition "lossy" that's what I am saying

Film to digital isn't lossy lossy by definition in the same way that digital audio isn't lossy. If the file contains the exact data as picked up by the sensor, you've got a lossless file. The quality of that file depends on how good your analog to digital conversion is, but it's still lossless.

Plus like I said surely there is material on dvd already recorded in mpeg 2 or 4 that just get put on the dvd without any further transcoding.

When was the last time a film came out with absolutely no post-processing at all?

mcpryon2
Nov 30, 2009, 08:56 PM
I'd say it's relative to the source; something mastered at 44.1/16, put on a CD, then ripped at full quality would considered "lossless."

dXTC
Nov 30, 2009, 11:09 PM
Let's throw in another complicating factor, shall we?

Standard CD audio is 16-bit, 44.1KHz PCM. Each sample is explicitly defined in the data; it is not approximated using a set of mathematical algorithms (which is what lossy compression does).

Back when CDs first came out, that was one of the highest bitrates available to studios.

However, nowadays most studios-- even "bedroom" ones-- can easily record in 24-bit or even 32-bit depth, with bitrates often at 48KHz or 96KHz. The majors sometimes use 192KHz, which of course sounds awesome at the mixing desk. The problem is this: The waveform data of the final mixdown must be resampled downward to fit the relatively old CD standard for mass retail distribution.

Thus, bits are lost, even before the product hits the shelves. Even on a CD, we're not hearing what the artist or engineers heard in the studio.

Discuss.

Cinematographer
Dec 1, 2009, 12:41 AM
Teej guy, if you set the CD with 16-bit, 44.1KHz as the standard, then of course the CD will not be lossy in your system. But that standard will be totally arbitrary.

dXTC, I totally agree.

Jolly Jimmy
Dec 1, 2009, 08:46 AM
dXTC hit the nail right on the head.

dXTC
Dec 1, 2009, 08:59 AM
There's another "compression" on most recordings that hasn't been mentioned yet.

The studio's final mixed recording (aka the final mixdown), in whatever format, must be "mastered" for retail distribution-- a sort of post-processing, if you will. During this step, recordings are often run through a sound processor called a "compressor." This effectively makes low-volume sounds louder, and relatively loud sound peaks quieter without overloading the audio signal (clipping). This evens out the sound volume, at the cost of limiting dynamic range and detail in the final retail product.

What's worse, many producers will order the mastering technician to make the signal "hot", i.e. as loud as possible so that it gets noticed on the radio/in stores/in clubs. Some songs get so heavily compressed that the final result sounds like it was recorded in 12-bit and loses detail and ambience, just for the sake of loudness.

I'd love to see a few enterprising artists offer, as a bonus download accessible from the CD (edit: or perhaps an accompanying data CD or data DVD), the uncompressed pre-mastering final mixdown of a couple of songs from the album, in whatever bit depth/bitrate was used during mixing. Most modern desktop and laptop computers have the processing power to play back 24-bit audio up to 48 KHz, so it's feasible.

applesupergeek
Dec 1, 2009, 10:14 AM
Hey dxtc, thanks for these posts, very informative.

I had read about that loudness "plague" in wikipedia some time ago. Does that mean, simplistically, that the loudest signals get out and the lower ones become louder disproportionately to the increase of the louder signals?

Also, in listening tests, would a higher khz sampling than the one used in cds, provide a better listening experience? Would that include older master tapes re-mastered at higher than cd khz rates?

cube
Dec 1, 2009, 10:20 AM
CDs are not compressed, they are quantized.

Teej guy
Dec 1, 2009, 11:04 AM
CDs are not compressed, they are quantized.

This is correct! I'll probably expand on it later, I'm busy this afternoon.

Hey dxtc, thanks for these posts, very informative.

I had read about that loudness "plague" in wikipedia some time ago. Does that mean, simplistically, that the loudest signals get out and the lower ones become louder disproportionately to the increase of the louder signals

Basically yes. Transients also lose their attack/impact because everything is at the same level. It turns the signal into a brickwall with no dynamic contrast which the brain interprets as noise, increasing listening fatigue and making it difficult to hear "through" the music. In my opinion, digital brickwall limiting/clipping is one of the worst things ever to happen in the music industry.

dXTC
Dec 1, 2009, 11:49 AM
Transients also lose their attack/impact because everything is at the same level. It turns the signal into a brickwall with no dynamic contrast which the brain interprets as noise, increasing listening fatigue and making it difficult to hear "through" the music. In my opinion, digital brickwall limiting/clipping is one of the worst things ever to happen in the music industry.

Amen!

Luckily, there are a small but growing number of music mastering professionals that have organized into an association called Turn It Up, which advocates proper mixing and strongly discourages heavy compression during mastering. Some have even started to turn away projects where the producer obstinately demanded over-compression.


Also, in listening tests, would a higher khz sampling than the one used in cds, provide a better listening experience? Would that include older master tapes re-mastered at higher than cd khz rates?

In many cases, yes; a certain "airiness", also called ambience, is detected by most people when listening to higher-bitrate and higher-bit-depth sound. That's because the quantized (as mentioned by cube and loosely defined below) representation of the waveform gets closer in shape to the original continuous analog signal with higher bitrates.

This, however, depends significantly on the quality of the original master recording. If the recording engineers failed to capture room reverb and attack transients properly, then it wouldn't matter much.

FYI: Quantization refers to the periodic measurement and recording of a continuous signal (be it audio or video) into a stream of digital data points. Increasing the number of measurements over a given period of time obviously increases the bitrate, but also provides for a more accurate representation of the original signal.

Mr Skills
Dec 1, 2009, 12:17 PM
dXTC is broadly correct, but since this is my professional world (see 'dayjob' bit in my sig), I can clarify some of this a bit.

Let's throw in another complicating factor, shall we?

Standard CD audio is 16-bit, 44.1KHz PCM. Each sample is explicitly defined in the data; it is not approximated using a set of mathematical algorithms (which is what lossy compression does).

Back when CDs first came out, that was one of the highest bitrates available to studios.


"Available to studios" is not quite accurate since virtually nothing was recorded digitally back then anyway, but it's certainly true that it was pretty much the limit of technology.

However, nowadays most studios-- even "bedroom" ones-- can easily record in 24-bit or even 32-bit depth, with bitrates often at 48KHz or 96KHz. The majors sometimes use 192KHz, which of course sounds awesome at the mixing desk. The problem is this: The waveform data of the final mixdown must be resampled downward to fit the relatively old CD standard for mass retail distribution.

Thus, bits are lost, even before the product hits the shelves. Even on a CD, we're not hearing what the artist or engineers heard in the studio.

Whilst it is true that 24 bit is pretty much the standard for professional sessions (I'm not sure I've ever seen a real-world session in 32 bit), the situation is much less clear with sample rates. The vast majority still work in 44.1/48k (which amount to the same thing, 48 being the standard when working to video). A few sessions work in 88.2/96k but not many. I have only ever seen 192k used for archiving a final mix, and even then only occasionally, so it is certainly not the standard used by "the majors".

To explain a little more about the sample rates: the most important thing for audio quality is the clock of converters. They are taking a reading 44,100 times a second (in the case of 44.1k) and the more stable the timing, the more accurate the recording will be. On cheap home equipment, where the clocking is not as good, doubling the sample rate effectively halves size of any errors, so it can have a noticeable difference. But on the level of equipment used in full-scale studios, the difference is often impossible to hear and even when you can hear a subtle difference, it's rarely definitively "better". The only time I record in 96k is on classical recordings, and even then it is not the difference between a 'good' and 'bad' recording. On pop and rock the logistical disadvantages of 96k (which takes a great deal more computing power and storage) outweigh any theoretical audio benefit.

There is a huge amount of "emporer's new clothes" in the music industry - no-one wants to be the one to say "actually, I can't hear the difference", so for a while a few years back everyone was rushing to record in higher sample rates. Now they are much less common. In my experience the people who obsess over things like that are often missing out basics that make a much bigger difference, like how an instrument is being played or moving a microphone an inch.

Mr Skills
Dec 1, 2009, 12:32 PM
There's another "compression" on most recordings that hasn't been mentioned yet.

The studio's final mixed recording (aka the final mixdown), in whatever format, must be "mastered" for retail distribution-- a sort of post-processing, if you will. During this step, recordings are often run through a sound processor called a "compressor." This effectively makes low-volume sounds louder, and relatively loud sound peaks quieter without overloading the audio signal (clipping). This evens out the sound volume, at the cost of limiting dynamic range and detail in the final retail product.

What's worse...

I'm nitpicking, I know, but your use of the phrase "what's worse..." implies that there is something inherently wrong with limiting/compression of a final master. I agree that a great many records end up overcompressed (including some I've worked on myself - I've even emailed a [big name] masterer to complain that the record is clipped only to have the response "talk to the label, they wanted it that loud"). But when done right that compression can do a great deal for the sound.

Sometimes reduction of dynamic range can actually increase your sensation of dynamics, since your brain perceives dynamics more from things like tone than it does from actual volume. Imagine if in a movie they kept a whispered scene to the actual volume of a whisper; it would actually feel less intimate being so far away. Not to mention how pleasing "bad" compression artefacts (pumping/squashing) can be in some pop/rock music (an example: MGMT. Massively over-compressed, and yes, it's fatiguing, but boy is it exciting!).

I completely agree that there are a great many records in the world that have been ruined by bad mastering and over-compression. But all records benefit from mastering when it's done right, and a few even benefit from it "done wrong". I wouldn't want to throw the baby out with the bathwater.

dXTC
Dec 1, 2009, 12:59 PM
Mr Skills, I stand corrected on your point about "What's worse". Certain genres indeed thrive on overcompression, and good mastering can often compensate for less-than-stellar mixdowns.

Your expertise is definitely welcome here.

Teej guy
Dec 1, 2009, 03:20 PM
There's a solid difference between over-compressing for an effect and brickwall limiting a master so that it's "louder" than other CDs.

I don't see any benefit to clipping transients just for loudness' sake. Don't get me wrong, I love side-chain compression a la Daft Punk, but we're talking about two completely different things here. I think MGMT would have benefited from not being brickwalled. I believe the vinyl master isn't as compressed as the CD master for their latest record, but don't quote me on that. I can go check though...

Mr Skills
Dec 1, 2009, 03:31 PM
There's a solid difference between over-compressing for an effect and brickwall limiting a master so that it's "louder" than other CDs.

I don't see any benefit to clipping transients just for loudness' sake. Don't get me wrong, I love side-chain compression a la Daft Punk, but we're talking about two completely different things here. I think MGMT would have benefited from not being brickwalled. I believe the vinyl master isn't as compressed as the CD master for their latest record, but don't quote me on that. I can go check though...

It stands to reason that the vinyl version would not be as compressed - you simply can't limit to the same extent to vinyl as you can to CD.

I agree - passionately - that arbitrarily limiting CDs to compete in 'loudness wars' is a Bad Thing. I was merely pointing out that the 'artificacts' of heavy limiting (as distinct from compression over the mix) are not necessarily a bad thing, depending on the music and personal taste. Although I guess, given the choice, I'd rather live in a world where a few songs were under-compressed than the current world where lots and lots of songs are over-compressed. ;)

I guess my point is that over-compressing for the sake of pure loudness is bad; over-compressing because you like the sound of it is fine :)

I should point out that I'm not the compression fiend that these posts make me sound like! If anything I tend to be fairly moderate unless the music particularly calls for it. I just have an aversion to the phrase "you should never ..." in music production. :)

Teej guy
Dec 1, 2009, 04:03 PM
It stands to reason that the vinyl version would not be as compressed - you simply can't limit to the same extent to vinyl as you can to CD.

I agree - passionately - that arbitrarily limiting CDs to compete in 'loudness wars' is a Bad Thing. I was merely pointing out that the 'artificacts' of heavy limiting (as distinct from compression over the mix) are not necessarily a bad thing, depending on the music and personal taste. Although I guess, given the choice, I'd rather live in a world where a few songs were under-compressed than the current world where lots and lots of songs are over-compressed. ;)

I guess my point is that over-compressing for the sake of pure loudness is bad; over-compressing because you like the sound of it is fine :)

I should point out that I'm not the compression fiend that these posts make me sound like! If anything I tend to be fairly moderate unless the music particularly calls for it. I just have an aversion to the phrase "you should never ..." in music production. :)

Sick man, bloody good post! I think I agree with everything there ;)