Geez, I'll admit I was out of line by clumping Reznor in there with everyone else in the loudness war, he took a stand by releasing a separate version of that album for the niche audiophile audience
Well, I did some more research, and the subject is controversial, since the dynamic range on his "audiophile" album is not that great - it's better than the CD version but not great. So, well, it remains to be seen what he will do with Apple. He's certainly aware of the issue, but it's not certain he will act on it.
I very much like the sound of that idea! Perhaps it's something along those lines that these big names have been brought in to work on. Again, very cool idea- broadband internet is plentiful and easily accessible nowadays, I don't see why content delivery of that nature shouldn't be feasible.
Actually, we're talking about sound compression here, not file compression. Audio compression has little impact on file size. So, it does not cost anything, except a cultural change. Moreover, players have enough computing power so that doing on the fly audio compression would not be a problem...
But it still comes down to whether you have the hardware to play it back natively.
With retina displays, a lot of people actually have the resolution to play 4k on their computers and tablets... But yes, it's still awfully expensive for TV and you have to "downgrade" to LCD (which I still feel inferior to plasma).
I do have some old but semi-decent hifi equipment too (Denon home cinema amplifier with 5 Acoustic Energy AE1). There is no point in having a big TV without the sound to go with it
Most people don't- As humans we're visually dominant, it's easier to perceive the resolution of light than the resolution acoustical pressure.
I think it's also because the resolution for video is barely getting to the point where we don't see the pixels anymore. Fill your visual field with your TV (and yes, that means sitting very close to it) and you will see pixels, lack of resolution and compression artifacts.
But, like with audio, some people actually don't see the defaults. Like, some people like turning on the digital gadgets that totally destroy image quality (nuclear colors, the awful 600Hz modes...). Just like some people don't see the difference between 128kbits and CD..
I actually think this example somewhat solidifies my point further. ''Higher perceived resolution isn't always 'better' or 'good' for everything.[''
Here's another perspective:
^You like the look of the grain from that film? You find it 'pleasing' that things aren't crystal clear? Some people find audio more 'pleasing' when it's recorded to 2" magnetic tape.
And some people like tube amplifier, even though they're not hifi... I listen to a lot of blues, and, likewise, some artists can produce a lot of emotion from instruments built from scraps (literally for early artists) and amplifier that distorts and saturate...
Actually, I worked on video compression, during the move from MPEG to wavelet compression. The lesson is that wavelet was so successful because it's compression artefacts feel a lot more natural than MPEG. MPEG artefacts are blocky - and our brain is optimized to detect lines and patterns and focus on them. It's very bad when artifacts are more interesting than the content

On the other hand, wavelet artifacts induce fuzzyness and blurred lines - and our brain is tuned to reconstruct details from blurry images.
About film grain, the strange film is that film grain actually make the image feel sharper. If you have an image that is soft and you introduce some simulation of film grain, it will feel crispier. That's because film grain induce the feeling of higher local contrast and our brain loves local contrast and associate it with high level of details. But it doesn't work so well with digital noise (too regular, too colorful), just like in audio analog saturation, especially with tube amps, can feel good whereas digital saturation just feels awful.
That's why it's good to see the grain on films where the artist actually intended the film to be grainy, because it gives a texture and gritty feeling to the image.
You're a photographer, right? I'm sure you have the traditional film/light thing down (props, seriously, that's an art) but you probably also shoot with a DSLR with a high Megapixel count sensor, right?
It's a hobby, it's very hard to live off photography... I mostly shoot with a 40mp DSLR. But I also shoot with my iPhone. One reason if that you don't always feel like carrying 4kg of camera and lenses, another is that different tools make you work differently - just like most guitarists have several guitars.
but if the photos you take are ultimately going to be compressed to JPEGs to be viewable on the internet, why have such a high quality sensor in that DSLR body?
I actually print my photos, so they don't all go only to the web... I have several 60-100cm wide prints on fine art paper or plexy at home. People should really print their photos, it's just not the same as looking at them on a screen...
But, true, you don't need 40mp to print 100cm (you can get decent 60cm from even a 10mp).
But a camera is just not a sensor... It's also the ergonomics - with a Pro DSLR, you mostly have one button-one action, no need to go through menus and you can operate the camera without leaving the viewfinder. Likewise, shooting through a viewfinder is not the same act as looking at a LCD at arms length. You also have the benefit of an autofocus that works in extreme low light, close to no lag when taking the photo...
Also, a sensor is not just resolution. It's also dynamic range, low-light performance - even when publishing on the web, you have a real difference between a tiny sensor like on the iPhone (even if Apple did an excellent job with it) and a 24x36 one.
You know why.. because RAW gives you more fidelity and flexibility in post. You're JPEGs will look better in the end because you originally shot with wayyy more res in RAW image data from the sensor.
Actually, you don't gain much in resolution, JPEG Fine on a DSLR is not very compressed and you usually won't see any artifacts.
The gain from RAW is that you get choices and creativity is about choices. The sensor captures a 14 bits image, JPEG is 8 bits. So, when you're using JPEG, a crude algorithm decides how to fit these 14 bits into 8. With RAW, I'm the one that decide exactly how I'm going to do it.