D*I*S, with respect, that's a value judgement and there are other excellent recording engineers who have reached a different conclusion.
It's really not a value judgment, though. I'm not saying "good enough" the way Bill Gates allegedly remarked that "640k is enough RAM for anybody" back in the early 80's.
I'm saying that any human population, even professional recording engineers, when subjected to properly controlled double-blind listening tests, cannot reliably distinguish 44.1/16 bit program material playback from any higher sample rate/bit depth version of the same program material. If the difference can't be perceived, why use hi-rez codecs for final mixes?
The AES, certainly not an enemy of audiophiles, published a major study in 2007 testing this hypothesis. Using higher resolution formats like SACD and DVD-A as their hi-fi samples and 44.1/16 bit versions of the same program material as their standard resolution samples, the 500+ person subjects in the study had no better chance of picking the hi-fi content through listening than they would have by flipping a coin. It was a chilling indictment of the hi-rez/hi-fi industry.
Recording at higher sample rates is akin to painting with ultraviolet pigments. Perhaps a pigeon could appreciate your art, as they can see further into the UV spectrum than humans can, but neither you, your favorite art critic, or anyone else can enjoy your painting. So spending extra time and money on ultraviolet pigments would be a waste. Likewise, unless you are mixing an album for bats and dogs to listen to, highs beyond 22k are completely irrelevant.
1080p HD is very easy to tell from standard def once you know what to look for, and a 4k projected image is going to be a heck of a lot smoother than 1080p. Also, while 24fps is more than sufficient to convey fluid motion in humans due to the timing of our persistence of vision, people can perceive movement at much higher frame rates and can tell the difference.
All that is true, but only up to a point. Once pixel density reaches a certain threshold at a specific distance from the viewer, his/her eye loses the ability to discern individual pixels. Once that threshold is reached, increasing pixel density offers no benefit whatsoever. It just becomes a marketing gimmick and p*ssing contest with one's competition. Same with frame rates. I read recently that the upper threshold for perception of motion is around 67 fps--anything faster than that might only reduce the perception of flicker and not really aid in the perception of additional movement. So, if a video codec using 70 fps and "retina"-level resolution (which would vary based on the distance from the image) were established, why would a person want to double the resolution or frame rates again? What would be the point?
Oh, and the the move over the last two decades is to REDUCE video fps rates in codecs, not increase them, at least for movies. Film has been 24fps for nearly a century, and we've been trained to enjoy that rate. Video used to be exclusively 30i, and now is often 30i, 30p, 60i, and, increasingly, 24p. When you see a movie shot in HD @ 24p, your eye reads it as "filmlike" and "cinematic" while watching program material shot in 60i looks "videolike". For sports, people like the "videolike" 60i because motion, especially slow motion, can be more precisely rendered. But for dramas, 24p is considered the most desirable.
Back to the Neil Young interview, his argument that download sellers like Apple should offer a choice of higher quality downloads is well placed, IMO. To some, 24 bit may be pie in the sky, but Apple's current delivery of .26 Mbit/sec falls well short of CD quality (1.35 Mbit/sec) let alone 24 bit at a 96 kHz sampling rate (4.39 Mbit/sec).
Perhaps Neil is being cagey banging on the 24 bit drum in the hopes that Apple will at least offer downloads that match CD quality.
He's being an idiot. As badly as his ears have been fried by years of touring, there is no way on God's green Earth that he would ever be able to pass a properly controlled listening test designed for him to discriminate 96k/24bit playback from 44.1/16 bit.
Hey, I used to own a MOTU 896HD and recorded projects @ 192k/24 bit. The result? Huge file sizes, CPU-choking plug-in processing, and absolutely no additional benefit whatsoever in the final product. I was an idiot then, just as Neil is now.
Apple is smarter than the both of us by not caving to the pressures of the high-rez crowd. Any lossless codec that gives you the equivalent of 44.1k/16 bit resolution is an acceptable level of resolution/bit depth overkill for music listening by human beings. Anything more is spec-driven nonsense.