Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, you're correct - DAC quality has improved over time.

There are engineering compromises to both slower and 'quicker' DACs.
[...]
This challenge has spurred the development of Delta Sigma Conversion, where a very much higher oversampling rate is used, but the converter is only 1 bit. [...]

So the simple answer to your question is that direct 44.1 converters are pretty much never used these days - and it's the use of oversampling that's driven sound quality improvements over time. High accuracy, fast D to A is expensive, so Delta Sigma has become extremely popular as an alternative.

Thanks. With a bit of effort, I actually learned something today :), including a little about Delta Sigma, Noise Shaping, and dithering.

I don't know if I'll remember any of it tomorrow, but... much appreciated.
 
I wouldn't consider it "backtracking" so much as "coming to their senses." 16 bit recordings as a final delivery medium have reached the minimum "overkill" level of resolution. That's what the audio research indicates. Anything beyond 44.1k/16 bit is wasted data and CPU cycles.

For those of us who record music, tracking @ 24 bit guarantees a very low noise floor and fidelity at all levels, which is desirable when mixing dozens or even hundreds of individual tracks. But the final mix? 44.1k/16 bit is more than sufficient with regard to dynamic range and high end frequency resolution.

D*I*S, with respect, that's a value judgement and there are other excellent recording engineers who have reached a different conclusion.

Back to the Neil Young interview, his argument that download sellers like Apple should offer a choice of higher quality downloads is well placed, IMO. To some, 24 bit may be pie in the sky, but Apple's current delivery of .26 Mbit/sec falls well short of CD quality (1.35 Mbit/sec) let alone 24 bit at a 96 kHz sampling rate (4.39 Mbit/sec).

Perhaps Neil is being cagey banging on the 24 bit drum in the hopes that Apple will at least offer downloads that match CD quality. :)
 
D*I*S, with respect, that's a value judgement and there are other excellent recording engineers who have reached a different conclusion.

It's really not a value judgment, though. I'm not saying "good enough" the way Bill Gates allegedly remarked that "640k is enough RAM for anybody" back in the early 80's.

I'm saying that any human population, even professional recording engineers, when subjected to properly controlled double-blind listening tests, cannot reliably distinguish 44.1/16 bit program material playback from any higher sample rate/bit depth version of the same program material. If the difference can't be perceived, why use hi-rez codecs for final mixes?

The AES, certainly not an enemy of audiophiles, published a major study in 2007 testing this hypothesis. Using higher resolution formats like SACD and DVD-A as their hi-fi samples and 44.1/16 bit versions of the same program material as their standard resolution samples, the 500+ person subjects in the study had no better chance of picking the hi-fi content through listening than they would have by flipping a coin. It was a chilling indictment of the hi-rez/hi-fi industry.

Recording at higher sample rates is akin to painting with ultraviolet pigments. Perhaps a pigeon could appreciate your art, as they can see further into the UV spectrum than humans can, but neither you, your favorite art critic, or anyone else can enjoy your painting. So spending extra time and money on ultraviolet pigments would be a waste. Likewise, unless you are mixing an album for bats and dogs to listen to, highs beyond 22k are completely irrelevant.

1080p HD is very easy to tell from standard def once you know what to look for, and a 4k projected image is going to be a heck of a lot smoother than 1080p. Also, while 24fps is more than sufficient to convey fluid motion in humans due to the timing of our persistence of vision, people can perceive movement at much higher frame rates and can tell the difference.

All that is true, but only up to a point. Once pixel density reaches a certain threshold at a specific distance from the viewer, his/her eye loses the ability to discern individual pixels. Once that threshold is reached, increasing pixel density offers no benefit whatsoever. It just becomes a marketing gimmick and p*ssing contest with one's competition. Same with frame rates. I read recently that the upper threshold for perception of motion is around 67 fps--anything faster than that might only reduce the perception of flicker and not really aid in the perception of additional movement. So, if a video codec using 70 fps and "retina"-level resolution (which would vary based on the distance from the image) were established, why would a person want to double the resolution or frame rates again? What would be the point?

Oh, and the the move over the last two decades is to REDUCE video fps rates in codecs, not increase them, at least for movies. Film has been 24fps for nearly a century, and we've been trained to enjoy that rate. Video used to be exclusively 30i, and now is often 30i, 30p, 60i, and, increasingly, 24p. When you see a movie shot in HD @ 24p, your eye reads it as "filmlike" and "cinematic" while watching program material shot in 60i looks "videolike". For sports, people like the "videolike" 60i because motion, especially slow motion, can be more precisely rendered. But for dramas, 24p is considered the most desirable.

Back to the Neil Young interview, his argument that download sellers like Apple should offer a choice of higher quality downloads is well placed, IMO. To some, 24 bit may be pie in the sky, but Apple's current delivery of .26 Mbit/sec falls well short of CD quality (1.35 Mbit/sec) let alone 24 bit at a 96 kHz sampling rate (4.39 Mbit/sec).

Perhaps Neil is being cagey banging on the 24 bit drum in the hopes that Apple will at least offer downloads that match CD quality. :)

He's being an idiot. As badly as his ears have been fried by years of touring, there is no way on God's green Earth that he would ever be able to pass a properly controlled listening test designed for him to discriminate 96k/24bit playback from 44.1/16 bit.

Hey, I used to own a MOTU 896HD and recorded projects @ 192k/24 bit. The result? Huge file sizes, CPU-choking plug-in processing, and absolutely no additional benefit whatsoever in the final product. I was an idiot then, just as Neil is now.

Apple is smarter than the both of us by not caving to the pressures of the high-rez crowd. Any lossless codec that gives you the equivalent of 44.1k/16 bit resolution is an acceptable level of resolution/bit depth overkill for music listening by human beings. Anything more is spec-driven nonsense.
 
Recording at higher sample rates is akin to painting with ultraviolet pigments. Perhaps a pigeon could appreciate your art, as they can see further into the UV spectrum than humans can, but neither you, your favorite art critic, or anyone else can enjoy your painting. So spending extra time and money on ultraviolet pigments would be a waste. Likewise, unless you are mixing an album for bats and dogs to listen to, highs beyond 22k are completely irrelevant.

I would agree with this for the final product medium, which I suspect you are referring to. Tracking at higher sample rate, but especially higher bit depth have benefits, possibly even for the master.
 
I would agree with this for the final product medium, which I suspect you are referring to. Tracking at higher sample rate, but especially higher bit depth have benefits, possibly even for the master.

Yes, I agree. I mentioned this in one of my previous posts. Greater bit depth when tracking helps keep mixes quiet when the track count goes up. I track @ 48k/24 bit, which I think is extreme overkill, and I mix and master at that level as well, but when it comes to delivery I'll output 44.1k/16 bit for CD.

Believe it or not, I might release our next big project on vinyl as well! If only because I like the vintage vibe it has, as well as the gorgeous scale of album sleeves for iconic graphic design. Also, I find the entire production of vinyl absolutely fascinating.

BTW, recording @ 48k instead of 44.1k just future-proofs me a tad, in that if the loony 96k format were to become mainstream, uprezzing from 48 to 96k is pretty quick and easy.
 
The big long vinyl/digital debates are pointless. We're talking about digitally downloaded music on iTunes and why Apple still doesn't offer us lossless audio files. Once they at least let us buy CD quality 16bit/44100kHz, then sure, ask for higher 24bit/48000kHz and upwards if you like.
 
Oh, and the the move over the last two decades is to REDUCE video fps rates in codecs, not increase them, at least for movies. Film has been 24fps for nearly a century, and we've been trained to enjoy that rate. Video used to be exclusively 30i, and now is often 30i, 30p, 60i, and, increasingly, 24p. When you see a movie shot in HD @ 24p, your eye reads it as "filmlike" and "cinematic" while watching program material shot in 60i looks "videolike". For sports, people like the "videolike" 60i because motion, especially slow motion, can be more precisely rendered. But for dramas, 24p is considered the most desirable.
That really surprises me still. Yes, there is something very "filmic" about the 24fps rate, but I wonder why we're just holding onto this low frame rate simply because of that nostalgic thing... If the movie industry had only just started, we would be using much higher frame rates (albeit more costly in terms of special effects). Now I am so used to watching movies on my HDTV which interpolates the frames to make them smoother due the TV's higher frame rate, that when I go and watch a movie in a cinema I find the experience quite tiring on my eyes... The motion is juddery and makes me feel a bit queasy.

There's even been a move (in UK TV programmes) to take the studio-recorded show (drama, chat show, whatever), and just convert it from 50i to 25p just so it looks more "filmic"... And I for one hate that. I hope that some day we'll get over this idea that 24/25p is somehow better, and embrace the more natural (NOT organic! ;)) higher frame rates which are easier on the eye.

Hey, I used to own a MOTU 896HD and recorded projects @ 192k/24 bit. The result? Huge file sizes, CPU-choking plug-in processing, and absolutely no additional benefit whatsoever in the final product. I was an idiot then, just as Neil is now.
I must admit that being a new owner of a ProFire 610, I've been digitising old projects at 24-bit 48kHz. This is mainly because I can then normalise them and pull out quiet sections without losing any dB in the process... But any files I output for final listening will most definitely be 16-bit. :)
 
That really surprises me still. Yes, there is something very "filmic" about the 24fps rate, but I wonder why we're just holding onto this low frame rate simply because of that nostalgic thing... If the movie industry had only just started, we would be using much higher frame rates (albeit more costly in terms of special effects). Now I am so used to watching movies on my HDTV which interpolates the frames to make them smoother due the TV's higher frame rate, that when I go and watch a movie in a cinema I find the experience quite tiring on my eyes... The motion is juddery and makes me feel a bit queasy.

I actually find watching movies at 60 fps (and TV) is what is making me feel a bit queasy. It's just too natural looking and really messes with the suspension of disbelief.
 
Now I'm not in any way qualified to make any emperical statements as to whether there's an appreciable difference between vinyl and redbook audio formats or which is superior.

With a few mastering credits to my name and a trademark licensing agreement to adhere to the mastering standards for Dolby Digital 5.1 channel surround (rather strict requirements around average loudness, dialogue normalization, etc.), I would say I'm qualified... and there is.

I've stated before that vinyl is limited to a dynamic range of 80dB, whereas 16-bit CD audio is 96.7dB and 24-bit Linear PCM is 140dB. To put this in perspective, every 3dB represents a doubling of wave power. So if we compared two identical recordings on both formats, with the same minimum loudness, the maximum wave power of the CD recording can in principle extend to 50 times that supported by vinyl.

The other two problems around groove depth cannot be escaped by a $15,000 "laser" turntable... the digital medium does not have a physical limitation, just a bandwidth limitation. But grooves as the source limit both the frequency and amplitude range AND clarity. The amount of definition in a vinyl groove, even in pristine condition, is limited by the height, depth and width of that groove. When the original master platter is cut, from which the reproductions are pressed the cutting stylus puts pressure on each groove, slightly altering its shape. As small as those groups are, a very miniscule shift in groove shape can alter frequency response and amplitude dynamics noticeably, and pitch is further affected slightly because the amount of information recorded changes in space as the stylus nears the center but the speed of the turntable is constant.

Workarounds to these problems still wouldn't compare to the lower noise floor and improved dynamic range of the digital medium, nor would they accommodate future format changes. File formats are the way to go, because all that's needed is a CPU and a quartz clock to decode them whereas software can be modified. You could avoid format wars for a very long time this way.

A truly novel thought might be that with vectorization technology, you could store an analogous sinewave in a digital file...?

Define what you mean by "vectorization technology".
 
The big long vinyl/digital debates are pointless. We're talking about digitally downloaded music on iTunes and why Apple still doesn't offer us lossless audio files. Once they at least let us buy CD quality 16bit/44100kHz, then sure, ask for higher 24bit/48000kHz and upwards if you like.

Good point.

I didn't hear Neil Young making a "vinyl is superior to digital" argument. His vinyl comment was an aside, at the end of a long interview, to point out that even Steve Jobs, when at home, listened to a form of music reproduction different from the digital downloads Apple sells today.

Maybe Steve just liked the "theatre" of playing vinyl, but it's interesting that he was a member of a group of people for whom Apple's lossy .256 Mbit/sec downloads have not replaced other formats, be they CD, SACD, HD downloads or vinyl.

If Neil Young is an "idiot" he's a pretty brilliant idiot and he's certainly not the only artist to prefer listening to his work in a higher resolution digital form than what is now offered by Apple.
 
He is comparing to a 192 KHz, 24 bit, 2 channel studio format. Which means uncompressed AIFF or WAV. The bitrate of a stream like this would be 192000*24*2/1024 = 9000kbps.

And if that's what he's comparing, it's a pretty useless comparison. 24 bit is used all the time but 192 is a tiny TINY fraction of the recording that's done. And there's an argument to be made that a sampling rate that high is just wasting space and few if any people can even hear the difference.


Ok, I tried it. I would absolutely not call the difference night and day, but I will say I was able to tell.

Did you use an ABX app or did you know which was which while comparing?


I definitely recommend you listen to 24/96 Hotel California by the Eagles before you decide that it's not worth it.

And I'd definitely recommend doing an ABX comparison before deciding that you actually are hearing a difference. Placebo effect is huge with audio, people constantly convince themselves they're hearing a difference that isn't there.


take this waveform for example:

As multiple people pointed out before your post, that's a horrible article, it's obvious the guy has absolutely no understanding of how digital audio actually works.


Totally agree... in the MP3 I could not even hear...

Again, I'll ask...ABX? The ABX app is linked earlier in the thread, couldn't be simpler to do the listening comparison that way.


That really surprises me still. Yes, there is something very "filmic" about the 24fps rate, but I wonder why we're just holding onto this low frame rate simply because of that nostalgic thing...

Isn't Peter Jackson shooting The Hobbit at 48 fps? Theatres generally aren't set up for it yet but he's hoping there will be some installed before release. I believe Roger Ebert has seen material at higher framerate and is a huge advocate of it, he thinks it's much more of a visual boost than 3D.
 
Last edited:
To put this in perspective, every 3dB represents a doubling of wave power. So if we compared two identical recordings on both formats, with the same minimum loudness, the maximum wave power of the CD recording can in principle extend to 50 times that supported by vinyl.

I would turn that around - same "maximum wave power" for both - the greater headroom gives the ability to "divide by 2" more often, which gives the ability to resolve much smaller changes in the audio stream.

Optical media are built from tiny physical potholes. In principle it is the same process as a "laser turntable" and will have the same fundamental limits namely that it can only resolve physical details sufficiently large relative to the laser wavelength. If one were so inclined, one could build a Vinyl 2.0 standard with modern technology that would have staggeringly high specs. Whether it could reach 24/192 is unclear, but it could certainly blow past RedBook.

But, there isn't much point in doing so, as we've long since passed the point of meeting the vast majority of people's needs.

"Live" is the new HD, anyway.

----------

Isn't Peter Jackson shooting The Hobbit at 48 fps?

Film shot at 24fps has been projected at 48fps or 72fps for decades. Traditionally it was done by hold in a frame for two light pulses (blade spins, whatever). The reason is that projecting a 24fps recording at 24fps playback looks like crap - flickery as hell.

Shooting at 24fps for movies makes as much sense as mastering recordings on 8-track - it's just old for old's sake. 1080p at reasonable viewing distance and high quality 60fps recording/playback gives a very visceral "3D" feeling - more so even, IMO, than the current silly-glasses approach.
 
Shooting at 24fps for movies makes as much sense as mastering recordings on 8-track - it's just old for old's sake.

When actually shooting on film, there is more to it than just nostalgia since doubling the frame rate means double the film stock, half as long filming time per reel (unless you double the amount the camera can hold), etc. More expensive and more unwieldy. I wouldn't be surprised if there are other issues like the camera being louder (and using more power).

But with digital those problems are much less of an issue, which is likely why we're finally seeing that happen.
 
To get back on point, Apple should offer at least 16bit/44100kHz lossless (alac, flac, wav or whatever, etc) audio files to be bought in iTunes. Not being able to buy the original lossless audio version and being sold mp3's is an absolute abomination and needs to change.
 
Dan said:
Ok, I tried it. I would absolutely not call the difference night and day, but I will say I was able to tell.

[...]
Did you use an ABX app or did you know which was which while comparing?

And I'd definitely recommend doing an ABX comparison before deciding that you actually are hearing a difference. Placebo effect is huge with audio, people constantly convince themselves they're hearing a difference that isn't there.

I manually shuffled them in different tabs of the browser until I could not identify which was which. I had to be careful not to hover or the popover filenames could give them away. I shuffled and covered when necessary, until I had no idea which was which. Then I listened to them back to back multiple times. Not good enough for publishing scientific results, but good enough for me.
 
I'm saying that any human population, even professional recording engineers, when subjected to properly controlled double-blind listening tests, cannot reliably distinguish 44.1/16 bit program material playback from any higher sample rate/bit depth version of the same program material. If the difference can't be perceived, why use hi-rez codecs for final mixes?

The AES, certainly not an enemy of audiophiles, published a major study in 2007 testing this hypothesis. Using higher resolution formats like SACD and DVD-A as their hi-fi samples and 44.1/16 bit versions of the same program material as their standard resolution samples, the 500+ person subjects in the study had no better chance of picking the hi-fi content through listening than they would have by flipping a coin. It was a chilling indictment of the hi-rez/hi-fi industry.

For me, this is one of the most important posts in this thread. I looked for the source of the study:

Meyer, E. Brad and David R. Moran. "Audibility of a CD-Standard A/D/A Loop Inserted into a High-Resolution Audio Playback, Journal of the Audio Engineering Society, Sept. 2007, pp. 775-779.​

I could only find references to it, such as http://www.audioamateur.com/media/galo2941.pdf, in which the author (perhaps not surprisingly given his history) seems to want to dismiss it to some extent.

If anyone can point us to the actual study, that could be helpful.

If you're a member, I guess you could get it here:
http://www.aes.org/journal/online/JAES_V55/9/

And all can view some of the information (and a little discussion, including from the authors) from here:
https://secure.aes.org/forum/pubs/journal/?ID=2

And more info from the authors here:
http://www.bostonaudiosociety.org/explanation.htm
 
Last edited:
For me, this is one of the most important posts in this thread. I looked for the source of the study:

Meyer, E. Brad and David R. Moran. "Audibility of a CD-Standard A/D/A Loop Inserted into a High-Resolution Audio Playback, Journal of the Audio Engineering Society, Sept. 2007, pp. 775-779.​

I could only find references to it, such as http://www.audioamateur.com/media/galo2941.pdf, in which the author (perhaps not surprisingly given his history) seems to want to dismiss it to some extent.

I saw that study mentioned in another article. This seems to make a lot of sense to me:

Despite the fact that no one could hear the difference in playback systems, they reported that “virtually all of the SACD and DVD-A recordings sounded better than most CDs — sometimes much better.” As it wasn't the technology itself that was responsible for this, what was? The authors' conclusion is because they are simply engineered better. Because high-end recordings are a niche market, “Engineers and producers are being given the freedom to produce recordings that sound as good as they can make them, without having to compress or equalize the signal to suit lesser systems and casual listening conditions. These recordings seem to have been made with great care and manifest affection by engineers trying to please themselves and their peers.”
 
If anyone can point us to the actual study, that could be helpful.

PDF download: http://colors.webatu.com/wp-content...bility-of-a-cd-standard-ada-loop-inserted.pdf

An interesting study - and conducted over multiple locations with different equipment. Looks very convincing.

With a high quality D to A, there's no reason why 16 bit 44.1 shouldn't outperform human hearing, with final mastered audio material (that is, audio material that's been normalised to use the full 16 bits of CD resolution). This study confirms that - certainly amongst the sample of people they used.

For studio engineers, it still makes good sense to record at 24 bit... since at the time of recording you won't make full use of the digital resolution available - and you end up leaving a lot of headroom to stop signals from overloading. I haven't read anything convincing regarding the use of higher sample rates (96 for instance), although this may be a wise move in order to avoid antialiasing filter effects adding up (a typical digital mixing session may well bounce digital audio out to analogue effects units, summing mixers etc... so the signal will pass through multiple stages of DAC/ADC). 192 does appear to be more of a marketing exercise though.

I'd be interested to read more about mastering levels for movie soundtracks. DVD has 24 bit sound... but that could be because the average mastering level is a lot lower than on a music CD (to leave headroom for transients and effects). There's some discussion in this thread which suggests that it could be 20dB or more down... which would give some justification for using a higher bit depth.
 
Last edited:
The AES, certainly not an enemy of audiophiles, published a major study in 2007 testing this hypothesis. Using higher resolution formats like SACD and DVD-A as their hi-fi samples and 44.1/16 bit versions of the same program material as their standard resolution samples, the 500+ person subjects in the study had no better chance of picking the hi-fi content through listening than they would have by flipping a coin. It was a chilling indictment of the hi-rez/hi-fi industry.

...(respectful snip)... Anything more is spec-driven nonsense.

Absolutely. The only thing I would say is that where DVD-A has potential is not in terms of noticeably better fidelity, but noticeably larger dynamic range. But that depends entirely on whether or not the original master recording was designed to take advantage of a 140dB dynamic range.

Earlier engineers like Glyn Johns and Bruce Swedien would be excellent candidates for working in that medium, but today's engineers don't know a damned thing about the process of sweetening. Recordings in the last twenty years have flatlined at a constant peak average loudness... and they sound about as exciting and dynamic as pea soup.
 
Can't be much of an abomination if vast majority of people are fine with it.

Is this the same "vast majority" that decide what is in the "pop" charts? The same "vast majority" that Steve Jobs determined has no clue what is best for themselves until 'what is good' is shoved down their throats?
In that case, yeah, sure, the "vast majority" are fine with it. I'm not though, and I believe I speak for the greater good.

mp3's sound terrible compared to their lossless versions. I use £110 headphones. Do you think that I am in the wrong for wanting lossless audio as it is originally created by the music makers? You think everyone should just accept a third party (Apple) compressing the original audio into mp3's and selling it? Don't you want the original audio data as the music maker intended it to sound? It should be our right, as consumers, to receive the original audio and not some degraded copy (an mp3).
 
Last edited:
]It should be our right, as consumers, to receive the original audio and not some degraded copy (an mp3).

This I agree with. Unless the mp3s were deeply discounted in price (they're not), I'd prefer to buy in lossless format. This is why I still mainly buy CDs to rip myself - and only really use iTunes when I know I only want to buy one or two tracks of an album.
 
One of the more interesting sections of Meyers/Moran is the following note at the end of their paper:

"Though our tests failed to substantiate the claimed advantages of high-resolution encoding for two-channel audio, one trend became obvious very quickly and held up throughout our testing: virtually all of the SACD and DVD-A recordings sounded better than most CDs— sometimes much better. Had we not “degraded” the sound to CD quality and blind-tested for audible differences, we would have been tempted to ascribe this sonic superiority to the recording processes used to make them."

For those interested in the Meyers/Moran study you will also want to look at Tsutomu Oohashi's 2000 study in the Journal of Neurophysiology. Professor Oohashi is a pretty distinguished Japanese scientist who is also a composer.

http://jn.physiology.org/content/83/6/3548.full#aff-2

Here's the abstract:

"Although it is generally accepted that humans cannot perceive sounds in the frequency range above 20 kHz, the question of whether the existence of such “inaudible” high-frequency components may affect the acoustic perception of audible sounds remains unanswered. In this study, we used noninvasive physiological measurements of brain responses to provide evidence that sounds containing high-frequency components (HFCs) above the audible range significantly affect the brain activity of listeners. We used the gamelan music of Bali, which is extremely rich in HFCs with a nonstationary structure, as a natural sound source, dividing it into two components: an audible low-frequency component (LFC) below 22 kHz and an HFC above 22 kHz. Brain electrical activity and regional cerebral blood flow (rCBF) were measured as markers of neuronal activity while subjects were exposed to sounds with various combinations of LFCs and HFCs. None of the subjects recognized the HFC as sound when it was presented alone. Nevertheless, the power spectra of the alpha frequency range of the spontaneous electroencephalogram (alpha-EEG) recorded from the occipital region increased with statistical significance when the subjects were exposed to sound containing both an HFC and an LFC, compared with an otherwise identical sound from which the HFC was removed (i.e., LFC alone). In contrast, compared with the baseline, no enhancement of alpha-EEG was evident when either an HFC or an LFC was presented separately. Positron emission tomography measurements revealed that, when an HFC andand an LFC were presented together, the rCBF in the brain stem and the left thalamus increased significantly compared with a sound lacking the HFC above 22 kHz but that was otherwise identical. Simultaneous EEG measurements showed that the power of occipital alpha-EEGs correlated significantly with the rCBF in the left thalamus. Psychological evaluation indicated that the subjects felt the sound containing an HFC to be more pleasant than the same sound lacking an HFC. These results suggest the existence of a previously unrecognized response to complex sound containing particular types of high frequencies above the audible range. We term this phenomenon the “hypersonic effect.”

As there are critics of the Meyers/Moran study (results "contaminated by equipment that could not reproduce high resolution, unknown equipment, material that is not high resolution") the Oohashi also takes hits (mostly due to the fact that the results have yet to be replicated).

This whole area of the human auditory function and the brain's response to sound is a fascinating one and far from scientifically "settled."

In the meantime, if Apple would offer downloads that are at least CD quality that would be nice.:)
 
Do you think that I am in the wrong for wanting lossless audio as it is originally created by the music makers?

The music originally created wasnt lossless, it was live.

That aside, you're not wrong for "wanting" - anybody can "want" anything they like whenever they like. And, hey, there's nothing even stopping you fom going out and getting what you want, just the way you like it.

But there is a lot wrong with believing Apple (or anyone else) owes you more than they're already providing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.