Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jasonnorm

macrumors regular
Original poster
Aug 10, 2007
112
0
Milford, MA
With Steve's statements at the keynote regarding HD rentals via AppleTV, I was very excited and hopefully that the full 720p capabilities of the AppleTV would be utilized. Based on this article, I am now pretty skeptical at what we might actually be getting.

I currently have Comcast and I know they compress the heck out of my signal (both analog and digital and Comcast's blasphemy of HD) so I am hopefull that Apple's HD will at least match and hopefully better Comcast's. Now, I am concerned it will not...

The comparison chart from the article below doesn't give much hope for high quality "HD" rentals.

Any 'videophiles' out there with superior knowledge of how HD and compression works want to chime in and provide some comfort (or confirmation of my fears) to all the anxious AppleTV owners?
 

Attachments

  • HD_Source_Comparison.JPG
    HD_Source_Comparison.JPG
    58 KB · Views: 3,563

wmealer

macrumors regular
May 7, 2006
173
0
Not as bad as it looks

This chart is disturbing, indeed. I am an :apple:tv owner, as well as a DishHD subscriber.

One thing that is a little misleading is the fact that 720p video doesn't need as high a bitrate as Dish's 1440x1080 picture size, or DTV's 1280x1080. So if you do the math, Dish's picture size is 1.6875 times iTunes's upcoming 720p picture size (1440 x 1080 = 1,555,200 and 1280 x 720 = 921,600 and 1,555,200 / 921,600 = 1.6875).

The fact that iTunes's bitrate is just under half that of Dish, leads me to believe iTunes rentals will be comparable quality to my satellite picture.

And in my experience with converting mkv downloads to 5Mbps mp4 files for :apple:tv, that theory holds up. My 720p mp4 conversions are consistently equal or better than my satellite picture quality.

I won't be doing any renting anyway, but I just wanted to offer my $0.02.
 

Avatar74

macrumors 68000
Feb 5, 2007
1,608
402
I've done both video and audio post...

The thing is... even an ATSC MPEG-2 stream is highly compressed. 4:4:4:4 Uncompressed HD is going to bitstream at least 275 Mbps, more than ten times the bitrate of MPEG-2 ATSC that you've cited in the chart (I'm not sure it's accurate off the top of my head but let's assume it is close).

H.264 is a very different paradigm from MPEG-2. The compression algorithms for MPEG-4, Part 10 (aka H.264) are drastically different from MPEG-2. They can achieve similar rates of mitigated artifaction with much less data to reconstruct the original signal from.

The thing you have to keep in mind is that intelligent compression algorithms aren't merely just chopping out data and playing back a half-ass copy.

Let me use a couple of examples to illustrate. The reason I'm using audio examples is because they're much easier to break down, but the principles of digital encoding are the same:

Linear PCM is an uncompressed audio format. At 16 bits per channel, in stereo (2 channels), with a sampling frequency of 44.1kHz, CD audio has a bitstream of 1536 Kbps.

Now even at this rate there's data missing because it's not an exact analogue of the source waveform... It's a series of amplitude value samples at a regular quantization interval, stored ultimately as binary data. But that's not what we hear. What we hear is reasonably identical to the analog source because of the process of reconstruction.

People often make the mistake of thinking that artifacts are a consequence of digital encoding. They are not. They are a consequence of errors that occur upon reconstruction of the analog wave from the digital source. You have to remember that all this data is converted into a series of voltage oscillations that reproduce an analogue wave through a loudspeaker.

I could write a book about the various things that happen in a digital system during reconstruction of the analogue wave... in fact a book has already been written on that subject, Principles of Digital Audio by Ken Pohlmann.

There's a few things... dithering which introduces almost imperceptible noise into the signal to ensure that amplitude values don't fall into the wrong "bucket". There's a low-pass filter at 20kHz to cut out frequencies above the Nyquist limit, because the sampling frequency chosen is designed to represent only those frequencies we can hear... anything above that would cause frequency aliasing which is not, contrary to popular perception, jaggedness in the signal but recreation of a different frequency than the original. (Ping me in IM if you really want to hear the longer explanation of this)

Then there's the digital data and transport medium itself... which inherently possess error checking data, buffering and reclocking of the carrier frequency to eliminate "jitter" (audiophiles LOVE to claim lesser systems today have jitter but reclocking has been present in just about every Digital to Analog converter since the mid-1980s so this is utter nonsense).

Ok, now let's take a look at another format for comparison...

Adaptive Delta PCM... It takes up about half the data of Linear PCM bitstreams but it's lossless. How is this possible? Well, where Linear PCM stores an absolute numerical value for the amplitude level of the audio at each quantization interval, ADPCM only stores the relative value of the change (delta) in amplitude value from the previous sample. This greatly reduces the required data to reconstruct the exact same soundwave with no loss.

Other encoding algorithms go even further. Dolby Digital, for example, uses a series of techniques to squeeze phenomenal performance out of a 448Kbps bitstream that is acoustically transparent to the source. In Dolby Digital's case, there's not only a lowpass filter at 20kHz to eliminate aliasing, but there are RF intermodulation and DC hum filters to cut out signal interference from the recording/mastering/encoding hardware... there are also metadata parameters that tell the receiver to normalize the output to an average loudness such that the dialogue is audible no matter what the sound effects are doing, and Dynamic Range Compression to tell the receiver when and where to compress the signal to maximize the dynamic range at any given moment while minimizing distortion. There's also more than likely some form of quantization throttling (like ADPCM) where the size of the sample at each interval is not fixed but instead varies to only the maximum bit-depth required at any given interval to reconstruct that sample faithfully... and so on and so forth.

H.264 is one of the most advanced video codecs out there, considerably more advanced than MPEG-2 and capable of delivering similar performance at much lower bandwidth. I wont say flawless because MPEG-2 isn't flawless either... optically-scanned 16-bit RAW HDRI is but good luck putting together an affordable system that can decode and project film at about 960 megaBYTES per second. Also, as the poster above stated, AppleTV is using 720p instead of 1080p... there's a HUGE difference in bitstream requirements because there are about six times the pixels and about four times the color depth. Entirely unnecessary margin for most home viewing as most people cannot tell the difference.

Is it going to be as nice as BluRay? Probably not... but I wouldn't think that the issues are going to be that noticeable unless you have a very large screen. Suffice it to say that 250 Mbps 2k digital theatrical projection (2048x1150) is good enough for 50 foot screens according to the Society of Motion Picture and Television Engineers (SMPTE) who are currently proposing it as the standard for digital theatrical projection systems even though 4k and 2540p formats are higher resolutions, and 35mm film scanned optically in 16-bit per channel uncompressed HDRI runs about a gigaBYTE per second of footage.

Despite what the audiophiles say, I have yet to meet a person who can pass a blind test to tell the difference between 128Kbps MPEG-4 AAC versus 16-bit Linear PCM. I engineer stuff professionally and I can't tell the difference. I would suspect the same holds true for H.264 at certain screen sizes. I have not seen double-blind test results only anecdotes where the subject clearly knew which sample they were watching and thus biased the results.
 

wmealer

macrumors regular
May 7, 2006
173
0
Avatar74, you really should bone up on your A/V before you go spewing your obvious ignorance on the matter. ;)

All I can say is... wow.

But to further illustrate how much more advanced H.264 is than MPEG-2, I can offer up my real-world, layperson's experience. DishHD likely gives the same bitrate to each of their channels, whether they be encoded in MPEG-2 or H.264. If we can assume that, we can clearly distinguish the H.264 channels from the MPEG-2 channels. The quality difference is as obvious as night to day.
 

wmealer

macrumors regular
May 7, 2006
173
0
Are you a robot? Seriously. My response was so downright absurd, anyone with a pulse would know it was a joke.

Dude, get some sunlight every now and then. You'd be surprised there's an entire world out there you never knew existed. If you'd take a minute and read the latter half of my post, you just might recognize that I was AGREEING with you. That is of course, you are in fact an inanimate object send from a faraway planet to educate Earthlings on all things technical.

Anyway, so much for friendly camaraderie among enthusiasts. Let's all just revert to being so literal and humorless that we can't crack a smile every now and then.

SHEESH!
 

jasonnorm

macrumors regular
Original poster
Aug 10, 2007
112
0
Milford, MA
Wow. That was impressive indeed. Thanks for that breakdown and the real world dish experience from the other posters is also encouraging.

I guess I will leave it up to my eyes when it comes out on how I feel the quality compares to my current crappy Comcast HD signal.
 

Avatar74

macrumors 68000
Feb 5, 2007
1,608
402
Are you a robot? Seriously. My response was so downright absurd, anyone with a pulse would know it was a joke.

Dude, get some sunlight every now and then. You'd be surprised there's an entire world out there you never knew existed. If you'd take a minute and read the latter half of my post, you just might recognize that I was AGREEING with you. That is of course, you are in fact an inanimate object send from a faraway planet to educate Earthlings on all things technical.

Anyway, so much for friendly camaraderie among enthusiasts. Let's all just revert to being so literal and humorless that we can't crack a smile every now and then.

SHEESH!

I've deleted my response to you because of an obvious lack of sarcasm comprehension on my part. I know it's a flaw I possess... my wife and friends tell me I'm too serious.

Rest assured I do get out... sometimes. I just have a habit of being a repository for loads of technical minutiae found to be ostensibly useless to the general population. Sure, I'm smiling in my profile image... after about five takes with no smile and my wife mocking my serious and calling me a "douchebag" before I cracked up.

Suffice it to say I *was* rather puzzled why you would appear to be arguing with my technical knowledge and then appear to be agreeing with me anyway... I'm analytical to a fault. I'd back track to every place I last saw my wallet before thinking to just check my back pocket first.

:D
 

Avatar74

macrumors 68000
Feb 5, 2007
1,608
402
Three cheers for humanity!

Hip-hip hurray!

Hip-hip hurray!

Hip-hip hurray!

I feel like Jeff Goldblum and Will Smith at the end of Independence Day.

Yeah but in Independence Day the aliens don't go willingly. *rimshot*
 

wmealer

macrumors regular
May 7, 2006
173
0
I've deleted my response to you because of an obvious lack of sarcasm comprehension on my part. I know it's a flaw I possess... my wife and friends tell me I'm too serious.

Rest assured I do get out... sometimes. I just have a habit of being a repository for loads of technical minutiae found to be ostensibly useless to the general population. Sure, I'm smiling in my profile image... after about five takes with no smile and my wife mocking my serious and calling me a "douchebag" before I cracked up.

Suffice it to say I *was* rather puzzled why you would appear to be arguing with my technical knowledge and then appear to be agreeing with me anyway... I'm analytical to a fault. I'd back track to every place I last saw my wallet before thinking to just check my back pocket first.

:D
I can assure you, if there was an emoticon available for "bowing at one's feet," I would have used that one.

I appreciate the retraction. Note to self: "Sarcasm, Avatar74, and calling into question his technical knowledge = bad. You can only have 2 of these 3."

My work is done here. Nothing more to see. Move along, folks.
 

Blazer5913

macrumors 6502
Jan 20, 2004
386
14
This chart is disturbing, indeed. I am an :apple:tv owner, as well as a DishHD subscriber.

One thing that is a little misleading is the fact that 720p video doesn't need as high a bitrate as Dish's 1440x1080 picture size, or DTV's 1280x1080. So if you do the math, Dish's picture size is 1.6875 times iTunes's upcoming 720p picture size (1440 x 1080 = 1,555,200 and 1280 x 720 = 921,600 and 1,555,200 / 921,600 = 1.6875).

The fact that iTunes's bitrate is just under half that of Dish, leads me to believe iTunes rentals will be comparable quality to my satellite picture.

And in my experience with converting mkv downloads to 5Mbps mp4 files for :apple:tv, that theory holds up. My 720p mp4 conversions are consistently equal or better than my satellite picture quality.

I won't be doing any renting anyway, but I just wanted to offer my $0.02.

I am in the process of ripping my 720p mkv movies for the AppleTV as well. Currently, I use Visual Hub and select the AppleTV preset, with 2-pass encoding. But based on your results, would you recommend upping the bitrate to the absolute max it can be, 5mbps? I mean, I really don't mind using the extra space if its visibly worth it. How large are you resulting files usually?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.