Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
FWIW, I think ATV is geared toward the casual viewer, rather than videophiles.

That doesn't answer the question though of why their 1080p files look so inferior to their 720p. I actually think their 720p encodes look pretty decent, superior to cable most of the time. But their 1080p encodes look terrible because they only increased the bitrate 20% for over 2x the amount of pixels.
 
I love the quality of the 1080 stuff I'm seeing on ATV! looks great just like the Netflix SuperHD stuff.

Not Apples fault if Eureka not looking great - they deliver the content for sale/rental on behalf of the label/studio it's up to the content owners to ensure it's encoded by a good house before being submitted to Apple. Some content owners master from a "master" source some just rip from the Blu-Ray/DVD to submit to Apple.

Apple doesn't price either, they're technically just the retailer.

Quit blaming Apple people.

PS: If you think this Eureka looks bad check out Star Trek Voyager and DS9 they were submitted by Paramount to iTunes around 7 years ago and the resolution back then was designed for the iPod video 320x240, check out the previews that's what you get when you "buy" it - it's shocking, thanks Paramount!
 
My 1080p stuff on my Apple TV looks amazing. It's flawless.

Probably your internet
 
Flawless? Overstatement of the year. Check the pics earlier in the thread.

Cool story bro... Remind me to compare screenshots on a computer screen instead of judge is by myself by watching my TV. If I like the quality and think it's flawless to my eyes, then I call it flawless. I watched some random snowboard movie in HD on netflix yesterday and was shocked how good it was.

Sorry you're so caught up in running benchmarks and reading comparisons all day...ha.
 
Agreed - people make a big fuss about the quality difference - I own Prometheus in HD on both iTunes and BluRay - I honestly can't see a big difference - in a blind test I wouldn't know which is which.

As far as 720p iTunes content versus 1080p it seems that AVS forums disagree and express how vastly better it is.

With that said it is experiential and screen grabs will only do so much - we aren't looking at still images but a moving picture show :)
 
Agreed - people make a big fuss about the quality difference - I own Prometheus in HD on both iTunes and BluRay - I honestly can't see a big difference - in a blind test I wouldn't know which is which.

As far as 720p iTunes content versus 1080p it seems that AVS forums disagree and express how vastly better it is.

With that said it is experiential and screen grabs will only do so much - we aren't looking at still images but a moving picture show :)

Question, how big a screen are you comparing it on? It is usually way easier to see the difference when using a good projector displaying 100+" compared to a good tv on 40".
 
I use an Epson projector onto a screen around 150 inches or so - even at that size iTunes provides a similar viewing experience as BluRay - for me at least. Feel free to disagree, but stop trying to convince people that your opinion is right or the only way to go.

I don't try to convince people that iTunes is the only way to go because the alternative is crap (which it can be with the unskipable previews and other hassles). Both are valid viewing options - that is why we have choice in the marketplace.
 
Cool story bro... Remind me to compare screenshots on a computer screen instead of judge is by myself by watching my TV. If I like the quality and think it's flawless to my eyes, then I call it flawless. I watched some random snowboard movie in HD on netflix yesterday and was shocked how good it was.

Sorry you're so caught up in running benchmarks and reading comparisons all day...ha.

I can tell the difference between apples 720p/1080p encodes on my 50" tv from a normal viewing difference. 1080p looks worse for everything besides animation. The good thing about comparison screens is that they prove to people like you which is better. You can't argue against them. I'm not looking at them on a small computer monitor I'm looking at them on my tv. Since you can't/don't want to see the difference that's great for you and apple because it proves that normal consumers just don't give a damn about quality and they can get away with low bitrate encodes without most people caring.

----------

As far as 720p iTunes content versus 1080p it seems that AVS forums disagree and express how vastly better it is.

That AVS thread didn't prove anything. Perhaps you should read it again. Argo preserves much more grain on the 720p and is the most faithful to the Blu-ray, Life of Pi 1080p is slightly more detailed depending on the scene, but it is a clean source with no grain. Skyfall (which you conveniently left out) looked better in 720p. The pictures that were posted in those threads aren't a good measure of quality because he didn't post the uncompressed .png, just cut/cropped/zoomed to a smaller box. Some he even modified them to boost the gamma which you will never do when watching it on your tv.

http://www.avsforum.com/t/1458077/a...itunes-vudu-and-blu-ray/100_100#post_22983623

Update: A thank you AVS members who brought this to my attention. iTunes 720p files can look better than their 1080p equivalents. I had to see for myself and the results are surprising but undeniable. When it comes to fidelity iTunes 720p has the best overall image quality, especially during difficult to render scenes.
 
Last edited:
Yeah, I think the prevailing wisdom is the extra compression on the iTunes 1080p encode is at fault.

It's not the compression method itself, it's the low bitrate used, unless that's what you meant by "compression". In that case I agree, and it's obviously the reason. Bump the bitrate up from 5 to 8 mbps and they would look great. The bitrate should have been doubled in the first place. 1080p has 2.25x more pixels than 720p but they only increased the bitrate 20%. The bitrate is not high enough for 1080p to keep detail in all scenes, especially with motion or film grain.
 
It's not the compression method itself, it's the low bitrate used, unless that's what you meant by "compression". In that case I agree, and it's obviously the reason. Bump the bitrate up from 5 to 8 mbps and they would look great. The bitrate should have been doubled in the first place. 1080p has 2.25x more pixels than 720p but they only increased the bitrate 20%. The bitrate is not high enough for 1080p to keep detail in all scenes, especially with motion or film grain.

Yeah, that's what I meant. They chose to throttle it down for the 1080p stream of Skyfall.
 
I think these well done posts will refute what people are claiming here - very cool posts at AVS:

Argo: iTunes 720p vs iTunes 1080p vs Vudu vs BluRay
http://www.avsforum.com/t/1459687/argo-itunes-vs-vudu-vs-blu-ray

Life of Pi: iTunes 720p vs iTunes 1080p vs Vudu vs BluRay
http://www.avsforum.com/t/1460032/life-of-pi-itunes-vs-vudu

Looking forward to more of these AVS posts!

Thanks! Excellent comparison. Particularly the one at http://www.avsforum.com/content/type/61/id/155081/ - it certainly shows iTunes 720p to have a lead over iTunes 1080p because of the much less aggressive compression. (The original Blu-ray is, of course, far better.) So much for "flawless" iTunes quality.
 
If anyone still thinks apples 1080p itunes encodes are better than the 720p just take a look at this, from True Blood...

http://screenshotcomparison.com/comparison/28741

Wow, just wow. :eek:

I think is silly to compare encodes of single frames of video from different distributors who use different encoding software. Frame 11771 may be encoded different in software A than software B.


You're entire rant would have earned an F in my 6th grade science class. Where is your scientific method?

I'm not in 6th grade anymore either ;)
 
I think is silly to compare encodes of single frames of video from different distributors who use different encoding software. Frame 11771 may be encoded different in software A than software B.


You're entire rant would have earned an F in my 6th grade science class. Where is your scientific method?

I'm not in 6th grade anymore either ;)

Thank you spacepower7!!! It's nice to see an intellectual approach to issues.
 
I think is silly to compare encodes of single frames of video from different distributors who use different encoding software. Frame 11771 may be encoded different in software A than software B.


You're entire rant would have earned an F in my 6th grade science class. Where is your scientific method?

I'm not in 6th grade anymore either ;)
The encoding standards that the distributors use are strictly set by apple. 720p uses High@3.1 profile with 2 ref frames, no CABAC and ~4000 kbps bitrate. 1080p uses High@4.0 profile with 4 ref frames, CABAC, and ~5000 kbps bitrate. If you think that this picture isn't representative of 99% of the encodes that apple sells I don't know what to tell you. Enjoy your 1080p. If anyone would like me to post more examples just ask. I have plenty.
 
The encoding standards that the distributors use are strictly set by apple. 720p uses High@3.1 profile with 2 ref frames, no CABAC and ~4000 kbps bitrate. 1080p uses High@4.0 profile with 4 ref frames, CABAC, and ~5000 kbps bitrate. If you think that this picture isn't representative of 99% of the encodes that apple sells I don't know what to tell you. Enjoy your 1080p. If anyone would like me to post more examples just ask. I have plenty.

I think you missed my point.

Even using the exact same settings, in the same software. Let's pretend the movie studio gives the exact file to Netflix and Apple. Film/Movies have leader, heads and tales, those countdown numbers in old movies.

If Apple or Netflix trim the intro or outro of movies a couple of frames, to save bits and bytes transmitted, the entire compression and individual frames will be compressed in different groups.

I'm not saying one is better than the other, I'm just saying your methodology ( or lack of ) would have failed my 6th grade science class. ( I say that as a student not teacher ) You are comparing single frames from different sources? Different codec settings? Different number of frames?.

Even with the same codec settings, do you know if the files have the same exact number of frames? A different number of frame in the beginning (even black space) could cause frame 1001 in the actual movie to be compressed differently.

I don't care which is better and it really doesn't matter, your analysis still fails.

Your opinion may be valid or may not be valid, but you don't have any science or math to support your rant.
 
Apples 1080p content sucks

Yup, which is why i buy Blu-rays , apple content is fine for something you just want to throw on, but when i want a quality picture (Sharper, less artifacts and better colour), oh and better quality sound, i put on the blu-ray.

Guess there might be a difference between the 4gb 1080p files from apple and the 12-20gb files on the blu-rays after all ;)
 
I think you missed my point.

Even using the exact same settings, in the same software. Let's pretend the movie studio gives the exact file to Netflix and Apple. Film/Movies have leader, heads and tales, those countdown numbers in old movies.

If Apple or Netflix trim the intro or outro of movies a couple of frames, to save bits and bytes transmitted, the entire compression and individual frames will be compressed in different groups.

I'm not saying one is better than the other, I'm just saying your methodology ( or lack of ) would have failed my 6th grade science class. ( I say that as a student not teacher ) You are comparing single frames from different sources? Different codec settings? Different number of frames?.

Even with the same codec settings, do you know if the files have the same exact number of frames? A different number of frame in the beginning (even black space) could cause frame 1001 in the actual movie to be compressed differently.

I don't care which is better and it really doesn't matter, your analysis still fails.

Your opinion may be valid or may not be valid, but you don't have any science or math to support your rant.
It's not a rant. Myself and many others have watched hundreds of itunes shows in both 720p/1080p. The reason why the 1080p looks worse is because the bitrate is too low, it's as simple as that. The 1080p has over 2x as many pixels but only a 20% larger bitrate. It can't hold detail as well in motion. It breaks up. The more film grain there is the worse the 1080p looks. This isn't some hidden fact. People have done comparisons between these shows since they started releasing them in 1080p. It's not just noticeable in screenshots, the difference is visible at a normal viewing distance. At best my point is to educate people to watch the 720p instead if they are going to buy these shows, at worst it's to show people that apple is being disingenuous about the quality of 1080p being better, and requiring you to buy a new apple tv to watch them. If you choose not to look at the facts or don't care that's fine, but it doesn't make it any less true.
 
It's not a rant. Myself and many others have watched hundreds of itunes shows in both 720p/1080p. The reason why the 1080p looks worse is because the bitrate is too low, it's as simple as that. The 1080p has over 2x as many pixels but only a 20% larger bitrate. It can't hold detail as well in motion. It breaks up. The more film grain there is the worse the 1080p looks. This isn't some hidden fact. People have done comparisons between these shows since they started releasing them in 1080p. It's not just noticeable in screenshots, the difference is visible at a normal viewing distance. At best my point is to educate people to watch the 720p instead if they are going to buy these shows, at worst it's to show people that apple is being disingenuous about the quality of 1080p being better, and requiring you to buy a new apple tv to watch them. If you choose not to look at the facts or don't care that's fine, but it doesn't make it any less true.

I'm not arguing if iTunes 1080p sucks, maybe it does? I'm arguing that your screenshots are not scientific proof. You screen shots assume all things being equal which neither you or I know.

You are posting screen shots as facts, science, assuming many factors.

I'm not arguing your opinion, I'm arguing against your assumptions and screen shots bc you can't back them up? Neither of us have all the technical details.

I really don't care if iTunes 1080p is good or not, you just haven't been willing to talk about science, facts, and math.
 
Since i own a macbook retina I have no means to play physical media on my PC. I chosen to go iTunes and have started my movie collection with 50 movies or so and some TV Shows. Now I am thinking about buying my first apple tv , however i have an av. setup. My question is will the apple tv be able connect to yamaha amplifier? (which has hdmi ins and outs), also will the sound and picture be comparable to blu ray. For instance do iTunes movies use codecs like Dolby HD.
 
Since i own a macbook retina I have no means to play physical media on my PC. I chosen to go iTunes and have started my movie collection with 50 movies or so and some TV Shows. Now I am thinking about buying my first apple tv , however i have an av. setup. My question is will the apple tv be able connect to yamaha amplifier? (which has hdmi ins and outs), also will the sound and picture be comparable to blu ray. For instance do iTunes movies use codecs like Dolby HD.
You will be able to connect. You will not have HD audio.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.