# 1080p or 720p? A Quantitative comparison and final answer

Discussion in 'Apple TV and Home Theater' started by anubis, Feb 22, 2008.

1. ### anubis macrumors 6502a

Joined:
Feb 7, 2003
#1
Hey everyone.

I know that this particular topic doesn't have anything to do with the Apple TV per se, but I thought I would share with you my knowledge to help you decide if you should get a 1080p or a 720p TV.

I am getting a Master's Degree in Optical Science from the University of Arizona. I was recently shopping for a new TV and I was faced with a choice that most of you have faced or will face when shopping for a new HDTV: should I save some dough and get a 720p TV, or is the additional cost worth it for a 1080p TV? I am going to present a purely quantitative and scientific way for determining if a 1080p TV is worth it. In fact, below I will list an equation that will tell you once and for all which one you should purchase.

In order to understand the equation, you must understand the following concepts. The 20/20 human eye has a maximum visual acuity of 1 arcminute. That is, two points must subtend an angle greater than 1 arcminute in order for a 20/20 eye to resolve the two points. What this means for TVs is that pixels on a TV must have an angular subtense of 1 arcminute or greater in order for your eye to resolve the detail in the video.

In order to determine the angular subtense of one full pixel (red+green+blue) of a 16:9 television, just use the following two expressions:

To determine the angular subtense in arcminutes for a 720p TV, use 60*arctan(a*0.00005319553/b) where a is the diagonal size of the television in inches and b is the distance you will sit away from the TV in feet. (One caveat: this equation actually assumes 768p TV since this is the actual native resolution for most TVs). Also note that these expressions are in degrees, not radians.

Example: Let's say that I wanted to find the full pixel angular subtense of a "720p" 42-inch LCD and I plan on sitting 7 feet away from the TV. The angle subtended by each full pixel is 1.1 arcminute. This represents a condition that borders on the very edge of the maximum detail that the 20/20 human eye is capable of resolving.

To determine the angular subtense in arcminutes for a 1080p TV, use 60* arctan(a*0.00037827932/b).

Example: Same situation: 42-inch TV, plan on sitting 7 feet away, but this time the TV is 1080p. We find that the full pixel angular subtense is 0.7 arcminutes. This represents a situation where the TV's video resolution is beyond the absolute resolution limits of human vision.

In the example problem provided, spending additional money on a 1080p television is like throwing money down the garbage. Your eye is not capable of resolving the additional video resolution of the 1080p TV in this situation.

Point of interest: although 20/20 vision is often considered "perfect", it is theoretically possible to have 20/6 vision. In order to get this vision, your eye anatomy must be perfect, and your pupil size must be about 4mm. Any larger, and the visual field is dominated by aberrations. Any smaller, and the visual field is diffraction limited with an ever increasing airy disk diameter. In this perfect situation with perfect field irradiance, the maximum resolution of this eye would be 0.4 arcminute. Note that very few people in the world have demonstrated uncorrected 20/6 vision.

Joined:
May 9, 2006

Joined:
Feb 7, 2003
#3
4. ### GreatDrok macrumors 6502a

Joined:
May 1, 2006
Location:
New Zealand
#4
I have a 120" 720p front projection DLP system and sit 15 feet away from it. I cannot see any pixel structure whatsoever. I have been convinced for some time that 1080p is too much for any TV you might actually want to buy. I know people talk about a 50" set being the upper limit for 720p but in my experience even 60" from 7 feet is still going to be too small to justify 1080p.

Don't get me started on 1080i versus 1080p either

5. ### rodolfo macrumors newbie

Joined:
Feb 18, 2008
#5
Great Post and Topic:

I don't understand why people wants to believe and expect that 720p24 fps (4:2:0) is HD, when in reality it only contains less than 40% information of a real HD stream: 1920x1080p24 is HD, 1920x1080i59.97 is HD, 1280x720p60 is HD. 843x480p30 is not HD, 960x540p24 is not HD, 1280x720p24 is not HD, 1280x720p30 is not HD.

Also "I don't understand" why Apple is obscuring the fact that 720p means 720p60 and not 720p24. It is not only about spatial resolution but also about temporal resolution and spectral resolution including chroma information. 720p was created for fast motion content, such as sports (Believe me, you don't want to see sports at 24fps). The funny part is that even 720p60 is only "deployed" on the USA by very few broadcasters, most of the world HD=1080. Another fact of 720p is that most of the TVs "up convert" 720p60 to 1080 and no to 1080p60.

Probably the problem is that an Apple TV machine will not be able to play 1920x1080 H.264 High Profile 4:2:2 @ 15~35Mbps as that little intel solo core box has not a H.264 hardware...

Regarding display sizes: For 720p the maximum display size I recommend is 55 inches. for 1080 the maximum size I have calculated is 103 inches. BUT full HD spectral resolution must be there to have the right experience. (Those sizes are calculated at 21 dpi for 2~3 mts viewing)

6. ### rodolfo macrumors newbie

Joined:
Feb 18, 2008
#6
Render a 1 pixel B&W grid and you will see it... (otherwise you may want to consider to change the lens of your projector)

7. ### Much Ado macrumors 68000

Joined:
Sep 7, 2006
Location:
UK
#7
Is this to say that a 1080p television will display an image with the same effective resolution as a real object placed next to it (from a distance)? I find this hard to comprehend.

8. ### HiRez macrumors 603

Joined:
Jan 6, 2004
Location:
Western US
#8
I can believe it. A good television with great source material from a decent distance produces very very sharp images, but of course the process of an emissive television cannot come even close to competing with the contrast ratio you will get from a real object in front of you, and there is no depth perception in play.

9. ### petvas macrumors 601

Joined:
Jul 20, 2006
Location:
Mannheim, Germany
#9
720p24fps is HD....
http://en.wikipedia.org/wiki/720p

10. ### Mackilroy macrumors 68040

Joined:
Jun 29, 2006
#10
720p isn't HD to people who believe Sony never lies and that the only way to get full use out of your TV is to have 1080p.

'Full HD' is just a marketing term, not reality.

11. ### Zwhaler macrumors 604

Joined:
Jun 10, 2006
#11
I read your post, and I thought that was interesting about the different eye quality... I myself have 20/15 or 15/20... whichever is better than 20/20) and I didn't think that I couldn't distinguish quality that much better, but I guess that I can at least a little bit based on the conclusion I came up with.

I have a 40" Samsung 1080p LCD TV, and I can't say that I entirely need all of the pixels, but I gotta admit that the quality really stands out when it comes to games, and Blu-Ray Disks. My friend had a 42" 720p LCD, and after comparing them we concluded that the 1080p makes a big difference for gaming and BRD. On games like Call of Duty 4, everything looks more realistic at 1080p. This doesn't mean that 720p is bad, but as everything slowly begins to support 1080p, I think it is worth it to spend the few hundred dollars extra. But keep in mind that this is ~8 ft viewing range. (My TV only cost \$1600 new off Amazon)

12. ### rodolfo macrumors newbie

Joined:
Feb 18, 2008
#12
ATSC allows 720p24 (and 16:9 480p) for interoperability reasons, that doesn't means that 720p24 is HD. The goal of HD was to obtain at least 4 times better quality than SD video.

13. ### motulist macrumors 601

Joined:
Dec 2, 2003
#13
Just because a difference is clearly discernible does not mean that the difference is significant.

For instance, If I'm doing great in a college class and I only need to get a 50% on my final exam in order to receive an A+ grade in the class, then even though I can clearly discern whether I got a 50 or a 100 on my final exam, the difference is not significant, because both achieve the same goal equally. It's the same with some people's opinion on HD.

To me, and many others, we can clearly see that HD is sharper than SD, but seeing that extra detail doesn't make us more entertained.

14. ### elppa macrumors 68040

Joined:
Nov 26, 2003
#14
And of course all these arguments are mute if you are watching bad content or poorly filmed pictures.

Joined:
May 18, 2007
#15

16. ### err404 macrumors 68020

Joined:
Mar 4, 2007
#16
While this is a partially true argument for recorded video content. It does not hold up for real-time content like text or gaming. The problem is the margin of error on rendered material leads to artifacts like the aliased jagged edged on diagonal lines. A 1080P realtime source is significantly smoother than a 720P source. Basically the extra resolution allows your eyes to carry some of the load by averaging out the errors.

But even for video, your own calculation show that 720P is lower quality then what your eye can see while 1080P is better then you can see. The math of your example reinforces that 1080P is visibly better then 720P on a 42" screen at 7'. (720P<Perfect<1080P)

As for the 60fps vs 24fps. At the end of the day recorded video content is nearly exclusively at <30 fps.

17. ### HiRez macrumors 603

Joined:
Jan 6, 2004
Location:
Western US
#17
And feature films are nearly exclusively still recorded at 24 fps. With good source material and encoding, on a quality, calibrated screen, 720p is going to look darn good. And the benefits of progressive scanning will partially offset the resolution advantage over 1080i at least. That, and the fact that you're not dealing with analog transmission degredation, make it far superior to NTSC analog SD, and for 24 fps sources you eliminate 3:2 pulldown artifacts if your tv supports it natively. I don't think Apple is trying to hide or get away with anything, as some would suggest.

18. ### fivepoint macrumors 65816

Joined:
Sep 28, 2007
Location:
IOWA
#18

To the threadstarter, remember that distance from the source is equally important. Screen size, resolution, distance from source are all equally important. This chart convinced me to save \$1000 and get the 720P Panasonic TH-50PX75U instead of the 1080P Panasonic TH-50PZ700U. With a 50 inch screen, and sitting 10 feet away, it is apparently 'impossible' for me to see any difference at all.

19. ### Spanky Deluxe macrumors 601

Joined:
Mar 17, 2005
Location:
London, UK
#19
I've got a 1080p 46" screen and my normal viewing distance is 10 feet. Its true, I can't really tell any difference between 720p and 1080p content at this distance. Sometimes when I want a real HD experience, I'll sit on the floor about 5 feet away from the TV and here I can definitely tell the difference. Considering 1080p TVs don't seem to cost much more than 720p ones these days, I'd always go for 1080p. Sure you won't tell the difference in most viewing conditions but if you *do* ever sit nearer, you will notice.

20. ### fivepoint macrumors 65816

Joined:
Sep 28, 2007
Location:
IOWA
#20
You must be kidding about the price difference. From what I've found, the EXACT same TVs with the only difference being the resolution (1080 vs 720) the price difference is anywhere from 70-100%. If you are looking at a \$1300 720, the same tv with a 1080 resolution will be around 2300 or so.

21. ### hazybluedot macrumors newbie

Joined:
Feb 13, 2008
#21
I kid you not

perhaps for some TVs there is a large price difference but not for all, take this selection:

Westinghouse SK-42H240S 42" LCD HDTV (720p) \$1,049.90
Westinghouse TX-42F430S 42" 1080p LCD HDTV (1080p) \$987.88
Westinghouse VM-42F140S 42" LCD (1080p) \$1300

All prices just taken off of Amazon.
I'm not quite sure why the second one in the list is cheaper than the first, but Amazon says so. Anyway, the point is that at least for this set of screens the price/resolution gap is relatively small. I only did my search for Westinghouse because I happen to own one and I remember having to make the same 720p vs. 1080p decision and recalled that the price difference wasn't terribly large. Obviously this is somewhat of an apples to oranges comparison as you also have to take into account features and quality of the screen. A cheap 1080p LCD might look worse than a high quality 720p in terms of color, brightness, etc.

22. ### fivepoint macrumors 65816

Joined:
Sep 28, 2007
Location:
IOWA
#22
Well, certainly you are right that the price gaps are different between models/manufacturers... but the models you quoted have many other differences than just resolution. You need to compare models of the same FAMILY against eachother.

Screen resolution is only one of many comparison factors between televisions. It isn't even extremely high on the list in my opinions. Something as simple as constrast ration has a larger affect on image quality most of the time.

23. ### MACsimus19 macrumors member

Joined:
Jun 18, 2007
#23
We'll i've got 20/10 vision... so there! I've also got a 1080p plasma and I only sit 8 feet away from the TV.

I think that my ability to resolve that detail makes it a NECESITY!!! I don't care if my friends and loved ones can't see SDE. If I know its there, 1080p is the only way to go.

For those of you that use your TV as a secondary monitor, the extra resolution on a 1080p set is fantastic.

24. ### err404 macrumors 68020

Joined:
Mar 4, 2007
#24
On a 1080p screen, 1080i content <30fps is displayed progressive, not interlaced.

My point was just that the math shows that 1080 is better at on a 42" display at 7'. You don't need to get 'full benefit' from 1080 for it to be better. Your conditions only need to exceed the 'full benefit' of 720.

If he is going to use math to prove a point, his numbers need to be bare that result.

For the record, I think that Apple that doing the correct thing by not supporting 1080. It's an internet distribution device and for most people their connection isn't able to transmit 1080 size files in realtime.

25. ### HiRez macrumors 603

Joined:
Jan 6, 2004
Location:
Western US
#25
It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell?