PDA

View Full Version : 1080p or 720p? A Quantitative comparison and final answer




anubis
Feb 23, 2008, 12:10 AM
Hey everyone.

I know that this particular topic doesn't have anything to do with the Apple TV per se, but I thought I would share with you my knowledge to help you decide if you should get a 1080p or a 720p TV.

I am getting a Master's Degree in Optical Science from the University of Arizona. I was recently shopping for a new TV and I was faced with a choice that most of you have faced or will face when shopping for a new HDTV: should I save some dough and get a 720p TV, or is the additional cost worth it for a 1080p TV? I am going to present a purely quantitative and scientific way for determining if a 1080p TV is worth it. In fact, below I will list an equation that will tell you once and for all which one you should purchase.

In order to understand the equation, you must understand the following concepts. The 20/20 human eye has a maximum visual acuity of 1 arcminute. That is, two points must subtend an angle greater than 1 arcminute in order for a 20/20 eye to resolve the two points. What this means for TVs is that pixels on a TV must have an angular subtense of 1 arcminute or greater in order for your eye to resolve the detail in the video.

In order to determine the angular subtense of one full pixel (red+green+blue) of a 16:9 television, just use the following two expressions:

To determine the angular subtense in arcminutes for a 720p TV, use 60*arctan(a*0.00005319553/b) where a is the diagonal size of the television in inches and b is the distance you will sit away from the TV in feet. (One caveat: this equation actually assumes 768p TV since this is the actual native resolution for most TVs). Also note that these expressions are in degrees, not radians.

Example: Let's say that I wanted to find the full pixel angular subtense of a "720p" 42-inch LCD and I plan on sitting 7 feet away from the TV. The angle subtended by each full pixel is 1.1 arcminute. This represents a condition that borders on the very edge of the maximum detail that the 20/20 human eye is capable of resolving.

To determine the angular subtense in arcminutes for a 1080p TV, use 60* arctan(a*0.00037827932/b).

Example: Same situation: 42-inch TV, plan on sitting 7 feet away, but this time the TV is 1080p. We find that the full pixel angular subtense is 0.7 arcminutes. This represents a situation where the TV's video resolution is beyond the absolute resolution limits of human vision.

In the example problem provided, spending additional money on a 1080p television is like throwing money down the garbage. Your eye is not capable of resolving the additional video resolution of the 1080p TV in this situation.

Point of interest: although 20/20 vision is often considered "perfect", it is theoretically possible to have 20/6 vision. In order to get this vision, your eye anatomy must be perfect, and your pupil size must be about 4mm. Any larger, and the visual field is dominated by aberrations. Any smaller, and the visual field is diffraction limited with an ever increasing airy disk diameter. In this perfect situation with perfect field irradiance, the maximum resolution of this eye would be 0.4 arcminute. Note that very few people in the world have demonstrated uncorrected 20/6 vision.



richpjr
Feb 23, 2008, 12:36 AM
http://www.avsforum.com/avs-vb/showthread.php?t=768167

anubis
Feb 23, 2008, 01:04 AM
http://www.avsforum.com/avs-vb/showthread.php?t=768167

Rich... thanks for that post...

Thought people would like to know why this is true and understand the science and the math behind it.

GreatDrok
Feb 23, 2008, 01:11 AM
I have a 120" 720p front projection DLP system and sit 15 feet away from it. I cannot see any pixel structure whatsoever. I have been convinced for some time that 1080p is too much for any TV you might actually want to buy. I know people talk about a 50" set being the upper limit for 720p but in my experience even 60" from 7 feet is still going to be too small to justify 1080p.

Don't get me started on 1080i versus 1080p either :-)

rodolfo
Feb 23, 2008, 03:51 AM
Hey everyone.

I know that this particular topic doesn't have anything to do with the Apple TV per se, but I thought I would share with you my knowledge to help you decide if you should get a 1080p or a 720p TV.

Great Post and Topic:

I don't understand why people wants to believe and expect that 720p24 fps (4:2:0) is HD, when in reality it only contains less than 40% information of a real HD stream: 1920x1080p24 is HD, 1920x1080i59.97 is HD, 1280x720p60 is HD. 843x480p30 is not HD, 960x540p24 is not HD, 1280x720p24 is not HD, 1280x720p30 is not HD.

Also "I don't understand" why Apple is obscuring the fact that 720p means 720p60 and not 720p24. It is not only about spatial resolution but also about temporal resolution and spectral resolution including chroma information. 720p was created for fast motion content, such as sports (Believe me, you don't want to see sports at 24fps). The funny part is that even 720p60 is only "deployed" on the USA by very few broadcasters, most of the world HD=1080. Another fact of 720p is that most of the TVs "up convert" 720p60 to 1080 and no to 1080p60.

Probably the problem is that an Apple TV machine will not be able to play 1920x1080 H.264 High Profile 4:2:2 @ 15~35Mbps as that little intel solo core box has not a H.264 hardware...

Regarding display sizes: For 720p the maximum display size I recommend is 55 inches. for 1080 the maximum size I have calculated is 103 inches. BUT full HD spectral resolution must be there to have the right experience. (Those sizes are calculated at 21 dpi for 2~3 mts viewing)

rodolfo
Feb 23, 2008, 03:57 AM
I have a 120" 720p front projection DLP system and sit 15 feet away from it. I cannot see any pixel structure whatsoever.

Render a 1 pixel B&W grid and you will see it... ;) (otherwise you may want to consider to change the lens of your projector)

Much Ado
Feb 23, 2008, 04:16 AM
The 20/20 human eye has a maximum visual acuity of 1 arcminute. That is, two points must subtend an angle greater than 1 arcminute in order for a 20/20 eye to resolve the two points. What this means for TVs is that pixels on a TV must have an angular subtense of 1 arcminute or greater in order for your eye to resolve the detail in the video.

Is this to say that a 1080p television will display an image with the same effective resolution as a real object placed next to it (from a distance)? I find this hard to comprehend.

HiRez
Feb 23, 2008, 04:56 AM
Is this to say that a 1080p television will display an image with the same effective resolution as a real object placed next to it (from a distance)? I find this hard to comprehend.I can believe it. A good television with great source material from a decent distance produces very very sharp images, but of course the process of an emissive television cannot come even close to competing with the contrast ratio you will get from a real object in front of you, and there is no depth perception in play.

petvas
Feb 23, 2008, 06:24 AM
Great Post and Topic:

I don't understand why people wants to believe and expect that 720p24 fps (4:2:0) is HD, when in reality it only contains less than 40% information of a real HD stream: 1920x1080p24 is HD, 1920x1080i59.97 is HD, 1280x720p60 is HD. 843x480p30 is not HD, 960x540p24 is not HD, 1280x720p24 is not HD, 1280x720p30 is not HD.

Also "I don't understand" why Apple is obscuring the fact that 720p means 720p60 and not 720p24. It is not only about spatial resolution but also about temporal resolution and spectral resolution including chroma information. 720p was created for fast motion content, such as sports (Believe me, you don't want to see sports at 24fps). The funny part is that even 720p60 is only "deployed" on the USA by very few broadcasters, most of the world HD=1080. Another fact of 720p is that most of the TVs "up convert" 720p60 to 1080 and no to 1080p60.

Probably the problem is that an Apple TV machine will not be able to play 1920x1080 H.264 High Profile 4:2:2 @ 15~35Mbps as that little intel solo core box has not a H.264 hardware...

Regarding display sizes: For 720p the maximum display size I recommend is 55 inches. for 1080 the maximum size I have calculated is 103 inches. BUT full HD spectral resolution must be there to have the right experience. (Those sizes are calculated at 21 dpi for 2~3 mts viewing)

720p24fps is HD....
http://en.wikipedia.org/wiki/720p

Mackilroy
Feb 23, 2008, 11:13 AM
720p isn't HD to people who believe Sony never lies and that the only way to get full use out of your TV is to have 1080p. ;)

'Full HD' is just a marketing term, not reality.

Zwhaler
Feb 23, 2008, 01:47 PM
I read your post, and I thought that was interesting about the different eye quality... I myself have 20/15 or 15/20... whichever is better than 20/20) and I didn't think that I couldn't distinguish quality that much better, but I guess that I can at least a little bit based on the conclusion I came up with.

I have a 40" Samsung 1080p LCD TV, and I can't say that I entirely need all of the pixels, but I gotta admit that the quality really stands out when it comes to games, and Blu-Ray Disks. My friend had a 42" 720p LCD, and after comparing them we concluded that the 1080p makes a big difference for gaming and BRD. On games like Call of Duty 4, everything looks more realistic at 1080p. This doesn't mean that 720p is bad, but as everything slowly begins to support 1080p, I think it is worth it to spend the few hundred dollars extra. But keep in mind that this is ~8 ft viewing range. (My TV only cost $1600 new off Amazon)

rodolfo
Feb 24, 2008, 04:25 AM
720p24fps is HD....
http://en.wikipedia.org/wiki/720p

ATSC allows 720p24 (and 16:9 480p) for interoperability reasons, that doesn't means that 720p24 is HD. The goal of HD was to obtain at least 4 times better quality than SD video.

motulist
Feb 24, 2008, 04:36 AM
Just because a difference is clearly discernible does not mean that the difference is significant.

For instance, If I'm doing great in a college class and I only need to get a 50% on my final exam in order to receive an A+ grade in the class, then even though I can clearly discern whether I got a 50 or a 100 on my final exam, the difference is not significant, because both achieve the same goal equally. It's the same with some people's opinion on HD.

To me, and many others, we can clearly see that HD is sharper than SD, but seeing that extra detail doesn't make us more entertained.

elppa
Feb 24, 2008, 05:23 AM
And of course all these arguments are mute if you are watching bad content or poorly filmed pictures. :D

Maui
Feb 24, 2008, 06:21 AM
And of course all these arguments are mute if you are watching bad content or poorly filmed pictures. :D

Well then un-mute your tv!

err404
Feb 24, 2008, 07:39 PM
While this is a partially true argument for recorded video content. It does not hold up for real-time content like text or gaming. The problem is the margin of error on rendered material leads to artifacts like the aliased jagged edged on diagonal lines. A 1080P realtime source is significantly smoother than a 720P source. Basically the extra resolution allows your eyes to carry some of the load by averaging out the errors.

But even for video, your own calculation show that 720P is lower quality then what your eye can see while 1080P is better then you can see. The math of your example reinforces that 1080P is visibly better then 720P on a 42" screen at 7'. (720P<Perfect<1080P)

As for the 60fps vs 24fps. At the end of the day recorded video content is nearly exclusively at <30 fps.

HiRez
Feb 25, 2008, 03:44 AM
As for the 60fps vs 24fps. At the end of the day recorded video content is nearly exclusively at <30 fps.And feature films are nearly exclusively still recorded at 24 fps. With good source material and encoding, on a quality, calibrated screen, 720p is going to look darn good. And the benefits of progressive scanning will partially offset the resolution advantage over 1080i at least. That, and the fact that you're not dealing with analog transmission degredation, make it far superior to NTSC analog SD, and for 24 fps sources you eliminate 3:2 pulldown artifacts if your tv supports it natively. I don't think Apple is trying to hide or get away with anything, as some would suggest.

fivepoint
Feb 25, 2008, 08:32 AM
http://davidlenihan.com/WindowsLiveWriter/HDDVDvs.DVD.Fight_13EA9/resolution_chartfull.png


To the threadstarter, remember that distance from the source is equally important. Screen size, resolution, distance from source are all equally important. This chart convinced me to save $1000 and get the 720P Panasonic TH-50PX75U instead of the 1080P Panasonic TH-50PZ700U. With a 50 inch screen, and sitting 10 feet away, it is apparently 'impossible' for me to see any difference at all.

Spanky Deluxe
Feb 25, 2008, 09:50 AM
I've got a 1080p 46" screen and my normal viewing distance is 10 feet. Its true, I can't really tell any difference between 720p and 1080p content at this distance. Sometimes when I want a real HD experience, I'll sit on the floor about 5 feet away from the TV and here I can definitely tell the difference. Considering 1080p TVs don't seem to cost much more than 720p ones these days, I'd always go for 1080p. Sure you won't tell the difference in most viewing conditions but if you *do* ever sit nearer, you will notice.

fivepoint
Feb 25, 2008, 10:13 AM
I've got a 1080p 46" screen and my normal viewing distance is 10 feet. Its true, I can't really tell any difference between 720p and 1080p content at this distance. Sometimes when I want a real HD experience, I'll sit on the floor about 5 feet away from the TV and here I can definitely tell the difference. Considering 1080p TVs don't seem to cost much more than 720p ones these days, I'd always go for 1080p. Sure you won't tell the difference in most viewing conditions but if you *do* ever sit nearer, you will notice.

You must be kidding about the price difference. From what I've found, the EXACT same TVs with the only difference being the resolution (1080 vs 720) the price difference is anywhere from 70-100%. If you are looking at a $1300 720, the same tv with a 1080 resolution will be around 2300 or so.

hazybluedot
Feb 25, 2008, 02:03 PM
perhaps for some TVs there is a large price difference but not for all, take this selection:

Westinghouse SK-42H240S 42" LCD HDTV (720p) $1,049.90
Westinghouse TX-42F430S 42" 1080p LCD HDTV (1080p) $987.88
Westinghouse VM-42F140S 42" LCD (1080p) $1300

All prices just taken off of Amazon.
I'm not quite sure why the second one in the list is cheaper than the first, but Amazon says so. Anyway, the point is that at least for this set of screens the price/resolution gap is relatively small. I only did my search for Westinghouse because I happen to own one and I remember having to make the same 720p vs. 1080p decision and recalled that the price difference wasn't terribly large. Obviously this is somewhat of an apples to oranges comparison as you also have to take into account features and quality of the screen. A cheap 1080p LCD might look worse than a high quality 720p in terms of color, brightness, etc.

fivepoint
Feb 25, 2008, 02:06 PM
perhaps for some TVs there is a large price difference but not for all, take this selection:

Westinghouse SK-42H240S 42" LCD HDTV (720p) $1,049.90
Westinghouse TX-42F430S 42" 1080p LCD HDTV (1080p) $987.88
Westinghouse VM-42F140S 42" LCD (1080p) $1300

All prices just taken off of Amazon.
I'm not quite sure why the second one in the list is cheaper than the first, but Amazon says so. Anyway, the point is that at least for this set of screens the price/resolution gap is relatively small. I only did my search for Westinghouse because I happen to own one and I remember having to make the same 720p vs. 1080p decision and recalled that the price difference wasn't terribly large. Obviously this is somewhat of an apples to oranges comparison as you also have to take into account features and quality of the screen. A cheap 1080p LCD might look worse than a high quality 720p in terms of color, brightness, etc.

Well, certainly you are right that the price gaps are different between models/manufacturers... but the models you quoted have many other differences than just resolution. You need to compare models of the same FAMILY against eachother.

Screen resolution is only one of many comparison factors between televisions. It isn't even extremely high on the list in my opinions. Something as simple as constrast ration has a larger affect on image quality most of the time.

MACsimus19
Feb 25, 2008, 05:16 PM
We'll i've got 20/10 vision... so there! I've also got a 1080p plasma and I only sit 8 feet away from the TV.

I think that my ability to resolve that detail makes it a NECESITY!!! I don't care if my friends and loved ones can't see SDE. If I know its there, 1080p is the only way to go.

For those of you that use your TV as a secondary monitor, the extra resolution on a 1080p set is fantastic.

err404
Feb 25, 2008, 06:49 PM
And feature films are nearly exclusively still recorded at 24 fps. With good source material and encoding, on a quality, calibrated screen, 720p is going to look darn good. And the benefits of progressive scanning will partially offset the resolution advantage over 1080i at least. That, and the fact that you're not dealing with analog transmission degredation, make it far superior to NTSC analog SD, and for 24 fps sources you eliminate 3:2 pulldown artifacts if your tv supports it natively. I don't think Apple is trying to hide or get away with anything, as some would suggest.

On a 1080p screen, 1080i content <30fps is displayed progressive, not interlaced. ;)

My point was just that the math shows that 1080 is better at on a 42" display at 7'. You don't need to get 'full benefit' from 1080 for it to be better. Your conditions only need to exceed the 'full benefit' of 720.

If he is going to use math to prove a point, his numbers need to be bare that result.

For the record, I think that Apple that doing the correct thing by not supporting 1080. It's an internet distribution device and for most people their connection isn't able to transmit 1080 size files in realtime.

HiRez
Feb 25, 2008, 07:08 PM
On a 1080p screen, 1080i content <30fps is displayed progressive, not interlaced. ;)It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell? :p

err404
Feb 25, 2008, 08:58 PM
It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell? :p

Sorry, I really do understand the confusion and the baggage inherent to the term "interlaced" but this is from the 480i days when the interlaced frames where received at 24fps with each frame being only half of the image. This resulted in combing and other artifacts that are nearly impossible to remove.

1080i video is displayed as full 1920x1080 frames at 30fps from a 1080p display. (true 1080p can be at 60fps, but it isn't due to most sources being recorded at 24 or 30fps)
Here's how it works...

A single 1080i frame consists of 2 frames of 1920x540. Both of these interlaced frames are created from a single progressive frames odd vs even lines.
Both frames are received and reconstituted into a single 1920x1080 progressive frame. This is a lossless process that is part of the 1080i spec. Since the 2 1080i frames are from the same temporal frame, there is none combing or other artifacts typically inherent in 480i. The resulting frame is pixel for pixel identical to the original 1080p frame*.

Since the 1080i spec runs at 60(half)fps, it can display up to 30 full progressive frames per second.

Keep in mind that this is only true for 1080p displays. A 1080i display will show only half of the frame at time every 1/60 of a second, leading to a shaky image that I find much worse then 720p

Another interesting point is that while the temporal pixel density for 720p and 1080i are nearly the same, this assumes 720p at 60fps. Since 720p content is typically at <30fps, where 1080i is at <60 (each single frame is sent in 2 halves) the typical temporal pixel density for 1080i is twice that of 720p.

Did you know that sony's first 1080p displays only have 1080i inputs? It's because for movies, it doesn't matter.

I hope this was clear, because i know it's not obvious.

* I'm not 100% sure, but 1080i may be limited to a chroma space of 4x2x2 where as a 1080p signal can be 4x4x4 (but usually not)

** for gaming I would use 720p over 1080i, since frame rates are likely over 30fps ;)

rodolfo
Feb 26, 2008, 04:48 AM
It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell? :p

Right: The ITU-R BT.709-5 defines: “A high-definition system is a system designed to allow viewing at about three times the picture height, such that the system is virtually, or nearly, transparent to the quality of portrayal that would have been perceived in the original scene or performance by a discerning viewer with normal visual acuity.”

rodolfo
Feb 26, 2008, 04:53 AM
720p24fps is HD....
http://en.wikipedia.org/wiki/720p

The information on Wikipedia is kind of misleading... So I made a little research...

The ITU-R BT.709-5 (latest version -2002-) defines: "A high-definition system is a system designed to allow viewing at about three times the picture height, such that the system is virtually, or nearly, transparent to the quality of portrayal that would have been perceived in the original scene or performance by a discerning viewer with normal visual acuity.” (vague, yet another very subjective definition)

The ATSC standard defines: "High-definition television has a resolution of approximately twice that of conventional television in both the horizontal (H) and vertical (V) dimensions and a picture aspect ratio (H × V) of 16:9. ITU-R Recommendation 1125 further defines “HDTV quality” as the delivery of a television picture which is subjectively identical with the interlaced HDTV studio standard." (another subjective definition)

Ironically and according to the ATSC definition; 1280x720 should not be even be considered HD! i.e. conventional TV =SDTV (as defined by ATSC) = BT.601= 960(16:9)/720(4:3) X 480 4:2:2 @ >50 fields therefore: HD=1440~1920 x 960 4:2:2 @ >50 fields)

Take a look of this old controversy: http://findarticles.com/p/articles/m...39/ai_54827393

Currents status of HD: USA the only country to "deploy" 720p HD (in a few channels to be fair like CBS, ABC & ESPN) Apple and Adobe Flash (On2) the only ones to define HD=720p24 4:2:0). ATSC (Advance TV spec, covering SDTV, EDTV and HDTV a spec that may be valid to reference if Apple TV has at least a Digital ATSC Tuner) uses SMTP 296M-2001 as a normative reference and SMTP 296M-2001; which by it self; uses BT.709 as a normative reference. BT.709-5 = 1920x1080 4:2:2/ 4:4:4. Also in europe the DVB standard (ETSI TR 102 154 V1.1.1 (2001-04)) = 1920x1080 4:2:2. and in Japan ARIB B24=1920x1080 4:2:2.

In addition consider that HDMI (High Definition Multimedia Interface) doesn't have such thing as 720p24 but 720p/50/59.97/60, EIA 770.3 also doesn't support 720p24 (i.e. there is no way to connect a 720p24 analog source to a HDTV with component inputs) but 720p59.97/60. (Yes you can argue that people have the freedom to encode a film into a 720p60 --a format developed to portray fast action well, but that doesn't mean that 720p24 is HDTV because 1920x1080p24 was designed for that specific scenario) --Yes, The ATSC have a table that defined 28 different MPEG 2 compression "constraints" "allowing 720p24 MPEG-2 encoding" But this is more considered to be enhanced TV rather than HD (like 480p60).

(http://findarticles.com/p/articles/m...39/ai_54827393 )

If you find something else on this technical HD definition puzzle, I will be more than happy to learn about it... So far what I found interpreting all those documents is that 720p24 is no HD an Apple HD movies doesn't look like HD on my Sony 46'' (1920x1080p) and JVC HD Pro monitor

err404
Feb 26, 2008, 09:19 AM
on the 720p issue, some of the confusion seems to be over displays vs content. An HDTV needs to be able to display 720 horizontal lines at 60fps.

However, content is only filmed at 24fps. This is still HD, unless you want to make an argument that HD movies don't even exist (in that case the definition is wrong or at the very lest meaningless)

I think a more pragmatic approach would be HD as any content that exceeds 480p. resolution is not the end all of quality anyway. A highly compressed film at 1080p x 60fps will still look awful.

ChrisA
Feb 26, 2008, 01:12 PM
One promblem with most people's analysis of this problem is that they focus on pixels. This is wrong. What we should be looking at is spacial frequencies that the screen can reproduce. We should be talking about "cycles per millimeter" or maybe referenced to viewer distance and expressed as "cycles per degree". Nyquist Sampling Theorem tells us that we need two pixels (well a slight bit more than two really) to make a cycle. Leaving out this throws most analysis I've read off by a factor of two. You have to treat digital sampled video the same way we treat digital audio -- A music CD uses 44,100 samples per second but can only reproduce music up to about 20Khz. Digital video is the same but of course the samples are in two diminsion (or three if you concider motion) A digital screen with 1080 pixels can produce no more then 540 cycles.

The bottom line is is that if we take as a given that the human eye can only resolve two lines if they are an arc=minute apart then we would need a screen with pixels spaced at 1/2 arc minute or better. But a screen with 1/4 arc minute pixels could produce arc minute spaced llines but with better contrast

Next we should not set a hard limit on what the eye can resolve. This depends a lot on the contrast. With high contrast images the eye may be able to see better then 60 cycles per degree. I'd guess much beter.

Another issue is the assumed center to center distance of the pixels. Most simple analysis looks at the horizontal or vertical pixel pitch. Well, these are special cases. The detail in the picture may not be aligned with the pixels and in general it is not. If the bars in a test image are rotated 45 degrees then the pixels in effect spaced farther apart

There is another thing that is left out. Motion. The human eye is very sensitive to motion.


So my recomendations....

Use the "worse case" pixel spacing. That is the space between diagonally adjacent pixels not the best case which is the horizontal pixel pitch. We are not watching test screens with vertical bars the details in a movie can appear in any random orientation. The diagonal distance is 1.414 times the nominal pixel pitch distance. If you think the worst case is to conservative then compromise and use the average (1 + 1.4.14)/2 or about 1.2

I don't know how to quanitize the other effects (1) that the the LCD screen might have a higher contrast then the paper eyecharts used to characterize human vision and (2) the eye's ability to detect motion at very small scales

Bottom line is that while the equation given mayb e correct the asumptions made are not and the result is that the answer is"off" by at least factor of two and maybe even close to four

timmell
Feb 26, 2008, 02:00 PM
I can't wait to see you guys post in the future about Ultra HD or whatever they call it when 4k or 2K is the standard in broadcast signal. What's better a 2160p or 1556p UHDTV? What should I get? Help someone?? Will I be able to see the differerce? I'm so stressing about about this.


You can see the difference between 720p and 1080p, it is all a matter of cost.

Mike

www.persistentproductions.net (http://www.persistentproductions.net)

tronic72
Feb 26, 2008, 04:00 PM
Most people are missing the point. This talk is all numbers and doesn't really equate to even small visual differences.

My neighbour has a new Sony 52" 1080p LCD and we are both hard pressed to see the difference between my Sony CRT 720 HD TV!

It's also important to know that content ripped for the ATV will be compressed so the difference will be even less visually significant.

I would assume this is the reason that Apple won't bother with 1080 HD content, which would greatly increase the download time without any large change in visual quality.

CNET.com has an article about the comparison between there three formats in which they state;

"Conclusions
While this isn't the most scientific test, both Katzmaier and I agreed that, after scanning through Mission: Impossible III for an hour, it would be very difficult--practically impossible--for the average consumer to tell the difference between a high-definition image displayed on a 1080p-capable TV and one with lower native resolution at the screen sizes mentioned above. At larger screen sizes, the differences might become somewhat more apparent, especially if you sit close to the screen."

URL is http://reviews.cnet.com/4520-6449_7-6661274-1.html

These guys have done the tests, as have many others. All this is doing is making more money for the electronics manufactures.

Bottom line: If you can afford it, get 1080p, but if you have 720 or 1080i it really doesn't matter.

Now let's all enjoy our Apple TVs.

MikieMikie
Feb 26, 2008, 04:17 PM
Most people are missing the point. This talk is all numbers and doesn't really equate to even small visual differences.

My neighbour has a new Sony 52" 1080p LCD and we are both hard pressed to see the difference between my Sony CRT 720 HD TV!

It's also important to know that content ripped for the ATV will be compressed so the difference will be even less visually significant.

snipped the rest


First, what sources were you using for your comparison between the two sets? SD? HD? Were they side-by-side, and were you the same distance from the sets?

Seondly, what has compression got to do with anything? I certainly have enough experience with compression technologies to understand that H.264 is a significantly different approach to compression than what is being used on SD DVDs. Delta compressions are extremely effective in reducing size without necessarily losing content.

To state it simply: DVDs are compressed. Handbrake, among others, reconstructs the images then compresses them with a different codec, notably H.264 for the Apple TV.

What makes you think there's a loss of information there? Following your conclusion to its natural end, why stop there? Since in your opinion compression renders it impossible to see the difference on these two sets, why not just stick with 480i?

err404
Feb 26, 2008, 04:17 PM
I just add that you should avoid 1080i displays. Just use 720p it's a more stable image.

I agree that for movies 1080p is on the verge of pointless, but for gaming there is an obvious difference (that is when/if native 1080p games start coming out)

(1080i content is fine, but not interlaced displays)

moreAAPLplz
Feb 26, 2008, 04:29 PM
I use my 42" 1080p as my computer monitor and I should note that I have 20/11 vision. Buying the 1080p was worth it to me because 1) it's a better picture for my eyes 2) it gives me more desktop real estate than a 720p and 3) I got my Sharp Aquos 42" 1080p for $1189, $210 less than what they had the Sharp Aquos 42" 720p marked at.

Given these reasons, the extra money for the 720p wouldn't have been worth it.

fivepoint
Feb 26, 2008, 05:00 PM
You can see the difference between 720p and 1080p, it is all a matter of cost.

Mike

www.persistentproductions.net (http://www.persistentproductions.net)

This is, quite simply, false.
Please refer to my graph on the first page. Your eyes can only input so much data. You CAN see a difference ( I agree with you) but only at certain distances. Basically, to get the benefits of 1080p and UltraHD you'll have to keep moving closer and closer to the TV, quickly making it 'unwatchable' similar to sitting in an iMax theatre in the front row.

MikieMikie
Feb 26, 2008, 05:07 PM
You CAN see a difference ( I agree with you) but only at certain distances. Basically, to get the benefits of 1080p and UltraHD you'll have to keep moving closer and closer to the TV, quickly making it 'unwatchable' similar to sitting in an iMax theatre in the front row.

Yeah, but when it's IMAX 3-D, it's worth sitting in the front. ;)

rodolfo
Feb 26, 2008, 06:24 PM
I can't wait to see you guys post in the future about Ultra HD or whatever they call it when 4k or 2K is the standard in broadcast signal. What's better a 2160p or 1556p UHDTV? What should I get? Help someone?? Will I be able to see the differerce? I'm so stressing about about this.


You can see the difference between 720p and 1080p, it is all a matter of cost.

Mike

www.persistentproductions.net (http://www.persistentproductions.net)

UltaHD or better say Ultra-HighVision is 7680x4320p/60, I saw a demo in Japan and it was great... (including 20.4 audio channels) They displayed the demo on a 11m diagonal screen and I saw it also at half-res on a 65 inches 4K monitor (the intention for SHV is >100 inches screens). There is noticeable difference... to the point that it looks virtually real at a distant of 1x the height of the display. BTW: The broadcasting standard is expected to start very soon...

err404
Feb 26, 2008, 06:26 PM
This is, quite simply, false.
Please refer to my graph on the first page. Your eyes can only input so much data. You CAN see a difference ( I agree with you) but only at certain distances. Basically, to get the benefits of 1080p and UltraHD you'll have to keep moving closer and closer to the TV, quickly making it 'unwatchable' similar to sitting in an iMax theatre in the front row.

I think that is bit of an exaggeration. It's becoming more and more common for people to have setups where 1080p is an noticeable improvement. I have a 50" screen thats about 8'-10' ft away and the difference is basically
720p - that looks really good
1080p - Damn that is SHARP!

One thing that I have noticed is that the filming of lot of movies is not that sharp. The softness of the frame make the improvment less noticeable. That said, alot of movies now are being filmed with better cameras and the step to 1080p is more apparent.

wanchaiman
Feb 26, 2008, 07:54 PM
They displayed the demo on a 11m diagonal screen

I want one!

Is it a plasma or an LCD?? :)

rodolfo
Feb 27, 2008, 12:32 AM
I want one!

Is it a plasma or an LCD?? :)

The 11mts :( is a projection screen the image came from 2 impressive projectors (one reproducing magenta -red+blue- and the other reproducing the green channel).

cenetti
Feb 27, 2008, 10:39 PM
I am very happy with my optoma 720p projector. I am getting 86" diagonal image from it...and to me 1080p projector is an overkill at this point. I'll probably get one when they come down in price though...

http://img206.imageshack.us/img206/3443/img1194rl9.jpg (http://imageshack.us)
http://img530.imageshack.us/img530/8607/img1193or0.jpg (http://imageshack.us)
http://img530.imageshack.us/img530/8607/img1193or0.34b473821d.jpg (http://g.imageshack.us/g.php?h=530&i=img1193or0.jpg)
http://img256.imageshack.us/img256/7547/img1195os6.jpg (http://imageshack.us)

motulist
Feb 28, 2008, 02:25 PM
I am very happy with my optoma 720p projector. I am getting 86" diagonal image from it...and to me 1080p projector is an overkill at this point. I'll probably get one when they come down in price though...

Yes, and your contrast level ranges all the way from medium gray to dull off-white. Forget higher resolution or size, I much prefer a high contrast picture with actually black blacks and bright whites that happens to be small SD instead of a larger HD picture with low contrast. But to each his own.

err404
Feb 28, 2008, 02:49 PM
While I wouldn't want to use a projector as my casual viewing screen, the right setup can look fantastic. Resolution aside, they bring a lot to a movie watching 'experience' that smaller screens just cant compete with.