Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

err404

macrumors 68030
Mar 4, 2007
2,525
623
It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell? :p

Sorry, I really do understand the confusion and the baggage inherent to the term "interlaced" but this is from the 480i days when the interlaced frames where received at 24fps with each frame being only half of the image. This resulted in combing and other artifacts that are nearly impossible to remove.

1080i video is displayed as full 1920x1080 frames at 30fps from a 1080p display. (true 1080p can be at 60fps, but it isn't due to most sources being recorded at 24 or 30fps)
Here's how it works...

A single 1080i frame consists of 2 frames of 1920x540. Both of these interlaced frames are created from a single progressive frames odd vs even lines.
Both frames are received and reconstituted into a single 1920x1080 progressive frame. This is a lossless process that is part of the 1080i spec. Since the 2 1080i frames are from the same temporal frame, there is none combing or other artifacts typically inherent in 480i. The resulting frame is pixel for pixel identical to the original 1080p frame*.

Since the 1080i spec runs at 60(half)fps, it can display up to 30 full progressive frames per second.

Keep in mind that this is only true for 1080p displays. A 1080i display will show only half of the frame at time every 1/60 of a second, leading to a shaky image that I find much worse then 720p

Another interesting point is that while the temporal pixel density for 720p and 1080i are nearly the same, this assumes 720p at 60fps. Since 720p content is typically at <30fps, where 1080i is at <60 (each single frame is sent in 2 halves) the typical temporal pixel density for 1080i is twice that of 720p.

Did you know that sony's first 1080p displays only have 1080i inputs? It's because for movies, it doesn't matter.

I hope this was clear, because i know it's not obvious.

* I'm not 100% sure, but 1080i may be limited to a chroma space of 4x2x2 where as a 1080p signal can be 4x4x4 (but usually not)

** for gaming I would use 720p over 1080i, since frame rates are likely over 30fps ;)
 

rodolfo

macrumors newbie
Feb 18, 2008
24
0
It's converted to progressive scan for display, but it still bears all the disadvantages of the interlaced source material, you cannot ever restore that. You can blow up a 320x240 GIF image to 1920x1080 too, but that doesn't make it a high-definition image because you're still stuck with poor source material. 1080i is still inferior to 720p in some respects. OK, so I just hate interlaced video, can you tell? :p

Right: The ITU-R BT.709-5 defines: “A high-definition system is a system designed to allow viewing at about three times the picture height, such that the system is virtually, or nearly, transparent to the quality of portrayal that would have been perceived in the original scene or performance by a discerning viewer with normal visual acuity.”
 

rodolfo

macrumors newbie
Feb 18, 2008
24
0

The information on Wikipedia is kind of misleading... So I made a little research...

The ITU-R BT.709-5 (latest version -2002-) defines: "A high-definition system is a system designed to allow viewing at about three times the picture height, such that the system is virtually, or nearly, transparent to the quality of portrayal that would have been perceived in the original scene or performance by a discerning viewer with normal visual acuity.” (vague, yet another very subjective definition)

The ATSC standard defines: "High-definition television has a resolution of approximately twice that of conventional television in both the horizontal (H) and vertical (V) dimensions and a picture aspect ratio (H × V) of 16:9. ITU-R Recommendation 1125 further defines “HDTV quality” as the delivery of a television picture which is subjectively identical with the interlaced HDTV studio standard." (another subjective definition)

Ironically and according to the ATSC definition; 1280x720 should not be even be considered HD! i.e. conventional TV =SDTV (as defined by ATSC) = BT.601= 960(16:9)/720(4:3) X 480 4:2:2 @ >50 fields therefore: HD=1440~1920 x 960 4:2:2 @ >50 fields)

Take a look of this old controversy: http://findarticles.com/p/articles/m...39/ai_54827393

Currents status of HD: USA the only country to "deploy" 720p HD (in a few channels to be fair like CBS, ABC & ESPN) Apple and Adobe Flash (On2) the only ones to define HD=720p24 4:2:0). ATSC (Advance TV spec, covering SDTV, EDTV and HDTV a spec that may be valid to reference if Apple TV has at least a Digital ATSC Tuner) uses SMTP 296M-2001 as a normative reference and SMTP 296M-2001; which by it self; uses BT.709 as a normative reference. BT.709-5 = 1920x1080 4:2:2/ 4:4:4. Also in europe the DVB standard (ETSI TR 102 154 V1.1.1 (2001-04)) = 1920x1080 4:2:2. and in Japan ARIB B24=1920x1080 4:2:2.

In addition consider that HDMI (High Definition Multimedia Interface) doesn't have such thing as 720p24 but 720p/50/59.97/60, EIA 770.3 also doesn't support 720p24 (i.e. there is no way to connect a 720p24 analog source to a HDTV with component inputs) but 720p59.97/60. (Yes you can argue that people have the freedom to encode a film into a 720p60 --a format developed to portray fast action well, but that doesn't mean that 720p24 is HDTV because 1920x1080p24 was designed for that specific scenario) --Yes, The ATSC have a table that defined 28 different MPEG 2 compression "constraints" "allowing 720p24 MPEG-2 encoding" But this is more considered to be enhanced TV rather than HD (like 480p60).

(http://findarticles.com/p/articles/m...39/ai_54827393 )

If you find something else on this technical HD definition puzzle, I will be more than happy to learn about it... So far what I found interpreting all those documents is that 720p24 is no HD an Apple HD movies doesn't look like HD on my Sony 46'' (1920x1080p) and JVC HD Pro monitor
 

err404

macrumors 68030
Mar 4, 2007
2,525
623
on the 720p issue, some of the confusion seems to be over displays vs content. An HDTV needs to be able to display 720 horizontal lines at 60fps.

However, content is only filmed at 24fps. This is still HD, unless you want to make an argument that HD movies don't even exist (in that case the definition is wrong or at the very lest meaningless)

I think a more pragmatic approach would be HD as any content that exceeds 480p. resolution is not the end all of quality anyway. A highly compressed film at 1080p x 60fps will still look awful.
 

ChrisA

macrumors G5
Jan 5, 2006
12,584
1,700
Redondo Beach, California
One promblem with most people's analysis of this problem is that they focus on pixels. This is wrong. What we should be looking at is spacial frequencies that the screen can reproduce. We should be talking about "cycles per millimeter" or maybe referenced to viewer distance and expressed as "cycles per degree". Nyquist Sampling Theorem tells us that we need two pixels (well a slight bit more than two really) to make a cycle. Leaving out this throws most analysis I've read off by a factor of two. You have to treat digital sampled video the same way we treat digital audio -- A music CD uses 44,100 samples per second but can only reproduce music up to about 20Khz. Digital video is the same but of course the samples are in two diminsion (or three if you concider motion) A digital screen with 1080 pixels can produce no more then 540 cycles.

The bottom line is is that if we take as a given that the human eye can only resolve two lines if they are an arc=minute apart then we would need a screen with pixels spaced at 1/2 arc minute or better. But a screen with 1/4 arc minute pixels could produce arc minute spaced llines but with better contrast

Next we should not set a hard limit on what the eye can resolve. This depends a lot on the contrast. With high contrast images the eye may be able to see better then 60 cycles per degree. I'd guess much beter.

Another issue is the assumed center to center distance of the pixels. Most simple analysis looks at the horizontal or vertical pixel pitch. Well, these are special cases. The detail in the picture may not be aligned with the pixels and in general it is not. If the bars in a test image are rotated 45 degrees then the pixels in effect spaced farther apart

There is another thing that is left out. Motion. The human eye is very sensitive to motion.


So my recomendations....

Use the "worse case" pixel spacing. That is the space between diagonally adjacent pixels not the best case which is the horizontal pixel pitch. We are not watching test screens with vertical bars the details in a movie can appear in any random orientation. The diagonal distance is 1.414 times the nominal pixel pitch distance. If you think the worst case is to conservative then compromise and use the average (1 + 1.4.14)/2 or about 1.2

I don't know how to quanitize the other effects (1) that the the LCD screen might have a higher contrast then the paper eyecharts used to characterize human vision and (2) the eye's ability to detect motion at very small scales

Bottom line is that while the equation given mayb e correct the asumptions made are not and the result is that the answer is"off" by at least factor of two and maybe even close to four
 

timmell

macrumors newbie
Nov 16, 2007
9
0
Redford, MI
Oh my lord.

I can't wait to see you guys post in the future about Ultra HD or whatever they call it when 4k or 2K is the standard in broadcast signal. What's better a 2160p or 1556p UHDTV? What should I get? Help someone?? Will I be able to see the differerce? I'm so stressing about about this.


You can see the difference between 720p and 1080p, it is all a matter of cost.

Mike

www.persistentproductions.net
 

tronic72

macrumors regular
Feb 10, 2007
106
0
Missing the point

Most people are missing the point. This talk is all numbers and doesn't really equate to even small visual differences.

My neighbour has a new Sony 52" 1080p LCD and we are both hard pressed to see the difference between my Sony CRT 720 HD TV!

It's also important to know that content ripped for the ATV will be compressed so the difference will be even less visually significant.

I would assume this is the reason that Apple won't bother with 1080 HD content, which would greatly increase the download time without any large change in visual quality.

CNET.com has an article about the comparison between there three formats in which they state;

"Conclusions
While this isn't the most scientific test, both Katzmaier and I agreed that, after scanning through Mission: Impossible III for an hour, it would be very difficult--practically impossible--for the average consumer to tell the difference between a high-definition image displayed on a 1080p-capable TV and one with lower native resolution at the screen sizes mentioned above. At larger screen sizes, the differences might become somewhat more apparent, especially if you sit close to the screen."


URL is http://reviews.cnet.com/4520-6449_7-6661274-1.html

These guys have done the tests, as have many others. All this is doing is making more money for the electronics manufactures.

Bottom line: If you can afford it, get 1080p, but if you have 720 or 1080i it really doesn't matter.

Now let's all enjoy our Apple TVs.
 

MikieMikie

macrumors 6502a
Aug 7, 2007
705
0
Newton, MA
Most people are missing the point. This talk is all numbers and doesn't really equate to even small visual differences.

My neighbour has a new Sony 52" 1080p LCD and we are both hard pressed to see the difference between my Sony CRT 720 HD TV!

It's also important to know that content ripped for the ATV will be compressed so the difference will be even less visually significant.

snipped the rest

First, what sources were you using for your comparison between the two sets? SD? HD? Were they side-by-side, and were you the same distance from the sets?

Seondly, what has compression got to do with anything? I certainly have enough experience with compression technologies to understand that H.264 is a significantly different approach to compression than what is being used on SD DVDs. Delta compressions are extremely effective in reducing size without necessarily losing content.

To state it simply: DVDs are compressed. Handbrake, among others, reconstructs the images then compresses them with a different codec, notably H.264 for the Apple TV.

What makes you think there's a loss of information there? Following your conclusion to its natural end, why stop there? Since in your opinion compression renders it impossible to see the difference on these two sets, why not just stick with 480i?
 

err404

macrumors 68030
Mar 4, 2007
2,525
623
I just add that you should avoid 1080i displays. Just use 720p it's a more stable image.

I agree that for movies 1080p is on the verge of pointless, but for gaming there is an obvious difference (that is when/if native 1080p games start coming out)

(1080i content is fine, but not interlaced displays)
 

moreAAPLplz

macrumors newbie
Jan 16, 2008
5
0
I use my 42" 1080p as my computer monitor and I should note that I have 20/11 vision. Buying the 1080p was worth it to me because 1) it's a better picture for my eyes 2) it gives me more desktop real estate than a 720p and 3) I got my Sharp Aquos 42" 1080p for $1189, $210 less than what they had the Sharp Aquos 42" 720p marked at.

Given these reasons, the extra money for the 720p wouldn't have been worth it.
 

fivepoint

macrumors 65816
Sep 28, 2007
1,175
5
IOWA
You can see the difference between 720p and 1080p, it is all a matter of cost.

Mike

www.persistentproductions.net

This is, quite simply, false.
Please refer to my graph on the first page. Your eyes can only input so much data. You CAN see a difference ( I agree with you) but only at certain distances. Basically, to get the benefits of 1080p and UltraHD you'll have to keep moving closer and closer to the TV, quickly making it 'unwatchable' similar to sitting in an iMax theatre in the front row.
 

MikieMikie

macrumors 6502a
Aug 7, 2007
705
0
Newton, MA
You CAN see a difference ( I agree with you) but only at certain distances. Basically, to get the benefits of 1080p and UltraHD you'll have to keep moving closer and closer to the TV, quickly making it 'unwatchable' similar to sitting in an iMax theatre in the front row.

Yeah, but when it's IMAX 3-D, it's worth sitting in the front. ;)
 

rodolfo

macrumors newbie
Feb 18, 2008
24
0
I can't wait to see you guys post in the future about Ultra HD or whatever they call it when 4k or 2K is the standard in broadcast signal. What's better a 2160p or 1556p UHDTV? What should I get? Help someone?? Will I be able to see the differerce? I'm so stressing about about this.


You can see the difference between 720p and 1080p, it is all a matter of cost.

Mike

www.persistentproductions.net

UltaHD or better say Ultra-HighVision is 7680x4320p/60, I saw a demo in Japan and it was great... (including 20.4 audio channels) They displayed the demo on a 11m diagonal screen and I saw it also at half-res on a 65 inches 4K monitor (the intention for SHV is >100 inches screens). There is noticeable difference... to the point that it looks virtually real at a distant of 1x the height of the display. BTW: The broadcasting standard is expected to start very soon...
 

err404

macrumors 68030
Mar 4, 2007
2,525
623
This is, quite simply, false.
Please refer to my graph on the first page. Your eyes can only input so much data. You CAN see a difference ( I agree with you) but only at certain distances. Basically, to get the benefits of 1080p and UltraHD you'll have to keep moving closer and closer to the TV, quickly making it 'unwatchable' similar to sitting in an iMax theatre in the front row.

I think that is bit of an exaggeration. It's becoming more and more common for people to have setups where 1080p is an noticeable improvement. I have a 50" screen thats about 8'-10' ft away and the difference is basically
720p - that looks really good
1080p - Damn that is SHARP!

One thing that I have noticed is that the filming of lot of movies is not that sharp. The softness of the frame make the improvment less noticeable. That said, alot of movies now are being filmed with better cameras and the step to 1080p is more apparent.
 

cenetti

macrumors 6502
Jan 30, 2008
464
47
I am very happy with my optoma 720p projector. I am getting 86" diagonal image from it...and to me 1080p projector is an overkill at this point. I'll probably get one when they come down in price though...




 

motulist

macrumors 601
Dec 2, 2003
4,235
611
I am very happy with my optoma 720p projector. I am getting 86" diagonal image from it...and to me 1080p projector is an overkill at this point. I'll probably get one when they come down in price though...

Yes, and your contrast level ranges all the way from medium gray to dull off-white. Forget higher resolution or size, I much prefer a high contrast picture with actually black blacks and bright whites that happens to be small SD instead of a larger HD picture with low contrast. But to each his own.
 

err404

macrumors 68030
Mar 4, 2007
2,525
623
While I wouldn't want to use a projector as my casual viewing screen, the right setup can look fantastic. Resolution aside, they bring a lot to a movie watching 'experience' that smaller screens just cant compete with.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.