PDA

View Full Version : HD display's


Platform
Mar 27, 2005, 12:36 AM
Was thinking here that the 20" ACD is not HD but the 23" and the 30" is.

How come then at the MWSF keynote Steve is so proud that this is the first time with HD projection and that he has a 20" ACD and all the pixles up there on the wall. :confused:

So why is the 20" displays not HD, or how could that be HD projection with all the pixles from the 20" on the wall :confused:

john1123
Mar 27, 2005, 12:46 AM
since the pm supports screen spanning, not just mirroring, it is entirely possible that the pm was driving the projector at HD resolution.

Platform
Mar 27, 2005, 12:52 AM
since the pm supports screen spanning, not just mirroring, it is entirely possible that the pm was driving the projector at HD resolution.

But he says that "here I have a 20" cinema display with every pixel up on the wall, fantastic is'nt it"

Rod Rod
Mar 27, 2005, 01:01 AM
The 20" Apple Cinema Display is capable of displaying high definition. The resolution of 720p high definition is 1280x720. 1080i HD is 1920x1200. Both 720p and 1080i are true high definition. Browse HDTVs and you'll see that the ones that display less than 1280x720 are "enhanced definition" whereas those which display 1280x720 or more are true high definition. Most plasma and LCD TVs are 720p native. Only a couple of Sharp Aquos LCD TVs are 1080 native (since they're progressive displays they upconvert everything to 1080p, although no HD satellite or broadcast signal is 1080p.)

Platform
Mar 27, 2005, 01:12 AM
The 20" Apple Cinema Display is capable of displaying high definition. The resolution of 720p high definition is 1280x720. 1080i HD is 1920x1200. Both 720p and 1080i are true high definition. Browse HDTVs and you'll see that the ones that display less than 1280x720 are "enhanced definition" whereas those which display 1280x720 or more are true high definition. Most plasma and LCD TVs are 720p native. Only a couple of Sharp Aquos LCD TVs are 1080 native (since they're progressive displays they upconvert everything to 1080p, although no HD satellite or broadcast signal is 1080p.)

OK thanks a lot

So any display with more than 1280x720 resolution is a true HD display :confused:

Edit: does it have to be widescreen :confused:

JackSYi
Apr 7, 2005, 03:23 AM
OK thanks a lot

So any display with more than 1280x720 resolution is a true HD display :confused:

Edit: does it have to be widescreen :confused:

I have the same question you do, and with some research I found this great forum thread: http://forums.applenova.com/archive/index.php/t-4489.html

Rod Rod
Apr 7, 2005, 04:40 AM
OK thanks a lot

So any display with more than 1280x720 resolution is a true HD display :confused:

Edit: does it have to be widescreen :confused:
Yes, any display with 1280x720 resolution or more is a true HD display.

HD is natively 16:9, but there are "square" 4:3 sets that are proper HDTVs. When they display standard content it takes up the whole screen, and widescreen content is letterboxed.

I'd like to clear up confusion about 1080i being a "higher end" HD and 720p a lesser version of HD. 1080i and 720p have a similar number of pixels per second.

1080i is 60 fields per second, with each field's resolution 1920x540.
1920 x 540 x 60 = 62208000 pixels/second

720p is 60 frames per second, with each frame's resolution 1280x720.
1280 x 720 x 60 = 55296000 pixels/second

According to the numbers, 1080i seems to have an advantage. However, for the signal to reach you it has to get pretty highly compressed, and for the same bandwidth the 720p signal is slightly less compressed. Therefore 720p has less compression artifacts than 1080i.

Besides that, interlacing is the most ancient form of compression artifact, and 1080i has that inherently. A deinterlaced 1080i signal (upconverted to 1080p) will look great depending on the quality of the deinterlacer, but I doubt it'll look better than a 720p signal upconverted to 1080p because the 720p signal has far more vertical resolution.

ftaok
Apr 7, 2005, 08:01 AM
Yes, any display with 1280x720 resolution or more is a true HD display.


Well, just to be nitpicky. For a display to be considered hi-def (meaning they can use the HDTV logo), it needs to be able to display at least 720 horizontal lines. There's also a clause that says that it has to display in 16:9 widescreen. Many smaller plasmas (43" and under) have a resolution of 1024x768 and are 16:9. The pixels in this case are rectangular.

Platform
Apr 7, 2005, 08:06 AM
Yes, any display with 1280x720 resolution or more is a true HD display.

HD is natively 16:9, but there are "square" 4:3 sets that are proper HDTVs. When they display standard content it takes up the whole screen, and widescreen content is letterboxed.

I'd like to clear up confusion about 1080i being a "higher end" HD and 720p a lesser version of HD. 1080i and 720p have a similar number of pixels per second.

1080i is 60 fields per second, with each field's resolution 1920x540.
1920 x 540 x 60 = 62208000 pixels/second

720p is 60 frames per second, with each frame's resolution 1280x720.
1280 x 720 x 60 = 55296000 pixels/second

According to the numbers, 1080i seems to have an advantage. However, for the signal to reach you it has to get pretty highly compressed, and for the same bandwidth the 720p signal is slightly less compressed. Therefore 720p has less compression artifacts than 1080i.

Besides that, interlacing is the most ancient form of compression artifact, and 1080i has that inherently. A deinterlaced 1080i signal (upconverted to 1080p) will look great depending on the quality of the deinterlacer, but I doubt it'll look better than a 720p signal upconverted to 1080p because the 720p signal has far more vertical resolution.

Thanks a lot for making it so clear :D ;)

MisterMe
Apr 7, 2005, 08:42 AM
....

1080i is 60 fields per second, with each field's resolution 1920x540.
1920 x 540 x 60 = 62208000 pixels/second

720p is 60 frames per second, with each frame's resolution 1280x720.
1280 x 720 x 60 = 55296000 pixels/second

According to the numbers, 1080i seems to have an advantage. However, for the signal to reach you it has to get pretty highly compressed, and for the same bandwidth the 720p signal is slightly less compressed. Therefore 720p has less compression artifacts than 1080i.Wow, have you gone off the path! There is ample bandwidth for a 1080i program stream plus one or more additional program streams in a digital broadcast. One reason that a 1080i digital program may have additional compression is that the broadcaster may be "stealing" bandwidth for more digital streams than are provided by the standard. Another reason is that the program may be a converted 720p or 480p stream. For example, VOOM upconverts everything to 1080i irrespective of source.

Besides that, interlacing is the most ancient form of compression artifact, and 1080i has that inherently. A deinterlaced 1080i signal (upconverted to 1080p) will look great depending on the quality of the deinterlacer, but I doubt it'll look better than a 720p signal upconverted to 1080p because the 720p signal has far more vertical resolution.You don't get 1080p by upconverting 1080i. 1080i displays one 540 half-frame every 1/60th second. Two consecutive half-frames are interlaced to give a complete frame each 1/30th second. You get 1080p by buffering the first 540 half-frame until the second 540 half-frame is received. The two halves are then combined and displayed at 30 frames/second. In other words, 1080p displays the same frame rate [and pixel rate] as 1080i.

Poff
Apr 7, 2005, 09:49 AM
Hmm.. I've often heard 720 referred to as half-assed or untrue HD, especially by people working in cinemas.

How come Apple Calls their 23inchers and 30inchers Cinema HD Display, and they call their 20"incher Cinema Display if all three are *true* HD-displays?

http://www.apple.com/displays/

ftaok
Apr 7, 2005, 09:56 AM
Hmm.. I've often heard 720 referred to as half-assed or untrue HD, especially by people working in cinemas.

How come Apple Calls their 23inchers and 30inchers Cinema HD Display, and they call their 20"incher Cinema Display if all three are *true* HD-displays?

http://www.apple.com/displays/
I'm not sure what the HD standards are in Norway, but in the US, there are 2 standards. 720p and 1080i - as mentioned above.

You have to realize that there are strengths and weaknesss to both approaches, but both can be considered HD. The advantage to 1080i is spatial resolution. This means on static scenes or slow moving scenes, 1080i will have more detail than 720p. Where 720p excels is on temporal resolution. Fast moving scenes or sports events will tend to "flow" better with 720p. Of course there are other variables at play here that may come to play.

In the end, the US has 2 different standards that creates a mess and adds to confusion. Blame Zenith for this. Oh wait, they're no longer around.

devman
Apr 7, 2005, 10:12 AM
Hmm.. I've often heard 720 referred to as half-assed or untrue HD, especially by people working in cinemas.

How come Apple Calls their 23inchers and 30inchers Cinema HD Display, and they call their 20"incher Cinema Display if all three are *true* HD-displays?

http://www.apple.com/displays/

correct. Apple only gives the HD moniker to a display that has at least 1080 lines (height).

Rod Rod
Apr 7, 2005, 11:03 AM
Wow, have you gone off the path! There is ample bandwidth for a 1080i program stream plus one or more additional program streams in a digital broadcast. One reason that a 1080i digital program may have additional compression is that the broadcaster may be "stealing" bandwidth for more digital streams than are provided by the standard. Another reason is that the program may be a converted 720p or 480p stream. For example, VOOM upconverts everything to 1080i irrespective of source.
I'm talking about the transmission bandwidth PER CHANNEL, not the transmission bandwidth for all channels added together. They're all generally 19mbps streams per channel. I'm sorry this was hard to follow.

You don't get 1080p by upconverting 1080i. 1080i displays one 540 half-frame every 1/60th second. Two consecutive half-frames are interlaced to give a complete frame each 1/30th second. You get 1080p by buffering the first 540 half-frame until the second 540 half-frame is received. The two halves are then combined and displayed at 30 frames/second. In other words, 1080p displays the same frame rate [and pixel rate] as 1080i.
I was referring to the LCD HDTV monitors available today which are 1920x1080 native. LCDs and plasmas are progressive displays. I know how interlacing works, thank you. The "buffering" you speak of doesn't apply to the deinterlacing schemes used by 1920x1080 LCDs and plasmas to upconvert a 1080i signal to 1080p.

I suppose I have to give you a link.
http://www.crutchfield.com/S-ykkI6byHP89/cgi-bin/ProdView.asp?g=153650&id=essential_info&i=610DV3750
Read the first line under the heading "Key Features." This TV displays all signals at 1080p by way of a scaler/deinterlacer.

Hmm.. I've often heard 720 referred to as half-assed or untrue HD, especially by people working in cinemas.

How come Apple Calls their 23inchers and 30inchers Cinema HD Display, and they call their 20"incher Cinema Display if all three are *true* HD-displays?

http://www.apple.com/displays/
People who call 720p halfway or untrue don't know what they're talking about. :) How Apple chooses to name their displays has everything to do with marketing. The current 15" PowerBook is HD, but they won't designate a PB as "HD" until/unless it has at least 1920 pixels across, as devman points out. devman's explanation of the spatial vs. temporal resolution advantages of 1080i compared to 720p is very succinctly put.

MisterMe
Apr 7, 2005, 01:17 PM
I'm talking about the transmission bandwidth PER CHANNEL, not the transmission bandwidth for all channels added together. They're all generally 19mbps streams per channel. I'm sorry this was hard to follow.You don't seem to understand how digital TV broadcasting works. Each broadcaster transmits multiple program streams on a single NTSC channel. By compressing their HD stream more than the standard specifies, they are able to transmit more simultaneous program streams than they would if they followed the standard.
I was referring to the LCD HDTV monitors available today which are 1920x1080 native. LCDs and plasmas are progressive displays. I know how interlacing works, thank you. The "buffering" you speak of doesn't apply to the deinterlacing schemes used by 1920x1080 LCDs and plasmas to upconvert a 1080i signal to 1080p.No broadcaster transmits in 1080p. The only way that an HDTV can display a progressive scan image is to buffer the first half-frame, combine it with the second half-frame, and display the complete frame progressively. That is "deinterlacing."
I suppose I have to give you a link.
http://www.crutchfield.com/S-ykkI6byHP89/cgi-bin/ProdView.asp?g=153650&id=essential_info&i=610DV3750
Read the first line under the heading "Key Features." This TV displays all signals at 1080p by way of a scaler/deinterlacer.So you rely on vendors for your information? I have nothing against Crutchfield. I may buy my second HDTV from them.
People who call 720p halfway or untrue don't know what they're talking about. :) How Apple chooses to name their displays has everything to do with marketing. The current 15" PowerBook is HD, but they won't designate a PB as "HD" until/unless it has at least 1920 pixels across, as devman points out. devman's explanation of the spatial vs. temporal resolution advantages of 1080i compared to 720p is very succinctly put.First, 720p is defined as HDTV but it is not an issue in this discussion. You are mistaken about the rest. 1920 horizontal pixels are not required. My 37" Sharp has a resolution of 1366 x 768. A 1280 x 720 display gives the requisite square pixel HD resolution. Only the most expensive HD monitors and integrated TV sets can display 1920 x 1080 at full resolution.

Rod Rod
Apr 7, 2005, 01:49 PM
You don't seem to understand how digital TV broadcasting works. Each broadcaster transmits multiple program streams on a single NTSC channel. By compressing their HD stream more than the standard specifies, they are able to transmit more simultaneous program streams than they would if they followed the standard.
Where did you get that information - "multiple program streams on a single NTSC channel." My understanding is that digital and analog broadcasts are on different frequencies. OTA HD goes out on different antennas, and broadcasters were given new frequencies for DTV broadcast.

No broadcaster transmits in 1080p. The only way that an HDTV can display a progressive scan image is to buffer the first half-frame, combine it with the second half-frame, and display the complete frame progressively. That is "deinterlacing."
Red herring #1. I never said any broadcaster transmits in 1080p. However, 1080p IS part of the ATSC spec.

"The only way" you're talking about applies to CRT TVs but not all plasmas and LCDs. On plasmas and LCDs, what you described will give interlace artifacts. The point of deinterlacing is to remove those comb line artifacts.

Have you ever watched interlaced material on a computer monitor?

So you rely on vendors for your information? I have nothing against Crutchfield. I may buy my second HDTV from them.
Red herring #2.

First, 720p is defined as HDTV but it is not an issue in this discussion. You are mistaken about the rest. 1920 horizontal pixels are not required. My 37" Sharp has a resolution of 1366 x 768. A 1280 x 720 display gives the requisite square pixel HD resolution. Only the most expensive HD monitors and integrated TV sets can display 1920 x 1080 at full resolution.
Red herring #3. In the part you quoted, I was referring to what Apple calls HD for marketing purposes. If you bothered to read my other posts in this thread you'd see that I clearly stated that anything above 1280x720 is HD.

ChrisFromCanada
Apr 7, 2005, 02:40 PM
Mister Me, Rod Rod is right. I have a 152" front projection TV and I watch both 720p and 1080i. They both are HD and they both look like HD. 1080i has motion atrifacts when used for things like sports and 720p always looks smooth and excels at things like sports but when looking at someting like a news broadcast 1080i does look better than 720p.

MisterMe
Apr 7, 2005, 08:10 PM
Where did you get that information - "multiple program streams on a single NTSC channel." My understanding is that digital and analog broadcasts are on different frequencies. OTA HD goes out on different antennas, and broadcasters were given new frequencies for DTV broadcast.Digital broadcasts occupy frequencies that are new to the individual broadcaster, not new to the television broadcast spectrum. Your local HDTV broadcaster uses a previously unoccupied NTSC channel. Let's pretend that WUBX TV8 was your favorite channel as a kid. You may see the station listed now as WUBX TV/DT 8 or WUBX TV8/DT 27. In these examples WUBX broadcasts its historic analog program stream using the full 6 MHz bandwidth of Channel 8. It also broadcasts several digital signals on Channel 27. However, your HDTV tuner will identify the multiple program streams as DT 8.1, DT 8.2, DT 8.3, etc. A typical mix would be the HDTV program on DT 8.1, a digital simulcast of the standard definition program on DT 8.2, 24 hour weathercast on DT 8.3, the local cable news channel on DT 8.4. Also included in the broadcast should be a program guide for each stream. All of these digital streams fit within the same 6 MHz spectrum space as NTSC Channel 27. Of the program streams listed above, the HDTV stream consumes the most bandwidth. By compressing this stream more than ATSC specifies, they can "steal" bandwidth for additional standard definition program streams or even non-television uses such as data communication.
"The only way" you're talking about applies to CRT TVs but not all plasmas and LCDs. On plasmas and LCDs, what you described will give interlace artifacts. The point of deinterlacing is to remove those comb line artifacts.I did not say there aren't artifacts. They appear on my TV. I suppose that you see them on HDTVs in the store. It is your explanation of them that is incorrect.
Red herring #3. In the part you quoted, I was referring to what Apple calls HD for marketing purposes. If you bothered to read my other posts in this thread you'd see that I clearly stated that anything above 1280x720 is HD.It's good to know that we agree on something.