PDA

View Full Version : AppleTV - 1080P A Different Way




PearlMikeJam
Nov 2, 2007, 11:09 AM
I have finally upgraded to HD and am now considering an AppleTV. I am concerned about the picture quality in the demos I have seen. Many of the discussions talk about the next gen AppleTV having 1080P capabilities, but it seems that people are expecting 1080P content. Why can't the next AppleTV upconvert to 1080P? I do not understand why a $50 DVD player can do that but a more sophisticated piece of hardware like the AppleTV cannot. That could improve the picture quality and allow future HD content on iTunes to be a more reasonable size at 720P.



zakatov
Nov 2, 2007, 11:15 AM
I have finally upgraded to HD and am now considering an AppleTV. I am concerned about the picture quality in the demos I have seen. Many of the discussions talk about the next gen AppleTV having 1080P capabilities, but it seems that people are expecting 1080P content. Why can't the next AppleTV upconvert to 1080P? I do not understand why a $50 DVD player can do that but a more sophisticated piece of hardware like the AppleTV cannot. That could improve the picture quality and allow future HD content on iTunes to be a more reasonable size at 720P.

There is no 1080p content for the :apple:TV, so let your TV upconvert, it'll be the same thing

HobeSoundDarryl
Nov 2, 2007, 04:20 PM
...And upconverting is not creating real 1080p. A real 1080p picture is going to have picture elements (pixels) based upon capturing an image at the max resolution of HD. Upconverting is using technology to take less data and basically guess what colors the pixels it is "filling in" should be.

Now, upscalers can make some good guesses, but no one should ever fool themselves into believing that you can get 1080p from- say- current SD-DVD video. An upscaler could take lowly 320x240 video and scale it up to 1080p too, but that also wouldn't be real 1080p (and you could easily tell with that much scaling).

I'm all for :apple:TV V2 having the capability to output 1080p HD source as 1080p. But it definitely will not be 1080p if it is getting there by upscaling some lower-res source.

MikieMikie
Nov 3, 2007, 07:28 AM
I have finally upgraded to HD and am now considering an AppleTV. I am concerned about the picture quality in the demos I have seen. Many of the discussions talk about the next gen AppleTV having 1080P capabilities, but it seems that people are expecting 1080P content. Why can't the next AppleTV upconvert to 1080P? I do not understand why a $50 DVD player can do that but a more sophisticated piece of hardware like the AppleTV cannot. That could improve the picture quality and allow future HD content on iTunes to be a more reasonable size at 720P.

If you are concerned about the current picture quality you have seen in :apple:TV demos, I encourage you to not purchase one. Especially if you have seen DVD quality source.

In addition, I encourage you to not leap to an assumption that there will be any more revisions/versions of :apple:TV than what already exist. Therefore, don't "wait" for a 1080p version. For Apple to ship a 1080p version would imply that there was enough source material to justify the cost of development and marketing. Such source does not exist on the ITMS, nor does it appear to be likely that it ever will. Selling razors and razor blades is what Apple is all about these days.

The :apple:TV can upconvert to 1080i, which I assure you looks exactly like 1080p, since most sets convert 1080i to 1080p in their frame buffers. If you weren't impressed with the demos, you were probably watching iPod quality video, downloaded from the ITMS. Ripped and encoded DVDs produce a DVD-quality viewing experience.

But as others have said, better than I, it isn't the pixels it's guessing at, it's the pixels in the source material that matter.

-- Mikie

fabianjj
Nov 4, 2007, 03:37 PM
I don't know that much about video scaling etc. But basically Up-scaling means that it colors all the pixels more separately instead of just using ten pixels to display one pixel of video content?

skye12
Nov 4, 2007, 08:36 PM
I have a Sony XBR4 LCD which is 1080p capable. 1080i off my DirecTv
dish looks great, but Bluray 1080p dvd's are in a class by themselves.

The detail has to be seen to be appreciated. Upscaling is not even close.

HobeSoundDarryl
Nov 6, 2007, 04:08 PM
fabianjj, upscaling is a technology way of trying to create something (higher resolution picture) from nothing. Your computer can do upscaling when you size any picture larger than 100%. The computer looks at the pixels that are now spread apart and offers a "best guess" at what colors the gap should be.

Video scalers are basically doing the same thing, though the guessing can get more sophisticated because the scaler can look at colors that are coming (in upcoming frames) to help influence the guessing. BUT, upscaling to 1080i or 1080p from a source that is maxed out at SD levels is far from watching the same video natively in 1080i or 1080p.

This thread is about the idea that an :apple:TV plus a scaler (built in or attached) could yield a 1080i/p :apple:TV. While this is certainly arguable, my vote would be absolutely not, much like a jpeg scaled beyond 100% is giving up picture quality for size. Technically a scaler can get an SD video to a 1920x1080i/p measurement, but it gives up a lot of detail (to the guessing what goes in the gaps) to do so.

PearlMikeJam
Nov 6, 2007, 07:52 PM
I want to make a couple of points to help clarify my original post.

First, I understand that 1080p content is not likely going to be on ITMs any time soon, however, 720p could be more realistic in the near future. If 720p is not in the cards, it would make sense to improve the output quality of :apple:TV. Upscaling would be one way to do that.

Second, IMO, upscaling makes a difference. I have watched and compared the same DVD on both an older non-scaling DVD player and an upscaling player and the difference is noticeable.

Finally, I realize that TVs convert 1080i to 1080p, but to me, 1080i (converted to 1080p by the TV) pales in comparison to 1080p and 720p native quality.

MikieMikie
Nov 7, 2007, 03:12 PM
Finally, I realize that TVs convert 1080i to 1080p, but to me, 1080i (converted to 1080p by the TV) pales in comparison to 1080p and 720p native quality.

I usually leave posts like these alone, but I am just beside myself with curiosity (hey, self, get back over here!).

If a TV converts 1080i into 1080p, and they share the same number of pixels, how the hell can you see a difference?

-- Mikie

peeaanuut
Nov 7, 2007, 04:42 PM
seeing the difference between 1080i and 1080p is like hearing the difference between a 320kbps mp3 and a 400kbps mp3. Noone can tell the difference. 1080P is a marketing ploy and people are buying into it. While technically it makes sense, it doesnt matter because the difference is unable to be see by the naked eye.

TheMechanic
Nov 7, 2007, 04:55 PM
I usually leave posts like these alone, but I am just beside myself with curiosity (hey, self, get back over here!).

If a TV converts 1080i into 1080p, and they share the same number of pixels, how the hell can you see a difference?

-- Mikie

Actually a frame in 1080i has only half the pixels of a frame in 1080p. On the other hand 1080i in comparison has double the frame rate. The problem here is interlacing (that is what the "i" stands for) it was develop to gain better video quality on CRT displays. But non-CRT displays can not properly display interlaced video, instead they use progeressive scan to display video. So techincally every HDTV can only display 1080p content and to display 1080i it has to do something like upscaling.

Maybe you want to read one or all of these Wikipedia articles, they're quite interessting.

http://en.wikipedia.org/wiki/1080i
http://en.wikipedia.org/wiki/Interlacing
http://en.wikipedia.org/wiki/Deinterlacing

RubberChicken
Nov 7, 2007, 06:34 PM
I have finally upgraded to HD and am now considering an AppleTV. I am concerned about the picture quality in the demos I have seen. Many of the discussions talk about the next gen AppleTV having 1080P capabilities, but it seems that people are expecting 1080P content. Why can't the next AppleTV upconvert to 1080P? I do not understand why a $50 DVD player can do that but a more sophisticated piece of hardware like the AppleTV cannot. That could improve the picture quality and allow future HD content on iTunes to be a more reasonable size at 720P.

Indeed, it seems reasonable to expect AppleTV to be able to upscale, when so many other "less sofisticated" devices are able to. The necessity however is really dictated by the other bits of equipment you use. If AppleTV gets an upscaling ability it may not be as good as a dedicated upscaling chip that you might find in a high quality Receiver or HDTV. They would likely do a better job, so an upscaling AppleTV would be undesirable in that case.

If you are unhappy with the display capability of your HDTV when watching standard DVD output compared to upscaled output from a $50 DVD player, then you may have made the wrong choice with your HDTV. Not all HDTVs offer upscaling and not all upscalers are equal. I just got a new Sony XBR and DA3300es receiver - both have decent upscaling but the XBR actually offers more control over the aspects of the upscaling and offers this for each input. In my mind this may still be preferable to the receiver, even though it has a topclass faroudja dcdi upscaler, as it has no control over each input.

And from your second post, if 720p looks better on your HDTV than 1080i, then you may not have made the right choice in HDTV.

However, I think it's quite reasonable for you expect upscaling ability in AppleTV, even if it should not really be necessary or desirable in some cases.

tronic72
Nov 10, 2007, 11:02 AM
I think too many people are getting caught up in technical details when it comes to HD.

There are heaps of posts (Google it), that explain that often there is little or no 'visual" difference between 720p 1080p and 1080i. Don't get me wrong, I think if you can afford to get the highest spec device when purchasing then go for it. But I just think it's important to remember that in many instances the difference between standards is often miniscule.

Remember; If the content isn't created in the same level HD format, then there is no benefit in the higher quality HD format.

Surround sound has suffered similar hype with many amps supporting 6.1, 7.1 and even 9.1. But there has never been content created that supports anything greater than 5.1 so these expensive 9.1 amps are really only "making it up" when it comes to the remaining channels.

Kevin_B
Nov 26, 2007, 03:52 PM
It's a bit complicated, but the fact is that when looking at footage that originated at 24fps, it doesn't matter a lick if it's 1080i or 1080p. They're identical.

TVs don't do "something like upscaling" to display 1080i as 1080p in many cases. In order to maintain compatibility with broadcast specs, material that was originally shot at 24 frames per second "progressive scan" (every motion picture out there) have what's called "3:2 pulldown" applied to them, which essentially results in interlaced footage that plays back at 29.97fps (3 whole frames, 2 interlaced frames, 3 whole frames, 2...etc). This is how movies are broadcast/encoded for 1080i. When most modern TVs sense footage with 3:2 pulldown in them, they take the appropriate steps to reverse the 3:2 pulldown, combining the interlaced frames back into whole frames and dropping the frame rate back down to the original 24fps. The result is 24fps progressive footage that looks exactly like the source did - absolutely no loss is incurred as a result.

Cheers,

Kevin

Peace
Nov 26, 2007, 03:55 PM
I think resolution independence will work for video in the future.;)

Killyp
Nov 26, 2007, 04:15 PM
seeing the difference between 1080i and 1080p is like hearing the difference between a 320kbps mp3 and a 400kbps mp3. Noone can tell the difference. 1080P is a marketing ploy and people are buying into it. While technically it makes sense, it doesnt matter because the difference is unable to be see by the naked eye.

I disagree. In fact I tend to avoid any interlaced signals if possible, too many motion artifacts (even on the 1080i signals). 720p to my eyes looks much better.

As for upscaling, the latest algorithms really do an impressive job. In fact, I've seen footage on a 65" plasma I could swear was HD, but ended up being a normal DVD. Things have progressed a lot recently...

technocoy
Nov 26, 2007, 04:37 PM
yeah, we had this argument years ago with 480 vs 480p.

You average wal-mart shopper will, in fact, more than likely not see that much of a difference unless the sets are side by side, but those that are saying there is no difference between progressive content and interlaced content have not seen it in action with true 1080p content delivery. it's actually more pronounced that the transition on 480/480p.

It REALLY shines on bluray/hd dvd... I feel like I'm looking at a moving painting on some films. especially pixar films because of the bold colors.

pilotError
Nov 26, 2007, 04:47 PM
Wow, there is so much misinformation in this thread its mind boggling!

The real question is what are you looking to get out of it?

Are you looking for an easy way to extend your Mac or PC / iTunes?

Are you looking for high quality streaming device?

The real difference in 720 vs. 1080i vs. 1080p will be in your hardware. If you have a... inexpensive TV, chances are the scaling hardware in the TV is... inexpensive as well. You really get what you pay for. Compare a Vizio to a Pioneer or Panasonic.

If your looking for HD streaming, your better off looking at a Slingbox Pro. If you want a less expensive way to extend your iTunes and iPhoto, then the Apple TV fits the bill. Is it the best quality? Maybe... Depends on what kind of TV do you have? If your looking for real HD streaming, expect to pay much more for it.

The Apple TV does what its advertised to do, nothing more, nothing less unless you hack it. Like others have said, you won't get 5.1 sound out of the current software. It remains to be seen if Apple continues to develop it, hopefully we'll get a decent refresh soon to address some of its shortcomings.

wmealer
Nov 26, 2007, 04:56 PM
I have finally upgraded to HD and am now considering an AppleTV. I am concerned about the picture quality in the demos I have seen. Many of the discussions talk about the next gen AppleTV having 1080P capabilities, but it seems that people are expecting 1080P content. Why can't the next AppleTV upconvert to 1080P? I do not understand why a $50 DVD player can do that but a more sophisticated piece of hardware like the AppleTV cannot. That could improve the picture quality and allow future HD content on iTunes to be a more reasonable size at 720P.

If you've only seen iTunes content on :apple:tv, I can certainly understand your concern. Good quality DVD rips on :apple:tv look just as good as any upconverting DVD player can output.

IMHO, upconverting DVD players are over-hyped. Any HDTV can do the upconversion on its own, simply because it has to be capable of taking 480i analog content and displaying it on that 1920 x 1080 pixel canvas. The DVD player may perform slightly better than your TV in the long run, but we all know it can't magically add detail that isn't already there.

If you're looking for pristine Blu-ray quality, I'm afraid the :apple:tv will never be able to deliver that. FWIW, the 720p content I watch on my :apple:tv puts my 1080i DishHD picture to shame (mostly due to bitrate of course). It wouldn't be a bad thing for :apple:tv to output 1080p, but I'd call it completely unnecessary until iTunes offers 1080p content (which in my estimation will be never).

wmealer
Nov 26, 2007, 05:12 PM
And the folks comparing 1080i to 1080p are leaving out one very important detail... the TV. 1080i only looks like 1080p on a 1080p set. 1080i on a 720p/1080i set looks more like 720p or 768p (depending on the number of actual pixels).

Obvious to some... not so much for others.

Kevin_B
Nov 26, 2007, 05:20 PM
You average wal-mart shopper will, in fact, more than likely not see that much of a difference unless the sets are side by side, but those that are saying there is no difference between progressive content and interlaced content have not seen it in action with true 1080p content delivery. it's actually more pronounced that the transition on 480/480p.

Not to come off as sounding like a know-it-all, but having worked in the film industry for over a decade I consider myself a bit more discerning than a wal-mart shopper. ;)

There is zero difference, pixel for pixel, between a 29.97fps 1080i signal that's had 3:2 pulldown removed from it and a 23.976fps 1080p signal. Not one pixel different - 3:2 pulldown removal is a totally lossless process that has nothing to do with conversions or scaling. If you say you can see the difference, you're either lying or something is drastically wrong with your TV.

I only reinforce this point to help people separate truth from marketing. If your TV can't display 1080p, that's another story altogether (although, even in that case, 1080p content wouldn't do you any good since your player would introduce 3:2 pulldown before sending a 1080i compatible signal out of your Blu Ray/HD DVD player).

Cheers,

Kevin

bimmzy
Nov 26, 2007, 05:51 PM
I usually leave posts like these alone, but I am just beside myself with curiosity (hey, self, get back over here!).

If a TV converts 1080i into 1080p, and they share the same number of pixels, how the hell can you see a difference?

-- Mikie

well they don't share the same pixels, 1080p is sharper and has nearly double the resolution of 1080i. intact 720p looks sharper than 1080i believe it or not!

bimmzy
Nov 26, 2007, 06:08 PM
Not to come off as sounding like a know-it-all, but having worked in the film industry for over a decade I consider myself a bit more discerning than a wal-mart shopper. ;)

There is zero difference, pixel for pixel, between a 29.97fps 1080i signal that's had 3:2 pulldown removed from it and a 23.976fps 1080p signal. Not one pixel different - 3:2 pulldown removal is a totally lossless process that has nothing to do with conversions or scaling. If you say you can see the difference, you're either lying or something is drastically wrong with your TV.

I only reinforce this point to help people separate truth from marketing. If your TV can't display 1080p, that's another story altogether (although, even in that case, 1080p content wouldn't do you any good since your player would introduce 3:2 pulldown before sending a 1080i compatible signal out of your Blu Ray/HD DVD player).

Cheers,

Kevin

continuing on that theme, 1080p's pixels are square, 1080i's pixels are non-square, so on horizontal resolution 1080i is a third softer.

on the issue of cycle rates, 720p will look as sharp as 1080i when showing a film, but of sports broadcasts 720p 50/60 will look sharper than the 1080i 50/60 equivalent, especially when something in the scene is moving in frame. add this to the need for compress of the source signal the 'p' formats perform better the the 'i' formats. :D

Kevin_B
Nov 26, 2007, 06:27 PM
continuing on that theme, 1080p's pixels are square, 1080i's pixels are non-square, so on horizontal resolution 1080i is a third softer.

on the issue of cycle rates, 720p will look as sharp as 1080i when showing a film, but of sports broadcasts 720p 50/60 will look sharper than the 1080i 50/60 equivalent, especially when something in the scene is moving in frame. add this to the need for compress of the source signal the 'p' formats perform better the the 'i' formats. :D

Hi bimmzy - I'm not sure if you're flaming or not, but I'll take the bait and assume you aren't. Where are you getting your information from?

The fact is that 1080i is square pixels, just like 1080p. The only difference in pixel aspect ratio comes if you choose to shoot 1080i on low-end consumer products like HDV cameras.

In regard to resolution differences, 1080i footage that originated on film and has had 3:2 pulldown removed has exactly the resolution (1920x1080) as 1080p does. No more, no less.

Agreed on the "p" vs. "i" sharpness when you start talking about the way material is shot. If it was shot progressive (as all major motion pictures are), it's always going to look sharper. Some applications are better shot interlaced (sports, for example) because, even tough you get effectively half the vertical resolution, you get double the temporal resolution. That means smoother motion and, even though the resolution is technically halved on any given "frame" out of the 59.94 per second, the way your eye works at those frame rates (and the fact that each field is actually captured at the right place in the 1920x1080 framework) blends fields together and the perceived resolution loss is actually slightly less. If you pause on a frame, it'll look half rez vertically. When playing, it will look 75% or so, but smoother.

Hope that helps to clarify some myths and misperceptions that might be floating around!

Cheers,

Kevin

bimmzy
Nov 27, 2007, 07:35 AM
Hi bimmzy - I'm not sure if you're flaming or not, but I'll take the bait and assume you aren't. Where are you getting your information from?

The fact is that 1080i is square pixels, just like 1080p. The only difference in pixel aspect ratio comes if you choose to shoot 1080i on low-end consumer products like HDV cameras.

In regard to resolution differences, 1080i footage that originated on film and has had 3:2 pulldown removed has exactly the resolution (1920x1080) as 1080p does. No more, no less.

Agreed on the "p" vs. "i" sharpness when you start talking about the way material is shot. If it was shot progressive (as all major motion pictures are), it's always going to look sharper. Some applications are better shot interlaced (sports, for example) because, even tough you get effectively half the vertical resolution, you get double the temporal resolution. That means smoother motion and, even though the resolution is technically halved on any given "frame" out of the 59.94 per second, the way your eye works at those frame rates (and the fact that each field is actually captured at the right place in the 1920x1080 framework) blends fields together and the perceived resolution loss is actually slightly less. If you pause on a frame, it'll look half rez vertically. When playing, it will look 75% or so, but smoother.

Hope that helps to clarify some myths and misperceptions that might be floating around!

Cheers,

Kevin


Without sounding controversial, :eek: i think you may be adding to the myths. ;)

A pixel aspect ratio of square pixels for 1080 is a panacea. Its also the internationally agreed standard for real 1080P.
But, most broadcast cameras still fall short of the 1920 x 1880 resolution. Though I do accept Kevin's point that square pixels in 1080 is the "technical standard", but that's not my point!

In the real world of HD broadcasting having 1920x1080 resolution is great, but the manufacturers of HD equipment aren’t really playing an image quality game. Instead they’re trying to maintain profit margins in a tough and competitive market.

About 95% of current 1080i cameras have 1440 pixels horizontally (rather than 1920 pixels), thus reducing the ratio to 1280/1440 = 88.9% of horizontal resolution at best.
In the future all 1080i HDTV cameras will offer the full 1920 pixels. many popular models are still limited to 1440 pixels, and if you factor in the issues of display resolutions also, pixel stretching is very much a reality.

Interlaced or progressive? ESPN, ABC Sports, and Fox Sports in HD cover and broadcast sporting events in the 720p/60 format. The BBC broadcasts in 1080i currently (though they have plans to move to 1080p for the 2012 Olympics) as do allot of European broadcasters, despite the EBU's insistence that 720p should be the interim standard.
The reason why sport is often originated in 1080i (50/60) is because, like with 720p (50/60) its cheep, and only consumes half the bandwidth of 1080p (50/60). Other than that there is no advantage of shooting interlaced over progressive.
60p gives the same or better 'temporal resolution' (smooth motion) as 60i, without the massive drop in vertical resolution, attributed to interlaced systems.

"Everybody seems to agree ...... that progressive scanning gives improved motion portrayal.... The choice between 1080i and 720p is thus a balance between static resolution and motion portrayal: 1080i offers better static resolution whereas 720p offers better motion portrayal." EBU

With film and its inherent 3:2 pul-down for SD and HDTV and its resultant cross-frame, cross-field interpolation, Even feature films are at risk of interline-twitter in 1080i. this results in motion artefacts being introduced particularly where the action is fast and furious.

My real bug-bare is that the displays we are being sold seldom live up to expectations.
If you put an HD signal though an HD flat panel display, its gonna make it look great, regardless of the 720p/1080i/1080p argument. But we never see the pictures as they where originated!
It will be a while before we'll see SED and FED flat displays. These promise a leap in image quality, but FED isn't due until 2009 at the earliest, and SED for consumers, later still (if ever).

So what am i saying?

..... Don't buy an HDTV quite yet! :rolleyes:


I'll shut up now! :D

MikieMikie
Nov 28, 2007, 06:54 AM
Actually a frame in 1080i has only half the pixels of a frame in 1080p. On the other hand 1080i in comparison has double the frame rate. The problem here is interlacing (that is what the "i" stands for) it was develop to gain better video quality on CRT displays. But non-CRT displays can not properly display interlaced video, instead they use progeressive scan to display video. So techincally every HDTV can only display 1080p content and to display 1080i it has to do something like upscaling.

Maybe you want to read one or all of these Wikipedia articles, they're quite interessting.

http://en.wikipedia.org/wiki/1080i
http://en.wikipedia.org/wiki/Interlacing
http://en.wikipedia.org/wiki/Deinterlacing

I have read them. They endorse what I said: There is no difference beteen 1080 i and 1080 p since the frame buffer is filled and blasted as one, no matter how it was filled.

-- Mikie

err404
Nov 28, 2007, 03:15 PM
It sounds to me like a lot of the misinformation is due to confusing 1080i content with 1080i displays. A 1080p display is superior to a 1080i display. True interlaced displays only apply to CRTs and some rear projection screens. Interlaced displays display only 1/2 the image at a time, alternating quickly between odd and even lines. This often makes the image appear unstable.
All flat panel and true 1080p displays are progressive. A flat panel that is capable of 1080i is merely able to interpret the interlaced signal in order to convert it to the screens native resolution. Not all TV are equal at this process. In fact some low cost or early screens will simply throw out the odd frames resulting in a 540p image.

1080i content 'can' look the same as 1080p but only when viewed on a 1080p display that does proper processing to reassemble the frames. Did you know that the first 1080p displays from sony can only handle 1080i input? This is because the sony engineers knew this.

The trick is that the 1080i spec is 60 frames per second where all of the even lines are in one frame and the odd in the next. for 24 frame per second content (like movies) both the odd and even frames are combined before being displayed on the progressive screen at full resolution. The catch is that the 1080i input tops out at 30 progressive frames per second. (this explanation is simplified and the 3:2 pull down is described more accurately earlier in the tread.

True 1080p is also up to 60 frames per second, but each frame is a full frame. however since the content is only 24 for movies, half of the maximum temporal bandwidth potential is not used.

long story short. Due to current content limitations the 1080p spec only is able to use half of it's potential, allowing 1080i (when being pushed to it extreme) to equal the current quality.

You might be asking why your bluray disk looks so much better then your 1080i cable content. This is just compression from your provider. the bluray disk is recorded at a MUCH higher bitrate then your TV provider is able to transmit. Your TV provider does this in order to squeeze more channels into the same pipe.

As a fun experiment (depending on your setup) try changing your bluray player output to 1080i and see if you can tell the difference. If you have all quality components they will be identical.

Once content begins to exceed 24 frames per second 1080p content will become more more important.

tronic72
Dec 3, 2007, 10:24 PM
seeing the difference between 1080i and 1080p is like hearing the difference between a 320kbps mp3 and a 400kbps mp3. Noone can tell the difference. 1080P is a marketing ploy and people are buying into it. While technically it makes sense, it doesnt matter because the difference is unable to be see by the naked eye.

I'm so sick of people getting bogged down in this sort of technical cr@p. For anyone who's whinging about the lack of 1080 content. Go down to your local TV store and compare the two formats. You simply cannot tell the difference.

To make matters worse, all Plasma & LCD screens are simply not sharp or accurate enough to illustrate the additional detail on the screen. I recently went to my local HiFi store to see how the latest Plasma & LCD screens compare to my Sony Vega CRT screen. They don't. I find it amazing that so many people are sacrificing such jumpy, blurred images for the sake of larger screens. I've had so many visitors come to our house and stand amazed at the quality of our screen after spending over $5000 on a new 1080i LCD or Plasma screen.

As Nut says, you are getting sucked in by the manufactures of this equipment.

wmealer
Dec 4, 2007, 09:33 AM
To make matters worse, all Plasma & LCD screens are simply not sharp or accurate enough to illustrate the additional detail on the screen.

Odds are that your computer display is LCD... Are you saying that your CRT could display the text you're reading right now more sharply or accurately than LCD? Your 1080i Wega CRT is roughly the same pixel resoultion as most modern PC displays. I think it's clear that LCD is the PC display technology of choice, especially when "sharp or accurate" is the objective.

Wouldn't you agree? I'm sorry, you lost me when you said LCD isn't sharp or accurate enough to display more detail than CRT. That isn't true in a PC environment, so how can it be true in your living room?