Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you're saying there's no visible difference between 1080p on a Blu-ray disc and 720p streaming from Netflix, AppleTV, etc. when viewed on quality equipment... then you have poor vision. The difference is immediately visible to me. I think a crack-smoking hobo might agree with you, but I have never used mind-altering drugs for entertainment.

What I'm saying is beyond you because I'm not saying any such thing. You cannot compare 720p to 1080p and say there's not a "visible" difference on equipment that can show it and be sane.

What I am saying is that you don't need BD levels of compression (i.e. that low) to achieve the same relative visible quality the vast majority of the time. BD uses that level because that's the amount of space they have available and they're going to use as little compression as possible to fill that space. That's a reasonable thing to do, but it doesn't mean you couldn't get away visually with more compression. I'm saying a 1080p movie file at the 8-12GB range can be mostly/virtually indistinguishable from a 40-50GB BD movie with proper compression. That doesn't mean there are no differences (especially during certain motion events that move every pixel on the screen randomly that happens in some movies once in a blue moon) and it doesn't mean you could overlay them and have a pixel exact result. What it does mean is that the average viewer (yes with proper equipment at viewing distance where you can see the full resolution) isn't going to be able to pick out the BD from the compressed file with the movie in motion MOST of the time (i.e. when shown various repeated segments).

I realize most BD snobs wouldn't believe that for a minute and most would make strong claims that I must be blind, etc. if I cannot tell, but strange things happen when you do double blind tests on things like this. I've seen every claim imaginable in the high-end audio arena and let me tell you that 95% of them are pure unadulterated BS and this is with people that have ULTRA EXPENSIVE high-end gear. They have every psychological reason to imagine they are hearing better sound (who wants to admit they wasted all that money?), but usually no technical/scientific reasons. The double-blind test is THE only way to prove perceptible differences with humans.

Most people making blanket statements do not take into account other factors involved either. For example, you may say an Apple TV downloaded 720p movie looks like "crap" to you on your 1080p set. I've seen that claim before on here. The problem is that a 1080p cannot show a 720p signal without scaling it upward. There are a lot of crappy scalers out there (just like with progressive DVD players there were a lot of very AWFUL scan doublers out there). You cannot assume what you're seeing on a non-native display is an accurate representation of that signal without a frame of reference. I have a 720p projector with a 93" screen and a 46" 720p plasma tv. They show true 720p and so I don't get distorted signals from 720p sources. Given the vast majority of sources are 720p or 1080i (neither of which carry 1080p resolution and both of which must be either scaled or line-doubled in order to present it on a 1080p resolution display), I'm not convinced a 1080p set is automatically a "good" thing (unless all you ever watch is Blu-Ray).

A good example are my two 24" LG monitors I have in my den on two different computers. They are both the same native resolution (1920x1200). One cost $600 at the time and the other cost $300. At 1920x1200 they look identical to my eyes. The difference, however, becomes clear when showing lower resolutions. The $300 display looks like CRAP at non-native resolutions (blurry/smudgy/unclear looking) whereas the $600 display looks only slightly off native clarity at lower resolutions. The scaler quality is night and day between the two. If you're going to use native resolution all the time, the cheap one is just fine (I use it with my Macbook Pro and it looks great). But the $600 monitor I use with my gaming PC rig and I can turn down the resolution on newer games that my GPU cannot keep up with at maximum resolution and it still looks great. The same games on the other monitor would look well...less than optimal.

The same thing happens with 1080p televisions showing 720p, 480p,480i and even 1080i sources. Things get a little more complicated with 3:2 pulldown and other issues, but ultimately you cannot "assume" just because 1080p looks "great" on your set that a 720p source looks like total "crap" just because it happens to look poor on your particular set. Yet I see that assumption and argument all the time. 720p sets are far less common today and so most people have never seen 720p sources in their native resolution and thus go by the image they are getting with their particular scaler, which may be good or it may be bad or somewhere in the middle.

Then there's the matter of optical resolution perception limits. For example, if you view a 1080p signal on a 42" 1080p monitor, you are not seeing 1080p worth of resolution at 10 feet away. You are seeing something closer to 720p worth of resolution and thus there is no real benefit of a 1080p monitor if that's where your seating location is. If you sit 20 feet away, you might as well just watch a DVD because you won't see more than 480p worth of resolution. I mention this because I've seen plenty of comments of people about 1080p superiority over 720p and the people in question weren't even seeing 720p worth where they actually watch the set (telling a difference at 3 feet for demo purposes is meaningless to your actual day-to-day experience).

Thus, I am in no way saying that 1080p isn't potentially beneficial assuming you have the right sized screen at the proper distance that you can actually see resolve that resolution and I'm not saying that under no circumstances can a Blu-Ray show a difference compared to using 5x more compression. I'm saying MOST of the time those differences are negligible to the point where it would not be worth using 5x the storage to have BD levels of compression on a hard drive storage system. It's certainly not worth it to me to have to keep using a disc based playback medium. I'd rather transfer the movie into a format that is useful for broadcasting around the house and doesn't take up an entire 3TB drive for a mere 60 movies when I could easily fit 300 movies in the same space with almost no drop in quality and more like 500 at 720p resolution for my current projector system with no noticeable drop in displayable resolution there (and perhaps a quality bump where the scaler performs worse in real time (that a BD player or projector would have to do to show 1080p on a 720p projector) than a high quality transfer to 720p).
 
What I'm saying is beyond you because I'm not saying any such thing. You cannot compare 720p to 1080p and say there's not a "visible" difference on equipment that can show it and be sane.

What I am saying is that you don't need BD levels of compression (i.e. that low) to achieve the same relative visible quality the vast majority of the time. BD uses that level because that's the amount of space they have available and they're going to use as little compression as possible to fill that space. That's a reasonable thing to do, but it doesn't mean you couldn't get away visually with more compression. I'm saying a 1080p movie file at the 8-12GB range can be mostly/virtually indistinguishable from a 40-50GB BD movie with proper compression. That doesn't mean there are no differences (especially during certain motion events that move every pixel on the screen randomly that happens in some movies once in a blue moon) and it doesn't mean you could overlay them and have a pixel exact result. What it does mean is that the average viewer (yes with proper equipment at viewing distance where you can see the full resolution) isn't going to be able to pick out the BD from the compressed file with the movie in motion MOST of the time (i.e. when shown various repeated segments).

I realize most BD snobs wouldn't believe that for a minute and most would make strong claims that I must be blind, etc. if I cannot tell, but strange things happen when you do double blind tests on things like this. I've seen every claim imaginable in the high-end audio arena and let me tell you that 95% of them are pure unadulterated BS and this is with people that have ULTRA EXPENSIVE high-end gear. They have every psychological reason to imagine they are hearing better sound (who wants to admit they wasted all that money?), but usually no technical/scientific reasons. The double-blind test is THE only way to prove perceptible differences with humans.

Most people making blanket statements do not take into account other factors involved either. For example, you may say an Apple TV downloaded 720p movie looks like "crap" to you on your 1080p set. I've seen that claim before on here. The problem is that a 1080p cannot show a 720p signal without scaling it upward. There are a lot of crappy scalers out there (just like with progressive DVD players there were a lot of very AWFUL scan doublers out there). You cannot assume what you're seeing on a non-native display is an accurate representation of that signal without a frame of reference. I have a 720p projector with a 93" screen and a 46" 720p plasma tv. They show true 720p and so I don't get distorted signals from 720p sources. Given the vast majority of sources are 720p or 1080i (neither of which carry 1080p resolution and both of which must be either scaled or line-doubled in order to present it on a 1080p resolution display), I'm not convinced a 1080p set is automatically a "good" thing (unless all you ever watch is Blu-Ray).

A good example are my two 24" LG monitors I have in my den on two different computers. They are both the same native resolution (1920x1200). One cost $600 at the time and the other cost $300. At 1920x1200 they look identical to my eyes. The difference, however, becomes clear when showing lower resolutions. The $300 display looks like CRAP at non-native resolutions (blurry/smudgy/unclear looking) whereas the $600 display looks only slightly off native clarity at lower resolutions. The scaler quality is night and day between the two. If you're going to use native resolution all the time, the cheap one is just fine (I use it with my Macbook Pro and it looks great). But the $600 monitor I use with my gaming PC rig and I can turn down the resolution on newer games that my GPU cannot keep up with at maximum resolution and it still looks great. The same games on the other monitor would look well...less than optimal.

The same thing happens with 1080p televisions showing 720p, 480p,480i and even 1080i sources. Things get a little more complicated with 3:2 pulldown and other issues, but ultimately you cannot "assume" just because 1080p looks "great" on your set that a 720p source looks like total "crap" just because it happens to look poor on your particular set. Yet I see that assumption and argument all the time. 720p sets are far less common today and so most people have never seen 720p sources in their native resolution and thus go by the image they are getting with their particular scaler, which may be good or it may be bad or somewhere in the middle.

Then there's the matter of optical resolution perception limits. For example, if you view a 1080p signal on a 42" 1080p monitor, you are not seeing 1080p worth of resolution at 10 feet away. You are seeing something closer to 720p worth of resolution and thus there is no real benefit of a 1080p monitor if that's where your seating location is. If you sit 20 feet away, you might as well just watch a DVD because you won't see more than 480p worth of resolution. I mention this because I've seen plenty of comments of people about 1080p superiority over 720p and the people in question weren't even seeing 720p worth where they actually watch the set (telling a difference at 3 feet for demo purposes is meaningless to your actual day-to-day experience).

Thus, I am in no way saying that 1080p isn't potentially beneficial assuming you have the right sized screen at the proper distance that you can actually see resolve that resolution and I'm not saying that under no circumstances can a Blu-Ray show a difference compared to using 5x more compression. I'm saying MOST of the time those differences are negligible to the point where it would not be worth using 5x the storage to have BD levels of compression on a hard drive storage system. It's certainly not worth it to me to have to keep using a disc based playback medium. I'd rather transfer the movie into a format that is useful for broadcasting around the house and doesn't take up an entire 3TB drive for a mere 60 movies when I could easily fit 300 movies in the same space with almost no drop in quality and more like 500 at 720p resolution for my current projector system with no noticeable drop in displayable resolution there (and perhaps a quality bump where the scaler performs worse in real time (that a BD player or projector would have to do to show 1080p on a 720p projector) than a high quality transfer to 720p).
Ok, I'd say we're on the same page now. I don't have 720p, only 1080p. My playback monitor is 2560x1600, which already creates an issue when viewing 1920x1080 material, and it is exacerbated when watching 720p on my computer.

I haven't bothered to shop for 720p TVs, so I don't know how common or uncommon they are today, but I would cite this as further argument that Apple is missing the boat by using 720p instead of the more common higher resolution. Who thinks manufacturers and consumers are ever going to scale back down? Nobody.

In the meantime, we have this beautifully illustrated scenario you have written out for us, whereby people with modern equipment are repulsed at 720p "LowHD" material. I don't suffer from this, because as you pointed out, I pretty much watch only Blu-ray. I don't even watch HD programming on TV, because most TV programming bores me.
 
Missing the boat? Not really ....

This is going to be one of those situations where the marketplace decides the direction the new TVs will go. Most consumers are clueless enough about things that they always believe "bigger numbers are better" without doing much research beyond that. So that, alone, will ensure 720p TVs die off in favor of 1080p models.

Right now, you can still buy a number of 720p TVs but all of them I've seen are the lowest-priced models on the showroom floors, and are likely to become "closeouts" before long.

But Apple didn't "miss the boat" with their decision to do 720p on the AppleTV. It was clearly an issue of trying to find the best balance of file sizes vs. viewability. With a $99 price point, I wouldn't be at all surprised if the majority of people using the new AppleTVs are attaching them to smaller size LCD panels in bedrooms, kitchens, etc. People who can afford big 60" plasma TVs can probably also afford a higher-end media center solution to go with them. It's the folks with the relatively small (27" and the like) TVs who would be most likely to grab a $99 AppleTV and throw it on there just so they can watch some Netflix streams or rent a couple movies from Apple, here or there. 720p is going to be just fine for those smaller display screens.

Not only that, but when you consider how many people plan on using one of these boxes via wi-fi? A lot of those people probably don't even have wireless "n" compatible routers. They may still be using an older Linksys or Netgear with wireless "g" mode only. 1080p content would really struggle to stream well over that speed of connection.


I haven't bothered to shop for 720p TVs, so I don't know how common or uncommon they are today, but I would cite this as further argument that Apple is missing the boat by using 720p instead of the more common higher resolution. Who thinks manufacturers and consumers are ever going to scale back down? Nobody.

In the meantime, we have this beautifully illustrated scenario you have written out for us, whereby people with modern equipment are repulsed at 720p "LowHD" material. I don't suffer from this, because as you pointed out, I pretty much watch only Blu-ray. I don't even watch HD programming on TV, because most TV programming bores me.
 
USB2 was half as fast on a Mac in TIGER. That is no longer the case in Leopard and Snow Leopard.

I copied a couple hundred MB via my USB2 flash drive the other day. From Mac 10.6.6, to Win7 Starter. It took 3-4 times as long on the Mac. God, maybe 10x, I wasn't timing it. After a minute and a half or whatever copying onto the drive, I couldn't believe it copied onto the netbook in under 15 seconds. Not pleased with the disparity. And this Mac, despite being older, is clearly faster than the little netbook in almost everything. (not Flash, obviously)

Of course, if Win7 Starter could actually network with other computers, I wouldn't have needed the flash drive, anyway.
 
I copied a couple hundred MB via my USB2 flash drive the other day. From Mac 10.6.6, to Win7 Starter. It took 3-4 times as long on the Mac. God, maybe 10x, I wasn't timing it. After a minute and a half or whatever copying onto the drive, I couldn't believe it copied onto the netbook in under 15 seconds. Not pleased with the disparity. And this Mac, despite being older, is clearly faster than the little netbook in almost everything. (not Flash, obviously)

Of course, if Win7 Starter could actually network with other computers, I wouldn't have needed the flash drive, anyway.

I'm not sure what the problem would be. All my drive tests on my MBP versus XP PC using USB2.0 drives show similar results in Leopard (I don't think I've done the test since Snow Leopard appeared). Tiger was around 1/2 the speed, however.
 
I haven't bothered to shop for 720p TVs, so I don't know how common or uncommon they are today, but I would cite this as further argument that Apple is missing the boat by using 720p instead of the more common higher resolution.

And which format are they going to use for these higher resolution files. Which of course will also have to have at least 5.1 sound or what's the point.

Keeping in mind that ISPs are not going to look too fondly on folks having to download a 25GB or higher file to get those movies.
 
Ok, I'd say we're on the same page now. I don't have 720p, only 1080p.
Almost all sporting events are broadcast at 720p. Your 1080p HDTV upconverts the 720p signal on-the-fly. So well, in fact, that you probably didn't even notice.

And if that's happening via a digital cable box instead of in your HDTV, your cable provider is probably upconverting that 720p signal to 1080i not 1080p.

A lot of people tend to think they get more from their 1080p HDTV than they actually do. Probably because that's what the salesguy at the big box store told them when he was selling them a $129 HDMI cable from Monster. ;)
 
Almost all sporting events are broadcast at 720p. Your 1080p HDTV upconverts the 720p signal on-the-fly. So well, in fact, that you probably didn't even notice.

And if that's happening via a digital cable box instead of in your HDTV, your cable provider is probably upconverting that 720p signal to 1080i not 1080p.

A lot of people tend to think they get more from their 1080p HDTV than they actually do. Probably because that's what the salesguy at the big box store told them when he was selling them a $129 HDMI cable from Monster. ;)

I think some people think that 1080i and 1080p are essentially the same thing and they are not. They are not even close, really (save a 3:2 pulldown of film material only, same as 480i to 480p). If you watch any NTSC television material (480i signal, possibly far less usable resolution) on a modern progressive display (i.e. which is mostly all newer tv sets) either the device that is receiving/displaying or the TV itself has to convert it to 480p to display it and that means de-interlace. 10 years ago most videophile types understood that this conversion is FAR from simple.

The quality you get (especially with things like smooth camera pans) depends on the method and quality of the de-interlace electronics or software used to convert it. Ever notice a little "jump" every so often in smooth panning on old TV shows off DVDs or down-loadable files off the internet? This is an artifact of improper de-interlacing. Handbrake still (as FAIK) doesn't have a proper conversion method for 60fps material. For example, I have the movie "Super Fuzz" on DVD and the source is clearly a television conversion since it only pans smoothly on my old JVC progressive DVD player if "Video2" mode de-interlace is used. No matter what setting I use with Handbrake or if I let my projector handle the 480i mode display I get a slight little "jump" on scenes with smooth panning (like under the bridge right before Dave Speed moves the sewer cover into place). The same thing happened when I converted my Red Dwarf DVDs to play on AppleTV. When the spaceship is panning on the intro, there's little blurry jerks to it.

This is just one example of a motion artifact caused by converting 480i to 480p. This problem exists because the frame information does not exist for all 480 lines at the same moment in time on a 60fps signal. Only 240 lines (every other line) are available at any given frame. While your brain tends to fill in the time gap, it doesn't change the fact that the motion produces a distortion around the 1/30th second change of every other line. The faster and more extreme the motion, the more obvious this distortion becomes to the eye. Higher quality de-interlacers do a better job with non film material, but if the source has already been converted to 480p (i.e. a video computer file), you're stuck with the end result.

Now imagine converting 720p to 1080i (e.g. via cable box) and then back again to 1080p (TV/Projector). While the finer resolution helps hide the distortion more than at 480i, you still get distortion. You're better off sending 720p to the final display device and letting it handle it (720p to 1080p is a simple scale operation by comparison and no interlacing artifacts need exist). Of course, newer equipment should offer a 1080p conversion mode, but I know my HD cable box does not. The problem then is that 1080i sources would get down-converted to 720p and then re-scaled back to 1080p on the 1080p display and you'd lose resolution. Ideally, the device should offer 720p to 1080p conversion and 1080i to 1080p conversion (of course quality is then an issue for de-interlacing since most cheaper consumer electronics have traditionally used poor quality de-interlacers) OR it should offer to output the signal untouched and let your external equipment or TV handle the conversion directly in order to get the best possible signal.

It should be mentioned that this problem would/could have been avoided if (as I think Bill Gates suggested) that the 1080i mode be left out of the HD standard period, thus leaving all broadcast sources at 720p (which due to the distortion in 1080i could arguably be the better choice for most material) and 1080p would have been reserved for newer devices (e.g. on-demand services and future Internet TV once bandwidth is no longer an issue) and Blu-Ray.

Basically, interlaced signals suck. They are inaccurate, time-distorted signals that use a trick of the human brain to perceive more resolution at a given moment than actually exists and go wonky the moment something moves on the screen. And if you've ever seen an interlaced computer signal on a CRT computer monitor (like certain Amiga models outputted) it would produce a "flickery" display to boot (usually not noticeable on tv type material). 720p past the distance for a given size screen versus seating distance will appear identical to 1080p anyway (and this does deserve mentioning since probably over 85% of all people are not seeing "1080" worth of resolution period (due to sitting too far from medium and small TVs; for example a 48" 1080p TV at a mere 12 feet seating distance will only produce ~720p worth of resolution to the human eye so a 1080p set at that distance is nothing but a marketing gimmick to get more money from you). Modern TVs don't even display interlaced modes accurately (that's not how they function to display every other scanning line like a CRT) and so you get even more distortion than the signal contains to begin with. Sadly, the standards committees had almost no insight in that regard (i.e. 15 years ago CRTs were the normal and 1080i seemed like a good idea for a 1080i CRT as 1080p for CRT was cost prohibitive and the data rates too high anyway while 720p and 1080i were comparable for actual stored/time data content). They should have listened to Bill Gates.... (and I never thought I'd find myself saying that). If they had, we'd have one less source of distortion for video out there. On the other hand, all lossy compression is a form of distortion that relies on human perception as well, so.... ;)
 
I think some people think that 1080i and 1080p are essentially the same thing and they are not. They are not even close, really (save a 3:2 pulldown of film material only, same as 480i to 480p).
Well, that's not quite true for 2 reasons. One is that most 1080p material IS film material, found on bluray. So, in this case the exception is the majority. The other you mentioned, but let's explain it. 1080p is virtually all stored as 24 frame, while 1080i is 60 frame, giving enough time to fully cover every spec of detail from the 1080p source. Your comparison to 480 doesn't work that well because both formats were 60Hz, so 480i has essentially half the data that 480p could provide. The information stored on DVD seemed to have randomly selected formats, even in the same DVD.

They've also finally cleaned up their interlace/de-interlace tech in modern players. The endless discussions of which DVD player does a better job deinterlacing are non-existant for 1080 & BD.

Thanks for your other comment, I wonder if I've got a peripheral that's acting up on my one Mac.
 
Well, that's not quite true for 2 reasons. One is that most 1080p material IS film material, found on bluray. So, in this case the exception is the majority. The other you mentioned, but let's explain it. 1080p is virtually all stored as 24 frame, while 1080i is 60 frame, giving enough time to fully cover every spec of detail from the 1080p source. Your comparison to 480 doesn't work that well because both formats were 60Hz, so 480i has essentially half the data that 480p could provide. The information stored on DVD seemed to have randomly selected formats, even in the same DVD.

They've also finally cleaned up their interlace/de-interlace tech in modern players. The endless discussions of which DVD player does a better job deinterlacing are non-existant for 1080 & BD.

Thanks for your other comment, I wonder if I've got a peripheral that's acting up on my one Mac.

Nowhere did I talk about viewing 1080p on a non-1080p display, which is what you seem to be talking about. I'm talking about using a 1080p native display to view non-1080p native material (i.e. 480i NTSC, 1080i and 720p that has been converted to "1080i" by a cable box and then has to be de-interlaced by the TV (thus creating an interlaced distortion when it should have been just scaled to 1080p by the display). You also get problems converting 1080i to 720p (de-interlace plus scaling).

In short, I'm saying the vast majority of all media out there is not 1080p (movies can be, but even then it's either transmitted in something other than 1080p or 1080i (1080i film broadcast at 60fps can be converted back to 1080p at 24fps using 3:2 pulldown, but I mentioned that) and therefore a 1080p display is going to get distortions when viewing "TV" material broadcast in either 480i or 1080i. It has to be de-interlaced. Even 720p can end up being converted by many cable boxes to 1080i first (creating time distortion) and then converted into progressive (de-interlace) by the 1080p display to show in 1080p.

In other words, 1080p displays aren't perfect by any means because unless the only thing you watch on it are movies, you are going to get distortions either in scaling or in de-interlacing. Ironically, some of the early HD CRTs could display 480i, 480p, 720p and 1080i in their actual native resolution. That means they were more accurate and less distorted than any set on the market today for the vast majority of formats out there. I don't know if any of them did 1080p as well, but if they did, that's your perfect (in terms of distortion free) playback device for everything. Flat-panels are nice in terms of shape, weight and price these days, but they have to convert everything to their native resolution format and your scaler and de-interlace equipment is going to determine how good a picture you will get when showing anything OTHER than 1080p (and that includes 1080i sources).

This is also true of 720p sets, of course (they show 720p the best and everything else is scaled and/or de-interlaced). My point was that 1080i should never have been allowed to exist in the first place and that a lot of the people that think "720p" sources (like AppleTV) look "terrible" have that impression because their scaler does a crappy job scaling it up to 1080p, not because the source is putting out a crappy signal. Most people out there don't realize that. It's easy to see on any computer monitor if you try to play a game at resolution lower than the native display (e.g. try 1024x768 on a 1920x1200 native monitor). If the scaler is good, it will look good, but not technically perfect. If the scaler is bad, it will look like blurred crap. I have two LG 24" monitors of this type and one was $600 and the other $299. There is a HUGE difference between the two when viewing lower resolutions than native. They look virtually identical with a native resolution signal (i.e. 1920x1200). The same is true of all 1080p displays. The ones with the better scalers and de-interlace circuitry will look good at 720p and with 480i and 480p. The ones with lower quality circuits will look very poor at those resolutions (very mucky/blurry).

Thus, I say a person that doesn't watch many movies (mostly tv) might not be as well off with a 1080p display as they think since NOTHING (that I know of) out there on television is broadcast in 1080p. It's all 720p or 1080i and neither of those display correctly/natively on a 1080p set. They either need to be de-interlaced, scaled or both. Only 1080p signals (i.e. Blu-Ray and some on-demand stuff or computer files encoded at 1080p) will show without processing. A 720p set will at least show some broadcast tv at native resolution and an older CRT HDTV will show the majority of TV programming without processing (i.e. 1080i) and some can do both 720p and 1080i (a family member of mine has a 57" Panasonic CRT that still works fine and it does 720p and 1080i natively).

I personally have a 720p projector with a 93" screen (1080p projectors were around $5000 when I bought it 4 years ago) and my new Panasonic 46" plasma is 720p as well (at the 12' seating distance for that room, you can only resolve 720p worth of resolution anyway so it wasn't worth paying more for a 1080p display regardless in that room). Thus, AppleTV works fine for me at the present time since it's all native and so is some TV. In the future, if I get a 1080p projector to replace my existing model (I understand that 3D BD is 720p, though?), I would then need 1080p sources to get the optimum experience, but I'll deal with that when the time comes. 3D projectors are mucking things up since they are still in their infancy and I couldn't care less about 3D on a smallish 46" screen.
 
Almost all sporting events are broadcast at 720p. Your 1080p HDTV upconverts the 720p signal on-the-fly. So well, in fact, that you probably didn't even notice.

And if that's happening via a digital cable box instead of in your HDTV, your cable provider is probably upconverting that 720p signal to 1080i not 1080p.

A lot of people tend to think they get more from their 1080p HDTV than they actually do. Probably because that's what the salesguy at the big box store told them when he was selling them a $129 HDMI cable from Monster. ;)
I don't like to watch sports on TV at all. I'll see it live, or play, but TV sports bore the urine out of me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.