Separate names with a comma.
Discussion in 'Digital Video' started by matt1219, Apr 2, 2009.
I know 1080p is supposedly better, but why not get 720p for cheaper? It's still looks great.
1080p isn't 'supposedly' better, it is better (unless you're talking about the same data rates for each). Why would you buy a Rolex when a McDonald's Happy Meal watch tells the time just as good?
That doesn't give me any good information.
What kind of information are you looking for? You asked why anyone would want 1080p video when 720p video looks nice. I gave the sole reason: 1080p video looks nicer. Does 720p video look nice? Yes. But 1080p video looks nicer. I don't know how else to put it.
Please don't encourage someone to post that stupid distance-to-screen size chart again.
With 1080p standard on just about every single TVs 40" or larger, and creeping into smaller sets at small price premium, I don't see why anyone would intentionally hunt and purchase 720p set.
Oh great...Another 720p and 1080p comparison.
You know why it's better? It's got better quality, higher resolution, etc. Not to mention if you master in 1080p, it DOES look a lot better. Try editing and comparing the video quality between the two. If you fullscreen on a computer monitor, between a 1080p clip and a 720p clip, with a resolution of 1920 x 1200 (or an actual HD monitor 1920 x 1080), you'll definitely tell a difference. 1080p is sharper, while the "upscaled" 720p looks softer.
A Mac fan site is not your only source of information. There are other sites on the World Wide Web that deal with the facts of HDTV.
That said, 720p is shorthand for 720p60 or 720 progressive scan lines and refreshed 60 times each second. US broadcast networks ABC and FOX broadcast in 720p. Cable networks ESPN and ESPN2 also broadcasts in 720p. Compared to the other US broadcast HDTV standard, 1080i, 720p is better for broadcasting the fast action of sporting events. It sacrifices the total data displayed onscreen, however. The other broadcast standard, 1080i displays about 20% more data on the screen each second. US broadcast networks NBC and CBS chose 1080i. A 1080i display displays a 540 line half-frame of odd-numbered scan lines and a second 540 line half-frame of even-numbered scan lines to produce a 30 fps moving image. Because 1080i is interlaced, in theory is suffers some of the same problems as the old analog standards. In particular, it may suffer dot-crawl and flicker.
The term 1080p is really more of an advertising term than a technical term. In the most naïve construction, 1080p means that each frame is composed of 1080 progressively scanned lines. However, important information is usually omitted:
All fixed-pixel flat panel 1080 displays display their images progressively. They buffer the half-frame of odd-numbered scan lines, interlace them with the half-frame of even-numbered scan lines, and then displays each 1080 frame progressively.
No commercial broadcast or cable network distributes 1080p content.
Blu-ray disc is a primary source of 1080p content.
You may have noticed that cheap HDTV sets display 720p. All 1080 LCD models advertise themselves as 1080p. Many of these advertise 120 Hz. There is no 120 Hz content. The faster refresh rate is intended to mitigate the limitations of LCDs. With the faster clock, each frame of 60 fps content is refreshed twice, each frame of 30 fps content is refreshed four times, and each frame of 24 fps content is refreshed five times. If video cameras and monitors were all capable of 120 Hz, then there would be no need for 3:2 pull down or 2:3 pull up. You may have also seen that some manufacturers are going to 240 Hz. Pointless.
In closing, it is difficult to impossible to tell the difference between 1080i and 720p on smaller displays. If you are a spec sheet john, then you will settle for nothing less that 120 Hz 1080p. If you just want to watch the Final Four, NASCAR, or the Masters, then just about any HDTV will blow you away.
But when TAKING videos?
For acquisition 1080p is pretty much limited to 24fps. Which is fine for some things and not fine for others. Currently 720p has much more flexibility in terms of acquisition because you can shoot in more frame rates and have longer record times.
Interlacing can certainly cause flicker, but dot-crawl is an artifact of analog composite video encoding and so is not an issue for the digital television systems.
USD $600 makes that untrue: http://www.camcorderinfo.com/content/Sanyo-Xacti-VPC-HD2000-Camcorder-Review-36280.htm#
I said "pretty much" for a reason. There are a few cameras that can shoot at, or beyond, 1080p60 but the vast majority of cameras out there currently cannot.
I paid a few hundred extra for 1080p in my 56" screen 2 yrs ago. I pretty much wish I would've saved the money and gone with 720p. It's just not that big of a difference.
thats why. atm HD broadcasts look better on a 720p HDTV. but if your sole reason for using an HDTV is for Blu-ray and PS3/Xbox 360 then get a 1080p HDTV.
If there were affordable 2000p TVs I'd buy one. I don't care much about Hollywood movies or TV but I have a DSLR and I'd like a large screen that could display the a full resolution image.
I don't own an HD video camera but I can still make and show HD content. I'll composite multiple SD sources. The end result looks better on 1080.
While 1080 is really better you eyes can only see the difference if you have a large enough screen or a close enough viewing distance. Which to get depends on screen size the distance.
1080p is a HUGE improvement over 720p. Not debatable
A 1080p set may be an improvement over a 720p set. However, this is because the 1080p set is better engineered. It is not because the 1080p set has more scan lines.
That is EXTREMELY HIGHLY DEBATABLE. IN FACT, IT'S A HUGE ISSUE. MOST PEOPLE CAN'T EVEN SEE THE DIFFERENCE.
Why not get a nice monitor? They have really high resolutions.
I wouldn't say most people...but there are definitely people who can't tell. I was pretty amazed when I heard someone say "it looks brighter but that's about it"
What? How do you figure?
But most cams that shoot 1080 can shoot 30p.
Yeah, good point. I usually forget about 30p as it's a bit of lame duck frame rate. People typically either want 24p for the film look or 60p for the super smooth motion.
Yea, that always puzzles me. To me, 30p is a nice middle ground. You get away from the super smooth video look, but you have a bit more leeway with panning and such, stutter wise.
So what is the big deal between 480i/P and 720P?
The resolution difference between 720P and 1080i/P is just as significant as it is between 480P and 720P, actually a little more. With the jump to 720P from 480P difference is 240 lines, between 1080i/P and 720P you're gaining 320 lines of resolution.
Personally I find it fairly easy to tell the difference between 720P and 1080i/p sources at reasonable watching distances on my 46" Sony XBR4.
As others have pointed out, the 720p/60 broadcast format will result in a smoother image, more ideal for sports and fast action, but for everything else 24P or 30P is more than fine, and I honestly turn off "smooth motion" 120Hz processing on every TV that has it.
have a good read...
we have a 52" 1080p LCD and a PS3 (blu-ray player) and i'll say switching to Rock-Band 2 (a PS3 game displayed at 720p) from a blu-ray movie (1080p) there is a HUGE difference.
^^^This is my "consumer" HD use.
I'm a video editor during the day and 720p and HDV material is pretty taboo (even though it happens). Rest assured, if I delivered a show in 720p i'd have some pissed co-workers....on a broadcast monitor, the difference between 720p and 1080p is very noticeable.
But as someone said, there is a viewing distance/resolution chart floating around the interwebs that will help you justify buying 720p products