Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

rdsii64

macrumors regular
Original poster
May 14, 2008
237
8
I am looking at hi def camcorders in my price range and as I was reading the specs I had a scratch your head moment. Can my eyes tell the difference between 1920x1080i at 60 frames per second and 1920x1080p at 30 frames per second.
 
I am looking at hi def camcorders in my price range and as I was reading the specs I had a scratch your head moment. Can my eyes tell the difference between 1920x1080i at 60 frames per second and 1920x1080p at 30 frames per second.
Clarification: 1080i accepts input at 60 540-line half-frames/second. Whether 1080i or 1080p, the content is still 30 frames/second--29.97 frames/second to be exact. To get deeper into the weeds, no flat-panel display displays interlaced content. 1080p originates as progressive scan and is, of course, painted progressively on the screen. 1080i content originates in interlaced mode. However, the TV set buffers the first 540-line half-frame. After it receives the second 540-line half-frame, it combines the two half-frame in memory and then progressively paints the screen with 30 complete 1080-line frames/second.

As a rule of thumb, TV sets that are advertised as 1080p are better engineered than those advertised as 1080i. However, the 1080i/1080p question is a substitute for having to explain the real engineering differences between the two. Pressed for an answer to your question, I would say that you can probably tell the difference. However, the reason that you can tell the difference is due to the better engineering rather than because "p" is better than "i."
 
I am looking at hi def camcorders in my price range and as I was reading the specs I had a scratch your head moment. Can my eyes tell the difference between 1920x1080i at 60 frames per second and 1920x1080p at 30 frames per second.

I think better to look at the physical size of the CCD sensor chip i the camera (bigger is better) and the quality of the lens. COmpression artifacts or lack of same matter a lot more than the issue you found. Don't forget to check the sund. the eyes can forgive some defects but the ears don't

BTW I whish they'd stop talking about "interlaced" video now that CRTs are long gone.
 
As others have said look at the "compression formats" instead of the i vs p. Because that will make a way bigger difference.
 
The real difference between 1080p and 1080i is that the interlaced format does not handle motion as well. Yes, flatscreens will buffer the frames and display 1080i60 as 30 progressive frames, but if the footage is recorded in an interlaced format, there can still be tearing with quick motion. Observe:

interlaced.jpg
 
Most HDTVs have hardware/software that de-interlaces 60i and displays as 60p. This is called the "Bob" de-interlacing. Simply means you take a 1920x1080 interlaced 60i signal and separate each field into a unique frame. This will maintain the smoothness of 60-hz temporal resolution but will cut your horizontal pixel resolution in half.

1920x1080 60i with bob de-interlace gets turned into 1920x540 60p. This is why 30p and 60i still look different on an HDTV ... or should.

Computer monitors do not de-interlace automatically so there has to be some mechanism to do this otherwise you with see the interlace effect.
 
I know there was a time when the following was the case...

the TV set buffers the first 540-line half-frame. After it receives the second 540-line half-frame, it combines the two half-frame in memory and then progressively paints the screen with 30 complete 1080-line frames/second.

... but I would've thought (hoped!) that as of 2010 the following would be in the majority.

Most HDTVs have hardware/software that de-interlaces 60i and displays as 60p. This is called the "Bob" de-interlacing. Simply means you take a 1920x1080 interlaced 60i signal and separate each field into a unique frame. This will maintain the smoothness of 60-hz temporal resolution but will cut your horizontal pixel resolution in half.

Anyway, I think most of the answers so far have tackled this from the wrong angle. 1080i59.94 and 1080p29.97 have the same spacial resolution (1920x1080) — the difference is the temporal resolution (59.94 vs 29.97). That difference dictates the motion characteristics of video, and you will notice the difference between the two.

1080i59.94 will look like sports, news or game shows. 1080p29.97 will look more like a movie.

Three things to note:
  • It's more complicated than this;
  • I'm speaking for capture (camcorders), not display (TVs);
  • 1080i59.94 is technically considered 29.97 frames per second due to the way the signal is packaged (not because of how the image ends up looking).
 
I am looking at hi def camcorders in my price range and as I was reading the specs I had a scratch your head moment. Can my eyes tell the difference between 1920x1080i at 60 frames per second and 1920x1080p at 30 frames per second.

Interlace was a wonderful invention back in the 30's when Marconi-EMI started manufacturing television broadcast equipment.

'Interlace' is a really neat solution to a serious problem.
It allows for the illusion of "higher definition" imaging within the limited bandwidth of the TV transmission chain.

When there is no movement in a scene, interlaced imaging will give you the maximum resolution available in that line-standard, se 525 (theoretical). But soon as there is movement, the resolution of any object in the scene plummets to half of the vertical resolution of the system!

This of course is not really a problem as moving objects go blurry to the human eye anyway.
Where it becomes a problem is when you try to show interlaced images on progressive displays. Any movement immediately shows up some really nasty motion artefacts, as Erendiox shows us.

So to answer rdsii64's question, the best approach would be to go for a camcorder that suits your TV/display!
 
So to answer rdsii64's question, the best approach would be to go for a camcorder that suits your TV/display!
WONDERFUL explanation bimmzy - thank you so much!

would it be best to go with a camcorder that records both 1080p and 1080i? then you have best of both worlds :) (if thats even possible?)

here is a question - assuming you always want to shoot "best case" video, could you shoot 1080p then later on interlace it if you want to display it on an interlaced screen/monitor? is that optimal?
 
Wirelessly posted (Mozilla/5.0 (iPod; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7)

Interlaced video also gives the impression of a higher frame rate. Progressive scan at 29.x fps always looks choppy to me (like film) where interlaced looks perfectly smooth since splitting up the picture gives the impression of 60fps.

Of course dealing with interlaced video in editing can be a major pita.
 
Wirelessly posted (Mozilla/5.0 (iPod; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7)

Tell more about how flat panels handle interlaced video -- 1080i (ie broadcast) still looks smoother to me, just like NTSC on a CRT. They must be refreshing the panel at 60 fps and putting the odd lines one scan and the evens the next.
 
here is a question - assuming you always want to shoot "best case" video, could you shoot 1080p then later on interlace it if you want to display it on an interlaced screen/monitor? is that optimal?

There isn't really a compatibility issue between 1080i and 1080p at the display end, and most progressive is actually stored as an interlaced signal. So the decision is a case of whether you want the "live" look of 1080i59.94 or the more movie-like look of 1080p29.97 — certain situations will be better suited to one than the other.

Tell more about how flat panels handle interlaced video

See:

Most HDTVs have hardware/software that de-interlaces 60i and displays as 60p. This is called the "Bob" de-interlacing.
 
There isn't really a compatibility issue between 1080i and 1080p at the display end, and most progressive is actually stored as an interlaced signal. So the decision is a case of whether you want the "live" look of 1080i59.94 or the more movie-like look of 1080p29.97 — certain situations will be better suited to one than the other.
interesting.

so a 1080i59.94 vs 1080p29.97 will take up the same amount of space? the only difference is the way the images are recreated? and interlaced will always look more fluid, as thats how it is redrawn - compared to progressive which is ON off ON off. ? maybe :)
 
interesting.

so a 1080i59.94 vs 1080p29.97 will take up the same amount of space? ...
Not necessarily. Progressive scan can be compressed more efficiently than interlace. Since all digital video is compressed, one would expect that progressive scan will allow more video to be stored.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.