Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
. My girlfriend loves 120 Hz for movies on our tv, but personally, I hate it! To me it makes everything look low budget. But everyone is different. :)

I agree with you there it takes a lot of getting used to. I recently changed our main TV to one that is 1080P at 120Hz (from one that only did 1080i at a lower refresh) Things look too realistic. I like the 3D when it comes to animated movies though
 
Faster fps is of course a preference. My girlfriend loves 120 Hz for movies on our tv, but personally, I hate it! To me it makes everything look low budget. But everyone is different. :)
I hate the way 120 and 240 look for movies. It's just horrible. Things that I want to look 'real' though, like live sports or nature doc, I think it helps in giving that 'looking through a window' effect.


Lethal
 
I wonder how many extra render boxes they had to add to make up for the difference { to 48fps }......In order to keep the schedule on time I mean. Those extra frames equal a whole lot more time rendering Gi, AO, SSS etc.
 
I hate the way 120 and 240 look for movies. It's just horrible. Things that I want to look 'real' though, like live sports or nature doc, I think it helps in giving that 'looking through a window' effect.


Lethal

That's a good point. There are things I like the high frame rate for, including nature shows, gaming, etc. :)
 
I hate the way 120 and 240 look for movies. It's just horrible. Things that I want to look 'real' though, like live sports or nature doc, I think it helps in giving that 'looking through a window' effect.


Lethal

I really don't like the effect. I honestly feel like the movie is being fast forwarded. I was blaming it on the fact I'm more interested in movies, but I went over to someone else who had 120 hz on and a girl said the movie looked funny.

I actually was really curious how the Hobbit is gonna look like when Peter was talking about the RED camera's in his blog. I just hope it will steel look awesome and won't distract me, the feeling something is wrong the entire movie. That would be a shame.
 
I had read in a digital compositing book that 24 fps is actually below the threshold rate where implied continuous motion can appear to flicker and that delivery theater projectors are actually designed to show each frame twice at 48 fps to compensate for this. Can anyone confirm or rebut this, and if it's not the case what was the author talking about? This is a pretty popular text on the subject.
 
I had read in a digital compositing book that 24 fps is actually below the threshold rate where implied continuous motion can appear to flicker and that delivery theater projectors are actually designed to show each frame twice at 48 fps to compensate for this. Can anyone confirm or rebut this, and if it's not the case what was the author talking about? This is a pretty popular text on the subject.

I don't know how modern the text is but I know that 48fps is being adopted for 3D to allow for true 24fps in each eye.

On paper I don't think there is any difference in playing a 24fps reel at 24fps or doubling each frame and playing it back at 48fps? If someone could tell me other wise it would be much appreciated.

Recently the Hobbit had a ten minute screening hat 48fps and a lot of people complained that it looks too TV like and not enough like a movie. Someone posted a comparison on vimeo and I have to say it feels too staccato.
 
Movies have a certain look to them which has been fine tuned for many years. It is fascinating to watch a video about making a movie where they switch back and forth between video and movie. In the movie the drama is palpable. In the video you just see some guy standing there.

You can google this topic and get hundreds of hits. There is all sorts of software out there to help you get that Hollywood movie look in your video.

My understanding is that the combination of low frame rates, lenses and high saturation and vibrancy give the video a look somewhere between real life and animation.
 
I don't know how modern the text is but I know that 48fps is being adopted for 3D to allow for true 24fps in each eye.
It's for smoothing out the motion so that 3D might be less headache/nausea inducing. Both eyes will see 48fps just like in current 3D theaters both eyes see 24fps.

On paper I don't think there is any difference in playing a 24fps reel at 24fps or doubling each frame and playing it back at 48fps? If someone could tell me other wise it would be much appreciated.
Using a double-bladed shutter for projection (so each frame gets displayed twice) of 24fps material will help reduce the flicker and motion judder inherent in filming at that framerate.


Lethal
 
It's for smoothing out the motion so that 3D might be less headache/nausea inducing. Both eyes will see 48fps just like in current 3D theaters both eyes see 24fps.


Using a double-bladed shutter for projection (so each frame gets displayed twice) of 24fps material will help reduce the flicker and motion judder inherent in filming at that framerate.


Lethal

Cheers for this! I don't know why I wrote that earlier about true 24fps as I did an entire physics project on 3D projection and was obviously having an "off" moment!
 
Unless you are watching 24P on a LCD @ 120 refresh or a Plasma @ 96hz or 72hz if it's a pioneer then your not really watching native 24P

24FPS does not display on a 60hz TV unless you add an extra video field into every alternate Film frame , ie 3:2 pull-down.

While this definitely gives 24FPS a "look" on 60hz TV systems , I would not say it is a good one.

In European Countries and most Asia Pacific countries (50hz countries) 24P is simply increased by 104% to run at 25P , no need to add the artifacts required by 60hz countries.... after 50+ years , finally 60hz countries can see native frame rates again due to updated display technology displaying @ 72,96 or 120hz (all multiples of 24)

So, unless you plan to take your production to the Cinema or release in Europe/Asia Pacific , then do the world a favor & DON'T shoot in 24P !, shoot in 30P if you must get the "film" look without the artifacts .....

I get so annoyed with the way 24P is thrown around by people who have NO knowledge why it actually exists !....

Why a CONSUMER would EVER want to go anywhere near 24P is beyond me.... :)

Perhaps in commercial displays and some computer monitors, but I have yet to see "true 24p" on any consumer display. The 120hz that is used in consumer displays only doubles the displays native refresh rate (60hz converted to 120hz). 24p signals are converted to 30fps using 3:2 pull-down to be displayed using 60hz. That's why LCD's and LED LCD panels have ALWAYS looked overly sharpened with their motion when watching 120hz. 24fps has blurry/jittery picture naturally. Anything that "smooths" the motion is manipulating the image. Even the Pioneer Elite is manipulating the image at 48fps or higher. Any "True cinema" mode on any set is simply 3:2 pull down or a better version of it. Consumer NTSC sets will always natively display 60fps until somebody changes the broadcast standard (29.97p or 59.94i). Until then I'm afraid the only TRUE 24p will be in the theaters or in a Post House (which is where I used to work). Anybody on a retail sales floor (where I also worked doing custom install for years) that tells you anything different is simply selling you on a feature. Doesn't mean it's true.
 
...there is a theory that 48 Hz is causing the brainwaves to more or less synchronize with that. And because in a cinema there are no distracting other lights, the whole audience gets slightly hypnotized. So that is why after 15 minutes everybody stops to talk and silently watches the movie...
This is the reason that Roger Ebert so vigorously lobbies for 24fps. He says that 24 is more closely related to the brain wave frequencies that reflect a state of mind of fantasy and passive acceptance, meaning that you can more easily become emotionally immersed in the drama.

Actually, those brain waves are far away from 24 Hz and nearly as far away from 24 as they are from 30 Hz (typical video frame rate) which is the frame rate he seems to abhor so resolutely, so I think the argument is pretty thin.

To answer the OP's Q I think you have to define "cinematic". I think the larger definition has much more to do with the depth of field in a Panavision camera as opposed to a garden-variety video camera, and also more to do with the gamma response curve. Both seem more important in defining "cinematic" by my personal yardstick than does having a frame rate so low that the motion artifacts it induces become an expected part of the experience.

And BTW, what critical flicker frequency is relevant to human perception can't be boiled down to a single number; image brightness and screen size/distance and ambient room light, among a host of other factors, modulate the answer significantly. Movies are delivered by illuminating each frame twice, so the image rate may be 24 fps but the flicker rate is 48 fps, and that is not by any stretch a "magic number"; it is designed to be sufficient for "most viewers" under "most circumstances" only, meaning perception of fluidity is a dynamic subjective process that can't really be quantified all that simply.

But if you define "cinematic" by how much frame-rate induced motion artifacts there are, such as flicker and judder, that can be a valid definition as well.

Some posters loathe frame rates faster than 24. They have a right to do that, even if you discount the "snob" factor, which is also alive and well, but I suspect a lot of this is because of a life-long accustomization to the flicker of film, and the judder of 3:2 pulldown (coupled with a stubborn refusal to adapt to change). IOW, all film and all TV that any of us have been exposed to until just recently is full of flicker and judder artifacts, and it seems foreign when those artifacts are taken away. It takes some getting used to. My advice? Get used to it. 4K and 8K video is coming, and it will have at a minimum 48 fps frame rate.

When faced with 60p or 120Hz or 240 Hz frame interpolation, that removes the artifacts. Many see this as plastic and weird, and invoke the "soap opera effect" argument. For myself, it took about a grand total of 4 seconds for me to abandon any love of flicker or judder artifacts when I first bought a 120Hz TV. I see it as a significant, major improvement. I will not buy a new set without 240 Hz frame interpolation, possibly 480 if its available. I am more interested in video looking closer to how things move in real life than I am having it seem closer to what film and its limitations might provide.

But then I sit very close to the screen, with about a 53-degree angle. Sitting ultra-close means that the reduction of motion artifacts means more to me than it might to some. The recommended distance for a 60" screen is 7.8 feet. The ATSC suggests a 30-degree angle (which 60 at 7.8 is); THX suggests a 36-degree angle, and most people sit so far away that their eyes can't even resolve the 1/60th of a degree of arc that supports HD resolution at that distance, meaning that most viewers don't even enjoy the full benefits of HD resolution because they are sitting too far away for the limitations of human vision to be able to take advantage of HD resolution; they might as well be watching in SD. If you have a 42" screen and sit 12 feet way, that's you. FOr folks sitting at reasonable or far distances, maybe the motion artifacts of film are more comforting than they are annoying as they are to me.

I also predict that 720p is becoming a dead format. I was a huge supporter of 720p, and one of the reasons was due to the frame rate and reduction in motion artifacts. But I see a day in the not-too-distant future where all content is sent as 1080p24, because that format is not bandwidth-intensive compared to the impractical and never-yet-used 1080p60 (is similar in file size to 720p and 1080i30), and virtually all TVs will have frame-rate interpolation available to convert it to true 1080p60 or even up to 1080p480, which would be a much easier implementation for broadcast or internet delivery than 4K.

Those sets that it has not trickled down to will still be OK at 1080p24, but they will have the same motion artifacts as film. Many would probably prefer that anyway, since they seem to want to cling to those motion artifacts rather than having accurate reproduction.
 
...The 120hz that is used in consumer displays only doubles the displays native refresh rate (60hz converted to 120hz). 24p signals are converted to 30fps using 3:2 pull-down to be displayed using 60hz...
Not exactly.

60 Hz does indeed need pulldown and its associated judder to display content that is originally 24 fps, but displaying at 120 Hz makes the math work out (24 divides evenly into 120 but unevenly into 60) so that each image can be displayed the same number of times rather than alternating 3 times with 2 times, which is where the judder artifact comes from. This is the whole idea behind 120 Hz, and why it is totally free of pulldown judder while 60 Hz isn't.

What's more, most if not all modern FPs that can do 120 Hz or higher also do frame interpolation, meaning they create unique images that are composites of the original images that are originally 1/24th of a second apart. 120 Hz with frame interpolation (which works remarkably transparently well) means there are four new unique images created for every source image, so every one of the 120 images displayed each second is unique rather than simply repeating the same frame 5 times.

That's not sales floor hype, that's real, and the result is a significant advance over pulldown and lower frame rates in producing fluid motion that is much closer to what one might see in real life. You don't have to like it or even appreciate the reduction of artifacts, but you do have to live with it because it is a fact of life.
 
This is the reason that Roger Ebert so vigorously lobbies for 24fps. He says that 24 is more closely related to the brain wave frequencies that reflect a state of mind of fantasy and passive acceptance, meaning that you can more easily become emotionally immersed in the drama.
I think you are thinking of someone else 'cause Ebert has been lobbying for 48fps as the new film standard for over a decade.

But I see a day in the not-too-distant future where all content is sent as 1080p24,
All content? I can't see networks shooting and streaming/broadcasting 24p for sporting events.

because that format is not bandwidth-intensive compared to the impractical and never-yet-used 1080p60 (is similar in file size to 720p and 1080i30), and virtually all TVs will have frame-rate interpolation available to convert it to true 1080p60 or even up to 1080p480, which would be a much easier implementation for broadcast or internet delivery than 4K.
1080p60 is roughly 2x the size of 720p60 and 1080i30.


Lethal
 
While I'm on a roll here, one of the reasons why folks might consider higher frame rates and reduction of artifacts weird and unacceptable is related to the fact that even though it is closer to what real motion looks like, it is not closer to how we interpret motion.

There is a constant quick, minor movement of the eyes called nystagmus which is both universal and ubiquitous in human vision, and when we are looking at something that is moving or moving our heads from one view to another the little micro-jumps that our eyes make, essentially taking individual "pictures" similar to how a multiburst camera shutter works, are farther apart and more jumpy. This is an adaption due to the fact that we only have resolution in our foveal vision (the very center of the retina) and the rest is fuzzier by far.

But we filter that out during perception so that we don't even realize that our eyes aren't perfectly still, which is what it feels like they are to us. Foveal vision is also totally incapable of color vision as well; our perception fills in that blank during the perception process using info from non-foveal vision just peripheral to the fovea.

So smoother motion, which is more natural, feels ironically less natural.

This is true of POV motion and color temperature as well. If you go from incandescent to sunlight, the color temp changes drastically. But we filter that out in perception so it doesn't become a factor for us. Try that with a video camera and the change in color temp is very noticeable, to the same eyes that can otherwise filter that out naturally.

When you are walking down the street your head unavoidably bobs up and down changing your POV, you might tilt your head which effectively tilts the horizon, etc. Again, we filter this out during perception; we don't even notice that the POV is changing that drastically. But try to move a video camera even a little bit, or tilt to change relative to the horizon, and it is annoyingly noticeable (which is why we hate the shaky camera technique and why the Steadicam was invented).

To me, this makes it understandable why smooth motion and higher frame rates are unacceptable to a lot of folks; even though it is more natural and much freer from artifacts, it seems very unnatural at first, which is why there is a learning period required. It's therefore an acquired taste to some extent. Except some acquire it and adapt immediately, and some never can adapt and refuse to adapt, allof which makes perfect sense.
 
I think you are thinking of someone else 'cause Ebert has been lobbying for 48fps as the new film standard for over a decade.
Roger's comments are much more than a decade old, surely formed before the advent of the possibility of 48fps. He could easily have adapted his view to modern technological possibilities, but the principles he embraced still apply. 48 is a multiple of 24; 30 is not.


All content? I can't see networks shooting and streaming/broadcasting 24p for sporting events.
If your TV/computer can fill in the motion in between the original images with credible virtual approximations of what real 1080p60 images would be, which they have technically been able to do for years now, why not? The argument of smoother motion for action becomes moot.

1080p60 is roughly 2x the size of 720p60 and 1080i30.


Lethal
As a long-time Broadcast Engineer in good professional standing, that is an exceptionally-common fact I am intimately familiar with. My point is that if stations move to 1080p24 (which is about the same file size as 1080i30 or 720p which is what HD is broadcast in right now) they can enjoy both the advantage of a higher resolution than 720p, and a higher frame rate by virtue of the TV set creating 60% of those frames on the fly locally (80% assuming 120 Hz frame interpolation, 90% assuming 240, and 95% assuming 480).

It's going to happen, just like H.265 (not 264) encoding is going to happen.
 
Last edited:
Roger's comments are much more than a decade old, surely formed before the advent of the possibility of 48fps. He could easily have adapted his view to modern technological possibilities, but the principles he embraced still apply. 48 is a multiple of 24; 30 is not.
I don't follow. You said Ebert "vigorously lobbies for 24fps" which he does not and I don't think his pro-48fps comments can be much more than a decade old because the MaxiVision48 he tried to champion wasn't developed until '99.

If your TV/computer can fill in the motion in between the original images with credible virtual approximations of what real 1080p60 images would be, which they have technically been able to do for years now, why not? The argument of smoother motion for action becomes moot.
It would be interesting to see a comparison between 60p source->24p delivery->60p TV interpolation vs 60p source->60p delivery->60p TV viewing. Staying 60p all the way though will obviously retain more quality but taking into account compression, average viewing distance and the fact that the average viewer not really giving a crap I wonder if 24p as an intermediate would be 'good enough'. You can repeat, blend or interpolate frames all you want but you can't manufacture accurate image info that's not there. Especially a really complex scene like, say, a football or basketball game.

As a long-time Broadcast Engineer in good professional standing, that is an exceptionally-common fact I am intimately familiar with. My point is that if stations move to 1080p24 (which is about the same file size as 1080i30 or 720p which is what HD is broadcast in right now) they can enjoy both the advantage of a higher resolution than 720p, and a higher frame rate by virtue of the TV set creating 60% of those frames on the fly locally (80% assuming 120 Hz frame interpolation, 90% assuming 240, and 95% assuming 480).
My mistake. I thought you were saying 1080p60, not 1080p24, was similar in size to 720p60 & 1080i30.


Lethal
 
... which is more natural, feels ironically less natural...

When I walk and look around me, I don't experience Depth Of Field, but see everything sharp. So the most natural images are produced by consumer camera's with everything sharp ;)

We don't call that cinematic


... we filter that out during perception so that we don't even realize that our eyes aren't perfectly still...

Even more than that. Look someone in the face and look at one eye and then the other. The other person sees your eyes moving. Now do the same looking in a mirror at your own eyes... (Was quite an experience to me when I saw that for the first time.

Our brain blanks during the motion (to prevent us from becoming seasick). So, for example, if you are looking at different persons (moving your head or eyes) your brain is essentially cutting a sequence. One of the tricks of invisible cuts is to cut in those blanking moments.

(Oh, and why the flicker of tv sets doesn't slightly hypnotize us is because there is too much other light.)

Back to cinematic...
 
Not exactly.

This is the whole idea behind 120 Hz, and why it is totally free of pulldown judder while 60 Hz isn't.

It think you may have missed one of my key points, and I apologize if I didn't quite explain it properly, but 120hz or 240hz does not and never WILL look like 24fps because 24fps IS NOT SMOOTH. Every pan has a jitter to the edges. These are not artifacts these are optics or the gram rate. 120hz looks fake to me and countless others who turn it off.
 
It think you may have missed one of my key points, and I apologize if I didn't quite explain it properly, but 120hz or 240hz does not and never WILL look like 24fps because 24fps IS NOT SMOOTH. Every pan has a jitter to the edges. These are not artifacts these are optics or the gram rate. 120hz looks fake to me and countless others who turn it off.
I think he was saying that 24p at 120hz and 240hz won't display the same artifacting that it does at 60hz because playback at 60hz requires pull down where as playback at 120hz or 240hz does not.


Lethal
 
I think he was saying that 24p at 120hz and 240hz won't display the same artifacting that it does at 60hz because playback at 60hz requires pull down where as playback at 120hz or 240hz does not.


Lethal

Fair enough, but 120hz and 240hz by no means looks like or could possibly be considered true 24fps. If anything they look more like an old PAL (25p / 50i) tv show that's been converted to NTSC (29.97p / 59.94i). The motion is quite off and unnatural.
 
frame rates

24fps has become multi standard with the improvement of telecine machines that can do the silly pull down that you need for dumb TV systems like NTSC ;)

I think that 30p, (which isn't even 30 frames per second, but instead is a crazy 29.97p!) isn't that different in look to 24fps

24fps kinda goes in to 25 and 30 fps alike.

I think cinema should go to 72 fps as this would help Kodak :p
 
When I walk and look around me, I don't experience Depth Of Field, but see everything sharp. So the most natural images are produced by consumer camera's with everything sharp ;)

We don't call that cinematic...
Speak for yourself. "We" may not (you and I) but "We" (anyone who correctly defines this aspect of cinematic) do. The distinction between a video camera which uses a lens and imager combination with very deep depth of field, compared to a Panavision camera which uses a lens and imager that creates comparatively much shallower depth of field under similar conditions, is a ubiquitous aspect of nearly all film content over the years. To not consider shallow depth of field one of a number of clear and distinct cinematic aspects is to not understand the term in the first place.

And unless you are from a different planet or the future your DO NOT see everything sharp. Only your foveal vision is sharp, and ONLY when you focus on what is centered in your fovea. EVERYTHING else is not sharp, meaning all peripheral vision and anything off-axis to the fovea by a matter of just a few degrees. It only seems that everything is sharp because you only concentrate on what you look at directly. Yes, that will be sharp; everything else, the other 95%, the part that hits the rest of your retinas, not so much. So for everything in an image to be sharp could not be more UN-natural.

Shallow depth of field is a cinematic (there's that word again) tool used to simulate what is natural to the eye, on film. It is not meant to duplicate what the eye sees in terms of depth perception and what is and what is not in focus, and no one ever seemed to indicate that it might, at least, that is, until you posted.
 
...It would be interesting to see a comparison between 60p source->24p delivery->60p TV interpolation vs 60p source->60p delivery->60p TV viewing. Staying 60p all the way though will obviously retain more quality but taking into account compression, average viewing distance and the fact that the average viewer not really giving a crap I wonder if 24p as an intermediate would be 'good enough'. You can repeat, blend or interpolate frames all you want but you can't manufacture accurate image info that's not there. Especially a really complex scene like, say, a football or basketball game...
I agree, it would be interesting. And I think what would be most interesting is that we would not be able to tell the difference.

If the image is not moving but the camera is (an extremely common situation in video and film) it is easy to interpolate a frame that is 100% indistinguishable from the parent frames because all of the information is there already. Its a little harder when the camera is moving and the image is moving at the same time, but that is rare for a number of reasons, chiefly because when the image is moving it is not common for the director to move the camera at the same time, as that would be both confusing and cinematically tacky.

The obvious exception would be a tracking shot of someone running through the jungle, where the background moves and the arms and legs of the runner also move, even though the runner stays relatively in frame in the same place. But even in those situations I think it would be better than actual 24p.

One of the things about low frame rate that makes it acceptable is that when things move, they naturally lose resolution. This is what we see in the real world as well. This is what makes 1080i acceptable, and I think this masking effect of lower resolution during motion would also make the small degradation of interpolated frames in this rare sort of scenario acceptable. Are we really going to see the moving arms and legs of someone running through the jungle close up, all that clearly anyway? Of course not; they're in motion. Artifacts induced there will be pixel-based and hard to see, not macroblock based and easy to see.

What is very un-natural about film is the flicker, the judder of pulldown, and the stereoscopic strobing artifacts of things that are moving. Seeing the background move by as a series of still images like a flip book is not natural. And frame interpolation as done on modern FPs addresses all of these problems pretty effectively.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.