Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

kikobarbada

macrumors regular
Original poster
Jun 28, 2007
195
0
Hey. I have just a question.

Why do I need more than 24 frames per second if movies are 24 fps and are perfect? Plus human eyes can't see the difference from 50fps and 100fps.

Thanks.
 
Human eyes (or actually human brains) can see 24 vs 30 very easily, that's why we have "filmlook" processing to mimic the flicker of 24fps.
I think you're confusion is the notion that 24fps is the point where the human observer feels the motion is lifelike and not jumpy.
 
Some people can reasonably tell the difference between 24fps and 50fps, I'm sure there are some that can also tell the difference between 50fps and 100fps. Secondly, if you're referring to games then 24fps might be "fine" but you aren't going to have a game hold at that, sometimes you might get 50fps and other times you might get slowdowns to 10fps or so, resulting in the dreaded stutter.
 
Actually most films are filmed at 100 frames per second, with a lot of the newer HD content being filmed at 120 frames per second.
 
I know for CRT screens I have to boost the refresh rate up to 75-80hz before I stop seeing the flashing. LCD I run into visible ghosting effects long before issue with flashing because of the different ways the screens physically refresh.

In applications (games) they also need to be in that 50+ range or I start to have headaches (FPS). Not that I play a lot of games.

uMac
 
Actually most films are filmed at 100 frames per second, with a lot of the newer HD content being filmed at 120 frames per second.

Sorry, Where the heck did this come from?!
Most theatrical released movies are shot in 24fps even if it's shot digital. The highest version of HD right now is only 60fps (1080p). The only reason to shoot at higher frame rates is for smooth slow motion shots, instant replays in sports, and high-speed scientific analysis. It is still displayed at the normal framerates.
 
If you have an old Nintendo 64 go play Ocarina of Time or some other game on it, you'll notice how jumpy the title screen even is. Better yet, play Diddy Kong Racing...Back in the day this used to seem great, but playing it again gives me shudders...
 
Sorry, Where the heck did this come from?!
Most theatrical released movies are shot in 24fps even if it's shot digital. The highest version of HD right now is only 60fps (1080p). The only reason to shoot at higher frame rates is for smooth slow motion shots, instant replays in sports, and high-speed scientific analysis. It is still displayed at the normal framerates.

Exactly. Video in the US is 29.97 -- 35mm film is projected at 24 -- the human eye sees "motion" at about 16 fps. But what the difference is with video games and computer games is that you aren't watching something static at 24 fps - you are controlling pixels and characters and so the frame rate has to be higher to continue the sense of not only motion, but to prevent choppiness. You aren't watching film or video - you are watching animated pixels designed to replicate human motion. It has to be faster so that you don't see the computer drawing in each frame.

Regardless - it's not an even comparison to compare a video or 35 mm (or even 70mm film) running at 24 fps to a computer game or video game -- it's just totally different.
 
Actually most films are filmed at 100 frames per second, with a lot of the newer HD content being filmed at 120 frames per second.

No it isn't. The only possible explanation for your assertion is that you are claiming that 50fps interlaced footage = 100fps. Which is wrong as 2 frames of interlaced footage actually only equals 1 real frame.
 
Hope this isn't too nitpicky, but FYI even though film is usually shot at 24 fps, it isn't exactly projected that way:

"The critical flicker frequency, or the repetition rate above which flicker is not perceived, falls between 40 and 60 repetitions per second. The exact flicker threshold depends on factors that include picture brightness and ambient lighting, but 30 flashes per second is below it under any viewing circumstances. Although theatrical motion pictures run at a rate of 24 frames per second, each frame is projected twice, raising the flash rate to 48 per second. This is above the critical flicker threshold for relatively low-brightness images in a dark movie theater, but it is well below it for bright television pictures viewed in lighted rooms, as may be attested by anyone who has viewed PAL television with its 50 flashes per second."

link: http://www.tvtechnology.com/features/Tech-Corner/f_rh_technology_corner-01.08.03.shtml

Not everyone's aware of this factoid, but anyone who's seen the feed from a motion picture camera's video assist has witnessed the camera's inherent flicker.
 
When it comes to video, I can't tell the difference between 24 and 30, but I can tell the difference between 24 and 60.

In games, I can tell a difference between 24, 30, 60, 120, and 333. Beyond 333 I cannot tell a difference. My little brother actually did a science project for school (8th grade, smart little bugger) ... he had several video clips and tested me on video and such. Then he had Quake 3 open (the only game we have that will run at such high FPS, lol) and he set the graphics on the same level, but com_maxfps "###" will set the max FPS in game, and he set to different ones.

We came to the conclusion that the reason we can't tell much of a difference in video is because of motion blur. In games, well most games, there is no motion blur. So if something moves across the screen at 24 frames per second really fast, you might see one flash of him, and it's totally still. Your mind doesn't create motion blur on something that just blinks on then off. But when you get up to hundreds of frames per second, the guy running across the screen might be present in 60 or 100 frames. Your mind fills that in with some motion blur (or whatever) and it seems to flow a lot better.

This is all speculation, I am not a scientist.

But yeah, I'm done rambling, later. :D
 
I'm not sure about movies, but the with the fps I think it relates more to games
Most new games are 60fps...you can really tell the difference between that and 30 fps
 
I'm not sure about movies, but the with the fps I think it relates more to games
Most new games are 60fps...you can really tell the difference between that and 30 fps

Huh? The framerate at which a game runs is based on the hardware it's running on. It doesn't come out of the box running at 60fps.
 
Huh? The framerate at which a game runs is based on the hardware it's running on. It doesn't come out of the box running at 60fps.

I think the poster is talking about consoles and you are talking about computers.

A great comparison in the video game world is the Madden 08. It is running at 30fps on the Sony Playstation 3 and 60fps on the Microsoft Xbox 360. You can see the difference.

Odd we had this talk at work the other day. Film at 24fps but video games seem to want more and more fps.
 
Huh? The framerate at which a game runs is based on the hardware it's running on. It doesn't come out of the box running at 60fps.
As far as console gaming goes, many older titles were fixed at 30fps (of course due to limitations in hardware), while console gaming these days tends to be at 60fps. However, some games still cap the framerate at 30fps, perhaps to provide a more consistence experience rather than having it fluctuate between 60 and 30 (look at games like Final Fantasy X, Mario Party 8, etc.)

Now when it comes to PC gaming, you're eventually going to run into limitations of the display. You're not going to ever see 100fps or 300fps because the displays simply do not refresh that fast (I'm not sure how LCDs work in this regard but isn't is somewhere around 60?). I doubt you'd notice any difference above 60fps.
 
Just for some added input, I can see the difference between 60hz and 75hz on a CRT monitor consistently, so the eye can pick up motion that fast. This is particularly true when you see something on the edges of your vision. The central part of your eye/retina is more adept at seeing color and fine details. The peripheral part of your eye/retina is better suited to see things in low light and pick up motion.

Just as an experiment, try these two things. First, find a CRT and set the refresh to 60hz and then look at something that is a couple feet to the side of the display. Can you see the flicker in the monitor when you aren't looking right at it?

Next, go outside in the dark (the real dark, probably no available to you city folk :p) and look right at a spot where you know something is. You probably can't see it. Now, look about 5 feet above that spot and pan your eyes left/right in a figure-8 pattern. Did the thing you were trying to see suddenly resolve in your vision?

Anyways, you definitely need more than 24fps, and more than 30 for interactive media (games, etc). When you start getting up into the 60+ range you're going to have a tough time seeing the difference. For "static" media, like movies, 24fps is acceptable and there are lots things you can do to make that look much sharper than it should.
 
When i'm in Europe PAL drives me crazy because of the flicker. Film projectors get around the 24fps flicker issue by actually opening and closing the light gate twice for each frame.
 
A camera, be it video or film takes snapshots of real moving object that continue to move all on their own between snapshots.

A video game must compute where each object is for each frame. I think the higher FPS mans more than just how many images are drawn per second, more importantly it means the location of each object is computed at higher rates.

Or to say in another way. Actors move in a continuous fashion and the camera takes samples their location 24 times per second. In a video games objects objects move instantaneously (faster then even "light speed") many times per second. Video characters move in discrete "jumps". I think the higher FPS makes the game's simulation more like the real continuous world

Some day more sophisticated games might decouple the simulation from the graphic rendering and we will then have "film-like" games but this will require a huge jump in computer power. (more than 100X faster) Basically you'd have to be able to render a Pixar-like animation in real time. Notice that Shriek, Cars, or Toy Story looks very, very good. at (only) 24 FPS.
 
Just what are you asking?

Hey. I have just a question.

Why do I need more than 24 frames per second if movies are 24 fps and are perfect? Plus human eyes can't see the difference from 50fps and 100fps.

Thanks.

I don't think any of us know quite WHY you're asking. Some think it's video game related, others think it's film related. But here's my 2¢ of something not said that may help if you're getting into video.
24 is great for narrative expression and film. Think film and introspective documentaries. It's hugely popular and overabused nowadays.
30 is great for reality tv / news stories / documentaries that give a "now" kind of feel. I believe with the new FCP you can mix the formats too. So- perhaps you have an interview in 30 fps and it feels real and crisp and 'in the moment' but you're cutting to 24fps re-enactments or pickup b-roll to submerge the viewer into the narrative, but coming back to the 30 to give the - 'we're here now' sense. If you have a camera capable of both, try it out. The thing is- I don't notice too much a difference on computer screens. especially once it hits youtube and the frame rate gets chopped- big deal. But when I see it on TV it's a stark difference. There's also 30fps progressive which I think meets in the middle. I think Curb Your Enthusiasm shot in or shoots in this because it's improv'd and in the middle of a narrative and something 'real'. Then there's 60i which is great for sports because it's like 30fps realness on stereroids. Anyways- most of what I'm saying is objective and can be argued but I'm laying it out in non-technical terms as your posting alone seems to ask for that. Basically - you need more than 24fps to play with different video formats that acheive different looks.
Good luck!
 
The General's little brother is right

Yes, it is primarily the motion blur than makes 24fps look so realistic. Video games typically have no motion blur or very crude motion blur and the high fps helps offset that.

Video games could look a lot smoother at 24 fps if they included motion blur, but that would require changes to the typical rendering pipeline. So far, the strategy for realistic motion is to increase the frame rates.

...We came to the conclusion that the reason we can't tell much of a difference in video is because of motion blur. In games, well most games, there is no motion blur. So if something moves across the screen at 24 frames per second really fast, you might see one flash of him, and it's totally still. Your mind doesn't create motion blur on something that just blinks on then off. But when you get up to hundreds of frames per second, the guy running across the screen might be present in 60 or 100 frames. Your mind fills that in with some motion blur (or whatever) and it seems to flow a lot better.
 
Yes, motion blur is important.

What a lot of you didn't explain clearly is that when a film camera records a fast moving object, that object blurs, as it moves a certain distance during that single exposure.

That blur, when played back as a series of blurs helps our eyes to tell the brain that the object is moving.

Games, as mentioned, don't show that blur, hence it becomes a series of sharp, static images.

Think about nightclubs with strobe lighting (less common now, due to fears over epilepsy, but every club used to have them) - people dancing fast look like a series of freeze frames. That happens in games, hence the need for a high frame rate so that the freeze-frames merge together in the eye.

Another point is that as film was very expensive back in the day, Hollywood wanted to use as little of it as possible in a feature reel. 24fps was worked out as being the MINIMUM frame rate that appeared *reasonably smooth* to *most people* under *optimally darkened conditions* with a *non-panning camera* and actors that *didn't move too fast*.

(* indicates weasel words)

Today, 24 fps flickers madly whenever you try to do much with it, especially panning shots, and shots with rapidly moving objects.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.