SonComet said:I agree that we can probably see over 30fps... Isn't it 8-10fps that allows us to see motion that we can easily follow (I think original anime shows were at this fps). Then 24fps lets us see fluid motion, and since we can see monitors refreshing at up to 85hz (for most people), I'd say that the eye can see more than 30 [On a side note for monitors just remember that even if your brain can't detect the flicker at over 85hz, your eyes still can and therefore any crt causes eye strain]. Also, you have to realize that unless a game avgs way more than 30 you can't lock it at 30. If it avgs 30 you are probably gonna drop to 10-15fps fairly often.
Here's something I found helpful in explaining the benefits of high FPS in gaming (@ ArsTechnia forums):
Fiendish said:JackMiller said:As long as it's above 25 fps, I'm happy.
Seriously, if your eye cannot tell the difference, why, uh, care? <*ducks from flames*>
I can tell a huge difference between steady 30fps and steady 60fps in first person shooter games. I suppose I just shattered your theory. Roll Eyes
Think about it this way. Let's say that a red ball moves across your vision at some speed such that it takes exactly 1/24th of a second to pass through your field of vision. If you watch that ball in real life, you would see a transparent red blur across your field of vision the height of the ball. If you were to render that ball on a computer monitor at a frame rate of 24 frames per second, however, you would have 23 frames of nothing and a single frame of a red circle. Now render the ball at 1000 frames per second. You would then see approximately 40 frames of red circles that actually move across the screen. You won't see the proper blur in the 24fps scenario, because the in-between images are missing. This has to do with persistence of vision. Even though we may not be able to distinguish moving details above certain speeds (note, moving details are much harder to distinguish than static details. Experiments have shown that humans can identify images that flash on and off in times much lower than 1/24th of a second), we can certainly tell whether or not something has filled the space. This is why, for those of us who aren't blind ^_^, 30 frames per second in fast action computer games looks jittery. If an object moves across the screen at a speed such that it only gets rendered a few times before vanishing on the other end, your eye will be able to tell that something was there and that it wasn't moving correctly.
"Video and film get by with 24 FPS (30 for broadcast video) because they capture all the visual information for that entire fraction of a second. When a movie camera snaps 24 pictures every second, each one captures all the movement for that 24th of a second. This is why you see all that blur when you freeze the picture on a single frame during an action scene, and why the blur goes away when the scene is played at full speed.
Computers don't do this, however. Each image displayed represents "zero time." That is, if it takes your computer 1/24th of a second to render the next frame, it only shows an instantaneous snapshot of the new position of everything, but doesn't show any of the visual information of the movement during that 1/24th of a second. If it did, you could freeze your game and everything moving would be blurry. The motion blur capabilities touted by some video cards doesn't work like a movie camera does, either. It only blends together previous drawings of objects, rather than actually representing all the motion "in between" frames." - from some guy on the internet