When it comes to video, I can't tell the difference between 24 and 30, but I can tell the difference between 24 and 60.
In games, I can tell a difference between 24, 30, 60, 120, and 333. Beyond 333 I cannot tell a difference. My little brother actually did a science project for school (8th grade, smart little bugger) ... he had several video clips and tested me on video and such. Then he had Quake 3 open (the only game we have that will run at such high FPS, lol) and he set the graphics on the same level, but com_maxfps "###" will set the max FPS in game, and he set to different ones.
We came to the conclusion that the reason we can't tell much of a difference in video is because of motion blur. In games, well most games, there is no motion blur. So if something moves across the screen at 24 frames per second really fast, you might see one flash of him, and it's totally still. Your mind doesn't create motion blur on something that just blinks on then off. But when you get up to hundreds of frames per second, the guy running across the screen might be present in 60 or 100 frames. Your mind fills that in with some motion blur (or whatever) and it seems to flow a lot better.
This is all speculation, I am not a scientist.
But yeah, I'm done rambling, later.
