Moving Picture Frame Rates
Some of the discussion here displays a misunderstanding of moving picture frame rates, which work differently within different technologies. More importantly, unlike audio recording, the rates at which images are recorded are never the rates at which the image pulses during playback.
In film projection, you see a complete frame all at once. In silent films, once motor driven equipment replaced hand cranking, 16fps was standardized as the camera speed. If you simply changed the image 16 times/sec in projection this would be not just easily visible, but horribly distracting to the eye. However, the inventors of cinema had already figured out that the issue was not how many times the image actually changed, but how often it was interrupted. More, smaller interuptions made the motion appear more fluid regardless of how many times the actual content of the image changed. So projectors were given 3-blade shutters. Thus, for standard silent projection, within each 1/16th of a second there were three flashes of the complete image, and three flashes of blackness. During one of those black flashes the claw pulled the next frame into the gate. So silent film was a 48 pulse/sec technology, which was what the early experimenters determined was the minimum rate to prevent observable flicker. Sound films became standardized at 24fps camera speed. Again, standard projectors continued to employ a three blade shutter, resulting in 72 pulse/sec display.
On a cathode ray tube display, the image is never present all-at-once, but there aren't exactly black flashes either. The electron beam paints the image one point at a time, and the dot at the top of the frame represents an earlier point in time than the dot at the bottom. We see what appears to be a complete picture because the phosphors continue to glow after the beam moves on. NTSC video is 29.97fps interlaced, meaning there are 60 fields per second - a field in this case functioning as a 'new' image or pulse.
A digital display is yet again something different. Frankly, I don't know exactly how they work, but there's some kind of on and off issue since 'refresh rates' clearly matter - a higher refresh rate making for a smoother image. AFAIK, the lowest refresh rate on a digital display you would find in the US is 60Hz (well, actually 59.94), though newer LCD displays have higher refresh rates (to help make up for the fact that LCDs basically suck as moving image display technology). Which means that regardless of the frame rate of the incoming signal: 720P24, 1080i60, 1080P30, 1080P60, whatever, the display is remapping the information into something that flashes at least 60 times a second.
Now, in practice, how noticeable a pulse or flicker rate will be is determined in part by other technology specifics, and different individuals will begin to perceive flicker at different rates. Near the border of recognizable flicker, the brain can also train itself to ignore the flashes and perceive a more fluid motion. For example, I can clearly see flicker in 16fps, 48 pulse 16mm film projection, but I know other people cannot. I also cannot watch PAL TV (50 Hz flicker) without discomfort. But, of course, billions of people around the world watch 50Hz flicker programming without complaint. Still, the original target of 48Hz flicker seems to have been a tad low for a best practice, and 60Hz would be a safer minimum standard.
The way the brain processes images and the way it processes sounds are quite different, and analogies between them are more likely to muddy the discussion rather than clarify it.
Funny how the 'analog vs. digital' argument, which is basically BS as most posts here note, has replaced the old 'tube vs. solid state' arguments, which at least were about differences that do make a difference sonically.