No, you'd want a "flicker" of more like 120-200MHz, perhaps even more, to avoid eye strain. While the typical eye/cortex can only "process" a visible change 30 times a second, the lower-order reflexes respond significantly faster. Strobing of a large portion of the central visible area (ie, a 17" laptop sitting in your lap) would absolutely kill your eye muscles and likely induce major migranes in those of us prone to such.
Remember back when we used tubes and the refresh rates had to be that high? Yeah, same idea, sort of. The problem there was that a small area of the screen would oscillate between full brightness, fade to near-black, then quickly ramp back up to full brightness in that timeframe, wreaking havoc on the reptile-brain bits behind the eye as they tried to react. This would be significantly worse, in that the ramping times would be near-zero (full on, full off, very little phosphor fade, most likely) and the full area of the screen would be blinking in unison.
Of course, then there are other possibilities such as:
* sufficient light spreading to allow that only a fraction of the lights in the display need to be on at a time to offer even lighting.
* "blinking" of different nearby LEDs at different offsets (ie, preventing the whole screen going black for an instant)
* phosphor fade effects to "even out" the blinking of each individual LED to something more like a constant but dimmer light.