Originally posted by 9hundred
Someone tell me about tv/computer displays please. Thankyou for your time.
TV - low resolution, low speed (24 fps or so)
computer - much higher resolution (anywhere from 2X to 7X for most standard size stuff), much higher speed (60 Hz).
1 Hz = 1 Hertz = 1 cycle per second = 1 fps = 1 frame per second
LCD is not a true display with regard to Hertz (a CRT is). Each pixel is not flashing (1 cycle is off-on), but rather blending (each pixel is made of 3 primary colors - which explains lack of true 100% black or 100% white) at a rate similar to 60 Hz. CRT's can go much higher than 60 Hz. 60 is usually the minimum to look good (no visible "flicker") to most people on a CRT. This is why people say that a lcd is "slower" with more "artifacts" than compared to a CRT.
This is why the widescreen lcd format is mostly being ignored by major computer makers. Any benefit as a computer display is nice, but it does not translate to watching a dvd on said widescreen display. There is also more than one widescreen resolution (standards... what?), and more than one method of drawing the screen (interlace or not).
As many have said before, a dvd is junk to watch on most computer screens unless you view in a window that is sized to the res of the dvd. This coupled with the fact that lcd's suck for re-sizing (native resolution vs other sizes)...
CRT is better than LCD for watching a DVD.
TV is best. The media is designed for TV.