If you go LCD/LED go with Samsung. Plasma go with Panasonic. Plasma has a faster native refresh rate of 600Hz. LCD is 60Hz with 120 and 240Hz options. With these options the TV is "guessing" and adding in extra frames to the picture displayed. IMO it looks terrible and looks like the video was shot with a camcorder. . . . Regardless of what you choose I would strongly recommend getting your tv professionally calibrated. This will run you $250-$400 normally, but will make all the difference in the world.
The Panasonic "refresh rate" of 600Hz refers to a different measurement than the refresh rate quoted for LCD TVs and thus should not be directly compared. The 600Hz figure is the plasma's "sub-field drive rate", which is something unique to plasma TVs. It refers to the frequency at which the individual phosphors are addressed by the electrodes in the screen. To vary the brightness of an individual phosphor, the phosphor is addressed multiple times per frame at varying intensities; varying the combination of addressing pulses allows the phosphor to show a range of brightness from black to white. In a Panasonic plasma, each phosphor is addressed ten times per complete refresh of the screen, and each refresh of the screen is happening 60 times per second (60Hz): thus, 600Hz sub-field drive rate. They used to address each phosphor eight times per refresh of the screen, and so quoted a sub-field drive rate of 480Hz. Thus, a plasma panel is not actually refreshing the image 600 times per second, but rather each phosphor must be pulsed up to 10 times per refresh of the screen in order to reach the desired brightness level. The pulsing just allows each phosphor to display a range of brightness levels. From a quick glance at Panasonic's site, it looks like they don't even list this figure anymore, which is good because it was confusing (which was probably Panasonic's intent). A user over on avsforum posted a great graphic a while back that illustrates this process, and compares the duty cycle of CRTs, plasmas, and LCDs (gray bars are times that the phosphor or pixel is lit):
http://www.avsforum.com/t/1019371/is-panasonic-pz800-850s-refresh-rate-limited-to-48hz#post_13646031
Now let's actually discuss refresh rate. Traditionally, both LCDs and plasmas could completely refresh a frame on screen every 1/60th of a second, for a refresh rate of 60Hz. Then, LCD manufacturers started upping the available refresh rates to 120Hz or 240Hz. Furthermore, plasma manufacturers developed panels that could run at refresh rates other than 60Hz, such as 48Hz or 96Hz. But these refresh rates, in the abstract, don't imply anything about the level of motion blur a panel will display.
Let's take the LCDs first. Most content that you send to a TV either contains 30 frames per second (e.g., 1080i TV broadcasts or standard-def TV) or 60 frames per second (e.g. 720p TV broadcasts). So let's assume you've got an LCD TV that runs at 60Hz. If it receives a 60 FPS source, every 1/60th of a second the TV's pixels change to display each frame of the 60 FPS content. If instead the TV receives a 30 FPS source, the TV's pixels will show each frame twice. For example, in the first 1/60th of a second the TV will show the first frame of the 30 FPS source content. Then, in the next 1/60th of a second, the TV will again show the first frame of the 30 FPS content. Then, in the next 1/60th of a second (or, 3/60ths of a second after the content started), the pixels will change to show the second frame of the 30 FPS content. That's because with 30 FPS content, the TV only has 30 frames to show over the course of a second, but has the capability to refresh the display 60 times per second. So, the pixels will only be changing every 1/30th of a second--the speed required to keep up with the content. Now let's say you've got an LCD TV that has a refresh rate of 120Hz. If you send that TV a 60 FPS video, the TV will show each frame twice (just like if you send a 60Hz TV a 30 FPS signal). So in the first 1/120th of a second, the TV will show frame one. Then in the next 1/120th of a second, the TV will again show frame one. At the third 1/120th of a second, the pixels will refresh to show the second frame of content. At this point you can probably guess what would happen with a 240Hz TV: with 60 FPS content, the TV will display each frame four times before the pixels go ahead and refresh for the next frame of content.
You might then ask, what's the point of having a TV with a refresh rate faster than 60Hz if it's just going to be showing the same frames of content multiple times? The answer lies in why LCD displays show motion blur. Old LCD displays used to blur because the pixel response time--how much time it took for a pixel to change from one color to the next--was too slow to keep up with fast-moving content on screen. Modern LCD displays don't have that problem; their pixels respond fast enough to keep up with video content. Rather, the problem comes from how an LCD panel displays an image.
The individual pixels in the LCD panel determine the color and brightness of each pixel of video content, but an LCD panel doesn't emit any light. The way an LCD panel emits light is by having, essentially, a fluorescent light bulb (or, in modern LCDs, multiple LED lights) located behind the LCD panel. The light shines through the LCD panel and creates the screen's "brightness." When you change the brightness of a laptop screen, for example, you're just adjusting the intensity of that backlight that is located behind the LCD panel.
This is a
much different process than the way that old tube TVs and plasmas emit light. They emit light by exciting each individual phosphor with electrical energy. On a plasma, for example, when a phosphor, or pixel, is charged with energy the phosphor lights up, directly emitting light out of the panel. A plasma panel thus does not need a fluorescent light bulb or LED to create light; the light is directly emitted by the excited phosphors. After the phosphor is charged, it fades out to darkness. So a plasma is not constantly emitting a picture like an LCD, but rather is rapidly flashing light at the viewer as the phosphors are repeatedly charged with electricity and fade to black in between charges. This phenomenon is why picture tube TVs and plasmas can be seen to "flicker" at low refresh rates.
So, with that information in hand, let's get to why LCDs blur, why plasmas don't, and why having a faster refresh rate can help with the LCD blur.
Imagine a video of an (American) football game, where the video shows a football flying through the air from one side of the screen to the other. As each frame of video is displayed, the football will move across the frames from one side of the screen to the other. Let's say the video is running at 60 frames per second. On an LCD, the pixels will update every 1/60th of a second, and on each update the football will move across the screen. In between each update, however, the football will be frozen in place.
If you could watch the TV in slow motion, it would look like the football was "jumping" from frame to frame each time the TV refreshed its pixels. The first frame is displayed, held on screen for 1/60th of a second, and then the next frame is immediately displayed showing the football further along, and again held for 1/60th of a second, and so on. The football thus "jumps" from one frame to next. This stuttering is called "sample and hold blur."
On a plasma or tube TV, we don't get that jumping because the picture fades to black between each refresh of the screen. So on a plasma, the first frame will be displayed by electrifying the phosphors on the screen. Then the phosphors will fade out out before being electrified 1/60th of a second later to show the next frame. The result is kind of like this:
LCD:___ |Frame 1 --------------------||Frame 2 --------------------|
Plasma: |Frame 1 -- fade to black --||Frame 2 -- fade to black --|
That fade to black between each frame tricks our eyes, and we think that the football is moving smoothly from frame to frame, instead of immediately "jumping" from one frame to the next on the LCD. In effect, the fade to black between each frame creates a smoother transition to our eyes.
LCD manufacturers realized this problem, and came up with a few possible ways to fix it. One way would be to essentially "strobe" the screen between each frame to simulate how tube TVs and plasmas fade to black between frames. Such a technique, called "Black Frame Insertion" (BFI) would look like this:
| Frame 1 || Black screen || Frame 2 || Black screen||
Another way would be to try to make the transition from one frame to the next smoother, by interpolating a new frame between each frame of video content that "guesses" at where the moving object would be half-way between each frame. That's the technique murdoc158 is referring to, which Samsung, for example, calls "AMP" or Auto Motion Plus. The result is something like this:
| Frame 1 || Frame 1.5 || Frame 2 || Frame 2.5 ||
Those ".5" frames are made up by the TV, which attempts to guess where objects moving from one frame to the next would be half-way between each frame. In effect, the TV is artificially increasing the frame rate of the video content.
Let's compare all these techniques with plasma to get a feel for what's going on:
Plasma:_____ |Frame 1 -- fade to black --||Frame 2 -- fade to black --|
60Hz LCD:___ |----------Frame 1-----------||----------Frame 2-----------|
120Hz LCD:__ |--Frame 1---||--Frame 1---||--Frame 2---||--Frame 2--||
120Hz w/BFI:_ |--Frame 1---||Black Frame||--Frame 2---||Black Frame||
120Hz w/AMP: |--Frame 1---||-Frame 1.5--||--Frame 2---||-Frame 2.5-||
In order to do either BFI or motion interpolation (AMP), the LCD display must be able to refresh its screen at least twice as fast as the video content, so that has time to do two frames (either actual frame + a black frame, or actual frame + an interpolated frame) during the time that would normally be occupied by a single frame of video. Given that the fastest video content runs at 60 frames per second (generally), the LCD display must then be able to refresh at at least 120Hz. Voila! 120Hz LCDs. Thus, an LCD running at 120Hz doesn't, alone, look any different than an LCD running at 60Hz (see the diagram above--the 60Hz LCD and the first example of the 120Hz LCD (without BFI or AMP) are displaying the same picture). If, however, the 120Hz LCD implements either BFI or the frame interpolation technique, it may be able to reduce its motion blur.
Both of those techniques have tradeoffs. The black frame insertion method, which approximates the fade-to-black that happens on plasmas and tube TVs between each frame, lowers the overall brightness of the display (that should be intuitive, because if every other frame displayed is a black frame, the TV will be half as bright as it would be if it just displayed the frame of content the entire time). The frame interpolation method increases the frame rate of the content, which can give the picture the so-called "soap opera" or "camcorder" effect, making the content look overly smooth. I can get into more about why the effect is called those things if anyone is curious.
So why, then, create a 240Hz TV? More flexibility for implementation of BFI or frame interpolation techniques. You could, for example, do things like this:
240Hz w/BFI:_ |--Frame 1---||--Frame 1---||--Frame 1---||Black Frame||--Frame 2---|
240Hz w/AMP: |--Frame 1---||-Frame 1.25-||-Frame 1.5--||-Frame 1.75||--Frame 2---|
In the case of BFI, you now only have the TV going dark 25% of the time, rather than 50% of the time as with a 120Hz TV implementing that technique, which would allow for higher brightness but still attempt to simulate the plasma fade-out between frames. For motion interpolation, the TV is essentially doubling the frame rate of the content to make it look even smoother (which, if you don't like the too-smooth effect to begin with, might be worse).
You might also be wondering, why do plasmas, which don't suffer from this blurring problem, offer viewing modes of 48Hz or 96Hz? Why not just stick with their tried-and-true 60Hz refresh rate?
The answer lies with a particular type of video content: movies. Dating back to when movies were shot exclusively on film, they were (and many still are) filmed at 24 frames per second, which is much lower than the 30 or 60 frames per second at which TV broadcasts typically run. If you consider that most TVs run at either 60Hz or 120Hz, you can probably immediately see a problem: how do you show 24 frames per second on a device that can either refresh 60 or 120 times per second? It's like trying to fit a square peg in a round hole, since 24 does not divide evenly into 60 or 120. The traditional way of doing so is to show some frames more often than others. So, on a 60Hz TV, you might show Frame 1 two times, then show Frame 2 three times, for a result like this:
| Frame 1 || Frame 1 || Frame 2 || Frame 2 || Frame 2 |
Then you'd go on to show Frame 3 two times, and Frame 4 three times, and so on. The result is that from 24 frames, you can get 60 frames of content. If the TV is a 120Hz TV, you'd just double those frames (show Frame 1 six times, Frame 2 four times, and so on). This technique is called "3:2 pulldown".
Obviously this is not good. If you're showing some frames on screen for more time than others, any kind of scene that's showing a moving object, like a slow pan across the horizon, isn't going to be smooth. It's going to kind of stutter, since the panning effect will seem to speed up and slow down at alternating intervals.
The best scenario would be to have a TV that can refresh its screen at a multiple of 24. Voila! Plasma running at 48Hz or 96Hz, which can show 24 FPS content without having to show some frames more than others. Obviously, the TV must switch back to 60Hz when displaying normal TV content running at either 30 or 60 frames per second, so the TV typically has to try to figure out when it's being sent a 24 FPS signal.
TL; DR
The 600Hz "refresh rate" on plasmas doesn't actually refer to the display's refresh rate, but something entirely different. Plasmas generally refresh at 60Hz, but that's fine because plasmas, by nature of the display technology, don't suffer from motion blur. Some LCDs, running at either 120Hz or 240Hz, include technologies that attempt to reduce the effect of motion blur, but the byproduct of those technologies may not be desirable. The fact that the LCD is running at 120Hz or 240Hz, by itself, doesn't mean anything about whether the LCD will show motion blur. Those higher refresh rates just enable the LCD to implement these other technologies (either frame interpolation or black frame insertion) that may help with the blur.
Great advice on getting the TV calibrated. Everyone should consider calibration if they're serious about viewing quality.
OP: I much prefer plasmas because I find their picture to be more similar to tubes, some of the reasons for which I discussed above. Given that many LCDs now have glass laid on top of the panel, their superiority in bright rooms is diminished. And unless the TV is only or predominantly going to be used in a bright room, I would buy a TV that will look best under critical viewing conditions (movie nights in low lighting, for example), rather than buying a TV that will look marginally better in poor lighting conditions, but worse under better conditions.