T...The main reason for the return is that turning down brightness to a comfortable level is affecting color and sharpness, making white appear gray. This is unacceptable to me. If you can stand the strong brightness, I guess the 245T would be OK. But I need a display that will be both sharp and produce normal colors even when set to a low brightness.
You're out of luck on this one because the way displays work, lower the brightness on ANY display and white will turn to black. The white color comes from the light of the backlight. If you turn down the brightness, you turn down the backlights which means a dimmer, and less white, white.
I've also noticed that the 30" emits a tremendous amount of heat. So much even that it bothered me when I sat front of one. Living in LA I don't need more heat!

What's your experiences with the 30" in this regard? Should I rethink it?
Six 22" LCDs will use more power/put out more aggregate heat than two 30" Cinema Displays I'm sure.
[/quote]So, do you guys have any suggestions for 20" or 22" displays that are sharp, don't strain your eyes, have quite accurate colors without being extremely bright? Also, the higher the resolution, the better. And the cheaper the better.[/quote]
Try Benq. I love my 22" Display. Better color and sharpness than my mom's old Mitsubishi Diamondtron (these displays were only rivaled by/rivaled Trinetrons in terms of performance back in the day when a 15" LCD cost you $1,000).
[/quote]Also, does anyone know why the 245T has more motion blur, when watching DVDs that the old 23" Apple?
And if someone could give the lowdown on the advantages/disadvantages of the various forms of LCD displays, i.e. TN, S-IPS, S-PVA etc., that would be great too.[/QUOTE]
The blurring is caused by poorer pixel response. With CRTs, the image is drawn in a scanning motion. Your refresh rate is how fast each pass is. The higher the better. On screens where the refresh rate was low, you'd notice a lot of flickering and you'd get a headache from eyestrain. If the refresh rate got low enough, you'd have problems with the image sheering since it would change before the screen had been drawn if you were dealing with an interlaced video scan.
LCDs work differently. When a pixel is to be changed, there's a period of latency involved (The pixel response). If the latency is too great, you'll get ghosting, muddy video, or sheering. The former two are basically the problems you're seeing. Sheering is most evident when playing a game like Quake 3 on modern hardware where even the most basic, current, GPU can get astronomically high frame rates.
http://en.wikipedia.org/wiki/TFT_LCD
That should take care of your other question.
Yes. Sad to say, but Samsung utilize poorer quality
photons. These go floppy at lower energy levels, so
leading to a perceived lack of sharpness.
The nicest way I can put this is that I don't think you understand the subject well enough to comment on it.
A photon doesn't have a perceivable level of quality. It is unto itself and no different from the one traveling right next to it, except in it's direction or wavelength (and a few other things dealing in the physics of photons that are beyond the scope of this thread).
I think what you mean to say is that they're using lower quality lighting and panel technology... and to those points I greatly disagree with you. They're not a DLP screen by any means (which are my current favorite in terms of HDTV performance), but they're way above no name panels.
I'm using horrible HP panels at work, which used to be very, very bright as well. Turning down brightness and fiddling with contrast made the colors look like crap.
My solution was to, instead of turning down brightness, decrease each individual RGB value down to 60-70%.
--Erwin
Sounds like a very time and effort intensive way of tweaking your display's Gama. I assume you're on a Windows PC at work? Catalyst Control Panel and NVidia's Forceware control panel will have calibration tools to help set your gama and color levels to more appropriate settings without all the effort. If you're on Intel Integrated Graphics or something else you'll have to see what, if any, tools they provide.
After living with the two new Samsung 245T's for a few days, I have decided to keep them. The image is a little bit better now since I've fiddled extensively with the settings. I've also changed the lighting in my room which makes me able to use them with more brightness. The image, both color and sharpness, definitely look better with the brightness cranked up. The only question is if you can stand it. I know I can't, but I think most casual users will have no problems with that. Especially young people with good eyes.
However, I'm not keeping them because I think they are great. I'm keeping them because it's a hassle to return them and I don't know what to replace them with (I would have bought Apple displays if they were cheaper). However, when the next generation of Apple displays are finally out (please, please higher resolution per inch and resolution independence in Leopard), it's Hasta la vista to the Samsungs!
For anyone who is interested, today I also hooked up a 245T to an HD cable box and an up-converting DVD player via HDMI. I was not impressed. If you plan to sit 4-5 feet away, the image looked OK. But from 2 feet away, it wasn't very pretty. The image has a lot of motion blur and grain, and the compression artifacts from the source material was very evident. However, you probably can't compare the image of the 245T to that of an LCD TV, as I assume there is no video processor in the Samsung for de-interlacing and helping out with motion blur and all that.
You should test an unscaled movie to be sure of your results if you haven't already. The blur is once again poor pixel response time, but the rest of the image faults you're mentioning sound like a poorly implemented image interpolator (hence my comment on viewing an unscaled movie). There shouldn't be any artifacts from the encoding unless it's a home made DVD that was encoded at a very, very, low bit rate, or their upscaling code is simply worthless.
De-interlacing doesn't have anything to do with the panel itself. It has to do more with the player/decoder hardware. Some players for example can't do 1080p because their hardware isn't up to the task and so they'll advertise progressive scan, but on only the 720 resolution. Interlacing on commercial DVDs use the "soft" Telecine method which means it's all done in software. The video is encoded in the progressive scan format and then the decoding software take care of the interlacing. More than likely, since you're connected via a digital interface, unless you've selected otherwise, your movie is actually playing in Progressive Scan mode, making arguments about interlacing quite pointless

. However, if you've set it to interlaced mode, your complaints may be simply because you're used to progressive scan and don't realize it. Interlaced video is less precise than Progressive scan and can jitter. The reason interlacing is used is because it uses less bandwidth.
http://en.wikipedia.org/wiki/Progressive_scan
I find your young eyes comment interested when coupled with the earlier one asking about why they don't make higher resolution 22" panels. One reason for having a lower PPI on a larger screen is that it's easier on "old eyes" or those who have poorer vision. Without loosing screen real estate, everything becomes easier to read. My dad loves using his large HDTV as a monitor because it's 13-something by 10-something and 30-something inches across so he can read text without glasses.
----
I hope all this information helps. For those interested in why i bothered learning all of this... well it helps to know the "why" behind the things you do when editing/encoding video

.