I'm wondering if anyone has noticed this with their home theater set up, and if so, why this is so. 1) Rip DVD to MPEG4 or divx, MOV or AVI, whatever. 2) Play that video file (in QuickTime or VLC) on the MacBook Pro, DVI to S-video, to standard def tube TV, flat screen. 3) Play that same video file on an up-rezing DVD player via that DVD player's USB input (USB flash drive), component out, to same TV. **The video looks much better in step 3, using the DVD player. Why? For example, in dark areas especially, there's more and blockier compression noise coming off the computer. Obviously, there's going to be some difference between S-video and component, but it's a crappy compressed AVI. I've seen S-video and component signals that look basically the same. Also, component isn't going to clean up compression noise. My speculations: The DVD player, being a piece of dedicated video playing hardware, is just capable of outputting better than the MacBook Pro's graphics card (RadeonX1600, 128 MB). The DVD player's black and white levels are adjusted to accommodate NTSC video whereas the computer's aren't. Blacks appear blacker off the DVD player and thus compression noise in dark areas is hidden. The DVD player's up rezing cleans up some of the compression noise. Sorry for the long winded post, but this is hard to explain. I was considering buying a Mac mini to make a home theater system, but not if my DVD player can do such a better job. Thanks!