I am not going to argue so this will be my last post on the subject. My DVI Cable was hooked up correctly. It looked fine and exactly the same as when connected by VGA. You seem to be under the assumption that DVI/HDMI is always a better method of transporting signals then analog cables such as VGA or component. The truth is that it's more complicated then that. Have a look at this article written by the guys at Blue Jeans Cable. They really know their stuff.
http://forum.ecoustics.com/bbs/messages/34579/122868.html
After you brought up your point originally, I actually googled and found this article as well (it was one of the top articles found by google and I'm assuming you did the same

).
Two key points that are being discussed in that article are the method of conversion by the television set and the type of cable carrying your signal. Also, let me start by saying that he is making his primary discussion of a digital versus an analog signal that is carried by a component cable specifically, and not a VGA cable. This is a very important point that does in fact matter.
The first point of conversion. Yes, it is true. Obviously, even a digital signal requires a bit of conversion upon reception by whatever television you are using and dependent upon your configurations. However, this article was written in 2005. Over the course of the past 2-3 years, digital signal systems have advanced. This portion of the article, however, is the most ambiguous part of the article, and he goes on to say the following:
As a general rule, with consumer equipment, one simply doesn't know how signals are processed, and one doesn't know how that processing varies by input. Analog and digital inputs must either be scaled through separate circuits, or one must be converted to the other to use the same scaler. How is that done? In general, you won't find an answer to that anywhere in your instruction manual, and even if you did, it'd be hard to judge which is the better scaler without viewing the actual video output. It's fair to say, in general, that even in very high-end consumer gear, the quality of circuits for signal processing and scaling is quite variable.
The next point made in the article that is 100% true is the method by which a DVI cable itself is designed and carries a signal. However, it is important to understand that a DVI cable and a VGA cable have the same problem. That problem is that signal quality is lost over distance. However, a composite cable carries a signal for a much greater length while maintaining its quality. For that reason, this part of the argument does not apply when comparing a DVI cable with a VGA cable.
As I mentioned before (and as mentioned in the article you referenced), it is probably due to one of two things: 1) your setup and configurations, and 2) the television you are using. Yes, dependent upon your setup, it is actually possible for you to be getting an equal picture quality with a VGA and DVI cable. Which is why I asked in my last post what your setup was
The point I was trying to make was that in optimal conditions, a digital connection will, in fact, provide a better picture quality than an analog one. Is this necessarily always the case? No. Hope that clarifies things!
