View Full Version : DVI-VGA - How much loss??????

Sep 21, 2003, 04:26 PM
I am looking to plug a CRT (SONY FW900) into an apple G5. I am assuming that the G5 has only dvi inputs in the back. I know I am going to need a small connection to convert the signal.

Now, I know everything is digital these days, and that those DVI ports in the back were made for a digital display. Can someone explain to me what happends during the conversion from DVI to VGA? I am studying to be a graphic designer, and would like to know how much information is lost during the conversion.


Sep 21, 2003, 04:51 PM
The little cross on the DVI port (output of course!) is VGA. Using VGA on this is no worse than normal VGA. Of course you loose the ultra precise pixel lock of DVI but there is no way of keeping that with VGA anyway.

Sep 21, 2003, 04:57 PM
Oh so there is a VGA port on the back????????????!!!:eek:

Sep 21, 2003, 05:11 PM
Sort of, the DVI connector is designed so that it also carries VGA signals along with the digital video signals. This is how a powerbook DVI model can connect to a VGA monitor, you use an included connector that is DVI on one end and VGA on the other. The cable does nothing more than take the output of some of the DVI connectors and place them on the appropriate VGA pins.

There is now VGA connector on the back of any macs. One of the connectors outputs a VGA signal.

Get it?

Sep 21, 2003, 08:17 PM
Yep, thank you for that information. :D :D :D