News flash: ADC
is DVI - read on.
VGA is an analog signal, and results in "soft" on-screen results with LCD monitors - not as crisp as a digital signal. Going "VGA to DVI" does not make the analog VGA signal into a high quality digital signal, it just pipes the VGA signal into some pins in the DVI-I connector that are there to carry oldstyle analog signals. So VGA to DVI is a misnomer - you are as well to use VGA to VGA if the monitor has an analog VGA port.
ADC and DVI (digital) have the SAME digital video signals in them, they just have different plug configurations and ADC also carries a power and USB lines that DVI does not.
So it is easy to go ADC to DVI - a simple adaptor (about $25) converts the ADC digital signals to a DVI jack. This gets a pure digital signal and is sharper than the VGA connection.
The thing you have to check is whether your video card has enough RAM to support the native resolution of the monitor. ATI Rage video cards in early G4 towers did not have enough VRAM to run 20" widescreen monitors at native resolution, despite their ADC digital outputs.
Remember that LCD monitors as a group perform poorly if they are set to a resolution other than the native resolution. If you are not maintaining a 1:1 ratio of video pixels to the actual LCD pixels, then the monitor has to interprolate thye signal and assign a video piel to more than one LCD pixel. Some, or many, of the video pixels will span 2 or more LCD pixels, and their edges will overlap with others on the same LCD pixel, which compromises the quality and makes the screen results look grainy and blurry.
There are two flavours of the DVI jack - DVI-I and DVI-D. DVI-I includes pins for the analog VGA signal. DVI-D has only the pins for the digital signals and is incapable of passing through VGA signals. Most monitors and cards with DVI jacks are DVI-I but there are exceptions.
So:
If you have an ADC video card and a DVI-I or DVI-D monitor, then use ADC-DVI convertor and a DVI - DVI cable.
If you have a DVI video card and a ADC monitor, you're mildly hooped because although the video signal is the same, the monitor relies on power which is not present in the DVI jack. There are some moderately expensive DVI - ACD adaptors which have an AC supply to feed in the power required.
http://www.addlogix.com/peripheral_sharing/embed.asp?stuff=se_adc_body.html
If you have a VGA video card and a monitor with a VGA jack, use VGA to VGA, no point in anything else. If the monitor has a DVI-I jack but no VGA, then you can use a VGA to DVI adaptor, but as noted it will still be a VGA signal. If the monitor is only DVI-D, you're hooped.
If you have a DVI-I video card output only and a VGA monitor, you can use a DVI-VGA adaptor to extract the VGA signal, then VGA to VGA to the monitor. This would be used, for example, to hook up a second VGA monitor if there was one already using the card's VGA port.
If you have an ADC card and a VGA monitor, you can similarly get an ADC-VGA adaptor.
More detail on what goes with what here
http://docs.info.apple.com/article.html?artnum=58692