This is only an issue with cables. You need to be sure to buy DVI-D cables.
This could have and should have been handled better. There's three types of DVI:
DVI-D - The digital format we know as DVI
DVI-I - Many video cards have this, it has DVI and analog video on it
DVI-A - Analog only.
DVI-A connectors don't really exist but are in the specification. The problem that comes up is that many cables are DVI-I (have the extra pins for analog) but connectors on devices that are digital only do NOT have the extra pins for analog.
The answer was simple had it been in the standard - ALL devices should use the DVI-I connector even if they only supported digital. Then the extra pins, even if unused, could fit. The DVI-D connector should not exist but it does so we have to live with it.
Thankfully, there's a simple answer - unlike in the early days of DVI, it's easy to get a DVI-D cable (cable without the analog pins).
Boo to the manufacturers who decided using a connector that blocked the insertion of a DVI-I cable (which would have worked just fine) was a good idea....
P.S. I can understand WHY to some geek the concept of a DVI-D connector was a good idea - one quick look at the number of pins and you can tell if a device supports analog or not. Great idea, EXCEPT WHEN YOU HAVE A DEVICE WITH A DVI-D CONNECTOR AND A DVI-I CABLE!!!!!!!!!! Like many ideas that sound good on paper, they're just frustration in the real world!