I just got a mini displayport to DVI adapter and tried to hook up a 24" Acer monitor. This monitor works fine via DVI from a Mac Pro at 1920x1200 but it seems to be detected as a TV when connected to the MacBook Pro. The options have "overscan", for instance. The resolutions allowed are all lower than 1920x1200 (tops out at 1920x1080i (yes, interlaced) or 1680x1052p (yes, 1052). The monitor's identified by model number in the display preferences so it's not like it's just throwing up a generic list of resolutions.
The monitor runs fine at 1920x1200 (non-interlaced) using a mini displayport to VGA adapter but I don't want to use the analog connection. I want to leap into 5 years ago and use digital.
What's the deal and how do I fix it?
On a side-note, I'm pretty disgusted with the whole displayport thing. Previous MacBook pros I've worked with had DVI-I connectors and had DVI to VGA adapters included in the box. I could hook up a DVI monitor directly to the computer with no adapter at all and I could plug in an analog display device using the bundled adapter. Now I have to go out and spend $60 (if I buy Apple products) to be able to hook up standard digital and analog display devices and the damn thing still doesn't work right.
The monitor runs fine at 1920x1200 (non-interlaced) using a mini displayport to VGA adapter but I don't want to use the analog connection. I want to leap into 5 years ago and use digital.
What's the deal and how do I fix it?
On a side-note, I'm pretty disgusted with the whole displayport thing. Previous MacBook pros I've worked with had DVI-I connectors and had DVI to VGA adapters included in the box. I could hook up a DVI monitor directly to the computer with no adapter at all and I could plug in an analog display device using the bundled adapter. Now I have to go out and spend $60 (if I buy Apple products) to be able to hook up standard digital and analog display devices and the damn thing still doesn't work right.