Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

kingofkolt

macrumors 6502
Original poster
May 2, 2007
375
0
Boston, MA
Hi all,

I just got my MacBook Pro on Monday, and installed Boot Camp + Vista on it yesterday (for the sole purpose of gaming :D ). Now, I plugged in my external monitor, a Samsung SyncMaster 931BW 19" Widescreen LCD, and I installed the driver for it. However, for some reason, Vista can only get it to go up to 1024x768 resolution, even though the screen's native resolution is 1440x900. I installed and reinstalled the screen's driver several times and it even recognizes the model name of the screen. When I installed the driver, I made sure to install the one ending in "(Digital)" not in "(Analog)" because it's plugged into my DVI jack and uses a DVI connection on the monitor. But it's still only allowing up to 1024x768 resolution. I looked at Samsung.com and found a support page for the screen and it mentioned the possibility that if one's computer won't allow them to set a higher resolution, perhaps one's video card doesn't support resolutions that high. This is of course not the case in that the MBP's native resolution on the laptop display is 1440x900 AND it is supposed to support the 30" Apple Cinema Displays.

I also tried setting the external monitor to NOT scale the image to full screen but instead just have a black border around it (if it won't get the highest resolution, I'd at least like it to look sharp). But when I did that, the external monitor didn't respond at all.

On Mac OS X, on the other hand, as soon as I logged into my account and opened System Prefs and clicked displays, it had a list of resolutions that worked with my screen, and naturally, the default selection was 1440x900. I didn't have to install drivers or anything, it just automatically knew what resolution to use (I know, throw out Vista, right? I need it for games dammit! :D ).

Well anyway, I was playing with this for hours yesterday and got nowhere. So I'm kinda at the end of my wits and I was wondering if any of you guys have advice. Could it be that the Boot Camp driver for the Nvidia GeForce 8600M GT is faulty? I only tried reinstalling the monitor's drivers, not the GPU's. I'm a little wary of reinstalling that, though, in that it has a whole control panel and I don't want to lose that by uninstalling it.

Advice please?

EDIT: I've also noticed that it sometimes doesn't even realize what I'm referring to when I make settings for the external monitor. For example, I'll tell it to use just one monitor by itself, and I'll set that one monitor to be the Samsung SyncMaster. When I push Apply, it changes my laptop display to 1024x768 resolution, as if THAT's the Samsung (and the Samsung doesn't even turn on!). In the mean time, mirroring (displaying the same thing on both screens) doesn't work at all, on 1024x768 or 1440x900. I have no idea what the heck is going on...
 
Did you try the advance settings for the video card? You might be able to force the correct resolution in there.
 
Did you try the advance settings for the video card? You might be able to force the correct resolution in there.

You mean the NVIDIA control panel? Yeah I tried that, that's where all the problems I listed above happened (not doing video mirroring correctly, not the right resolution, etc.). I've tried both the NVIDIA control panel and the regular built-in windows "Display" box.
 
There's probably also an advanced section to the drivers where you can manually set a resolution, override settings, that sort of thing.

I'm not sure where it is, as I think Nvidia recently changed their control panel around (theoretically to simplify it, though I liked the old one).

I don't know. Windows seems MUCH flakier with getting dual monitors to work right, but still, this should work :confused:
 
There's probably also an advanced section to the drivers where you can manually set a resolution, override settings, that sort of thing.

I'm not sure where it is, as I think Nvidia recently changed their control panel around (theoretically to simplify it, though I liked the old one).

I don't know. Windows seems MUCH flakier with getting dual monitors to work right, but still, this should work :confused:

I pretty much explored the entire NVIDIA control panel and didn't see any options like that. That was exactly what I was hoping to find but I didn't find it. Maybe I'll try to get more drivers from nvidia.com... :(
 
Maybe for some reason some "weird" resolutions are hidden? Tried to right click and change via the Nvidia icon in the systray?
 
Same problem here. I have the latest MacBook Pro and tried using my DELL WFP 2007 20" widescreen. The highest resolution I can select is 1024x768 pixels. I had a similar issue with my old Asus notebook. That old issue seemed to be related to my Graphics card's drivers. Hopefully Apple releases an update..
 
have u used the dual-monitor setup in macosx??? see if it runs native in that. if it does then id say its a vista drivers problem. also you could try install xp(if you could be bothered) and see if it runs native in that. this way it would be easy to identify the problem
 
Yes, it runs natively in 1680x1050 on Mac OS X. Seems Apple forgot to add my display's resolution to the nv4_disp.inf installation file. Well, I got Mac OS :)
 
have u used the dual-monitor setup in macosx??? see if it runs native in that. if it does then id say its a vista drivers problem. also you could try install xp(if you could be bothered) and see if it runs native in that. this way it would be easy to identify the problem

Yeah, Mac OS X recognized the native resolution instantly, as I said in my original post. It does certainly seem to be a Vista problem, or at least a Vista on Boot Camp problem. I had a similarly hard time getting the screen to work on my old Dell on XP, but that eventually was just a matter of getting a DVI cord instead of an VGA one, and doing some funky things with the display settings to get it to work. But I'm just at a loss here. I hope Apple and/or NVIDIA update their drivers SOON.

Yes, it runs natively in 1680x1050 on Mac OS X. Seems Apple forgot to add my display's resolution to the nv4_disp.inf installation file. Well, I got Mac OS :)

Wait, is there an INF file that I can just edit and it will fix the problem? If so, where is it?!
 
i'm subscribing to this thread because my SR MBPro is doing the same thing with vista ultimate. can't get vista to recognize the 20" cinema display...
 
I would attempt to change some of the refresh rates. That's something you can mess around with.

It's most definitely a Vista thing, I've had problems with my display drivers and such, i was able to get them figured out by going through the Dell site. I'm not using Bootcamp though, this is my PC.
 
Just a quick thought, are the drivers you installed the latest version (off the website or off the install disk)?

Also, perhaps a question people can mock me about, is it possible that the monitor isn't vista compatible? No idea, just wondering because if OSX sees the native resolution then why wouldn't Vista (obviously the drivers worked for OSX).
 
I would attempt to change some of the refresh rates. That's something you can mess around with.

It's most definitely a Vista thing, I've had problems with my display drivers and such, i was able to get them figured out by going through the Dell site. I'm not using Bootcamp though, this is my PC.

I did change the refresh rates (even though I know 60Hz is the right one), but no dice.

Just a quick thought, are the drivers you installed the latest version (off the website or off the install disk)?

Also, perhaps a question people can mock me about, is it possible that the monitor isn't vista compatible? No idea, just wondering because if OSX sees the native resolution then why wouldn't Vista (obviously the drivers worked for OSX).

The driver for the graphics card is from Boot Camp, which was just updated a week or two ago, and the screen's driver is from the disk the screen came with. I downloaded the latest screen driver from Samsung.com, but it does no good. (Does anyone know if I should install the monitor on "Generic PnP Monitor", or on "Generic non-PnP monitor"? I've tried both I think, but I'm not sure of what that stuff means.)

I don't think a monitor can just not be compatible with an OS. It's a matter of getting the right drivers for it, it seems.
 
I've also been experiencing the same problem with my 1920x1200 24" display. Does anyone know if this problem is exclusive to Vista, or if it also affects XP? If XP isn't affected, I might just switch back. On a side note, it really does seem that Apple/NVIDIA just botched the drivers, as can be seen with the Civ IV crashes and slowdowns in FPS like Call of Duty 2.
 
I'm going to chime in as experiencing this as well.

My set-up:

- Acer AL2216Wbd 1680x1050 (22" widescreen LCD DVI monitor)
- MacBook Pro 15" (2.2GHz, 2GB, nVidia 8600M GT)
- Boot Camp 1.3
- Windows Vista Home Premium upgrade retail

Same problem: can only get up to 1024x768 on external monitor, even though the nVidia set-up correctly identifies the external monitor as AL2216W. Vista's device manager identifies the AL2216W as Generic non-Plug and Play and the MacBook Pro's display as Generic Plug and Play. In OS X, everything works as expected: I can get full native 1680x1050 resolution on the external monitor either by extending the desktop, or putting the system to sleep, closing the lid, and just waking up the system on the external monitor.
 
External Monitor on MacBook Pro

Hello,

Spent some time with this today and finally got it to work. I'm using Macbook Pro 17", Boot Camp 1.3, Windows Xp sp2 with an external flatscreen sony monitor (its an older molder).

First, I downloaded/installed the ATI drivers from here:
http://ati.amd.com/support/driver.html

Lots more options here in the ATI UI than i found in the Apple provided drivers.

After the install i could right-click on the desktop, choose ATI Catylyst Control Centre and then enable another display. Note that there were two additional options for external displays and it only worked when i chose the second.

When it prompted and i selected the "clone" option. Now my monitor displays a much higher res than it did with initial Boot Camp release.

Hope this helps.
Josh.
 
Hello,

Spent some time with this today and finally got it to work. I'm using Macbook Pro 17", Boot Camp 1.3, Windows Xp sp2 with an external flatscreen sony monitor (its an older molder).

First, I downloaded/installed the ATI drivers from here:
http://ati.amd.com/support/driver.html

Lots more options here in the ATI UI than i found in the Apple provided drivers.

After the install i could right-click on the desktop, choose ATI Catylyst Control Centre and then enable another display. Note that there were two additional options for external displays and it only worked when i chose the second.

When it prompted and i selected the "clone" option. Now my monitor displays a much higher res than it did with initial Boot Camp release.

Hope this helps.
Josh.

Hm... I've tried basically the equivalent of this, except that the new MBP's use NVIDIA graphics instead of ATI so I had to download the NVIDIA drivers. Two problems though: First, the new NVIDIA drivers don't work. It tells me it can't find driver software compatible with my hardware. Secondly, when I try changing those same settings as you (for example, the clone option), nothing shows up on my external screen. The only option that gets something to actually show up on my screen is to do "dual-view" or extended desktop (i.e., combine the two monitors into one giant widescreen monitor). But even then, the resolution on my external screen is wrong. :mad:

EDIT: I also tried the link provided by mundizzle, but to no avail...
 
Hm... I've tried basically the equivalent of this, except that the new MBP's use NVIDIA graphics instead of ATI so I had to download the NVIDIA drivers. Two problems though: First, the new NVIDIA drivers don't work. It tells me it can't find driver software compatible with my hardware.

You have to download moded INF files so you can install Nvidia's regular drivers. If anything, there's better support for doing that on the web then for ATi cards.
 
You have to download moded INF files so you can install Nvidia's regular drivers. If anything, there's better support for doing that on the web then for ATi cards.

do those other drivers fix the problem?

EDIT: I also tried the link provided by mundizzle, but to no avail...

To elaborate... I downloaded the modded INF, the new driver did install, but the screen resolution options remained the same. The max external resolution was still 1024x768. Additionally, the screen became very dark. After several heart attacks, I realized I could reset the color settings to default and screen was back to normal. But now I'm a little wary of downloading anything "modded".
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.