Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

RMo

macrumors 65816
Original poster
Aug 7, 2007
1,254
281
Iowa, USA
Hey everyone,

I just got my new MacBook today (well, it's refurbished, but it's new to me, and it's a nice replacement for my old iMac G5), and I am having problems using my external display that I have used problem-free previously on my PC laptop.

The monitor is an 20-in widescreen Acer monitor, model AL2016W (here it is on Newegg), that runs at a native resolution of 1680x1050. It accepts both VGA and DVI-D inputs. I have a mini-DVI-to-DVI adapter on the MacBook, which is connected to the DVI-D cable included with the monitor, which is then, of course, connected to the monitor. Nothing comes up on the external monitor, however, except a "No Signal" message. However, something must be going through because the MacBook is able to detect what kind of display it is, and the "No Signal" message appears right after I hook it up (meaning it must have tried to do something, because the message disappears after a few seconds, and otherwise it would have not shown up when I connected it to the computer.

(One thing: the PC was connected with VGA instead of DVI, but the monitor had no problems. I've also used it with VGA on the old iMac. I don't have a mini-DVI-to-VGA adapter handy to experiment with, nor do I want to spend the money and time necessary to get one.)

Does anyone have any ideas? Basically, my MacBook is detecting the exernal display, but the external display is complaining of "No Signal".

Thanks!
 

jackiecanev2

macrumors 65816
Jul 6, 2007
1,033
4
if the monitor was formerly hooked up via VGA, make sure the monitor is set to digital or autodetect instead of analog. your MB will automatically detect the monitor, and come up with the appropriate settings on its own.
 

RMo

macrumors 65816
Original poster
Aug 7, 2007
1,254
281
Iowa, USA
Thank you very much--I thought that might have been the problem, but I couldn't get to the on-screen menu to switch inputs without actually having something (from a computer) on the screen, and unfortunately, I couldn't do that because, well, it wasn't working in the first place. :D

So, I unplugged my monitor for a while and hoped it would "forget" it's VGA-ness, and now it is working as expected. Thanks again! :)
 

RMo

macrumors 65816
Original poster
Aug 7, 2007
1,254
281
Iowa, USA
Hmm, so nobody has any ideas? I did read on another forum (that I found via a Google search) that someone else who also has a widescreen Acer LCD has this same problem ... but nobody had any ideas there, either. :rolleyes:

I can get it to work at some resolutions ... like 800x600 ... but I can't get it to work at 1680x1050, even though the monitor says it accepts that and that is what my Mac says it is putting out (the refresh rate is fine, too, as my monitor wants it at 60 Hz and that is what the Mac says it is giving).

I might try to get another monitor, but I think the easier (though probably less satisfying) option would be to get a mini-DVI to VGA adapter. I'm not a graphics pro or anything, so I can't say that I would mind the probable drop in quality.

Any thoughts?
 

petervcook

macrumors newbie
Oct 29, 2007
2
0
Similar problem on MacBook and MacBook Pro

I am having the same problem. I have tried it with a MacBook and a MacBook Pro, on a Dell Projector and a HP LCD monitor. However, the weird thing we are doing is running it from the Mac through a (Mini-)DVI to VGA adapter back to a VGA to DVI cable. I know that's weird but it is because a VGA cable is the only thing running to our remotely placed projector. Going DVI the whole way fixes the problem.

The weird thing is, it was working last week.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.