Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

erichk

macrumors newbie
Original poster
Aug 23, 2010
3
0
Hi, I just got myself a new monitor. It only has D-Sub as input so DVI/HDMI is out of option. I have been happily using an old monitor with a mini displayport - d-sub converter so I used the same configuration. The problem is, once i set the resolution to 1920x1080, at first the monitor shows `no signal`, after a few secs the display is up but with glitches every now and then. After a few minutes of usage the screen goes nuts. I'm on late 2011 macbook pro.

Any ideas?
 
Running 1920x1080 is technically "1080p" digital format, which isn't commonly supported by most VGA (analog) bandwidths/devices. Check the supported resolutions of your actual monitor, it might prefer a resolution like 1920x1200 or 1280x800, etc.
 
yes it does run at full hd under vga,but there is minor flickering.
Also I did the mistake of installing switchresx,now that damn thing is hard to get rid of.
removed all plists,all instances,but still somehow it manages to exist.
Its really starting to piss me off. :mad:
Ok anyway enough of my OT.
Sometimes the cable too might be at fault. :)
 
i `fixed` it by setting the resolution to 1440x900 and then 1920x1080. it has worked fine ever since, except for sleeps. i have to repeat the process every time the system wakes up. any ideas?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.