Hi all I have two VGA cables. My first is simply a cheap one which I have been using for the past few months on my full HD (1080p) TV. I use a Mini DisplayPort to VGA adapter on my 17" Macbook Pro and have been able to get full HD output to my TV. However, this cable was poorly shielded and I got some wavy lines from other TV inputs. I've just got a new VGA cable, and this has far greater shielding (so that problem is fixed), however I cannot get it to go above a resolution of 1400x1050 - going to 1920x1080 @60Hz as I did with the other cable gives me a message from my TV - "Unsupported signal. Adjust your PC output." When I try and use the 'Detect Displays' button in System Preferences, I don't see anything happen as a result... Help! Edit: After a bit of research, my new cable is actually an SVGA cable. It has the same pins, layout etc and as far as I'm aware it should work. Why doesn't it!? Edit2: Tried another VGA cable, and that works. So my findings are that 'Super-VGA' cable doesn't work, at least by default. Any way I can fix this?