Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

nuggetWRX

macrumors newbie
Original poster
Dec 8, 2008
28
0
I want to connect my MBP (before they switched to the new black surrounds) to my 22" ViewSonic NX2232w 720p HDTV. The native resolution of the ViewSonic is actually 1680x1050 but I am running 1280x1040 @ 75Hz because my graphics card is not seeing the tv as itself, it just labels it as a "VGA Display." The lists of resolutions under System Pref's jumps from 1600x1200 @ 100Hz to 1792x1344 @ 60Hz without so much as a wink towards 1680x1050. Is there a way any of you know of to manually force my graphics card into supporting the necessary 1680x1050 resolution with this ViewSonic? Oh and by the way the graphics card I have is the NVIDIA GeForce 8600M GT (that should help).
 

nuggetWRX

macrumors newbie
Original poster
Dec 8, 2008
28
0
Woops forgot to mention that. I am using my DVI>VGA adapter I got with my Power Mac G5, then a (m)VGA to (m)VGA cable into the back of the ViewSonic.

I tried a DVI to HDMI but it just sucked so I went to this analog signal
 

goMac

Contributor
Apr 15, 2004
7,662
1,694
I don't know if with HDTV's you can necessarily output at the native res. They aren't like monitors.

It may only be able to accept a 720p signal that the scaler scales to the native resolution.
 

geoffreak

macrumors 68020
Feb 8, 2008
2,193
2
As mentioned above, TVs aren't intended to be used as computer displays, so resolution isn't a concern for manufacturers, who don't bother putting in proper support.
There are some applications which are able to force resolutions, but another kind MR user will need to link to one.
 

nuggetWRX

macrumors newbie
Original poster
Dec 8, 2008
28
0
the thing is, ViewSonic built this as a monitor with a tv tuner... not a tv with a monitor (kinda like the first gen. iPhone's were just PDA's with a phone added, not the other way around. but they fixed it) so that's why i am confused as to how i should go about lying to my graphics card.
 

Frosties

macrumors 65816
Jun 12, 2009
1,079
209
Sweden
Have you looked in the manual for listed resolutions? You could try switchresx but taking the shortcut and looking in the manual is my best advice.
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
VGA seriously sucks, I think it was initially designed for 640 by 480 video. I assume that modern TVs and computers can push more data down it, but obviously there is a limit to the amount of data you can push down 15 (uninsulated?) pins. If your TV has a HDMI input (which I assume it does) then use it. Buy a DVI to HDMI cable, they aren't that expensive, and your problems should disappear.

EDIT: Make that 800 by 600. From Wiki.
 

Freyqq

macrumors 601
Dec 13, 2004
4,038
181
download switchresx

it lets you force a resolution. It's a trial, but if you use it and uninstall - the new resolution stays.
 

areusche

macrumors regular
Jun 24, 2008
168
1
VGA seriously sucks, I think it was initially designed for 640 by 480 video. I assume that modern TVs and computers can push more data down it, but obviously there is a limit to the amount of data you can push down 15 (uninsulated?) pins. If your TV has a HDMI input (which I assume it does) then use it. Buy a DVI to HDMI cable, they aren't that expensive, and your problems should disappear.

EDIT: Make that 800 by 600. From Wiki.

You're incorrect sir. VGA is simply an anolog video output. Depending on the monitor it can output up to HD and more. I've personally used vga outputs up to 1600x1400.

VGA, DvI, and HDMI all achieve the same ends. It really depends on the application of the connection where it is important. Blurays HDCP will only work on digital connections such as dvi and HDMI with a supported module inside the monitor.

Edit: To the OP have you tried hooking up the monitor to another computer to see if it will pull up the resolution?
 

kasakka

macrumors 68020
Oct 25, 2008
2,361
1,060
You should use a DVI->HDMI cable instead. Then it should give you the correct resolution and the image quality will be better than with the VGA cable.
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
You're incorrect sir. VGA is simply an anolog video output. Depending on the monitor it can output up to HD and more. I've personally used vga outputs up to 1600x1400.

VGA, DvI, and HDMI all achieve the same ends. It really depends on the application of the connection where it is important. Blurays HDCP will only work on digital connections such as dvi and HDMI with a supported module inside the monitor.

Edit: To the OP have you tried hooking up the monitor to another computer to see if it will pull up the resolution?

Are you trying to tell me that Wikipedia lied to me? :(
 

kasakka

macrumors 68020
Oct 25, 2008
2,361
1,060
Are you trying to tell me that Wikipedia lied to me? :(

You've simply been reading the article on the VGA graphics standard, which is a bit confusing since it seems to refer to both hardware and display modes. The analog D-Sub connector is capable of 2048x1536 (QXGA) max resolution if I remember correctly.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.