Lying to a Graphics card... MBP...?

Discussion in 'MacBook Pro' started by nuggetWRX, Jul 6, 2009.

  1. nuggetWRX macrumors newbie

    Joined:
    Dec 8, 2008
    #1
    I want to connect my MBP (before they switched to the new black surrounds) to my 22" ViewSonic NX2232w 720p HDTV. The native resolution of the ViewSonic is actually 1680x1050 but I am running 1280x1040 @ 75Hz because my graphics card is not seeing the tv as itself, it just labels it as a "VGA Display." The lists of resolutions under System Pref's jumps from 1600x1200 @ 100Hz to 1792x1344 @ 60Hz without so much as a wink towards 1680x1050. Is there a way any of you know of to manually force my graphics card into supporting the necessary 1680x1050 resolution with this ViewSonic? Oh and by the way the graphics card I have is the NVIDIA GeForce 8600M GT (that should help).
     
  2. Tallest Skil macrumors P6

    Tallest Skil

    Joined:
    Aug 13, 2006
    Location:
    1 Geostationary Tower Plaza
  3. nuggetWRX thread starter macrumors newbie

    Joined:
    Dec 8, 2008
    #3
    Woops forgot to mention that. I am using my DVI>VGA adapter I got with my Power Mac G5, then a (m)VGA to (m)VGA cable into the back of the ViewSonic.

    I tried a DVI to HDMI but it just sucked so I went to this analog signal
     
  4. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #4
    I don't know if with HDTV's you can necessarily output at the native res. They aren't like monitors.

    It may only be able to accept a 720p signal that the scaler scales to the native resolution.
     
  5. geoffreak macrumors 68020

    geoffreak

    Joined:
    Feb 8, 2008
    #5
    As mentioned above, TVs aren't intended to be used as computer displays, so resolution isn't a concern for manufacturers, who don't bother putting in proper support.
    There are some applications which are able to force resolutions, but another kind MR user will need to link to one.
     
  6. nuggetWRX thread starter macrumors newbie

    Joined:
    Dec 8, 2008
    #6
    the thing is, ViewSonic built this as a monitor with a tv tuner... not a tv with a monitor (kinda like the first gen. iPhone's were just PDA's with a phone added, not the other way around. but they fixed it) so that's why i am confused as to how i should go about lying to my graphics card.
     
  7. mikes70mustang macrumors 68000

    mikes70mustang

    Joined:
    Nov 14, 2008
    Location:
    US
    #7
    Im pretty sure you cant get full res out of VGA, i think you need DVI or HDMI into the TV for that.
     
  8. Frosties macrumors 6502a

    Frosties

    Joined:
    Jun 12, 2009
    Location:
    Sweden
    #8
    Have you looked in the manual for listed resolutions? You could try switchresx but taking the shortcut and looking in the manual is my best advice.
     
  9. Erasmus macrumors 68030

    Erasmus

    Joined:
    Jun 22, 2006
    Location:
    Hiding from Omnius in Australia
    #9
    VGA seriously sucks, I think it was initially designed for 640 by 480 video. I assume that modern TVs and computers can push more data down it, but obviously there is a limit to the amount of data you can push down 15 (uninsulated?) pins. If your TV has a HDMI input (which I assume it does) then use it. Buy a DVI to HDMI cable, they aren't that expensive, and your problems should disappear.

    EDIT: Make that 800 by 600. From Wiki.
     
  10. Freyqq macrumors 68040

    Joined:
    Dec 13, 2004
    #10
    download switchresx

    it lets you force a resolution. It's a trial, but if you use it and uninstall - the new resolution stays.
     
  11. areusche macrumors regular

    areusche

    Joined:
    Jun 24, 2008
    #11
    You're incorrect sir. VGA is simply an anolog video output. Depending on the monitor it can output up to HD and more. I've personally used vga outputs up to 1600x1400.

    VGA, DvI, and HDMI all achieve the same ends. It really depends on the application of the connection where it is important. Blurays HDCP will only work on digital connections such as dvi and HDMI with a supported module inside the monitor.

    Edit: To the OP have you tried hooking up the monitor to another computer to see if it will pull up the resolution?
     
  12. kasakka macrumors 68000

    Joined:
    Oct 25, 2008
    #12
    You should use a DVI->HDMI cable instead. Then it should give you the correct resolution and the image quality will be better than with the VGA cable.
     
  13. Erasmus macrumors 68030

    Erasmus

    Joined:
    Jun 22, 2006
    Location:
    Hiding from Omnius in Australia
    #13
    Are you trying to tell me that Wikipedia lied to me? :(
     
  14. kasakka macrumors 68000

    Joined:
    Oct 25, 2008
    #14
    You've simply been reading the article on the VGA graphics standard, which is a bit confusing since it seems to refer to both hardware and display modes. The analog D-Sub connector is capable of 2048x1536 (QXGA) max resolution if I remember correctly.
     

Share This Page