Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

El Burro

Suspended
Original poster
Sep 7, 2009
134
226
Hi,
I have a KDL-19M4000 Sony Bravia 19 inch TV. So I bought a DVI to HDMI cable for my 2007 Mac Mini which outputs up to 1920x1200.

My LCD TV has one hdmi input and a "PC Input." I ended up buying a DVI to HDMI cable for my mac mini to the HDMI input on the LCD TV. I received the cable today and when I hooked it up, the native LCD TV resolution of 1440x900 wasn't detected, and it just ended up showing a bunch of weird resolutions (the picture WOULD SHOW UP CRISP!), but the picture would either have borders if I unchecked "overscan" or would be zoomed in too much if "overscan" was checked.

I then installed switchresx and typed in 1440x900 on it so that it could give me teh rigth resolution, then i restarted, and now the screen is centered, but there is a border around the screen.

What makes this incredibly aggravating is that when I restart the computer, and the white screen with the apple icon at the beginning shows up, the screen seems to be completely filled up at the right resolution of 1440x900.

I then looked at the manual, and buried somewhere within the complicated manual in fine print it said:

Do not connect a PC to the TV’s HDMI input. Use the PC IN (RGB IN) input instead when connecting a PC.

Wtf? Ok, fine. So I just wasted $25.00 on a stupid cable that I didn't need. I also ended up buying a stupid adapter for my macbook pro (mini displayport to DVI so I could hook that up to the 19 inch with the DVI to HDMI cable I bought). Thanks a lot *******pple and Shitony.

So my question is, why the hell can't I use the HDMI input with my PC? I mean it's outputting but with a HUGE border around it GOD!

So am I forced to use the, from what I understand, the ******** VGA connection which won't be "digital"? The manual says that the maximum resolution is (if I understand correctly) with the "PC Input" (it looks like a VGA FEMALE input) is:

WXGA 1360× 768 47.7 60 VESA


but my screen's native resolution is "1,440 dots × 900 lines"

So what the hell is going on? Am I being forced to downgrade to the ******** quality analog that doesn't even display at 1440x900?

And I have a DVI to VGA adapter which has a female VGA end, but the monitor has a VGA female end too so I can't connect the two. The hell? Is there a cable with two ends with both VGA males that I need then? Or am I supposed to use a XVGA cable?

I hate technology...



Here are the specs below (I BOLDED THE ONES THAT SEEM TO BE THE MOST IMPORTANT!):


Specifications
Television system: NTSC ATSC (8VSB terrestrial) QAM on cable
Panel System:
LCD (Liquid Crystal Display) Panel
Native Display resolution (horizontal × vertical): 1,440 dots × 900 lines VIDEO IN 1/2: S VIDEO (4-pin mini DIN) (VIDEO 2 only):
Y: 1.0 Vp-p, 75 ohms unbalanced, sync negative
C: 0.286 Vp-p (Burst signal), 75 ohms VIDEO: 1 Vp-p, 75 ohms unbalanced, sync negative
COMPONENT IN:
YPBPR (Component Video): Y: 1.0 Vp-p, 75 ohms unbalanced,
sync negative PB: 0.7 Vp-p, 75 ohms PR: 0.7 Vp-p, 75 ohms
Signal format: 480i, 480p, 720p, 1080i
HDMI IN:
HDMI: Video: 480i, 480p, 720p, 1080i, 1080p
Audio: Two channel linear PCM 32, 44.1 and
PC IN:
D-sub 15-pin, analog RGB
, 0.7 Vp-p, 75 ohms, positive See the PC Input Signal Reference Chart on page 38 PC
Screen size (in inches): 19"

PAGE 38
PC Input Signal Reference Chart
Signals Resolution Horizontal (Pixel) ×Vertical (Line) Horizontal frequency (kHz) Vertical frequency (Hz) Standard
VGA 640 × 480 31.5 60 VGA
640 × 480 37.5 75 VESA
720 × 400 31.5 70 VGA-T
SVGA 800 × 600 37.9 60 VESA Guidelines
800 × 600 46.9 75 VESA
XGA 1024 × 768 48.4 60 VESA Guidelines
1024 × 768 56.5 70 VESA
1024× 768 60.0 75 VESA
WXGA 1280× 768 47.4 60 VESA
1280× 768 47.8 60 VESA
1360× 768 47.7 60 VESA


This TV’s PC Input does not support Sync on Green or Composite Sync. This TV’s PC Input does not support interlaced signals. This TV’s PC Input supports signals in the above chart with a 60Hz vertical frequency.
For the best picture quality, it is recommended to use signals with a 60Hz vertical frequency from a personal computer. In plug and play, signals with 60Hz vertical frequency will be selected automatically.
 
Don't blame this on on Apple--you'd have exactly the same problem with any computer. The issue, unless I'm very mistaken, is that TVs with nonstandard native resolutions (that is, not 720p or 1080p) are designed to accept input at said standard resolutions, and so do badly when you try to give them a pixel-for-pixel signal.

Essentially the TV's internal processing is upscaling or downscaling the signal to fit its actual resolution, and simply doesn't have the capability for a 1:1 signal. This works fine for video and the overscan isn't noticeable, but on a computer you get exactly the problems you're describing. Most TVs, simply put, aren't designed to make very good monitors.

In yours, Sony is using a 1440x900 monitor panel, but the display hardware is set up to expect TV resolution inputs (and the "PC" input is, if anything, an afterthought, since it'd be too expensive to add additional hardware to support a native-resolution input).

I had exactly the same issue with an older Philips TV that used a nonstandard 1280x768 panel, but was only designed to take a 720p input and upscale it a bit. Even the "PC" input on it wouldn't accept video at the native resolution, though it did offer enough controls that I could sort of finagle the video image to fit the screen properly, though it was still somewhat scaled horizontally.

I expect when you're turning it on it only LOOKS like it's displaying properly because of the flat grey background--were there more texture on the background you'd see that it was being scaled and/or cropped.

Nothing Apple can do about this, and it's not even HDMI's fault--I've set up a 28" "monitor" that is basically a TV with no tuner or remote--it only has an HDMI input, no VGA/DVI. And, impressively enough, the Mac recognizes its nonstandard 1920x1200 resolution (because the TV itself broadcasts that it can handle that, and can), and outputs said pixel-perfect image via HDMI.
 
Don't blame this on on Apple--you'd have exactly the same problem with any computer. The issue, unless I'm very mistaken, is that TVs with nonstandard native resolutions (that is, not 720p or 1080p) are designed to accept input at said standard resolutions, and so do badly when you try to give them a pixel-for-pixel signal.

Essentially the TV's internal processing is upscaling or downscaling the signal to fit its actual resolution, and simply doesn't have the capability for a 1:1 signal. This works fine for video and the overscan isn't noticeable, but on a computer you get exactly the problems you're describing. Most TVs, simply put, aren't designed to make very good monitors.

In yours, Sony is using a 1440x900 monitor panel, but the display hardware is set up to expect TV resolution inputs (and the "PC" input is, if anything, an afterthought, since it'd be too expensive to add additional hardware to support a native-resolution input).

I had exactly the same issue with an older Philips TV that used a nonstandard 1280x768 panel, but was only designed to take a 720p input and upscale it a bit. Even the "PC" input on it wouldn't accept video at the native resolution, though it did offer enough controls that I could sort of finagle the video image to fit the screen properly, though it was still somewhat scaled horizontally.

I expect when you're turning it on it only LOOKS like it's displaying properly because of the flat grey background--were there more texture on the background you'd see that it was being scaled and/or cropped.

Nothing Apple can do about this, and it's not even HDMI's fault--I've set up a 28" "monitor" that is basically a TV with no tuner or remote--it only has an HDMI input, no VGA/DVI. And, impressively enough, the Mac recognizes its nonstandard 1920x1200 resolution (because the TV itself broadcasts that it can handle that, and can), and outputs said pixel-perfect image via HDMI.

So should I play around with the switchresx settings in a different way and try a different resolution or try DisplayConfigX?

Or is my only choice to connect using the "PC Input" slot? If so, what type of cable do I need to use?

Do I need an WXVGA or a regular VGA to output at the closest analog signal (WXGA
1360× 768 47.7 60 VESA)?

Or can I use a WXGA+ (<--- yes +) cable which supports 1440x900 which is my screen's native resolution?

I have a DVI to VGA adapter that came with my mac mini, and if the Sony Manual said that it can output "1360× 768" with "WXVGA am I out of luck?

Thanks again!
 
El Burro,
You and Makosuke are already talking over my head but I'll tell you my experience.
I have a 2008 46" Sony KDL - Z series. I also bought a DVI to HDMI cable and had the exact experience you have. I got a great picture only it had the black bars all around it. I switched to the DVI to VGA cable and set the Mini to a resolution of 1920 x 1080. The picture was then perfect. I switched out to the HDMI cable a couple of times to see if I could tell any difference in the picture quality and I couldn't.
Hope this helps ease the pain.
 
El Burro,
You and Makosuke are already talking over my head but I'll tell you my experience.
I have a 2008 46" Sony KDL - Z series. I also bought a DVI to HDMI cable and had the exact experience you have. I got a great picture only it had the black bars all around it. I switched to the DVI to VGA cable and set the Mini to a resolution of 1920 x 1080. The picture was then perfect. I switched out to the HDMI cable a couple of times to see if I could tell any difference in the picture quality and I couldn't.
Hope this helps ease the pain.

Thank you Oneness! I appreciate your help!
 
I also bought a DVI to HDMI cable and had the exact experience you have. I got a great picture only it had the black bars all around it.
In your case, since it's a 1080p TV with a native resolution of 1920x1080, had you set the computer to 1080p with overscan on, and the TV to "dot-by-dot" mode (or whatever Sony calls the "no overscan" mode), you would have gotten a pixel-for-pixel image fitting the screen via HDMI. Not that VGA with a decent cable doesn't work fine, but DVI, being pure digital, should have slightly cleaner image (though it will require setting the TV's sharpness to whatever setting does no enhancement--on my Sharp that's the lowest, not what you'd expect).

As for El Burro, if the manual says it doesn't support 1440x900 input, as you quoted, then even if you use SwitchResX to manually pick that resolution it won't work properly (you probably won't get any image at all, though there shouldn't be any harm in trying). The issue is that the TV's signal processing isn't set up to accept a signal at the panel's native resolution.

Your best bet, I'm going to guess, will be a VGA cable with the computer set to output 1280x768, 60Hz. I'd recommend this over the slightly higher resolution because it's closer to the 16:10 aspect ratio of the screen; I'm guessing that 1360×768 will give you a somewhat horizontally squashed image, assuming the TV stretches the video to fit the screen.

There's no harm in experimenting, though, to see what looks best to your eyes.
 
I just wanted to let you all know (and anyone else in the future that has similar problems) that the VGA cable works.

In particular, because Mac Mini I had a DVI input. I used a DVI to VGI adapter.


The problem was that the VGI end of the DVI to VGA adapter was a female end (meaning there were no spikes it was just holes), and the monitor had a VGA Female input too.

Thus, I needed a VGA Male to Male Cable. I ordered the "Cables To Go" brand which is supposed to be high quality and it retails as of today at $33.00 from the website, but can be found cheap on Ebay or where I got it for $10.00 after shipping (http://www.techforless.com/cgi-bin/tech4less/28011?id=8qxhGFHe&mv_pc=241). I ordered this cable because it claimed to output at XVGA resolutions and I didn't want to get a cable that may have "bandwidth" problems. The cable quality is also very good (it's heavy and well made) so that's a plus.

The other good news is that the mac mini is displaying everything at 1440x900 which is my Monitor's Native Resolution so it's a 1:1 pixel translation.

There is one minor problem and that is there is some "clipping" in certain areas (I don't know how else to describe it) that disappears when I go over that area. I think it may be that the Mac Mini that I have has a ****** integrated video graphics card that is that Intel Accelerator or whatever. So it may have some problems outputting at 1440x900 though I don't see why that would be a problem seeing as Apple claims that my specific computer can handle resolutions up to 1920x1200

For instance, a line of text will be repeated twice but when I scroll up or down, it corrects so it seems like the screen gets refreshed.

This is the only (but minor) problem and I can deal with that, but otherwise it is fine.

I also turned up brightness a bit because the computer's native setting was too dim.

Edit: The weird horizontal lines/clipping was because my mac mini wa pushing the native resolution of 1440x900 but the Sony Manual says that the monitor can handle up to 1360x768. So I brought it down a notch, and not only is it easier to see, but the problems are gone!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.