Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

applefan289

macrumors 68000
Original poster
Aug 20, 2010
1,705
8
USA
...1024x768?

I guess my question is when someone says "480i, 480p, 720p, 1080i, and 1080p", would that be along the lines of a resolution like 1024x768, 1280x800, etc.? The reason I'm asking is because I know the Mac does not have an output of "720p" or "1080i" like home consoles (like Xbox 360 and PS3) do. So how do you know if the TV you're connecting your Mac to (let's say a Mac Mini) is outputting in 1080i?

Also, when a TV is in WXGA resolution (like 1024x768), is there an actual mode for that like 768p, or is that just the screen resolution, and the game is running at 720p?
 
...the Mac does not have an output of "720p" or "1080i" like home consoles (like Xbox 360 and PS3) do...
Sure it does.

If you've got your Mac connected to a display that supports those resolutions via an HDMI cable, you'll see 1080p, 720p, and I believe 1080i as well, listed in the Display pref pane, along all the standard supported computer resolutions. I'm running my home TV at 1080p, explicitly, from a Mini, via exactly such a setup.

1080p is just 1080x1920 at 60Hz (for NTSC) or 50Hz (for PAL). 720p is 1280x720 @ 60Hz.

For 1080i I assume the Mac just transmits the alternate fields of the signal appropriately; I doubt you'd see interlacing artifacts. The MacOS can also handle standard def resolutions as well, although of course it looks awful. There is a checkbox under the TV resolutions, up to and including 1080p, for "overscan" which you want on--otherwise you'll get a black border around the image if your TV is set to dot-to-dot mode (which you DEFINITELY want, under almost any circumstance, for optimum quality, if it's a digital signal).
 
480p and 720p refer to the number of vertical pixels on a screen.

Depending on the aspect ratio, the number of horizontal pixels can change.

For example, if a tv has an aspect ratio of 4:3, it means horizontally there was 4/3 times the number of vertical pixels. So if the screen is 480p (480 vertical pixels), this means there is (4/3)*480 = 640 horizontal pixels.

More common aspect ratios on HDTV's are 16:9, hence the wider screen.

So for your question, having a screen resolution like 1024x768 I guess is technically 768p with an aspect ratio of 4:3 [(4/3)*768=1024].

==

Now, this I'm not sure at, but I'll take a guess, please anyone feel free to correct me if I am wrong. Your Mac Mini can output at whatever resolution is necessary.

It is the physical screen that has a resolution built in, not the Mac Mini. All screens have a physical number of pixels. So for example, if your tv is 1080p, your Mac Mini will output at 1080p to it assuming you have it set to that in System Preferences.

Now in terms of the 'p' and 'i' that follows the number, that it how the picture is actually displayed on the screen, I don't remember the exact terms.
 
Both great answers, thanks. I had no idea that there were some monitors that were not exactly 16:9. Some arcade games use the 1024x768 resolution. Interesting!
 
...Now in terms of the 'p' and 'i' that follows the number, that it how the picture is actually displayed on the screen, I don't remember the exact terms.

The "p" and "i" refer to how the picture is "Painted" on the screen. Either "Progressive" or "Interlaced". The original Television used what we refer to now as 480i - Interlaced. This meant that the electron beam that excited the phosphor of the screen, making it glow, was scanned first by odd numbered lines or "Fields" then the next scan with even number Fields. This painting happened 60 times per second, and with 30 frames of picture per second that meant that every frame got painted twice, first with odd number lines, then even number lines.

This was due to the nature of the technology - what we might refer to as a "limitation" these days.

With modern technology, it became possible to have a beam scan so fast it could paint every line in one scan. This is "Progressive", however, the analog transmissions were all still for Interlaced video, so even though we could theoretically do progressive, we needed a sea change to enact - or a heavy AD converter in the set.

Well, as it happens the HD/Digital conversation allows this twofold: 1) it allowed the technology to be rolled out in the first place, 2) the use of Cathode Ray tubes were being replaced with Plasma and LCD screens. These screens have no scanning beam, and is even easier to paint the screen.

The key to the flat panels vs. cathode ray tubes is this: the CRT had to continually scan, meaning that if it stopped, there would be no image, whereas all a flat panel needs to do is not change the state of a pixel. They are far more passive and lend themselves much better to progressive scan - in fact, progressive is best for screens like this.

And this concludes my Captain Smarty comment of the day. That'll be US$750.
 
While we're on the subject, what exactly is the difference between VGA and DVI? I hear that DVI has this special security thing (I think it's HDCP or something) whereas VGA does not. Besides the fact that VGA is analog, what is exactly is the disadvantage of not having this "HDCP"? (I have a computer monitor connected via VGA and it looks really sharp - I can't imagine a clearer picture!
 
While we're on the subject, what exactly is the difference between VGA and DVI? I hear that DVI has this special security thing (I think it's HDCP or something) whereas VGA does not. Besides the fact that VGA is analog, what is exactly is the disadvantage of not having this "HDCP"? (I have a computer monitor connected via VGA and it looks really sharp - I can't imagine a clearer picture!

Oo. You're giving me ALL sorts or reasons to show off! :)

Ok, nutshell: VGA is an analog video standard. If you have a flat panel, it's controller is digital, meaning there's an A/D conversion happening inside the monitor. While you probably have a great picture, you are susceptible to interference, which manifests in maximum cord length.

DVI or "Digital Visual Interface", Link, is what it says: it's digital, comes out of the computer digital, goes into the screen digital, no conversions, hardened against interference and much better performance overall. It's also capable of carrying a lot more information (e.g. really huge resolutions). It does support HDCP but is not required.

HDMI is several things. 1) it pairs the video channel of DVI with Digital Audio, 2) it includes the nefarious HDCP High-bandwidth Digital Content Protection. Because of the entirely digital nature of HDMI this allows a 'handshake' between all of the devices in the HDMI chain, this ensures it's a closed content stream and nothing is "eavesdropping" on the signal. This handshake happens every time you activate the chain. If the handshake goes sideways, often you'll hear audio, but the screen will be overpainted with a nastygram. You can get a DVI-to-HDMI converter to have a computer use an HDTV as a monitor, and that'll work just fine as it is the responsibility of the source of the HDMI signal - the computer - to send an HDCP handshake and demand a valid response. If you are shunting DVI video into an HDMI connector, it is not asking for HDCP validation.

These are all standards, meaning to get that certification a minimum set or specs need to be satisfied.
 
Last edited:
For example, if a tv has an aspect ratio of 4:3, it means horizontally there was 4/3 times the number of vertical pixels. So if the screen is 480p (480 vertical pixels), this means there is (4/3)*480 = 640 horizontal pixels.
That's not actually quite true, as the pre-HDTV video formats did not assume square pixels. You can actually kind of see this if you put your face really close to a CRT TV, and in fact due to the way analog recording methods work the number of vertical columns of pixels in a signal could vary.

The standard DVD video format (which is digital), for example, is encoded at 480x720 pixels, either progressive or interlaced. Now, with a 4:3 program (an older TV show, for example), the pixels are vertical rectangles; when played on a computer monitor or LCD TV that has square pixels, there's adjustment done automatically in the background to resize the image so it looks correct. A 16:9 HDTV-width program, in contrast, STILL has 480x720 pixels, but it includes a "note" to the display device that it's actually supposed to be 16:9 (this is what that "anamorphic widescreen" business in the fine print on the back of a DVD means). So the pixels become stretched horizontally into rectangles that direction.

Again, this usually all happens invisibly in the background, and since all non-CRT monitors that I'm aware of have square pixels (and a LOT more of them than 480x640 or 480x853, depending on ratio), the image you're seeing on the screen has been "blurred" somewhat to upscale it to fit the screen.

Try playing a DVD on your Mac with VLC and experimenting with the various aspect ratio settings to see this at work.

Note, incidentally, that analog TV also had "overscan" which was pixels that were transmitted in the signal but were effectively outside the area of the screen. That's why when you watch a DVD of a TV show on your computer there is almost always some black border along the sides--that would be "off the edge of the screen" on an old TV, but the computer, since it's just looking at the raw signal, exposes it. You get exactly the opposite problem if you display a computer signal that assumes you can see every single pixel on a TV that is doing overscanning--the edges of the screen (like the menu bar) are cut off. That's what that checkbox for "Overscan" in the Displays pref pane does--it adds a black border to the image being sent to compensate for the stuff that's getting cut off and make it usable.

In a perfect world none of this crap would exist, since we've reached the point where all storage, transmission, and display technologies are pure digital with square pixels and progressive display that are capable of displaying precisely from one edge to the other, but the broadcast video industry is apparently VERY set in its ways, hence even 1080p HDTVs getting a pure-digital signal over HDMI might still assume they need to overscan, and there are STILL programs getting recorded in interlaced formats, which have had no logical technological purpose--they make things look worse on a modern screen--for a decade.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.