For example, if a tv has an aspect ratio of 4:3, it means horizontally there was 4/3 times the number of vertical pixels. So if the screen is 480p (480 vertical pixels), this means there is (4/3)*480 = 640 horizontal pixels.
That's not actually quite true, as the pre-HDTV video formats did not assume square pixels. You can actually kind of see this if you put your face really close to a CRT TV, and in fact due to the way analog recording methods work the number of vertical columns of pixels in a signal could vary.
The standard DVD video format (which is digital), for example, is encoded at 480x720 pixels, either progressive or interlaced. Now, with a 4:3 program (an older TV show, for example), the pixels are vertical rectangles; when played on a computer monitor or LCD TV that has square pixels, there's adjustment done automatically in the background to resize the image so it looks correct. A 16:9 HDTV-width program, in contrast, STILL has 480x720 pixels, but it includes a "note" to the display device that it's actually supposed to be 16:9 (this is what that "anamorphic widescreen" business in the fine print on the back of a DVD means). So the pixels become stretched horizontally into rectangles that direction.
Again, this usually all happens invisibly in the background, and since all non-CRT monitors that I'm aware of have square pixels (and a LOT more of them than 480x640 or 480x853, depending on ratio), the image you're seeing on the screen has been "blurred" somewhat to upscale it to fit the screen.
Try playing a DVD on your Mac with VLC and experimenting with the various aspect ratio settings to see this at work.
Note, incidentally, that analog TV also had "overscan" which was pixels that were transmitted in the signal but were effectively outside the area of the screen. That's why when you watch a DVD of a TV show on your computer there is almost always some black border along the sides--that would be "off the edge of the screen" on an old TV, but the computer, since it's just looking at the raw signal, exposes it. You get exactly the opposite problem if you display a computer signal that assumes you can see every single pixel on a TV that is doing overscanning--the edges of the screen (like the menu bar) are cut off. That's what that checkbox for "Overscan" in the Displays pref pane does--it adds a black border to the image being sent to compensate for the stuff that's getting cut off and make it usable.
In a perfect world none of this crap would exist, since we've reached the point where all storage, transmission, and display technologies are pure digital with square pixels and progressive display that are capable of displaying precisely from one edge to the other, but the broadcast video industry is apparently VERY set in its ways, hence even 1080p HDTVs getting a pure-digital signal over HDMI might still assume they need to overscan, and there are STILL programs getting recorded in interlaced formats, which have had no logical technological purpose--they make things look worse on a modern screen--for a decade.