Hey everyone. I know that this particular topic doesn't have anything to do with the Apple TV per se, but I thought I would share with you my knowledge to help you decide if you should get a 1080p or a 720p TV. I am getting a Master's Degree in Optical Science from the University of Arizona. I was recently shopping for a new TV and I was faced with a choice that most of you have faced or will face when shopping for a new HDTV: should I save some dough and get a 720p TV, or is the additional cost worth it for a 1080p TV? I am going to present a purely quantitative and scientific way for determining if a 1080p TV is worth it. In fact, below I will list an equation that will tell you once and for all which one you should purchase. In order to understand the equation, you must understand the following concepts. The 20/20 human eye has a maximum visual acuity of 1 arcminute. That is, two points must subtend an angle greater than 1 arcminute in order for a 20/20 eye to resolve the two points. What this means for TVs is that pixels on a TV must have an angular subtense of 1 arcminute or greater in order for your eye to resolve the detail in the video. In order to determine the angular subtense of one full pixel (red+green+blue) of a 16:9 television, just use the following two expressions: To determine the angular subtense in arcminutes for a 720p TV, use 60*arctan(a*0.00005319553/b) where a is the diagonal size of the television in inches and b is the distance you will sit away from the TV in feet. (One caveat: this equation actually assumes 768p TV since this is the actual native resolution for most TVs). Also note that these expressions are in degrees, not radians. Example: Let's say that I wanted to find the full pixel angular subtense of a "720p" 42-inch LCD and I plan on sitting 7 feet away from the TV. The angle subtended by each full pixel is 1.1 arcminute. This represents a condition that borders on the very edge of the maximum detail that the 20/20 human eye is capable of resolving. To determine the angular subtense in arcminutes for a 1080p TV, use 60* arctan(a*0.00037827932/b). Example: Same situation: 42-inch TV, plan on sitting 7 feet away, but this time the TV is 1080p. We find that the full pixel angular subtense is 0.7 arcminutes. This represents a situation where the TV's video resolution is beyond the absolute resolution limits of human vision. In the example problem provided, spending additional money on a 1080p television is like throwing money down the garbage. Your eye is not capable of resolving the additional video resolution of the 1080p TV in this situation. Point of interest: although 20/20 vision is often considered "perfect", it is theoretically possible to have 20/6 vision. In order to get this vision, your eye anatomy must be perfect, and your pupil size must be about 4mm. Any larger, and the visual field is dominated by aberrations. Any smaller, and the visual field is diffraction limited with an ever increasing airy disk diameter. In this perfect situation with perfect field irradiance, the maximum resolution of this eye would be 0.4 arcminute. Note that very few people in the world have demonstrated uncorrected 20/6 vision.