I think you should think before *you* speak. If you are going to come off as a know-it-all you might want to be right, don't you think? You make a few poor assumptions right of the bat. Just because something has an AR of 2.35:1 doesn't mean it's 720P, and doesn't mean it's HD. Even though 16:9 (aka 1.85:1) and 2.35:1 are both considered "widescreen" 2.35:1 doesn't equal 1.85:1.
The two common forks of HD right now are 1280x720p (aka 720p) and 1920x1080 (aka 1080i). And Apple's h.264 trailers are, as another poster pointed out, neither of these.
Anyway, you have to look beyond the physical dimensions of the image when considering its quality. I can blow up MiniDV footage to 1920x1080, but that doesn't make it HD quality even though the image has the proper dimensions. Probably the two most common HD formats used by pros, DVCProHD and HDCAM, have respective data rates of 100Mbps and 140Mbps. And a higher end version of HDCAM, HDCAM SR, has a data rate of 440Mbps.
If you can stream media of that quality over the internet you have a killer ISP.
Just as a comparison MiniDV is 25Mbps, DVD's have a VBR between 1 and 9Mbps (typical is 4-5), and broadcast HD is around 19Mbps.
Lethal