Look it up or hook up the hardware to see what is coming out of it. There is a hardware gate built in so that no matter what you feed it (even a blank, one color video at the lowest possible bitrate), what is coming out and flowing to your HDTV is 720p.
When we hook the camcorder directly the the TV and when we flip to the exact same video downconverted to 720p max specs (for whatTV can handle) everyone at my house can see the difference. Basically, "as is" for us, there's the "convenient" video
apple:TV) and the "good video" (camcorder, blu ray).
1. Nope, there was a German site the demonstrated 1080p resolution and about 33% dropped frames at a medium bitrate. At that point, work stopped on developing more efficient decoding.
2. The difference you are observing is the compression/downscaling used by the camera (or your settings.) If you record everything on the max theoretical settings, you'd have to have a mega-setup to see the difference. Most likely, the default settings, or the max camera settings, or the upscaling/downscaling features in the firmware as set such that one detects a difference between 720p and 1080p.
3. You really need to learn to understand the difference between what's available (technical limitations in your settings/firmware) and what's actually possible (the raw data displayed at 720p vs. 1080p; and what equipment you'd need to have to observe a real difference at the max settings.) You really come across as a person who believes everything they're sold/told, rather than having an actual understanding of the differences.