The signal indicator is a pretty arbitrary measurement and varies from one generation of the hardware to another, since it is being reported by a device-specific measurement system.
I just ran a quick test with my iPhone 4 and iPad:
The iPhone 4, when set on a surface with nothing touching it shows a 5-bar 3G signal and gets 5175kbps down, 1175kbps up. Gripped tightly it shows a 1-bar signal and gets a measly 2162kbps down and 419kbps up. Surely this is proof of just how bad the situation is?
The iPad, by contrast sees the same 5-bar 3G signal placed in the same location. It gets 1949kbps down, and 160kbps up (the latter presumably to do the lack of HSUPA support in its 3G chipset.)
It would appear that that under optimal circumstances the iPhone 4's external antenna band gets amazing reception, and that when held in direct contact with skin merely gets comparable reception to an internal antenna. Is this really a problem? Perhaps we'll learn more tonight, but I'm far from displeased with its real-world behavior.