The i7-620m under maximum loads uses 64.7 Watts, while it has only a TDP of 35W.
The i7's do fancy power management stuff, like turbo boost. If it's using 65W, and the cooling system (which is what the TDP is aimed at) only deals with 35W, then the CPU gets hotter, until assumedly something happens, like the CPU switches off turbo-boost, or the user does something less power hungry.
The 9400M has no such technology, and if the cooling system was not capable of removing it's full TDP of heat, it would shut down.
edit: I looked it up. TDP refers to the amount of heat is generated in Watts under load. It's not the same as power output.
Heat Generated = Power Output.
Where else does the energy go?
So the ATI 6500 might be just as energy efficient in idle as the 9400m, it could be hotter than the 9400m. We'll have to wait for more data.
Clearly, as we have no reliable TDP data at all.
I am seriously confused as to how the points you are making are supposed to prove that the idle power draw of the 9400M is 11W. It's not 11W. It's a maximum of about 12W. You want to know how I know this? Some batteries for old 15" MBPs are 60Whr batteries. That would mean that based on just the idle draw of the GPU, the battery would be drained in 5 hrs. This is clearly not the case, as you should be able to get 5 hrs of real world usage.