Sorry to resurrect an old thread, but this is the newest I've found on MR using google search, i decided it's better to post in an similar topic rather than starting my own.
Out of curiosity, I did some simple benchmark myself today, using a piece of Handbrake ripped Blu-ray movie, it's H.264 encoded, Quicktime format contained, Low Complexity profile, with video in ~1500kbps 1129x480.
I played the same scene in the movie, while using Activity Monitor to calculate how much CPU time was used by the player(iTunes), this was done on my Macbook Pro 17-inch '2011 and iMac 27-inch '2009, you can check out the detailed specs in my signature.
First test was between iTunes:
iMac '09 (iTunes): ~33% cpu usage
MBP '11 (iTunes): ~10% cpu usage
The result did puzzle me for awhile, even though Sandy Bridge processors are real beasts and I've been an active supporter for them, it's hard to believe an higher clocked desktop Core 2 processor would lose so much to a laptop processor, so I did a second test to confirm this was indeed a result of iTunes utilizing the hardware acceleration ability of the mobile GPU.
The second test was between iTunes and Quicktime Player X(and I think most of us do know that QTX uses hw acceleration by now):
iMac '09 (iTunes): ~30% cpu usage
iMac '09 (Quicktime Player X): ~20% cpu usage
MBP '11 (iTunes): ~10% cpu usage
MBP '11 (Quicktime Player X): ~10% cpu usage
Conclusion
Clearly iTunes was hardware accelerated all along on the MBP since it yielded no different result between iTunes and Quicktime Player X.
Oh and by the way, the laptop was on battery and integrated graphics, the CPU temperature showed up in iStat Pro was 42°C/107°F while playing.
I know a lot people have been complaining about the 2011 models being hotter than older models, but I found mine to be the most amazing laptop I've ever used, it's fast, it runs cool and I last hours on battery, I couldn't have been happier with it.