You're not comparing like with like. A fairer test would be for you to playback the same video in Windows on your MacBook. Also the 8400M GS is a much better GPU than the GMA950 and even the X3100.
Go ahead and compare

The GMA950 in both of your Macs supports HWMC for DVD/MPEG-2 decoding. Install Windows and get the PowerDVD 8 demo. Enable hardware acceleration. See just how different your DVDs look in OS X and Windows and see just how much lower the CPU use is under Windows with and without HWMC enabled.
For sometime now, Nvidia cards (even my 7300) have had H.264 hardware acceleration. The GMA950 doesn't have this capability. So it's completely normal for your MacBook's CPU to be doing all the heavy lifting when playing back H.264 content.
Actually, a quick google search will reveal that the 7300 line of GeForce cards (mobile ones anyway, like in the Apple TV) do NOT have hardware acceleration of any type for H.264. Their implimentation of "PureVideo" only includes full decoding for MPEG-2 and some support for WMV-HD. nVidia's own product feature page specifically says "features may vary by product" and other pages regarding the mobile 7300 line clearly state that its only MPEG-2 and WMV-HD support.
http://www.nvidia.com/page/go_7300_features.html no mention of H.264 anyway. Just MPEG-2 and WMV HD.
Also, it doesn't really have anything to do with the OS (at least on the Windows side of things). It's not Vista that offloads the video to the GPU, it's NVidia's driver. Without PureVideo support in the NVidia driver, there's not going to be any decoding work offloaded to the GPU.
While its true that the driver tells Windows how to use the hardware, its all about DXVA, VMR7, and VMR9, as well as what software player you use. You can have the latest and greatest drivers installed, but if you're using software that doesn't support your hardware features, it does not matter.
DXVA is a system wide feature in Windows that allows nearly anything that accesses video overlay to take full advantage of the hardware.
Apple simply does NOT have a similar technology (it is NOT Windows exclusive, Apple could EASILY write support for this into DVD Player and Quicktime). It IS an OS "thing" and Apple does not have it.
So it's not so much OS X is worse than Vista/XP in respect to HD playback but rather Apple's hardware isn't as good as HP's. Although I'd be very interested to see some benchmarks relating to HD playback performance on equivalent Windows and Mac machines.
It has everything to do with OS X simply not taking advantage of the available hardware. Get anyone with any of the current iMacs or current MacBook Pros to run HD video in OS X then over in Windows with a PureVideo (or ATI equivalent) capable software player. The difference will be night and day in both CPU use and image quality.
I've said this before and I'll say it again. You go to any home theater forum on the internet and you look up threads where people ask how to improve the video image quality output by their Mac. Whats the answer given every single time? "Install Windows".
Like I said, I'm not trying to start a "flame war" or another "Mac vs. PC" debate. The simple truth is that OS X is not taking advantage of technology that has been available for years.
This is untrue. Can you imagine how awful the Apple TV would be at playing back 720/H.264, considering it's using a single-core 1 gHz P4?
As Edorian said, Pentium Ms are more than capable of 720p playback. Thats exactly what is in the Apple TV. A newer version of the Dothan (Pentium M) CPU running at 1GHz.
Again, the GeForce Go 7300 that is used by the Apple TV is NOT capable of hardware H.264 acceleration or decoding of any kind
http://www.nvidia.com/page/go_7300_features.html
Why do you think the Apple TV is limited to 720p H.264 with low bitrates? Because it is relying entirely on the CPU for playback.
You know, it is funny though. The Apple TV costs $229 and has a MUCH better graphics processor than the Mac mini costing 3x as much, and the MacBooks costing more than $1,000 more. Why is it the Apple TV gets a dedicated GPU thats better than the consumer notebooks costing more than 5x as much? Why do the MacBooks cost around 5x as much as the Apple TV but get the worst GPUs on the market? Thats ridiculous.
That's exactly the case. The video driver (whether for discrete or onboard) is responsible for such decisions. If the hardware supports acceleration for a particular codec, then it should pass it to the device. For the GMA950 and X3100, this means MPEG-2. For the NVida it means MPEG-2 and H.264. The GMA950 and X3100 do not have hardware acceleration for H.264, thus it must be decoded by the playback app using the cpu(s).
Again, that is not true. Apple simply does NOT take advantage of hardware features in OS X. It has nothing to do with the video driver. Apple simply does not take advantage of the features available to them.
Your sig says you have a MacBook Pro. Is it one with a GeForce? If it is, put Vista on it, get a demo of PowerDVD 8 and enable PureVideo in the player. Compare DVDs in it and DVD Player in OS X. You will see a complete world of difference. Also use VLC in Windows (the Windows version takes advantage of DXVA) to play some H.264 video and then play the same video in Quicktime or VLC in OS X. Again, world of difference.
Mac OS X simply DOES NOT take advantage of hardware features.
What video card/gpu did your Dells have?
Again, that doesn't matter. OS X simply doesn't take advantage of the GPU features available to it. Neither does the Apple TV because.. well, hardware support for H.264 just isn't there.
The fact is that OS X and the Apple TV, regardless of GPU, rely entirely on the CPU for video playback of ANY kind. You can see the difference yourself if you have a GeForce based MBP. Your iMac *might* be able to, but those older ATI cards that Apple used in the pre-aluminum iMacs are.. well, old, and don't have all of the features the current and previous generation do.
Edit: I just want to point out that every time I mention installing Windows, I mean with Boot Camp not in a VM.