Here is the screenshot with the same film at 1080p.
Ah ha! Now we have an idea where the difference lies. The NVIDIA kernel extensions (i.e. GPU drivers) appear to be significantly updated on the new machines (1.5.34 vs. 1.5.30) and the OpenGL API framework is also newer (1.5.8 vs 1.5.7).I couldn't find the AGL and OpenGL framework numbers you were looking for but here are my kext verisons:
GeForce.kext - 1.5.34
NVDANV50Hal.kext - 1.5.34
NVDAResman.kext -1.5.34
[EDIT] I'm blind...
AGL.framework - 3.0.9
OpenGL.framework - 1.5.8
If you play it at exactly half the original size it will use less CPU then if you just drag the window to a random size.
Then how come mine is greater than half the size and is still uses less that the half size shown in the forum?
The iPhone for isnstance can only play videos with low profile settings, 1500 kbit/s and 640x480 resoultion. That's what the decoding hardware was made for, it won't play anything exceeding those parameters.
Sadly the GMA X4500HD is the first mobile Intel IGP solution that offers full AVC/h.264 hardware acceleration.Will this hardware decoding come to the GMAx3100? XD
Anyone know how the hardware acceleration affects battery life?
the new macbook not using as much CPU resources when playing HD video isn't much of a surprise is it?? It does have a faster fsb and ram...or am i missing something here?surely it doesn't have everything to do with the new video card?
Did anyone else notice that the old MacBook uses the term "H.264 Decoder" in the "Format:" details, whereas the new MacBook simply says "H.264"?
Given that they are both Mac OS X 10.5.5 and everything is otherwise the same, that small difference may be an indication that the decoder module QuickTime is using is indeed different.
I wonder why nobody has jumped onto NVIDIA CUDA and built a decoder and encoder for OS X.