This could be one of those misinterpreted rumors. it seems like they are finding a way for the GPU to do it as that is the coming trend. 
Video cards are not made to encode and de-code H.264. The processor still does that before passing it on to the video card.
Yes they are. ATI/AMD have UVD, NVIDIA have PureVideo. Intel now have a unit in their latest chipset. This isn't new stuff either, UVD and PureVideo have been around a couple of years, and they have a real noticeable benefit and reduce power consumption in mobile platforms too.
Those are both DEcoders, whereas this new chip would be mainly used to speed up encoding.
Wonderful! Blu-Ray tomorrow!!!
Don
h.264 ENcoding is already being speed up by modern video cards.
I stand corrected on that fact. But that doesn't really change the issue I was responding to: The majority of Macs out there don't have hidden hardware that can just be 'turned on.'
I was incorrect in the way I responded to that, but the central point remains the same.
Hi all, I'm asking this from a position of total ignorance as to how these things work, but would using a dedicated chip improve battery life over how things are currently done, in which case could this be a significant thing for notebooks?
And of course, all discrete GPUs in the current Apple lineup already accelerate h.264 decode with Blu-ray support, although better drivers are probably needed. Intel's GMA X4500 also provides h.264 decoding acceleration so decoding h.264 is hardly a feature that needs a separate chip.
Video cards do indeed participate in decoding h.264.
http://www.anandtech.com/video/showdoc.aspx?i=2977
http://www.digit-life.com/articles2/video/video.dec.2007-page1.html
http://www.google.com/search?num=10...lt&cd=1&q=video+card+h.264+ati+nvidia&spell=1
We're talking real-time, uncompressed HD not "good enough, compressed HD content that isn't real-time." We're also hinting at real-time raytracing that needs an chipset that OpenCL can leverage without taxing the GPU or any free cores.
We're talking real-time, uncompressed HD not "good enough, compressed HD content that isn't real-time."
anandtech.com said:the GeForce 8600, it is currently the best option for anyone looking to watch Blu-ray or HD-DVD on their PCs. The full H.264 offload onto the GPU makes HD movie playback not only painless but also possible on lower speed systems.
We're also hinting at real-time raytracing that needs an chipset that OpenCL can leverage without taxing the GPU or any free cores.
The big problem (see my previous post) is that BD movie playback can't be done without complying with a huge set of insane DRM requirements.Seriously, where is Apple with Blu-Ray super drives. It is borderline unacceptable that Apple does not sell this yet, especially since Apple is on the Blu-Ray board!
[other stuff]
This sucks, because people (like me) want a BD-RE drive for use as a data drive, for use as a backup device, and don't care about playing movies (of any kind) on the computer. But I'm certain that I'm in a clear minority with that opinion.
It sounds like it would've been useful 3 years ago, but not today.
AFAIK, I assume that even the lowest end current macbook can encode and decode h.264 at a reasonable rate without a dedicated chip. Heck, my 5 year old 1 ghz G4 powerbook can encode h.264 off a DVD in very high quality at something like 8 times longer than real time. So I would presume (and hope) that even without a dedicated chip a brand new upcoming late 2008 / early 2009 macbook should be able to do stuff more than 8 times faster!