Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

nicklad

macrumors 6502
Original poster
Jun 13, 2007
258
3
Nottingham, UK
Does anybody know why Apple have not implemented hardware acceleration of video decoding support for the NVIDIA 8600M?

It is especially useful for Adobe Flash.

Hardware acceleration of video decoding on this chip is possible in Windows under Boot Camp with the latest drivers but not under Mac OS X. This significantly reduces battery life unnecessarily due to the processing required on the CPU. Laptops will also get unnecessarily hot and have to run the fans on high because of this, which is often distracting.

Apple have a technical note that lists the supported hardware, yet it does not explain why they have chosen to not implement it for the 8600M.

Technical Note TN2267:
http://developer.apple.com/library/mac/#technotes/tn2267/_index.html

According to NVIDIA, it is fully supported on the chip:
http://www.nvidia.com/object/gpus_supporting_adobeflash.html
 

thejadedmonkey

macrumors G3
May 28, 2005
9,183
3,343
Pennsylvania
Probably beause the 8600m is an inherently "bad" card, that has a 100% failure rate, with enough time. Giving it less to do will prolong its life. At least, that's my guess.
 

nicklad

macrumors 6502
Original poster
Jun 13, 2007
258
3
Nottingham, UK
Probably beause the 8600m is an inherently "bad" card, that has a 100% failure rate, with enough time. Giving it less to do will prolong its life. At least, that's my guess.

Interesting, it is probably that or only wanting to offer a feature on newer models. Still, it makes little sense. Video decoding on a GPU does not tax, and thus heat, it anywhere near the amount running a 3D game does. It is also not a marketed feature.
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
Adobe blames Apple. Apple blames Adobe. Either way we all suffer. There is no technical reason for no GPU decoding on ANY card. 9400m, 9600, 320GT are the ONLY cards supported. No ATI anything and no other Nvidia. Sucks, I know.
 

nicklad

macrumors 6502
Original poster
Jun 13, 2007
258
3
Nottingham, UK
Adobe blames Apple. Apple blames Adobe. Either way we all suffer. There is no technical reason for no GPU decoding on ANY card. 9400m, 9600, 320GT are the ONLY cards supported. No ATI anything and no other Nvidia. Sucks, I know.

This is clearly an Apple only issue at the API is unavailable on the 8600M, yet there is, prima facie, no technical reason for this.

Adobe then use this API in Flash to provide acceleration.

A GPU does need the ability to decode the video to accessible memory rather than just allowing it to be drawn to a specific area of the screen. (First generation GPUs offering hardware decoding only offered this.)

We are also talking specifically about H.264 acceleration here.
 

ztrafe

macrumors member
Apr 25, 2010
69
0
Sweden
"The Video Decode Acceleration framework is a C programming interface providing low-level access to the H.264 decoding capabilities of compatible GPUs such as the NVIDIA GeForce 9400M, GeForce 320M, GeForce GT 330M, ATI HD Radeon GFX, Intel HD Graphics and others. It is intended for use by advanced developers who specifically need hardware accelerated decode of video frames."
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
This is clearly an Apple only issue at the API is unavailable on the 8600M, yet there is, prima facie, no technical reason for this.

Adobe then use this API in Flash to provide acceleration.

A GPU does need the ability to decode the video to accessible memory rather than just allowing it to be drawn to a specific area of the screen. (First generation GPUs offering hardware decoding only offered this.)

We are also talking specifically about H.264 acceleration here.

They blame each other, yes. I agree with you and blame Apple as I have spoken to Adobe engineers and asked about this very thing. They said point blank: Those API's are all Apple has given us. We are waiting and would like to support more GPU decoding on the Mac platform but are unable to until Apple gives us the keys. Maybe Apple is embarrassed to show others their API's as they may be such a mess. Who knows their reasoning;)

Does anyone know about Silverlight encoding? Uses 120% on OS X to full screen an HD movie vs same movie on Windows using 3% CPU. I am not kidding either.That is a huge discrepancy.
 

Daveoc64

macrumors 601
Jan 16, 2008
4,074
92
Bristol, UK
There is no technical reason for no GPU decoding on ANY card

... other than the fact that some of the GPUs in the most popular Mac models don't have any video decoding hardware in them.

As for cards that have the hardware support, but not the software support - that is clearly Apple's fault.
 

derbothaus

macrumors 601
Jul 17, 2010
4,093
30
... other than the fact that some of the GPUs in the most popular Mac models don't have any video decoding hardware in them.

As for cards that have the hardware support, but not the software support - that is clearly Apple's fault.

I thought that was obvious but I should have said "modern" or "recent" in my statement.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.