Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
All Apple video drivers are written by Nvidia and ATI. Not by Apple. Nvidia's driver code is 100% closed. So no other company than Nvidia can write drivers for their products, except by reverse engineering on Linux, which don't run that well.

ATI's driver code is half closed half open.

But I think the reason why Nvidia can't port VDPAU to OS X is due to Apple not supplying them with the required API's.
I'm not so sure NVidia is doing a much better job with Mac than Adobe is.

I have my MBP 17" (w/ 9600 GT) hooked up to a 30" Dell monitor, and 1920x1200 plus 2560x1600 is a lot to juggle for such a tiny GPU (forget the 9400, it almost dies from this setup). It's OK. Exposé gets a little jerky, but not in a dealbreaking way. HOWEVER... when I switch over to Windows 7, it flies. Noticeably snappier with the dual screens than OS X is. When I use 3D Flip or whatever that miserable window switching thing is called, it's smooth as butter, even on the 30" screen. I can also watch fullscreen videos on the MBP screen while doing work on the 30", and the flow is smooth on both screens. OS X can't match that, if I try the same there the fullscreen video (QT player) drops quite a lot of frames and things get a little sluggish on the 30" screen meanwhile.

Go figure...
 
Again I'm no expert on the subject, but flash from what I see is more like a virtual machine, in order to work in all platforms, so it works differently than video player installed on your computer. The bottom line is that those video on VLC are not doing all the work on the CPU, while flash does, and a more meaningful comparison would be the same video being played on GPU accelerated flash and VLC. Also consider that any GPU made in the last 5 years probably has playback support for all the older video formats. That's just a logical assumption from my part.

VLC does all the work on the CPU on mac. At least for the most variety of modern codecs.
 
I'm not so sure NVidia is doing a much better job with Mac than Adobe is.

I have my MBP 17" (w/ 9600 GT) hooked up to a 30" Dell monitor, and 1920x1200 plus 2560x1600 is a lot to juggle for such a tiny GPU (forget the 9400, it almost dies from this setup). It's OK. Exposé gets a little jerky, but not in a dealbreaking way. HOWEVER... when I switch over to Windows 7, it flies. Noticeably snappier with the dual screens than OS X is. When I use 3D Flip or whatever that miserable window switching thing is called, it's smooth as butter, even on the 30" screen. I can also watch fullscreen videos on the MBP screen while doing work on the 30", and the flow is smooth on both screens. OS X can't match that, if I try the same there the fullscreen video (QT player) drops quite a lot of frames and things get a little sluggish on the 30" screen meanwhile.

Go figure...


That's a whole different issue, and it's really not up to Apple, or to Nvidia in this case. The 3D drivers for OS X or Linux will never ever be as good as Windows counterparts simply because Windows is a gaming OS right now. And gaming is the only industry that drives driver development to the lengths we see on Windows. Nvidia/ATI makes tons of money from selling GPU's for Windows and they can spend tons of money for driver development.

And on top of that, Microsoft, while being aware that Windows is a gaming PC more than anything, spends tons of money for Direct3D development, which is in a much better place than OpenGL right now.
 
What percentage of the CPU does it typically use?

Depends on the size of the movie. To play a 1080p bluray rip, VLC will use 100% CPU in many cases and even that won't be enough sometimes. Although in scenes where there's less motion, it'll drop to 50% CPU or such.
 
Depends on the size of the movie. To play a 1080p bluray rip, VLC will use 100% CPU in many cases and even that won't be enough sometimes. Although in scenes where there's less motion, it'll drop to 50% CPU or such.

Bluray uses a more modern format, so I doubt most GPU would have support for it. I mean older video formats, what percentage do they use?
 
All Apple video drivers are written by Nvidia and ATI. Not by Apple. Nvidia's driver code is 100% closed. So no other company than Nvidia can write drivers for their products, except by reverse engineering on Linux, which don't run that well.

ATI's driver code is half closed half open.

But I think the reason why Nvidia can't port VDPAU to OS X is due to Apple not supplying them with the required API's.

EVERYTHING I've ever read suggests that it is indeed Apple that writes the drivers for OSX, not Nvidia and not ATI. Apple has a developer agreement with these companies that gives them access to driver code (more to the point it is Apple that will not let NVidia or ATI have OSX code, which is apparently top-secret!)

I don't know if its Apple being lazy or greedy but there are many Nvidia chipsets that would support h.264 acceleration with "up-to-date" drivers.

I just read that Adobe now supports 720P HD video acceleration even on the Intel GMA 500 chipset on Windows (this would be of note considering my Dell Netbook except sadly, it's running OSX which apparently will only offer such acceleration to the machines that are barely over a year old or less). Even the Mac Pro with much more powerful GPU options will be left out. I say if Apple isn't up to the task, they should find someone that is. They've got $40 BILLION in excess cash reserves and yet they cannot hire a few people to keep their video drivers up-to-date??? WTF!?

I have this nagging suspicion that AppleTV could handle 1080P even with its underpowered current hardware if only the GPU could be utilized to give it a bit of a boost.
 
I don't understand these Apple Haters! What part of Adobe Flash is proprietary don't yal understand? Apple is supporting open platforms: CSS, Java Script, H.264, HTML5,etc. Flash is 90's technology!
OS X is also 90's technology, going with your logic that if something was released in the 90's it hasn't evolved since then.

Flash may not live forever, but at this particular point in time it does stuff that HTML5 (which is still in its infancy) simply cannot do. And with the new 10.1 player, Flash will be in better shape than it has been... ever. With all prior versions, performance improvements were so miniscule you needed a microscope to notice them, so 10.1 is quite the paradigm shift.

You move on when the alternatives are better (as in more powerful, not just "better" for political reasons like open platform and yada yada). You don't just move on because alternatives exist. They also need to be better in every way, from a technical standpoint.

There also seems to be this peculiar notion floating around that if we only get rid of Flash, we will be rid of animated banners with bouncing balls. Does anyone seriously believe that banners are going anywhere? Flash doesn't make ads, content producers do. If you take away their tools they will do the exact same things with other tools. In fact, once all banners go HTML5 you'll wish they would've remained in Flash, because then at least there were dead easy ways for ad blocking software to detect and jettison any Flash material, but what do you think will happen when those banners become one with the content code? It will be a never ending cat-and-mouse game where ad blockers and ad developers (lovely title) try to outsmart eachother.
 
I hope the fanboys take note of this. Flash sucks on OSX because of Apple. I've used it on Windows for years, and it runs perfectly. Hopefully this will go to alleviate some of the differences.

So Flash sucks on OS X because of Apple because they don't support a feature that exists only in the newest version of Flash?

So, why does the Linux version suck too?

Sorry, if it was lousy in OS X but great in Windows AND Linux then I'd be more inclined to point my finger at Apple, but no so long as the Linux version is crappy too.
 
Bluray uses a more modern format, so I doubt most GPU would have support for it. I mean older video formats, what percentage do they use?

Bluray uses H.264. H.264 is the codec for the majority of the content out there right now.

And all the Nvidia GPU's after 8x series accelerate H.264 on Windows/Linux, so they will accelerate bluray as well.

I don't know about older formats, it's tough to find anything which is not H.264 lately.

You can check DVD for example, which is MPEG2, in that case VLC uses around 5% CPU. But there's reverse engineered hardware acceleration for MPEG2 on VLC.

720p uses 50% CPU on VLC/Mplayer on MBP 3.06 for example.
 
Maybe they are lazy. They sure charge a butt load for there software!! Plus Adobe should thank Apple. The Macintosh is what put Adobe on the map!

Don't get it twisted. Adobe needed Apple to sell there warez, but Adobe churned out killer Apps for Apple which helped adoption. You think people would've bought macs if they couldn't do anything with them?
 
LOL at the fanbois trying to spin this. Apple blaming Adobe for being lazy? How about the fact that Quicktime chokes on media that VLC can play? Or how about the fact that the Nvidia chip in my 2 year old MBP can do hardware acceleration but Apple won't support it? Talk about lazy programmers, Apple should look in the mirror. And talk about a sleazy company that drops support on a 2 year old product in order to drive hardware sales.

Good thing I can just install Windows 7 to get hardware acceleration on my early '08 MBP. At least Microsoft supports its users beyond 2 years. Good job Apple. You won't be selling hardware to me anymore pulling these kinds of stunts.
 
EVERYTHING I've ever read suggests that it is indeed Apple that writes the drivers for OSX, not Nvidia and not ATI. Apple has a developer agreement with these companies that gives them access to driver code (more to the point it is Apple that will not let NVidia or ATI have OSX code, which is apparently top-secret!)

Even on PC side lot of laptop drivers are developped buy the manufacturer and not the chip maker like Nvidia or ATI. What the manufacturers get from chip makers are the reference drivers but its up to manufacturers to really make most out of them. I honestly don't believe that out of all the companies Apple would contract chipset manufacturer to develop the release drivers.
 
LOL at the fanbois trying to spin this. Apple blaming Adobe for being lazy? How about the fact that Quicktime chokes on media that VLC can play? Or how about the fact that the Nvidia chip in my 2 year old MBP can do hardware acceleration but Apple won't support it? Talk about lazy programmers, Apple should look in the mirror. And talk about a sleazy company that drops support on a 2 year old product in order to drive hardware sales.

Good thing I can just install Windows 7 to get hardware acceleration on my early '08 MBP. At least Microsoft supports its users beyond 2 years. Good job Apple. You won't be selling hardware to me anymore pulling these kinds of stunts.

I agree. Nice post and very true. Well done.
 
LOL at the fanbois trying to spin this. Apple blaming Adobe for being lazy? How about the fact that Quicktime chokes on media that VLC can play? Or how about the fact that the Nvidia chip in my 2 year old MBP can do hardware acceleration but Apple won't support it? Talk about lazy programmers, Apple should look in the mirror. And talk about a sleazy company that drops support on a 2 year old product in order to drive hardware sales.

Good thing I can just install Windows 7 to get hardware acceleration on my early '08 MBP. At least Microsoft supports its users beyond 2 years. Good job Apple. You won't be selling hardware to me anymore pulling these kinds of stunts.

I agree with all except QuicktimeX does not choke on video VLC can play. On the contrary QuicktimeX uses multiple cores where VLC can't. So I can't play certain videos on VLC which play perfectly fine on QuicktimeX.

But yes Apple has been quite late to incorporate hardware acceleration for H.264.

Then again, you don't really need hardware acceleration on any of the Macs sold in last 2 years since all of them are perfectly capable of playing 1080p movies at 30fps through CPU only. So other than battery life on the road, you don't lose much.

Any Core2Duo CPU is capable of playing 1080p through CPU only.
 
so..hardware decoding is supported for the 9400gt but not the 8600gt or the 9600gt? That is ridiculous, as in windows my 8600gt has hardware decoding in flash just fine.


This is my biggest concern as well. I mean it's not like the "older" notebook GPU's are that much slower than the latest notebook GPU's.

ADOBE IS DEAD.

You say this on every post you make; "(insert company) IS DEAD". No they're not, nor will they be for a long time. Microsoft is not dead. Apple is not dead. Adobe is not dead. Grow up.
 
Even on PC side lot of laptop drivers are developped buy the manufacturer and not the chip maker like Nvidia or ATI. What the manufacturers get from chip makers are the reference drivers but its up to manufacturers to really make most out of them. I honestly don't believe that out of all the companies Apple would contract chipset manufacturer to develop the release drivers.
nVidia and ATI are both offering driver packages for their mobile parts.
 
I have this nagging suspicion that AppleTV could handle 1080P even with its underpowered current hardware if only the GPU could be utilized to give it a bit of a boost.

Apple TV should be able to play 1080p because it already should have GPU acceleration. AppleTV only plays Quicktime container files and Apple already has GPU acceleration for Quicktime.
 
Great. I am glad to see you admit that until now it was Apple preventing Flash from running more efficiently on OS X.

Why does this tosh keep spouting out of your mouth? The 10.1 beta addresses many of the historic issues Flash has had on OSX, with no hardware acceleration. So why has Flash been so bad on OSX for years? It does not take an Adobe or Apple software engineer to work this out.

Why is Flash on Linux so bad? Adobe even has access to the source, yet it isn't even on par with Flash on Windows.
 
I hope the fanboys take note of this. Flash sucks on OSX because of Apple. I've used it on Windows for years, and it runs perfectly. Hopefully this will go to alleviate some of the differences.

You do realize this only applies to video? Flash still sucks on the Mac.
 
With all prior versions, performance improvements were so miniscule you needed a microscope to notice them, so 10.1 is quite the paradigm shift.

Too bad Adobe has waited until now to move in that direction and you know why? Now that a big time company is just flat out rejecting them, they have to make a case for their technology rather than just sit on their collective rears and suck up the stagnation that comes from unchallenged ubiquity.

I'm not totally taking Apple's side on this, but I hate Adobe and hated them long before I was using Apple's computers. (They were unkind to us Linux types.)
 
I thought The Amazing Criswell (known from Ed Wood movies and the Jack Paar Show for his wildly inaccurate predictions) was the worst fortune teller ever, but with your track record of announcing the impending death of Microsoft/Adobe/any company that isn't Apple/ in all-caps in 90% of your posts, you have him beat.

Here's what would've been dead without Adobe: Apple, who survived the mid-90's only thanks to the loyal support of creative professionals who kept buying their computers when nobody else did. Had there been no Photoshop for Mac during those years, "Apple, Inc" would've been up for sale on Craig's List.


Anyone who says Adobe will ever die is clearly an idiot. What will and should die is Adobe's dominance in web content ie. the Flash Player, but we are probably talking 10 years from now at least. The Flash development environment is a fantastic toolset, nobody can deny.

Whatever Adobe says, Flash is not an 'open' platform, and probably never will be. It is closed source and maintained by one company, which is not good for the end user no matter how much of the current Internet depends on it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.