HTML will be and is a standard no matter what number comes after it.
LOL@YRSLF
This is pure idiocy and you know it.
HTML will be and is a standard no matter what number comes after it.
LOL@YRSLF
Why do people running old macs or low spec ones keep moaning?
If you had a low spec machine and it was not able to run something or not supported by the "latest thing" well, you'd think about upgrading.
Can't go on forever with old tech.
Yes, Flash Player 10.1 has been partially updated, six years after its codebase was deprecated and issued final warnings. But more to the point, the Cocoa frameworks are a porting layer only, and work only in Safari, only if you're using the nightlies (though this may have been fixed recently), only on Intel, and only when the SWF object is in the normal wmode--they didn't rewrite Flash completely top to bottom as a true Cocoa-native plugin, like they should have started years ago--it still has Carbon dependencies in most situations.In case you didn't know, the latest Flash plugin is Cocoa and takes advantage of Core Animation in Safari 4.0 and Snow Leopard.
Flash was a mess four years before Core Animation existed at all. Until about two months ago, the whole thing relied on QuickDraw, for crying out loud.The previous hold up appears to be again Apple, since Safari 3.x and Leopard did not expose Core Animation to browser plugins even for Cocoa applications.
The availability of the h.264 API does nothing to address Flash's performance problems.Personally, I think it's everyone involved's fault. Apple for not making the APIs
Draft standards have always been adopted before finalization--HTML4 was the same way, just like the lack of final 802.11n ratification didn't stop manufacturers from selling draft-compliant hardware for about two years. Calling it a "proposed standard" like it's an idle idea for future development is a little ridiculous. Some parts of it are finished, like the basic support for the new video objects.Just because you keep calling HTML5 a "standard" in every other post doesn't make it so. It's a proposed standard that's not even close to being nailed down.
This looks like some good news for many of us, but it seems like only the 9400 and above are supported, which leaves out some of us older C2D and CD MBP, iMac users that either have an older nvidia gpu or an ATI based gpu,
http://www.engadget.com/2010/04/28/flash-player-gala-brings-hardware-decoding-support-to-mac-os-x/
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.Frankly, I think it's utterly ridiculous that the discreet chips on the same notebooks cannot use hardware decoding, but the cheap/slow integrated ones CAN. My 8600M GT is fully capable of hardware H264 decoding, but Apple doesn't support it for hardware decoding. When even the Mac Pro gets no hardware decoding support, you know something isn't right. Apple doesn't even offer an excuse for that because there is none.
They implement a feature set, and then make it available to the hardware that supports it.They're lazy, greedy and don't want to hire any new programmers to keep OSX up-to-date despite $40+ Billion in cash reserves.
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.
That excludes outdated G92-based cards like the 8600M.
What are you talking about? Outdated and dead-end only because Apple makes it so. Throw Windows 7 on any 8600M GT equipped Mac and it will do h.264 acceleration all day.You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush.
This is pure idiocy and you know it.
Draft standards have always been adopted before finalization--HTML4 was the same way, just like the lack of final 802.11n ratification didn't stop manufacturers from selling draft-compliant hardware for about two years. Calling it a "proposed standard" like it's an idle idea for future development is a little ridiculous. Some parts of it are finished, like the basic support for the new video objects.
The HTML5 Canvas portions that are still being worked out, and whether the video object tag officially supports any particular formats, and other areas still unresolved has no bearing here. What already exists will remain in the final, ratified version. The only question is how much more will be added to HTML5 for its ratification.
How about on HTML5 video? If you switch to the HTML5 beta on YouTube does the 330M get switched on when you view HD content that is using HTML5 rather than Flash? This should be nearly the same case, since only the 330M will do Apple's hardware decode acceleration for H.264. You can switch YouTube to the HTML5 beta using this page on YouTube:...So, to make sure one is getting hardware decoding, one has to first manually start an app that turns on the 330m. And then load the page with the flash video. Any inputs on this?
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.
That excludes outdated G92-based cards like the 8600M. The Mac Pro argument is a red herring. It's not about how powerful the system or the graphics hardware is, but about the technology it implements--and for that matter, the more processing muscle the computer has, the less important dedicated acceleration is. nVidia's more powerful GPUs, used in discrete solutions, are based on older cores, with older implementations of hardware acceleration components.
It's the same story elsewhere: OpenCL only supported the then-current generation of GPGPU hardware features, as Core Image required D3D9-class programmable shaders. Like new versions of DirectX, older, but more powerful cards are just not technologically capable of new features, even though they may continue to outperform newer, lesser models.
They implement a feature set, and then make it available to the hardware that supports it.
You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush. Microsoft writes D3D around a hardware set, and cards that don't have that hardware are just left out. nVidia and AMD/ATI could update their cores annually so that all GPU models from a given year have the same hardware features, but that would raise prices and vastly shorten design cycles, holding up evolution. Apple could have gotten the ball rolling faster and supported hardware acceleration a core generation earlier. Intel could have finished up the 32nm process sooner and given us lower-power silicon, etc.
Do any machines have JUST the 9600mGT? Because even when using the 9600 for graphics, the 9400 is still powered and available for use with OpenCL - and presumably also available for H264 decoding. Thus 9600 drivers aren't required.
no most macbook pro 15" have the 9600m except a few of the lowest end that have only the 9400m but it is guaranteed that if you have the 9600m you also have the 9400m. to see if you have the 9600m go to System Preferences->Energy Saver and if you have the two GPUs on the top there should be an option for Higher Performance and Better Battery Life. But the fact is that when the 9600m is powered on (When your on Higher Performance) your 9400m is powered off.
I just booted into "Higher Performance"/9600M GT mode using my late 2009 model 15 inch MBP and when I view Youtube videos and the like I AM getting the small white square in the upper left corner which signifies that the hardware acceleration is on. Since the 9600M GT is not one of the graphics cards supported, I can only assume that it's using the hardware acceleration of my 9400M even though I'm in "Higher Performance" mode.
How else would you explain the small white square?
9600M GT won't work? The most idiotic move ever.
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.
That excludes outdated G92-based cards like the 8600M. The Mac Pro argument is a red herring. It's not about how powerful the system or the graphics hardware is, but about the technology it implements--and for that matter, the more processing muscle the computer has, the less important dedicated acceleration is. nVidia's more powerful GPUs, used in discrete solutions, are based on older cores, with older implementations of hardware acceleration components.
It's the same story elsewhere: OpenCL only supported the then-current generation of GPGPU hardware features, as Core Image required D3D9-class
programmable shaders. Like new versions of DirectX, older, but more powerful cards are just not technologically capable of new features, even though they may continue to outperform newer, lesser models.
They implement a feature set, and then make it available to the hardware that supports it.
You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush. Microsoft writes D3D around a hardware set, and cards that don't have that hardware are just left out. nVidia and AMD/ATI could update their cores annually so that all GPU models from a given year
Clueless indeed. The G92 core provides H.264 acceleration that is a generation old. Apple wrote its acceleration to take advantage of the current generation of acceleration hardware and made a strategic decision not to invest in implementing it on hardware that has already been superseded.Beyond that you just seem to be utterly CLUELESS about what features various GPUs support. The method of acceleration is beside the point. "Outdated" cards like the 8600M GT provide H264 hardware acceleration PERIOD and Apple is not using it.
Because its GPGPU implementation was current when OpenCL was finalized. It's not a difficult concept to grasp.Here you paint just the opposite story for the "outdated" 8600M GT. It does have OpenCL support in OSX.
There is no such implication. Your response is no less a full-on rant than your first post. Can the drama.The implication once again that various older cards do not have hardware acceleration for H264 or other video standards is just plain wrong or at best you are being deceitful in your portrayal of the situation.
They provide a generic API. The details of the implementation are left to the software stack. nVidia says so in as many words in their technical documentation.It is the video card makers that offer driver level support for the features of their video cards in Windows.
Amidst your scrambling to use the word "ignorant" as many times as possible, perhaps you overlooked the fact that driver support isn't really the issue.Thus any feature they support will have driver support.
Because H.264 acceleration was implemented first, when v2 hardware was still current. Really now.This is why in fact the "outdated" 8600M GT in FACT *HAS* hardware H264 support in Windows.
Outdated and dead-end because nVidia and ATI make it so. The hardware those products used is a generation out of date (literally outdated) and all future products will use the current or a future generation of the hardware (hence dead-end).What are you talking about? Outdated and dead-end only because Apple makes it so.
And?Throw Windows 7 on any 8600M GT equipped Mac and it will do h.264 acceleration all day.
Of course it can. Windows hardware accelerated H.264 is an older implementation. If Apple had developed the access to the functionality a year earlier or Windows support had come a year later, they'd be in the same boat.the fact of the matter is, my 8600gt can do hardware acceleration in vista/bootcamp in flash 10.1 with no issues.
No one denies it.It is capable of doing it.
It's got nothing to do with Flash. Apple's hardware-accelerated graphics layer APIs for H.264 was written for current generation, VP3-capable hardware. They chose not to write it for prior generations of hardware features. You don't have to like that fact, but although some crybabies might claim that there is no reason and no difference, the reality is that the GPU cores in question have different hardware. Support wasn't decided with a Ouija board. All products running cores using the current generation hardware are supported by Apple. First and second-generation hardware support for these functions wasn't written.OSX, for whatever reason, does not allow flash to do this. Hence, a diminished user experience for the customer.
Even on Windows, the G92 core barely made the cut. Even though the GeForce 6 and 7 cores also had hardware video acceleration PERIOD, as some might say, they're not supported because they were outdated, dead-end products when Windows acceleration was implemented. Only select GeForce 8 and 9 models were supported. A few months later for Windows acceleration support and the list would probably be the same as Apple's.
Clueless indeed.
Apple's work appears to be using the VDPAU libraries. Their support isn't limited to VP3 hardware.I find that statement odd. GF6 never had h.264 acceleration of any kind. I think it had some VC1 acceleration though. But using DXVA, if your card can accelerate h.264 it will. That includes ATI cards as well.
Did Apple just not write a hardware agnostic acceleration layer? Or are they not done adding support for other cards (ATI)?