Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why do people running old macs or low spec ones keep moaning?

If you had a low spec machine and it was not able to run something or not supported by the "latest thing" well, you'd think about upgrading.

Can't go on forever with old tech.

I have an 27" iMac with an i7. It's brand new, yet it has an ATI 4850. I'm moaning because it is brand new and it isn't supported. How is this an old machine or low spec?

Why did Apple decide not to include their lastest model iMac?

Has anyone tried this with a ATI 4850 to see if it works?
 
In case you didn't know, the latest Flash plugin is Cocoa and takes advantage of Core Animation in Safari 4.0 and Snow Leopard.
Yes, Flash Player 10.1 has been partially updated, six years after its codebase was deprecated and issued final warnings. But more to the point, the Cocoa frameworks are a porting layer only, and work only in Safari, only if you're using the nightlies (though this may have been fixed recently), only on Intel, and only when the SWF object is in the normal wmode--they didn't rewrite Flash completely top to bottom as a true Cocoa-native plugin, like they should have started years ago--it still has Carbon dependencies in most situations.

Apple certainly wasn't and isn't holding them up with their legacy code--they've been trying to get Adobe to release Cocoa rewrites since at least 2002 and gave out stern warnings in 2004. It's 2010 and Adobe is just getting started. If there was any technical holdup (a dubious argument to begin with), it was in browser support, but Firefox has supported Quartz plugins for at least about three years now, if not longer, and Safari always has.
The previous hold up appears to be again Apple, since Safari 3.x and Leopard did not expose Core Animation to browser plugins even for Cocoa applications.
Flash was a mess four years before Core Animation existed at all. Until about two months ago, the whole thing relied on QuickDraw, for crying out loud.
Personally, I think it's everyone involved's fault. Apple for not making the APIs
The availability of the h.264 API does nothing to address Flash's performance problems.

Silverlight manages to have a functioning, efficient plugin across all supported browsers, playing h.264 at lower CPU usage than Flash even before this API was available. VLC decodes and plays locally-stored Flash video more efficiently on OS X than Adobe's own Flash Player. Neither of these products relied on a spurious argument about access to video hardware acceleration to distract from the runtime's overall performance. Flash content performance sucks even on SWF objects that don't have a single frame of h.264.

...even on Windows, which is miles ahead of the Linux and Mac versions.
Just because you keep calling HTML5 a "standard" in every other post doesn't make it so. It's a proposed standard that's not even close to being nailed down.
Draft standards have always been adopted before finalization--HTML4 was the same way, just like the lack of final 802.11n ratification didn't stop manufacturers from selling draft-compliant hardware for about two years. Calling it a "proposed standard" like it's an idle idea for future development is a little ridiculous. Some parts of it are finished, like the basic support for the new video objects.

The HTML5 Canvas portions that are still being worked out, and whether the video object tag officially supports any particular formats, and other areas still unresolved has no bearing here. What already exists will remain in the final, ratified version. The only question is how much more will be added to HTML5 for its ratification.
 
This looks like some good news for many of us, but it seems like only the 9400 and above are supported, which leaves out some of us older C2D and CD MBP, iMac users that either have an older nvidia gpu or an ATI based gpu,
http://www.engadget.com/2010/04/28/flash-player-gala-brings-hardware-decoding-support-to-mac-os-x/

Start writing your complaints to Apple. They only respond when they get a giant flood of complaints about something. Otherwise, they tuck it under the rug and stick their fingers in their ears.

Frankly, I think it's utterly ridiculous that the discreet chips on the same notebooks cannot use hardware decoding, but the cheap/slow integrated ones CAN. My 8600M GT is fully capable of hardware H264 decoding, but Apple doesn't support it for hardware decoding. When even the Mac Pro gets no hardware decoding support, you know something isn't right. Apple doesn't even offer an excuse for that because there is none. They're lazy, greedy and don't want to hire any new programmers to keep OSX up-to-date despite $40+ Billion in cash reserves. OSX 10.7 will get delayed due to the iPhone/iPad (yet again just like Leopard) and there is simply NO EXCUSE for it given the amount of money Apple has. This planet needs jobs (not "Jobs") and Apple is being as stingy as they can possibly be. :mad:
 
I don't see much improvement over last version. I have 9400m. The problem is the plugin eats up more than 2Gigs of memory easily by just watching youtube video and eventually caused safari to stop responding. I have to kill the flash plugin in activity monitor and reload the pages.
 
Frankly, I think it's utterly ridiculous that the discreet chips on the same notebooks cannot use hardware decoding, but the cheap/slow integrated ones CAN. My 8600M GT is fully capable of hardware H264 decoding, but Apple doesn't support it for hardware decoding. When even the Mac Pro gets no hardware decoding support, you know something isn't right. Apple doesn't even offer an excuse for that because there is none.
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.

That excludes outdated G92-based cards like the 8600M. The Mac Pro argument is a red herring. It's not about how powerful the system or the graphics hardware is, but about the technology it implements--and for that matter, the more processing muscle the computer has, the less important dedicated acceleration is. nVidia's more powerful GPUs, used in discrete solutions, are based on older cores, with older implementations of hardware acceleration components.

It's the same story elsewhere: OpenCL only supported the then-current generation of GPGPU hardware features, as Core Image required D3D9-class programmable shaders. Like new versions of DirectX, older, but more powerful cards are just not technologically capable of new features, even though they may continue to outperform newer, lesser models.
They're lazy, greedy and don't want to hire any new programmers to keep OSX up-to-date despite $40+ Billion in cash reserves.
They implement a feature set, and then make it available to the hardware that supports it.

You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush. Microsoft writes D3D around a hardware set, and cards that don't have that hardware are just left out. nVidia and AMD/ATI could update their cores annually so that all GPU models from a given year have the same hardware features, but that would raise prices and vastly shorten design cycles, holding up evolution. Apple could have gotten the ball rolling faster and supported hardware acceleration a core generation earlier. Intel could have finished up the 32nm process sooner and given us lower-power silicon, etc.
 
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.

That excludes outdated G92-based cards like the 8600M.

You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush.
What are you talking about? Outdated and dead-end only because Apple makes it so. Throw Windows 7 on any 8600M GT equipped Mac and it will do h.264 acceleration all day.
 
I briefly tried the Gala flash player on my 2.0GHz unibody MacBook with NVIDIA's 9400M and it does make a difference in CPU utilization when viewing content on Hulu. Below are the typical CPU/core utilization results for Safari v4.0.5 when watching the "Secretary's Day" episode of "The Office" at 480p:

v10.0r45 release: 83%
v10.1r53 release candidate: 64%
v10.1.81.3 Gala preview: 50%

These numbers relate to a single core, so total CPU utilization was about one half of those numbers (42%, 32%, and 25% respectively on my 2.0GHz Core 2 Duo). Thus, although the Gala preview is better I wouldn't call it a dramatic improvement over the existing 10.1 release candidate (it is, however, notably better than the 10.0r45 player).
 
This is pure idiocy and you know it.

Lol, whatever you say. ;)

Draft standards have always been adopted before finalization--HTML4 was the same way, just like the lack of final 802.11n ratification didn't stop manufacturers from selling draft-compliant hardware for about two years. Calling it a "proposed standard" like it's an idle idea for future development is a little ridiculous. Some parts of it are finished, like the basic support for the new video objects.

The HTML5 Canvas portions that are still being worked out, and whether the video object tag officially supports any particular formats, and other areas still unresolved has no bearing here. What already exists will remain in the final, ratified version. The only question is how much more will be added to HTML5 for its ratification.
 
330m needs to started manually to use it

On the new macbook pros, if you simply start safari and see an HD flash video, hardware decoding is not performed (assuming intel graphics card is on by default). I have to manually start an application that kicks in the 330m. And then when I view the video hardware decoding takes place.

This kind of defeats the purpose. I have to manually monitor which card I am on. Switching to full screen flash video turns on the nvidia card(checking by the gfxcardstatus app), but the white square does not come on. So, even though the card switches from intel to nvidia, hardware decoding is not switched on after the video has already started.

So, to make sure one is getting hardware decoding, one has to first manually start an app that turns on the 330m. And then load the page with the flash video. Any inputs on this?
 
...So, to make sure one is getting hardware decoding, one has to first manually start an app that turns on the 330m. And then load the page with the flash video. Any inputs on this?
How about on HTML5 video? If you switch to the HTML5 beta on YouTube does the 330M get switched on when you view HD content that is using HTML5 rather than Flash? This should be nearly the same case, since only the 330M will do Apple's hardware decode acceleration for H.264. You can switch YouTube to the HTML5 beta using this page on YouTube:

http://www.youtube.com/html5

You can identify the HTML5 video by watching carefully for an HTML5 progress/waiting "clock" as the video begins to load prior to playback (rather than the standard YouTube waiting "clock" for Flash content). You need to check this since even after you enable the HTML5 beta player much of the HD content will still be played using Flash.
 
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.

That excludes outdated G92-based cards like the 8600M. The Mac Pro argument is a red herring. It's not about how powerful the system or the graphics hardware is, but about the technology it implements--and for that matter, the more processing muscle the computer has, the less important dedicated acceleration is. nVidia's more powerful GPUs, used in discrete solutions, are based on older cores, with older implementations of hardware acceleration components.

It's the same story elsewhere: OpenCL only supported the then-current generation of GPGPU hardware features, as Core Image required D3D9-class programmable shaders. Like new versions of DirectX, older, but more powerful cards are just not technologically capable of new features, even though they may continue to outperform newer, lesser models.

They implement a feature set, and then make it available to the hardware that supports it.

You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush. Microsoft writes D3D around a hardware set, and cards that don't have that hardware are just left out. nVidia and AMD/ATI could update their cores annually so that all GPU models from a given year have the same hardware features, but that would raise prices and vastly shorten design cycles, holding up evolution. Apple could have gotten the ball rolling faster and supported hardware acceleration a core generation earlier. Intel could have finished up the 32nm process sooner and given us lower-power silicon, etc.

the fact of the matter is, my 8600gt can do hardware acceleration in vista/bootcamp in flash 10.1 with no issues. It is capable of doing it. OSX, for whatever reason, does not allow flash to do this. Hence, a diminished user experience for the customer.
 
Do any machines have JUST the 9600mGT? Because even when using the 9600 for graphics, the 9400 is still powered and available for use with OpenCL - and presumably also available for H264 decoding. Thus 9600 drivers aren't required.

no most macbook pro 15" have the 9600m except a few of the lowest end that have only the 9400m but it is guaranteed that if you have the 9600m you also have the 9400m. to see if you have the 9600m go to System Preferences->Energy Saver and if you have the two GPUs on the top there should be an option for Higher Performance and Better Battery Life. But the fact is that when the 9600m is powered on (When your on Higher Performance) your 9400m is powered off.

I just booted into "Higher Performance"/9600M GT mode using my late 2009 model 15 inch MBP and when I view Youtube videos and the like I AM getting the small white square in the upper left corner which signifies that the hardware acceleration is on. Since the 9600M GT is not one of the graphics cards supported, I can only assume that it's using the hardware acceleration of my 9400M even though I'm in "Higher Performance" mode.

How else would you explain the small white square?
 
I just booted into "Higher Performance"/9600M GT mode using my late 2009 model 15 inch MBP and when I view Youtube videos and the like I AM getting the small white square in the upper left corner which signifies that the hardware acceleration is on. Since the 9600M GT is not one of the graphics cards supported, I can only assume that it's using the hardware acceleration of my 9400M even though I'm in "Higher Performance" mode.

How else would you explain the small white square?

It's probably worth pointing out that playing the same HD Youtube video in "Higher Performance" mode made my CPU temp shoot up to 80C. The same vid in 9400M only mode only caused the temp to rise to 60C.
 
What are the chances of the ATI 4850 being supported in the future? I want to know if Apple has already made my brand new flagship iMac obsolete. It's silly that the Mac Mini is supported over my machine. I'll return this computer and wait for a GPU upgrade on the next refresh if the 4850 has no chance of being supported.
 
That's quite the rant. Apple implemented hardware acceleration of VP3-compatible cards, of which the MCP79 and GT215/6 are the only models Apple has shipped.

That excludes outdated G92-based cards like the 8600M. The Mac Pro argument is a red herring. It's not about how powerful the system or the graphics hardware is, but about the technology it implements--and for that matter, the more processing muscle the computer has, the less important dedicated acceleration is. nVidia's more powerful GPUs, used in discrete solutions, are based on older cores, with older implementations of hardware acceleration components.

Your entire post is laughable. First you start sing terms like "outdated" to provoke emotional reaction and then downplay hardware acceleration for computers like the Mac Pro because they supposedly don't need it (that's not the point and it's still a waste of CPU power I don't care how powerful the system). Beyond that you just seem to be utterly CLUELESS about what features various GPUs support. The method of acceleration is beside the point. "Outdated" cards like the 8600M GT provide H264 hardware acceleration PERIOD and Apple is not using it. They should have had drivers with this support when the computer was released and there is still NO EXCUSE for them not adding it.

It's the same story elsewhere: OpenCL only supported the then-current generation of GPGPU hardware features, as Core Image required D3D9-class
programmable shaders. Like new versions of DirectX, older, but more powerful cards are just not technologically capable of new features, even though they may continue to outperform newer, lesser models.

They implement a feature set, and then make it available to the hardware that supports it.

Here you paint just the opposite story for the "outdated" 8600M GT. It does have OpenCL support in OSX. The implication once again that various older cards do not have hardware acceleration for H264 or other video standards is just plain wrong or at best you are being deceitful in your portrayal of the situation. Well, either that or you are just plain ignorant. Take your pick.

You can call it lazy not to write a crippled or transitional "lite" version for dead-end hardware features if you like, but you're going to have to paint everyone with that brush. Microsoft writes D3D around a hardware set, and cards that don't have that hardware are just left out. nVidia and AMD/ATI could update their cores annually so that all GPU models from a given year

Once again you are posting red herrings galore and they just make your post look ignorant. It is the video card makers that offer driver level support for the features of their video cards in Windows. Thus any feature they support will have driver support. This is why in fact the "outdated" 8600M GT in FACT *HAS* hardware H264 support in Windows. DirectX is moot, regardless. It is well known that Microsoft PURPOSELY has it crippled for WindowsXP. It is not the hardware vendors or their drivers that limits the feature support for a given DirectX call, but Microsoft. Those same cards have the DirectX 10 and 11 features available from OpenGL but Microsoft wants to force users to upgrade from XP to Windows7 (and originally Vista). In short, you not only leave the relevant facts in La-La land, but ignore other behaviors of Microsoft that have nothing to do with feature support, but marketing. And it is this latter thing that drives Apple to not support older hardware features they've been missing all along. Why offer features for older computers if they can "encourage" you to buy a new computer.
 
The newer version, released today, lets me take advantage of hardware accelerated Flash on my MacBook Air (w/ 9400m). With the previous beta I never got that little white block, plus it was pretty obvious visually there was no hardware acceleration going on. :D

It may just be that, with the beta, things still aren't well optimized yet; but so far with hardware accelerated flash on this computer still occasionally can't keep up (not too bad with 720p, but 1080p doesn't really work well). Watching the same resolution h.264 videos in quicktime works just fine. I realize there are potentially other variable that could play into this, but it does still feed into the impression that, even with hardware acceleration, Flash is (speaking euphemistically) less than optimally designed.
 
Beyond that you just seem to be utterly CLUELESS about what features various GPUs support. The method of acceleration is beside the point. "Outdated" cards like the 8600M GT provide H264 hardware acceleration PERIOD and Apple is not using it.
Clueless indeed. The G92 core provides H.264 acceleration that is a generation old. Apple wrote its acceleration to take advantage of the current generation of acceleration hardware and made a strategic decision not to invest in implementing it on hardware that has already been superseded.

Windows support for some select GF8 cores (including the G92 in the 8600) is a consequence of timing. Hardware-accelerated video has been a feature since the GeForce 6 series, but support on Windows is limited to a subset of 8 series and newer models--they have hardware acclereation PERIOD, too, but can't use it because it's outdated.

All you're really saying is that you're bent out of shape that your computer didn't make the cut, not your asinine claim that retroactive support of last-generation hardware is an expectation in introducing new features for graphics cards.
Here you paint just the opposite story for the "outdated" 8600M GT. It does have OpenCL support in OSX.
Because its GPGPU implementation was current when OpenCL was finalized. It's not a difficult concept to grasp.

Hardware features for current and future technology platforms is supported at introduction. If older designs don't have the necessary level of hardware support to take advantage of the new feature as written, it's doesn't get to use the feature, even if it has some older equivalent.

Programmable vertex shaders, fragment processing, and vector units were not totally new when GPGPU technologies started taking off; the right kind of sufficiently modern programmable vertex shaders and vector units are needed. Older versions of the same technology lack the capabilities for the newer technology; usually that's why the old products are being replaced.
The implication once again that various older cards do not have hardware acceleration for H264 or other video standards is just plain wrong or at best you are being deceitful in your portrayal of the situation.
There is no such implication. Your response is no less a full-on rant than your first post. Can the drama.
It is the video card makers that offer driver level support for the features of their video cards in Windows.
They provide a generic API. The details of the implementation are left to the software stack. nVidia says so in as many words in their technical documentation.
Thus any feature they support will have driver support.
Amidst your scrambling to use the word "ignorant" as many times as possible, perhaps you overlooked the fact that driver support isn't really the issue.
This is why in fact the "outdated" 8600M GT in FACT *HAS* hardware H264 support in Windows.
Because H.264 acceleration was implemented first, when v2 hardware was still current. Really now.
What are you talking about? Outdated and dead-end only because Apple makes it so.
Outdated and dead-end because nVidia and ATI make it so. The hardware those products used is a generation out of date (literally outdated) and all future products will use the current or a future generation of the hardware (hence dead-end).
Throw Windows 7 on any 8600M GT equipped Mac and it will do h.264 acceleration all day.
And?
the fact of the matter is, my 8600gt can do hardware acceleration in vista/bootcamp in flash 10.1 with no issues.
Of course it can. Windows hardware accelerated H.264 is an older implementation. If Apple had developed the access to the functionality a year earlier or Windows support had come a year later, they'd be in the same boat.
It is capable of doing it.
No one denies it.
OSX, for whatever reason, does not allow flash to do this. Hence, a diminished user experience for the customer.
It's got nothing to do with Flash. Apple's hardware-accelerated graphics layer APIs for H.264 was written for current generation, VP3-capable hardware. They chose not to write it for prior generations of hardware features. You don't have to like that fact, but although some crybabies might claim that there is no reason and no difference, the reality is that the GPU cores in question have different hardware. Support wasn't decided with a Ouija board. All products running cores using the current generation hardware are supported by Apple. First and second-generation hardware support for these functions wasn't written.

Like any other business decision, the simple reality was that there were insufficient long-term gains to be made by it to justify dedicating resources to developing it. Like CoreImage and OpenCL, future products will benefit from it, and some current products that don't have the necessary hardware get left behind.

Even on Windows, the G92 core barely made the cut. Even though the GeForce 6 and 7 cores also had hardware video acceleration PERIOD, as some might say, they're not supported because they were outdated, dead-end products when Windows acceleration was implemented. Only select GeForce 8 and 9 models were supported. A few months later for Windows acceleration support and the list would probably be the same as Apple's.
 
Is it me or is it not that great? Its very picky in terms of which video it chooses to accelerate. I've tried 5 Youtube HD vids (3 1080p/2 720p) and only ONE (720p) was accelerated!

Also HTML5 isn't that much better...HD HTML5 vids still push my CPU usage to 50% or more.

They all suck!

Fun Fact a 1080p flash video will take 80-85% of the CPU, 720p = 65-75%!
 
Even on Windows, the G92 core barely made the cut. Even though the GeForce 6 and 7 cores also had hardware video acceleration PERIOD, as some might say, they're not supported because they were outdated, dead-end products when Windows acceleration was implemented. Only select GeForce 8 and 9 models were supported. A few months later for Windows acceleration support and the list would probably be the same as Apple's.

I find that statement odd. GF6 never had h.264 acceleration of any kind. I think it had some VC1 acceleration though. But using DXVA, if your card can accelerate h.264 it will. That includes ATI cards as well.

Did Apple just not write a hardware agnostic acceleration layer? Or are they not done adding support for other cards (ATI)?
 
Clueless indeed.

So your MEGA RANT boils down to the fact you think a 1.5 year old computer is apparently ANCIENT and deserves no support from Apple. A 1.5 year old computer! No support. No features. Don't implement because it's "outdated" (by your own admitted asinine logic)....I take it back. You're not clueless. You're a total TROLL. Why don't you go troll somewhere else? You should be deleted from here.

Proof:
1> You think faster better computers like the Mac Pro should apparently not get newer features like GPU hardware decoding because they don't NEED it??? Completely and utterly ludicrous logic. I cannot begin to state how this simply destroys ANY credibility to your character because it's obvious trolling.

2> You call computers sold by Apple less than two years ago "outdated" and therefore conclude the hardware features their chipsets contain should not be supported. I call this laziness on Apple's part and you insist that it's Nvidia and ATI's doing? WTF are you smoking? Apple writes the drivers for OSX and they are the ones that should be supporting the features Nvidia provided to them in hardware. You cannot accept the opinion that Apple should support hardware that is only a year and a half old that they didn't bother to support at the time and instead RANT how old my computer is while simultaneously ignoring the FACT that even newer/better hardware like the Mac Pro gets NO SUPPORT EITHER. Again, that doesn't make the slightest bit of sense what-so-ever. So again, that makes you a TROLL because the ONLY point of your rant is to get an emotional rise. Meanwhile, Adobe is adding hardware assisted decoding for even total crap chipsets like the Intel GMA 500 series which can run 720P now with little CPU use on Windows while Apple cannot decode HD video on even their flagship Mac Pro with a top of the line video card, but that's not Apple's fault.... :rolleyes:
 
I find that statement odd. GF6 never had h.264 acceleration of any kind. I think it had some VC1 acceleration though. But using DXVA, if your card can accelerate h.264 it will. That includes ATI cards as well.

Did Apple just not write a hardware agnostic acceleration layer? Or are they not done adding support for other cards (ATI)?
Apple's work appears to be using the VDPAU libraries. Their support isn't limited to VP3 hardware.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.