Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is the reason. They don't support their products. Not likely to change, because people keep buying their stuff anyway.

So it´s essentially a graphics driver issue that could be remedied by Apple updating the drivers?
I own a 27" iMac and don´t want to believe that a machine that is only a few months old isn´t eligible to important updates.
 
So it´s essentially a graphics driver issue that could be remedied by Apple updating the drivers?
I own a 27" iMac and don´t want to believe that a machine that is only a few months old isn´t eligible to important updates.

Welcome to Apple's hardware support. It would be awesome if they updated with support for ATi, however. Not holding my breath though.
 
I have an old G5 imac (pre-isight) and flash videos just about kill it. Sites like iplayer run slowly, video keeps stopping and cpu usage goes mad when flash kicks in :mad:

Can any of you recommend a way around this? Click to flash is great to stop adverts etc but does not help my problem.
 
So it´s essentially a graphics driver issue that could be remedied by Apple updating the drivers?
I own a 27" iMac and don´t want to believe that a machine that is only a few months old isn´t eligible to important updates.

You're an Apple owner; believing whatever you want is part of the job. ;)

Just wait until you try putting a Blu-ray disc in your 'HD' iMac.

penlost said:
I have an old G5 imac (pre-isight) and flash videos just about kill it. Sites like iplayer run slowly, video keeps stopping and cpu usage goes mad when flash kicks in

Can any of you recommend a way around this? Click to flash is great to stop adverts etc but does not help my problem.
Try 10.1. If you already are, what browser and OS are you running? 10.5+Safari 4 or 5 will run better than 10.4 because of the Core Animation support. Might also check System Preferences, Energy, Options, and make sure Processor Performance is set to Highest or Automatic.
 
Rather than completely disagree with you (which I actually do, the 'much faster' part...), let's just say that the rate of increase of CPUs dramatically slowed after 2004. DRAMATICALLY. I could buy a 2Ghz Pentium in 2001... But I can't buy a 10Ghz chip today. This link shows why you're wrong on the instructions per clock. (Look to the Purple line in the graph)

That trend would have bailed out Adobe/Flash in the past. This time, it might bury them.

Those graphs are extremely misleading. The article talks about changing trends in how clock speeds are hitting a wall around the time the article was written and yet the lines drawn in the graph use the previous pre-wall trends to draw a flat line right through the current period. Worse yet, there is only ONE (yes just ONE!) data point from 2004 (when the article was written) until 2009 in that purple clock efficiency chart and yet if you look at ONLY that period where CPU speed ends and efficiencies begin, there is a dramatic CLIMB. But the graph ignores that completely and draws a FLAT line through it by using all previous data points. It further achieves this by actually showing "less efficiency" from beyond 1996 or so (during the same time HUGE increases were occurring in clock speed and so efficiency per cycle was not really needed or emphasized), but because those graphs are two separate lines (instead of ONE performance line overall), you get TWO exactly similar pictures of "flat" lines when in fact, the efficiency data has been steadily climbing since 2004, but because there is only one data point and the graph line is averaged across the entire graph back into the 1980s, it draws a flat line instead! There is also no indication of correlation of cycle efficiency to pure speed so you have NO clue from the graph how much faster today's (or rather 2009's) CPUs are compared to 2004. The graph makes it look like current CPUs are no faster than the ones from 2004. LOL. My PowerMac is 1/4 the speed of my Core2Duo per core (at 1.8GHz versus 2.4GHz) and thus a 2009 CPU is 4x faster than a 2004 G4 (which was less efficient than a G5, but more efficient than Pentium 4s of the day). The current I-series of Intel CPUs are probably closer to 6x faster per core. So in 6 years, reality shows a 6x increase in CPU power in consumer processors per core while the graph makes it look like they are FLAT and the same.


Global warming graphs do the same thing in reverse by drawing straight lines through specific areas of time that ignore both local and long-term trends and only show a period from 1980 until 2000 or so and ignore both the huge cooling trend period of the 1960s and 1970s (many scientists claimed we were heading for the next ice age during that time) and the ice core data that shows that the Middle Ages were MUCH warmer than it is now and NO human industrialization was present during that time (i.e. the Earth has its own sets of peaks and falls that are not necessarily correlated to mankind's activities). We have had falling temperatures once again since the turn of the century, but that data is ignored and averaged out by using a curve across a 100 year period that then minimizes the current drops at the end of the graph without waiting to see what happens and "averages" right through the 20 year cooling trend by drawing a straight line up through into the spike in the mid 1980s (during the largest sun-spot activity in recorded history right up until around 2000), thus not showing the drops that actually went below the 1880s (pre-industrialization) temperatures during that 20 year period when sun-spot activity dropped. Ironically, if you compare sun-spot graphs with global temperatures you get almost an exact match across the entire late 19th century until today, but if you compare carbon output for the earth you get HUGE spikes of carbon in the same periods (60s and 70s) where the temperature was dropping like crazy. They dismiss such contradictory data as temporary anomalies while simultaneously ignoring data from the Middle Ages that show cyclic temperature swings without industrialization (there's a reason we had ice ages and warming periods without technology and they just IGNORE ALL OF IT and blame it on man).

Here, we have the same thing at work in reverse. Performance lines are compartmentalized from overall performance (which would show the truth) to sub-categories of how performance is achieved and then using long-term trends to draw flat lines where data actually shows significant increases by using past trends (clock speed increases) which are then separated in order to make it look like CPUs are dead in the water when in fact the i7 is significantly faster than just the Core 2 Duo on a per core basis despite similar clock speeds. How can one be dead in the water (FLAT according to the line on the graph) while in reality CPUs are getting notably faster each cycle still? Like i said, graphs can be completely MISLEADING. In both cases, the lay person looks at the graph and accepts the "line" he sees without investigating the actual data, where it came from, what's being represented, what's being left out and also the scale factor of the data relative to prior or post periods of the graph (zoom in or out on something close enough and mole hills look like mountains and vice versa).

And none of that takes into account the mult-core parallel computing trends that are also taking place (Grand Central is one example towards optimizing those ends). Most of today's "super computers" achieve their power through parallel computing, not some insanely high CPU speed. The programs simply need to make use of the type of CPU power available. There is also the possibility of paradigm shifts in technology (e.g. say 8-state devices instead of binary transistors being used to create multi-dimensional CPUs. This is how the human brain's neurons work) and things like holographic memory storage (again how the brain encodes). (http://www.popsci.com/technology/ar...aim-first-transistor-mimics-brain-connections).

Overall, the point is that CPU speed drops are not as dramatic (or should I say melodramatic?) as you would seem to indicate. One could question how much an average person "needs" more CPU power to do things that worked well on a C64 (e.g. word processing). Programming inefficiencies are erasing a lot of the gains made. Chalk it up to lazy programmers compared to years past.
 
what a joke
You really think flash is going anywhere??? what a joke, just because S.J. says some things about flash people just think its going to die off... not anytime soon my friend, 95% of the web is still using flash... Flash has been around for years and will remain so.

Steve Job fan boys agree...
I so agree with you, it's such a joke to see these people on the steve jobs bandwagon... I work with Flash every day, it's not going anywhere anytime soon, there is a reason Flash has been around as long as it has... p.s. Steve, Flash can do touch gestures. I love Macs and I like Flash, there I said it.... people make up your own mind look at the facts, Flash is alive and well : )

dum dum
Flash is not going anywhere! Stop kissing Steve's ass. You want to run your computer like a fart box Tandy 2000 hx from 1980 that's your choice get all the silly flash blockers you want on your mac. 95% of the worlds web is still using flash.

Yawn...
another disgruntled Flash developer (and/or amateur troll) living in denial. Sorry, but that dinosaur relic of the desktop-age is dying fast, now that mobile platforms are crawling the web with increasing numbers. [95%? Gimme a break. Small hand-held mobile devices (totally reliant on battery power) are not using Flash hardly at all.]

And again: what's with all the Steve Jobs infatuation? Have you never heard of the Free Software Foundation? Do you actually believe companies (or organizations) like Opera (or Mozilla) are enamored of Flash? Nope, quite the opposite. They only "support" it simply because they have little (or no) choice. Apple is in a unique position to take names and kick tail, and that's the difference that matters. [Think different.] All your anti-Jobs rhetoric and fanboy taunts accomplish absolutely nothing... other than exhibit your extreme naïveté.

Oh, and let's not forget Linux users.
Plenty of folks have no stomach for Flash.
Even on my new MBP... ClickToFlash FTW.
 

Yawn...
another disgruntled Flash developer (and/or amateur troll) living in denial. Sorry, but that dinosaur relic of the desktop-age is dying fast, now that mobile platforms are crawling the web with increasing numbers. [95%? Gimme a break. Small hand-held mobile devices (totally reliant on battery power) are not using Flash hardly at all.]

And again: what's with all the Steve Jobs infatuation? Have you never heard of the Free Software Foundation? Do you actually believe companies (or organizations) like Opera (or Mozilla) are enamored of Flash? Nope, quite the opposite. They only "support" it simply because they have little (or no) choice. Apple is in a unique position to take names and kick tail, and that's the difference that matters. [Think different.] All your anti-Jobs rhetoric and fanboy taunts accomplish absolutely nothing... other than exhibit your extreme naïveté.

Oh, and let's not forget Linux users.
Plenty of folks have no stomach for Flash.
Even on my new MBP... ClickToFlash FTW.

the official Flash 10.1 was just released a few days ago. it includes several new APIs like Multitouch.

since you're so interested in open source and the mozilla foundation, you may be interested in adobe's open source projects.

finally, why do all you flash deniers insist on using Click2Flash? do you not know how to uninstall a plug-in? honestly.
 
the official Flash 10.1 was just released a few days ago. it includes several new APIs like Multitouch.

since you're so interested in open source and the mozilla foundation, you may be interested in adobe's open source projects.

finally, why do all you flash deniers insist on using Click2Flash? do you not know how to uninstall a plug-in? honestly.

Because for all their griping, the fact they don't uninstall it means they actually *do* want to use it at least occasionally.

And yes, I do wish Flash could play Hulu full screen on my netbook as well as Netflix via Silverlight. But I'd rather stay small screen than not have it at all like on the iPad (till the much-anticipated app finally comes).
 
So what´s holding other GPUs back? Is it a matter of drivers (programmed and/or crippled by Apple)?

Ignoring Apple's policy of dropping support quickly, it could be a driver thing, like you said. (ATI vs NVIDIA). I think I saw some posts that said that Apple's new API is very similar to NVIDIA's API. You can speculate that this means the ATI cards/drivers will need porting to the different API model. New ATI products might see support, old ones might too. But Apple typically doesn't help out old products at all, even if it costs them nothing.

The good news is that hardware accel is mostly irrelevant on the desktop, as another poster pointed out. I would be concerned about whether I had OpenCL support though.

MagnusVonMagnum said:
And none of that takes into account the mult-core parallel computing trends that are also taking place (Grand Central is one example towards optimizing those ends). ...
It's late, so my apologies for the rough edges on the rest of this post...

I guess you've never heard of Herb Sutter, or that column/article the graph is on.

In any case, the graph Y-axis is exponential, which may account for some of your perceived complaints with it. The axis is exponential out of necessity; we were on a rocket ship the last 30 years, and if that was graphed linearly, the graph would be unreadable unless it was on the side of a 30 story building. That means that the recent increases, which may seem meaningful to you, are nothing compared to what we've seen in the past. The purple line, ILP (Instruction Level Parallelism), hasn't gone anywhere. You will find that a huge component of 'increased' performance in the past five years is coming from the one part of the graph that hasn't stopped: transistors. That shows up two places: Multi-core (and more of them) and Cache. The 2004 PowerMac G5 had 512K of L2 cache. Most of your current Intel cores have 6MB of cache on chip available to a single core (12MB total in the package). That is where your speed improvement is coming from. If you disagree, you can tell Herb where he is wrong at HerbSutter.com and I am sure he will fix the graph.

In the specific case of Intel, the transition to the AMD64 instruction set has also provided a way to better performance on identical hardware. In addition, Intel has dropped their old Front side bus architecture, and adopted something similar to AMD's HyperTransport. Neither of these have made the actual CPU faster; they are simply corrections of long standing problems forced upon Intel by the brick wall they hit. Just like Flash 10.1 isn't an example of Adobe's superior programming... it is a forced correction of their cruddy programming.
 
Ignoring Apple's policy of dropping support quickly, it could be a driver thing, like you said. (ATI vs NVIDIA). I think I saw some posts that said that Apple's new API is very similar to NVIDIA's API. You can speculate that this means the ATI cards/drivers will need porting to the different API model. New ATI products might see support, old ones might too. But Apple typically doesn't help out old products at all, even if it costs them nothing.

I've given the GPU accelerated decoding a go on my parents PC with the latest drivers and the total reduction of CPU utilisation is something like from 20% down to 15%, so there is no 'massive' drop as some will try to make out. Right now on my Mac I can view a YouTube video with CPU utilisation fluctuating between 18-28%, so 10.01 is still a marked improvement over 10.0. The question is whether we'll see further work being done by Adobe or whether they'll stand back on their laurels.

I understand that crappy Flash programmers do cause problems and it isn't all Adobe's fault but I would like to see Adobe do something that does something about those who abuse Flash - more restrictions on how one can use certain features, banning its use for advertisement banners etc. Flash is good but it isn't good when abused.
 
Speaking of Flash, has Apple mentioned when they will publicly release the new Guanduia development framework, something that (I believe is completely standards-based) has been around for some time? If Apple can make Guanduia work under Chrome 5.0 and soon Internet Explorer 9.0 with HTML 5.0 support, that could potentially be a huge winner right there! :)
 
Because anyone that isn't just writing some kind of Quicktime front-end doesn't care about most of QTKit and the portion that does give you hardware accelerated decoding (which hasn't been around since 2005, no matter what matticus says).
Yes, it has. The Quicktime APIs for hardware acceleration were part of the "Core" technologies released with 10.4. CoreVideo and Quartz acceleration was essential to that massive rewrite.

All software written for OS X eventually passes through Quartz and Quicktime, because they are integral to the rendering system on OS X, which you might understand if you were familiar with the architecture-level issues.

Just as all software goes through Media Foundation on Windows (and is passed down to legacy-mode systems as necessary), everything on OS X goes through Quicktime at some point.

Quicktime player and the system Quicktime APIs are very different animals, and you are talking about "high level Quicktime" functions with regard to the player frameworks, evidently ignorant of the fact that Quicktime is part of the graphics system on OS X at the lowest levels.
It's just too high level. Things like Plex and Flash need to access to the decoded frame data to do some post-processing.
This is where it gets really silly for you. Quicktime is not a high-level API. It's a peer of OpenGL and is part of CoreGraphics, the Mac equivalent of Media Foundation.

Thus most everything passes through Quicktime on its way through the rendering system. The difference is how and where it chooses to communicate, which defines the level of hardware access (just as on Windows).

Plex and Flash do not need to use stream decoders other than the system-provided ones (and indeed Plex is being rewritten to use more of the system calls, not fewer--I was under the mistaken impression that this project had been completed, but it is still underway). They choose not to use the system APIs directly (for valid reasons), but to pretend it is impossible is a total fabrication.
Can the QTMovie class (and QTKit in general) deal directly with compressed video data in its raw, uncontainerized form? Or does it depend upon that data being passed into it encapsulated within a .mov/.mp4 container file?
The former is true. QTKit will load any video format playable in Quicktime, which included direct FLV support until 2007 (and I believe again in Quicktime X).
If the former is true, then I suppose Adobe may have no excuse for not obtaining its video data in whatever container seemed most appropriate, and passing it into QTKit for decompression/rendering.
They do have an excuse, and that is because using QTKit would still require that they use their own decoder or ship an appropriate plugin pack for the more exotic types of Flash video, and they'd have to modify their overlay code. Neither are tremendous roadblocks, but they're valid decisions not to adopt.
It's also very new. It deprecates the older Quicktime API. HW Accel for video decoding was added late in the game and Adobe didn't even have time to implement it even if they could've used such a high level API in the first place. Matticus is just wrong here, he's just too proud to admit it.
You keep saying this, but you're absolutely wrong. It's not new at all; it's been part of OS X since the introduction of the Core technologies in Tiger. You need look no further than your own link:

http://developer.apple.com/mac/libr...tual/CoreVideo/CVProg_Intro/CVProg_Intro.html

Introduced in 10.4. The fact that you can't seem to separate hardware acceleration for video functions from the H.264 specialized acceleration speaks for itself.
Apple is responsible for lack of hardware decoding this round. They dragged their feet compared to Microsoft. Adobe did ship a beta with the feature though, barely a month or so after Apple gave them the API. That's far from what I call "lazy". :rolleyes:
This is exactly what I'm talking about. This pretense that the H.264 codec has anything to do with anything must stop.

Nobody was calling Adobe lazy for not using the H.264 API--it is new in version 10.1 for Windows, too. It didn't exist at all, anywhere, before this version, released this year.

The issue stems from their failure to update Flash player in the preceding years, allowing Mac performance to stagnate relative to Windows, and the fact that Adobe declined to update the codebase for Flash until version 10.1, which allows them to see significant performance increases because they now have access to OS X system APIs.
 
Rather than completely disagree with you (which I actually do, the 'much faster' part...), let's just say that the rate of increase of CPUs dramatically slowed after 2004. DRAMATICALLY. I could buy a 2Ghz Pentium in 2001... But I can't buy a 10Ghz chip today. This link shows why you're wrong on the instructions per clock. (Look to the Purple line in the graph)

Again with the ghz. Seriously. Clock. Speed. Doesn't. Matter.

And your graph is grossly outdated and misleading. Where's Core Duo ? Core Solo ? There is no data point passed the Pentium 4 which was actually slower clock for clock than the Pentium 3 architecture, because Intel did exactly what you say is wrong today, they optimized for clock frequency instead of instructions per clock.

The new Core Solo/Duo architecture went back to the Pentium 3 architecture, dropping the Pentium 4 stuff. There's a reason for that. Intel realised :

Clock. Speed. Doesn't. Matter.
 
Yes, it has. The Quicktime APIs for hardware acceleration were part of the "Core" technologies released with 10.4. CoreVideo and Quartz acceleration was essential to that massive rewrite.

All software written for OS X eventually passes through Quartz and Quicktime, because they are integral to the rendering system on OS X, which you might understand if you were familiar with the architecture-level issues.

Ok, I stopped here. It puzzles me how after just 2 paragraphs of your long rant you managed to go completely off topic. Seriously, Quartz ? Do you even know what you're talking about now ? Hardware accelerated compositing for the GUI of the OS is not the same thing as hardware accelerated video decoding.

You do understand that many things can be hardware accelerated and are not related to each other right ? Gah.

Please, at least get your arguments straight. People are banging into your head about H.264 hardware decoding and you're saying they're wrong because OS X has used the GPU to accelerate compositing on the desktop... Not the same thing at all.

Have you been arguing about OS X using hardware accelerated compositing all this time and not understanding the actual conversation taking place ? Wow.

And now QTKit is a display technology ? QTMovie is a 3D vertex array type of object according to you ? QTKit is a high level API for movie playback. It includes an Interface Builder usable object, QTMovieView and underlying APIs for Video playback/capture. Do you even know what it is you're talking about here ? Have you ever even launched XCode to do something else than compile than a template ? It doesn't matter that the resulting video is composited through hardware to get displayed on screen, this isn't what is being discussed.
 
Ok, I stopped here. It puzzles me how after just 2 paragraphs of your long rant you managed to go completely off topic. Seriously, Quartz ? Do you even know what you're talking about now ?
It's not off topic at all. You're treating Quicktime like it exists as some application floating off in space. It's not. It's a peer of Quartz and the Core technologies, all of which are packaged together in CoreGraphics, which is just about as low-level as you can get developing applications for OS X.
Please, at least get your arguments straight. People are banging into your head about H.264 hardware decoding and you're saying they're wrong because OS X has used the GPU to accelerate compositing on the desktop... Not the same thing at all.
That's not even remotely close to what I'm saying.

OS X has used the GPU to accelerate video functions since 10.4. That's what CoreVideo is. You're acting like because the H.264 specialized acceleration is relatively new that Flash performance has sucked for years because of H.264. Again, H.264 acceleration is brand-new in 10.1 for Windows, so it can't be the reason.

For the 10,000th time, the H.264 API has nothing to do with Flash performance relative to Windows through version 10.0. Flash 10.1's improvements for the Mac are based on the fact that they finally rewrote Flash to use modern, OS X native APIs.
And now QTKit is a display technology ? QTMovie is a 3D vertex array type of object according to you ?
No. What are you talking about?
Have you been arguing about OS X using hardware accelerated compositing all this time and not understanding the actual conversation taking place ? Wow.
No, but it's fairly clear that you have no idea what's going on.
 
No, but it's fairly clear that you have no idea what's going on.

Where's facepalm guy...

The discussion centers around the fact that Adobe did not ship hardware accelerated video decoding into Flash 10.1 on OS X because that Apple didn't provide the proper API early enough. You said that's wrong and they should have used QTKit which has been available for a long time. We tell you QTKit is not a proper API, it's too high level and now you're discussing hardware compositing through Quartz Extreme and hardware decoding of "other codecs" of which you fail to mention any (Sorenson video ?)

I think it's fairly obvious who has no idea what is going on and is just trying to get the last word in.
 
The discussion centers around the fact that Adobe did not ship hardware accelerated video decoding into Flash 10.1 on OS X because that Apple didn't provide the proper API early enough.
No, the discussion is about the release of Flash 10.1. Only one aspect of that is the fact that beta H.264 acceleration did not make it into the final release. I wasn't expecting it to--the Windows API has been out for two years and Adobe just now did it; it's only fair to give them the same timeframe for the Mac version.

The far bigger news is that 10.1 is a Cocoa application, and has gained significant performance improvements as a result of being able to leverage CoreVideo and CoreAnimation and a laundry list of Cocoa-only features and APIs.
You said that's wrong and they should have used QTKit which has been available for a long time.
No, I said that Flash should have been rewritten years ago so this performance gap would not have been so bad. As far as H.264 is concerned, Adobe first had to rewrite Flash in Cocoa before it could use any API, something they just completed for this version.

Nobody called Adobe lazy for not implementing H.264 hardware acceleration. They were called lazy for the years of stagnation of the Mac platform software and the failure to use CoreVideo, CoreAnimation, and all the other Cocoa-only technologies they could have been using years ago. When Windows massively rewrote its graphics layer, Flash was updated in less than a year.
and now you're discussing hardware compositing through Quartz Extreme and hardware decoding of "other codecs"
No, I'm not.

I'm talking about hardware accelerated functions through CoreVideo and CoreAnimation, which are part of CoreGraphics, and frameworks and APIs that Adobe can now use thanks to the Cocoa rewrite. I don't know what you're on about.
 
I'm talking about hardware accelerated functions through CoreVideo and CoreAnimation, which are part of CoreGraphics, and frameworks and APIs that Adobe can now use thanks to the Cocoa rewrite. I don't know what you're on about.

Everyone you've been replying to has been discussing the hardware accelerated H.264 bit when they talk about hardware acceleration and APIs. You've been talking about hardware accelerated display of the pixels on the screen all this time ? No wonder you've been wrong.

Flash already has hardware acceleration for on screen displaying of graphics. Maybe next you'd like to tell us Valve is lazy because the framerate isn't up to par between the OS X and Windows version of Portal, on the same machine using Bootcamp because it sure isn't Apple's fault or nVidia's fault or anyone else who isn't providing proper optimization ?

The reality is Adobe got Flash in 2005 from Macromedia. The Mac is barely 5% of their installed base.
 
Everyone you've been replying to has been discussing the hardware accelerated H.264 bit when they talk about hardware acceleration and APIs.
Yes, and?

You've conveniently ignored the fact that I have never faulted Adobe for not implementing H.264 acceleration, nor have I ever faulted them for choosing not to use system APIs. The only problem is claiming they don't exist.
You've been talking about hardware accelerated display of the pixels on the screen all this time ? No wonder you've been wrong.
NO. What part of CoreVideo, CoreAnimation, and the other Core technologies are you not getting? You're talking about compositing completely out of the blue.

Quartz != Quartz Compositor. Not once did I mention Quartz Compositor. I really don't know what you're on about. You've been told this several times, but you keep looping back to it in a bizarre Sarah Palin-esque fashion.
Flash already has hardware acceleration for on screen displaying of graphics.
Yes, as of 10.1, when they stopped using QuickDraw (except, evidently, for Opera).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.