Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I do program websites in ActionScript myself, yes, garbage collection is important, but if you remember to remove unused eventlisteners and children, then the website won't require over 10% of the CPU even when the entire website is loaded and the images are stored in the cache, and no animations are being made...

Well ActionScript does a fair amount of hand holding on the Garbage Collection side of things. Not a bad thing per-say, C# does the same. But many languages do not, and it's a tedious PITA to stop memory leaks.

Also, some programs are just too large to be absolutely sure you know exactly what is going on at all times. I'm not thinking that the Flash Player should fall into this category, but I don't really know.

Still, my point is that a piece of software that is doing absolutely nothing constructive, can still be eating you CPU cycles. And not only is it possible, it's something that is on (or should be on) developers minds. It should certainly be on the mind of people who create a program that chokes while standing on top of a Core2Duo :rolleyes:
 
Further, with all the "100% CPU" whinings, I fired up Activity Monitor in my (and your) Utilities folder and couldn't ever get it to hit 100% on CPU usage (even on that 1Ghz G4 with an 7+ year old graphics card). It did hit 65%-85%. But then I went to Apple's website, Quick Time trailers, Sea Rex trailer and chose 480p to try to get close the same playback width & height (but not quite as big of a playback screen). I watched the CPU usage in Activity Monitor again and it never fell below 85%, often showing 90%-100% (very often at 99%)

It still shouldn't hit that mark. Not by a mile.
Not even 50% usage.
Come on.. a little Flash App?
Sloppy archaic coding. Flash needs.. NEEDS a rewrite.
 
Or maybe everyone else is talking about a version of Flash that is in use - you know, released by Adobe rather than a beta.

If Adobe fixes the problems, that will be great. But it's foolish to pretend that the problems never existed
I already said the new Mac version "won't be as crummy as the last one". Does referring to the previous version as "crummy" suggest to you that I'm in denial about its issues?

just because you're claiming to be using some beta
You make it sound like I have some secret vaporware on my machine. Why not download the public release candidate of 10.1 like I just did this morning?

that may or may not ever be released.
Sure. 10.1 (now up to release candidate 2) will be pulled and never spoken of again, much to the surprise of the cellphone OS manufacturers who have been working with Adobe on this one to get Flash content onto their devices by this summer.

Or not. The release of 10.1 is imminent and it's likely to be launched along with CS5 in mid-May. If you prefer to dwell on the past, that's your call. As for myself, I'm thinking that if time is a distance between point A and point B, where point A is 10.0 and point B is 10.1, and we've traversed something like 99.98% of the distance and have .02% left, we're a lot closer to arrival than departure. Take the current iPhone. I think it's ugly, cheap looking and overpriced. But I do like the look (and features) of the next iPhone, which is 8 weeks away from launch. So, my options are to 1) dismiss the iPhone forever because I don't like the current and soon-to-be-retired version, and go on and on about it, or 2) realize that 8 weeks will fly by quickly, as they always did, consider the current iPhone history and the new one present and future.

Option 2 sounds like the more rational and sane plan, unless of course whining for the heck of it is really important to me, in which case I'll simply cling to the past for dear life, do the tell-it-to-the-hand whenever someone reminds me there's a new iPhone coming any day, and remain living in my own comfy reality distortion field where the iPhone 3GS is the only version that will ever exist.
 
I'm not arguing that Flash doesn't need a rewrite. It does. I am pointing out falsehoods posted as facts that don't seem to stand up in real testing, apparently driven by some of our goals to blindly agree with Apple no matter what Apple says.

And even still, as a coder capable of either- and one who works in both- that Mb/minute thing really matters if we care about "buffering delays", and the widespread playback of Flash on 97% of the world's computers really matters compared to the tiny fraction of computers capable of that kind of playback via HTML5 + H.264 + javascript solutions.

In a few years, things may be different. But it's a tough pill to swallow NOW, mostly because Apple is choosing to NOT allow the option of Flash on select devices capable of running it. I still sit in the camp that Apple should allow it to run as an option- even popping an intermediate warning screen like: "You are about to run a Flash-based application which is likely to use battery power at a faster pace than you may be accustomed. Proceed anyway?" That way Apple could still get it's view across, but without limiting select users of Apple devices to an "ultimate" web experience absent of something as widespread on the web as Flash.

Nobody loses much with the option. But every single iPhone/iPad/Touch owner loses by not being able to access ALL of the Internet should they want to do so with devices well capable of doing so... if Apple only allowed it.
 
Still makes sense for Adobe to support it, as there are millions of Macs out there. I would certainly hope that they will fix the current implementation of Flash, as it is ridiculous what it does on occasion to my CPU utilization and fan speed!
 
i don't know what kinda computers you people are running, but none of my current or past 5 mac's fans have ever increased while visiting a full screen Flash RIA, and certainly not while watching YouTube.
 
I ran that same larger screen flash presentation on both an old and new Apple laptop, including a Powerbook G4 1.5ghz for testing purposes. It does seem to heat up (it does when I run h.264 too), but the fans don't roar and Safari doesn't crash.

I know the Apple koolaid tastes good. But do the tests yourselves and see the difference. I AM testing on both old and new Apple hardware and I can't replicate the more extreme "flash is bad" comments posted by the Apple faithful. Yes, Flash does demand more (but not 100%) of the CPU, but so did Quicktime 7 in Leopard and backwards when playing H.264 video.

Yes, Adobe has plenty of room to make Flash better. But that could be said about just about every piece of software in the world.

I don't join in the Flash-bashing just because The Steve has decided to take a stance against Flash. Limiting your customers to less of Internet experience for no great customer-centered reason other than because you want to do so is not a stellar Apple decision... unless you take that stance for OTHER reasons, such as those that make stock holders happy, even if they are at the expense of customer benefit.
 
i don't know what kinda computers you people are running, but none of my current or past 5 mac's fans have ever increased while visiting a full screen Flash RIA, and certainly not while watching YouTube.

Let's see - virtually EVERYONE who uses Flash on a Mac says it's a CPU hog and drives their fans crazy. A large percentage of people say the same thing about the Windows version.

Heck, even Adobe admits that Flash is a pig. They claim that 10.1 will reduce CPU usage by 86%. How do you do that if it wasn't using too much CPU in the first place?

And ONE person claims that he doesn't have the problem - but there's no way of identifying him or the sites he goes to.

Who to believe?
 
Let's see - virtually EVERYONE who uses Flash on a Mac says it's a CPU hog and drives their fans crazy. A large percentage of people say the same thing about the Windows version.

Heck, even Adobe admits that Flash is a pig. They claim that 10.1 will reduce CPU usage by 86%. How do you do that if it wasn't using too much CPU in the first place?

And ONE person claims that he doesn't have the problem - but there's no way of identifying him or the sites he goes to.

Who to believe?

Is this the basis for YOUR logic?

So if we go into a Satan-worshipping forum where everyone bashes God, but a few guys say they have a great relationship with God, clearly we should believe Satan > God???

If we go into a gay men forum with a lot of woman bashing, but a few guys say they like women, clearly we should believe that loving men is better than loving women???

If we go into a republican forum with lots of dem bashing...

If we go into a <race> forum with a lot of <other race> bashing...

If we go into a Microsoft forum with a lot of Apple bashing...

Etc.

The complainers are always "louder" than those reasonably satisfied with anything. If Adobe crashed and Apple bought them, rebranded Adobe Flash as Apple Flash and Steve rolled it out as "the best thing ever", I wonder how many in this forum would jump on the "Flash is great" bandwagon.
 
In the past 6 months I have had:

2.2ghz macbook 09'
2.2ghz macbook pro 09'
2.2ghz mac mini 09'

and now a 2.0ghz macbook, 2008 model aluminum.

Every single one had the fans crank up within 30 seconds of starting a flash app. Well I take that back... little apps no.. but a medium to large sized app or game would be the breaking point. Stutter and just wouldn't run great.

But on my wife's compaq 610, flash runs smooth.

Just my experience.
 
Is this the basis for YOUR logic?

So if we go into a Satan-worshipping forum where everyone bashes God, but a few guys say they have a great relationship with God, clearly we should believe Satan > God???

I certainly wouldn't believe the few guys who say they have a great relationship with God.

If we go into a gay men forum with a lot of woman bashing, but a few guys say they like women, clearly we should believe that loving men is better than loving women???

In your world gay men bash women?
 
OK so "your" experience is not necessarily representative of "everyones" experiences. Darkroom says he doesn't experience that on his 5 Macs. I don't notice fan changes like that even on an ancient Powerbook G4 at just 1.5Ghz.

But then again, maybe you are choosing a particularly demanding game or Flash app, while we are commenting about general Flash usage on a variety of Macs. I agree with you that Flash seems to generally run faster on Windows computers, but it's not as bad as some would have us believe on Macs.

My Safari rarely crashes (even on the old Macs that I own), and I do a good deal of work- and thus spend a good deal of time testing- Flash content through Safari.

My fans don't seem to crank up, and as I posted in a real test just a page or two back, I can't even get a fairly demanding Flash creation I made to push my CPUs to 100% as so often thrown out in these forums.

Yes, Adobe could make Flash more efficient. But as I also shared in that same post, testing Quicktime 7 in Leopard on those older machines playing an H.264 video from Apple's own site generally did register at 99% of CPU use. So by that (admittedly unfair) logic, Quicktime (7) is worse than Flash. Shouldn't we all be bashing it even harder as a CPU hog?

No wait, Steve currently likes Quicktime, so it must be THE way.
 
I certainly wouldn't believe the few guys who say they have a great relationship with God.

Then his logic apparently works for you as well?

So, if we hop into a Microsoft fan forum where we can read anti-Apple comments from way more computer users in the world where Microsoft is the dominant king, then we can conclude that Apple is bad, simply because (apparently?) a biased crowd can fairly judge any issue.
 
Then his logic apparently works for you as well?

So, if we hop into a Microsoft fan forum where we can read anti-Apple comments from way more computer users in the world where Microsoft is the dominant king, then we can conclude that Apple is bad, simply because (apparently?) a biased crowd can fairly judge any issue.

His logic doesn't work, but your choice of illustrations is poor. In your particular example, both sides of that debate are nuts. You might as well have said "if everyone in a tooth fairy forum says the tooth fairy is better at checkers than the easter bunny, should we believe them?"
 
Fine. Then is Flash bad because Steve decided to proclaim it as such? Or is Flash genuinely bad?

I notice that every thread that references Flash since Steve made his proclamation brings out the bashing, and it seems popular with this crowd to chime in individual experience as if it is everyone's experience, etc.

Funny though: go back to the time before the proclamation and there were scant Flash-bashing comments along these lines. When Apple treated Adobe like a friend, the crowd here seemed to passionately bash others- usually Microsoft- as the enemy.

Up until Grand Central and the latest hardware, h.264 playback in Quicktime was as demanding on CPUs- if not more so- than Flash playback, as I just demonstrated a few posts back. But there are not tons of past threads bashing Quicktime for it's gross inefficiencies at CPU utilization. Why is that?

We all live in the world where about 9 out of 10 computer users use Windows over Apple, which implies that Windows is much more right than Apple based on what buyers choose to buy and/or use. But such logic doesn't apply when it suggests Apple is not the best. On the other hand, anything- even rumor or personal experience- that supports the contrary is so often thrown out as fact here.

So, are both sides nuts here too then?
 
And even still, as a coder capable of either- and one who works in both- that Mb/minute thing really matters if we care about "buffering delays", and the widespread playback of Flash on 97% of the world's computers really matters compared to the tiny fraction of computers capable of that kind of playback via HTML5 + H.264 + javascript solutions.

That's a great point, however the nature of HTML is to degrade. I'm using HTML5 + h.264 and then with Javascript replacing the video tag with a flash player in Firefox, IE, and Opera. I only need one source file for the video content because Flash supports h.264 and my method is forward proof, because for example once a user upgrades to IE9 when it's released they will automatically get the HTML5 version.

So it's really not a big deal ... and the code to do this is light and simple, actually helped another MacRumors member with this earlier today :)
 
Flash on mac is genuinely bad. Hopefully this news will see it improve.

And I think Flash on the Mac is OK. It certainly offers a good way to do what it does so that the result will play on a wide variety of platforms and in a wide variety of browsers- both Mac & Windows. It is one media application development option that plays in 97% of the world's computers.

This particular news strongly suggests that Adobe could make Flash play H.264 video more efficiently, which is great. But all the rest of what Flash does is largely not impacted by this event at all. I am hopeful that Adobe is hard at work evolving Flash to be much more efficient. But I would still rather have it "as is" on Apple mobile hardware- at least as a user on/off option- than to have Apple decide for me.
 
That's a great point, however the nature of HTML is to degrade. I'm using HTML5 + h.264 and then with Javascript replacing the video tag with a flash player in Firefox, IE, and Opera. I only need one source file for the video content because Flash supports h.264 and my method is forward proof, because for example once a user upgrades to IE9 when it's released they will automatically get the HTML5 version.

So it's really not a big deal ... and the code to do this is light and simple, actually helped another MacRumors member with this earlier today :)

That's a great point too- but only if you see Flash as just video. What about interactive flash functionality? That's not just one little substitution bit of code. If you want to serve up Flash animations with interactivity for Flash-capable devices and HTML5 + h.264 + javascript for anti-Flash devices, that's the stuff that isn't easily future proofed: it's 2 coding projects instead of one. IE9 upgrades won't change that coding reality.

And even depending on future upgrades doesn't work either, as there will be lots of web users still using old versions of (pre HTML5) browsers for a very long time. Things would work much better if Apple would bend and let it's mobile product users have the OPTION of using Flash now. Then, as the web migrates to the "better" option of HTML5, etc, all us brilliant Apple product buyers will just be well ahead of the crowd with technology that already supports the next big thing in web media. Instead, we're denied the complete Internet now because.... why exactly? I think I am smart enough to decide for myself if I want to burn my battery reserve faster to do something I want to do. I bet there are others capable of this as well.

Before HTML5, etc is prevalent enough to be seen as a major replacement for Flash, all the current iPads, iPhones and Touches will be outdated... maybe even 2nd or 3rd generation back antiques. A few years from now, maybe things will be different. But not now. And not next year.
 
Flash on mac is genuinely bad. Hopefully this news will see it improve.
It already has improved, but these news will hopefully will result in further improvement.

I'm running the 10.1 release candidate 2, released on April 8, available for download here: http://labs.adobe.com/downloads/flashplayer10.html

So far I've noticed pretty dramatically decreased CPU utilization, but the best part is that I no longer have to listen to the damn fans spinning up. I went to YouTube and tried the Toy Story 3 trailer in 1080p HD, hit the fullscreen button and played it on my MBP 17" (1920x1200). It's pretty short so I repeated it 3 times and... no fan noise. That's with the 9400M integrated GPU, I have lots of stuff open so I can't be arsed to log out and try the 9600.

This corresponds to earlier reports from people who said 10.1 allowed them to play fullscreen Hulu content without any framerate issues and CPU hogging. Mac users don't get the HW acceleration but they've done a lot of optimization aside from that.
 
That sounds weird. Since CUDA is already used to accelerate H.264 encoding by up to 10x, but it can't be used for decoding?

Are you saying that Nvidia Fermi supercomputer cannot play a H.264 video if it wasn't for the specific hardware to simply decode H.264 within the GPU's it uses?

I said it is pointless for decoding. h.264 decoding involves lots and lots of little operations, all very much designed not to fit into CUDA. Note that when you decode there is _one_ motion prediction which doesn't work very fast. And you have to do the decoding _exactly_ according to the h.264 rules. When you encode you want to try out _one gazillion_ of different motion predictions to see which one is best. And CUDA is reasonably good at trying thousands of very similar motion predictions. A software encoder would try 20 different predictions because trying 200 would take ten times longer. With CUDA support you can try 100 times more without too much penalty. But the _one_ operation that you need to do for decoding doesn't get speeded up.

I also doubt very much that a _good_ encoder gets a speed up of 10x. You get the 10x speedup if you take handbrake, change it from 20 different predictions to 2000 making it 50 times slower, then using CUDA to make it only 5 times slower.

Google for CABAC. Learn what it does. And then you come back here and tell us how you would use CUDA or OpenCL to do CABAC decoding.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.