Back in the Flash MX/MX2004 days I was developing on a Windows PC and Flash ran great. For three months I worked on a dual 1GHz G4 and Flash was atrocious then.
Things picked up when I switched full-time to G5's - a dual core 2.0 was fine and my own quad 2.5GHZ had no issues, until Flash Player 9 which would run fine as a standalone player but within Safari dropped frames... on a vector animation. It was painful.
Now I'm using 10.1 beta on the G5 and it seems fine but the best thing that happened was ClickToFlash. While Flash never brought my G5 to its knees the way Intel Macs seem to suffer, it did make the room rather hot once it grabbed 120% of the 400% CPU available.
So, if it's all Apple's fault that Flash sucks, why has it sucked for the last 8 years, even at doing basic things like animating vectors (which it was designed for)? I think, if the blame is to be split, it's at least 70% Adobe's fault.