Wasn't interested? Or wasn't able? Do we really know?
I don't see why they wouldn't have been able to.
The history of PowerPC, though, is that IBM wanted it in part to get more volume on the POWER architecture (for servers), but also in part for their desktops. That latter part crumbled rather quickly, though; OS/2 was dying, and running Windows on PowerPC was only briefly even possible. It never caught on. Motorola wanted it in part as a 68K replacement (that worked, but there were far fewer customers at this point; for example, Amiga and Atari were at this point dying platforms, and NeXT moved from 68K to x86 instead of to PowerPC), in part to do their own computers (that worked especially well when Apple allowed clones), and in part for embedded. The embedded stuff lived on at Freescale and eventually NXP, but hasn't seen much love since either.
But if things had happened as originally hoped for, all three would've made PowerPC "PCs" throughout the 1990s and 2000s, and others would've licensed PowerPC as well. That would've provided enough volume to give Motorola and/or IBM an incentive to make an efficient fifth-generation PowerPC laptop chip. Instead, Motorola's G5 project (the 8x00) become embedded-only, and IBM's G5 project (the 970) was mostly focused on workstations.
I don't personally believe that lack of interest in a particular market segment was the sole reason for the PowerPC decline. I think they had some design/engineering issues that ultimately made the PowerPC architecture less suitable for desktop/laptop in the long run.
I'm not a chip engineer, but my understanding is that, on paper, PowerPC is far better suited than x86, which at this point is a bunch of hacks on top of each other. Intel wanted to replace it long ago, e.g. with the 860 and later the Itanium.
I haven't found someone who can explain to me why ARM would be inherently better suited for what Apple is doing than PowerPC. Instead, I imagine it's purely an accident of history: Apple was one of the founding partners anyway, so they had an architecture license; on top of that, producing ARM at scale was already very common in the early 2000s, so they could easily ask Samsung to make the original iPhone SoC. Everything else is just history. Apple made an ARM-compatible design with the A6 (the A4 and A5 were largely still Samsung-designed), and bit by bit made it more to their liking. They could've done this with PowerPC as well, but it had since fallen out of favor. (And I imagine their ca-2005 relations with Freescale weren't rosy, given how unhappy they were with later G4 revisions.)
Switching to Intel was an acknowledgement of that fact. Intel might have grown complacent in the past few years, but that's a solvable problem. Architecture design problems are much harder to solve.
Yes, but I don't think there's anyone who would say that x86 is a particularly good architecture design.
That's one possibility. There's also the possibility that someone else simply builds a better chip.
Two sides of the same coin, but yes.
Given how much money and long-term investment you need, I think that version is unlikely. They'd have to outspend Apple
and have a lot of patience, and that raises the question: why? This also immediately rules out Qualcomm, because they mostly sell to low-margin customers.
Agreed. And we're in a very different era of computing. Back in the PowerPC days, there were a lot of custom architectures on the fringes, like SPARC and MIPS, whereas mainstream consumer devices were all powered by general purpose third party CPUs.
Right.
That has changed. Even Microsoft is designing CPUs now. Who would have imagined that even 10 years ago?
Well, Microsoft's designing them perhaps to the level Apple's A4 was Apple-designed. The SQ1 and 2 are very similar to Qualcomm's 8cx, and those in turn just use an ARM Cortex design. Apple, meanwhile, doesn't use Cortex at all, and hasn't since the A6.
Qualcomm, Nvidia and Samsung have at times toyed with their own designs, but have mostly gone back with not doing that. Apple is mostly the odd one out. Why? Because they can afford to, and it benefits them.
Code is also a lot more portable these days, making the underlying architecture less relevant.
Yep.
Apple has always wanted to build the whole widget and they've certainly had an impressive run to date, but I certainly wouldn't dismiss Intel and AMD.
Neither would I. But take any marketing claim like this with a bucket of salt. Really, what they're saying is "if you make the laptop thicker and add a louder fan, we're faster". To which Apple probably internally responds with: "yeah, we could do the same, but we really don't want to".