Deep and wide refers to the penetration into the market and not decoding capabilities or branch prediction.
Lost as in market share.
Oh... 😄 No argument there. Intel dominated just about every level of the CPU market over the past couple decades. I’d argue their powerhouse status began before then, in their earlier innovations around microprocessor design, but they grew to pretty much own the general computing market.
That status doesn’t have to last forever though. We’ve seen plenty of other dominant players fall into obscurity in the blink of an eye.
The business school case study of it all will be the fact that the one part of the market Intel didn’t dominate, cheap low power toys and gadgets, is where the seed was planted that grew to threaten the entire Intel empire. Intel dominated market share for general computing. Not for the low power gadgets that evolved into smartphones. I’d have to dig up the numbers again, but I’m pretty sure Apple ships more processors for more profit than Intel does.
That’s where the tipping point is. Intel managed to keep x86 dominant because they had the money to do it. The best designers, the best process engineers, and an industry focused on optimizing for their architecture.
Apple, it turns out, has been using their massive market share, profits and relationships from mobile devices to match Intel on each of these and since they’re starting from a better CPU architecture and are using a more modern system view they’ve reached the point where they can beat Intel at their own game. To the point that they can just about run x86 targeted code through translation faster than Intel can run it natively.
When the money spigot to Intel starts to close, they’ll struggle more and more to keep pace with the competition with ever more limited resources.
You originally phrased your statement as RISC vs. CISC saying CISC won. That seems like the wrong way to look at it. CISC as a philosophy didn’t win anything but x86 dominated the market for a while. As a design philosophy, CISC is dead. (Actually, as a design philosophy, CISC never existed— it was only coined in retrospect to contrast with RISC, which was a design philosophy). Can you point me to a single new CISC design in the past 30 years?
Apple has a big enough hammer to force a square peg in a round hole. Headphone jack, power brick, 32 bit apps and that's the iphone, they forced a number of other changes in the Mac lineup over the years. If Intel did the same thing, they would be brought up on anti-trust charges.
Intel wouldn't be brought up on anti-trust for dropping peripheral support... Intel picks and chooses the technologies they support directly (USB vs. Firewire?) As far as I'm aware, they were never sued for dropping support for the ISA bus, or the EISA bus, or anything else. They wouldn't get called in by the DOJ for dropping support for segmented memory.
Intel silicon is laden with legacy functionality, that if they could remove, might be able to beat everyone else at this game.
But it's not happening. Intel doesn't produce niche chips in the general purpose computer market (ie non-mobile phone market), and they have to support what is out there.
And this is the albatross around their neck. If they keep thinking that their current "one architecture to support every possible use case" is the way to go, they'll find x86
is a niche chip that is only used by legacy applications.