I wouldn't count Intel out yet. People are talking like Intel is already a has-been. Legacy applications are a huge market for Intel and companies won't be scrambling (at least imo) to redevelop entire working infrastructures at the cost of potentially billions of dollars for the newest chip on the block. Similar things were said over the years about Windows and windows infrastructure and look at Microsoft today.Oh... 😄 No argument there. Intel dominated just about every level of the CPU market over the past couple decades. I’d argue their powerhouse status began before then, in their earlier innovations around microprocessor design, but they grew to pretty much own the general computing market.
That status doesn’t have to last forever though. We’ve seen plenty of other dominant players fall into obscurity in the blink of an eye.
The business school case study of it all will be the fact that the one part of the market Intel didn’t dominate, cheap low power toys and gadgets, is where the seed was planted that grew to threaten the entire Intel empire. Intel dominated market share for general computing. Not for the low power gadgets that evolved into smartphones. I’d have to dig up the numbers again, but I’m pretty sure Apple ships more processors for more profit than Intel does.
That’s where the tipping point is. Intel managed to keep x86 dominant because they had the money to do it. The best designers, the best process engineers, and an industry focused on optimizing for their architecture.
Apple, it turns out, has been using their massive market share, profits and relationships from mobile devices to match Intel on each of these and since they’re starting from a better CPU architecture and are using a more modern system view they’ve reached the point where they can beat Intel at their own game. To the point that they can just about run x86 targeted code through translation faster than Intel can run it natively.
When the money spigot to Intel starts to close, they’ll struggle more and more to keep pace with the competition with ever more limited resources.
You originally phrased your statement as RISC vs. CISC saying CISC won. That seems like the wrong way to look at it. CISC as a philosophy didn’t win anything but x86 dominated the market for a while. As a design philosophy, CISC is dead. (Actually, as a design philosophy, CISC never existed— it was only coined in retrospect to contrast with RISC, which was a design philosophy). Can you point me to a single new CISC design in the past 30 years?
Intel wouldn't be brought up on anti-trust for dropping peripheral support... Intel picks and chooses the technologies they support directly (USB vs. Firewire?) As far as I'm aware, they were never sued for dropping support for the ISA bus, or the EISA bus, or anything else. They wouldn't get called in by the DOJ for dropping support for segmented memory.
And this is the albatross around their neck. If they keep thinking that their current "one architecture to support every possible use case" is the way to go, they'll find x86 is a niche chip that is only used by legacy applications.
They (intel) has to get their act together. For example, why was skylake architecture used in 6 generations of chips? My money is on them as over the years they survived threats from other chip makers, which made faster chips.