It's thinking like this at Intel which lead to them being in their current predicament.
No - Intel's problem is that they have been "too big to fail" and able to succeed through politics rather than innovation ever since IBM "made" them by picking their kludgey stopgap
pseudo-16/32-bit 8086 (EDIT) 8/16 bit 8088) for the original IBM PC. (This was paired with Microsoft's kludgy CP/M knock-off operating system to make a deeply mediocre machine who's only notable feature was the three magic letters on the front and an army of pin-suited salespersons with an existing office equipment and mainframe empire). Sure, Intel arguably got the whole ball rolling in the 1970s with the 8080 - but by the 80s they'd lost ground to the Zilog Z80 and even the 6502 and several "proper" 32 bit chips were in the pipeline. Intel and Microsoft managed to lock the industry into 1970s CPU architecture for a couple of decades - when we
could have had 68k, a clean 32 bit instruction set, a Unix-based OS and high-level source compatibility
Even the current x86-64 instruction set was designed by AMD, not Intel.
The first ARM in 1987 wiped the floor with the contemporary Intel 80286 performance-wise - but it didn't run Windows (except under emulation), so it went nowhere.
Intel's Unique Selling Point is legacy compatibility, and this is now being undermined by a perfect storm of industry changes: There's Intel's almost total failure in the emerging mobile market, which is now dominated by ARM. The rise of Linux and web-based tech with its roots in Unix/Linux - and its culture of hardware-agnostic software design and open source (there's some advantage to running Linux on x86 but most of the key Unix/Linux software packages have supported ARM and other architectures since forever) - is giving Intel serious competition in the server space where power consumption is also a big issue. Apple - even if they remain the underdog in the personal computer market - have demonstrated to the world that you can make high-performance laptops and low-end desktop workstations without x86, and Microsoft seem to be having a serious attempt at Windows on ARM this time. Then,
in general as computers have got more powerful, more and more software is written in portable, high-level code and uses operating system frameworks for things like graphics, multithreading, vector processing and neural networks, rather than direct hardware access. An increasing amount of stuff is written in scripting languages like Python or Javascript Even modern Windows ships applications as Common Language Runtime bytecode rather than x86 or ARM binaries, Android apps typically ship as bytecode and even the Apple App store has the capability to distribute "compile on delivery" bytecode. Apple have
also demonstrated - with Rosetta 2 - how effectively x86 binaries can be translated to ARM once you've kicked out all of the legacy stuff. I think Windows on ARM's main problem is with bits of legacy Win 16, Win 32 etc. binaries still floating around.
Put simply, Intel's business model relies on people needing to run legacy code, and people ain't writing legacy code any more - so the clock is ticking.
The one area where Apple Silicon
has failed to convince is in the high-end personal workstation market - the new Mac Pro satisfies a very limited niche of people who need high PCIe bandwidth but not discrete GPUs. If you really needed the 2019 Mac Pro an x86 tower is probably still the tool for the job. Trouble for Intel there is that the only people buying Intel are the dogmatic "workstations need Xeon because they just do" crowd and AMD are slaughtering them on price/performance. Plus, the whole idea of a high-powered personal workstation isn't for the ages, as the industry shifts to cloud computing (and NVIDIA have some nice ARM-based datacentre-grade iron such as Grace/Hopper to show you).
Whatever you do with x86 tech, it is always going to be carrying around the extra weight of supporting the complex x86 instruction set(s) that more modern RISC-like ISAs don't need. Intel's future is probably to make ARM, RISC-V or some other ISA - and rely on Rosetta-esque translation for x86 support. Trouble is, then, they're going to have to compete with multiple competitors on technical merit, which they haven't needed to do for 40 years.
While a part of this
is down to Apple shaking up the industry from time to time, it certainly isn't about what Apple released recently - it goes back as far as the Newton (when they invested money in ARM) plus of course the iPhone's role in promoting the modern mobile scene.