Have you ever worked in a business? And in a business that wasn't started up in the last 3 years, i.e. that has some amount of legacy software, data, etc in systems that management spent big money for, that staff is trained for, etc?
The idea that "x86 is obsolete technology" is laughable. The overwhelming majority of software that is used to make money in most industries runs only on x86/amd64. And most of the business software that doesn't run on x86 runs on, oh, I dunno, IBM zArchitecture, IBM i-formerly-AS/400, etc. Not sure if there's too much VMS still out there.
I shouldn't need to tell you this, but the purpose of IT in every organization is to provide computer systems that help that business make money at whatever that business does. If you work for Boeing in IT, your goal is to provide systems that can be used to design, support, etc commercial and military airplanes. If you work for Coca Cola, your goal is to provide systems that can be used to manufacture soft drink syrups, oversee bottling plants, distribute finished beverages. Etc. The software that helps these businesses make money is overwhelmingly written for x86 (either Windows or, for client-server systems, some form of Linux/Unix on the backend) or IBM architectures.
I don't want to make ageist comments, but your perspective sounds like someone who has no appreciation of 'legacy' systems and their prevalence in the world around you. Go to a store and buy something using a credit card - your payment will be processed using mainframes running IBM zArchitecture. If you think x86 is obsolete technology, what does that make zArchitecture? With its perfect 100% compatibility with software written all the way back in the mid-1960s? And flip the light switch in your room - do you think the power plant that supplies your electricity has control systems running on ARM, or anything else developed in the last two decades?!
There is a reason that x86 is popular in the business world and elsewhere. There is a reason that IT departments 'happily' spend money on expensive Windows laptops or desktops. It is not stupidity. Frankly, when you think people with decades of experience are doing things out of "stupidity", chances are, if you want to see stupidity, it is between your chair and your keyboard. Just because you have no real world experience doesn't mean that the people who see value in what you view as obsolete are wrong.
Frankly, you might as well go and call up, say, a construction company and tell them that they're stupid to buy pickup trucks when a Toyota Prius gets way better fuel economy and costs one third as much. They will laugh at you and tell you that it's impossible to carry the materials they need for their work in the back of a Prius. Just because you think a Prius is a higher-tech, more environmentally friendly vehicle does not mean that there isn't a place for one-ton pickup trucks in the construction industry. And if you are too closed-minded to see it, that's on you, not on them.
Also - I'm presuming that I'm a bit older than you, because I remember when everybody was excited about RISC this, PowerPC that, etc. Go and find a Mac magazine from Oct. or Nov. 1991 when the AIM alliance was announced, all the excitement about how amazing PowerPC was going to be, how it was the future, etc. Then realize that, 15 years later, Apple started selling systems that are functionally/architecturally IBM compatibles (I suspect, though I have never tried, at least the first generation of Intel Macs with "Boot Camp" are capable of booting MS-DOS; they can certainly BIOS boot Windows XP). And then you start to look at the world a bit differently - why did this exciting platform turn into a complete flop 15 years later, while Windows (which was a joke in 1991, two years before the first version of NT would ship) running on an architecture that everybody considered outdated went on to run the world?
And in 1991, if you were a kid like me, you encountered vague references of big systems - UNIX, VAX, AS/400, RS/6000, Alpha, Silicon Graphics, etc - in magazines. Big systems that people used for Serious Serious Work and that were talked about in Serious Places, not magazines you could buy for $4. And guess what - x86, and to a lesser extent Windows NT, basically ate all those big systems. The architecture that everyone considered a joke in 1991 (with its 640K memory contortions, etc) and two operating systems that didn't exist in 1991 (Windows NT and Linux) somehow ate all the unmentionably-big systems.
If someone, in 1992, had said that they wanted to port some Big Serious Software that ran on *NIX workstations or IBM systems to
Windows on
x86, any 8-12 year old kid who read magazines would have been like "you're an idiot. x86 is dead. RISC FTW! And Taligent!" (Remember that Windows NT ran on a bunch of exciting RISC architectures, too... all of which promptly flopped. And as for Taligent, I don't think their operating system went much beyond the excited magazine articles.) And yet... 10 years later, essentially almost that Big Serious Software was now exclusively running on NT or Linux on x86 machines.
Excitement (and excitement over perceived technical merit) does not guarantee long-term success; in fact, if you took a magazine like, say, Byte (first publication, 1975) and went back over everything they wrote about, looked at how those technologies ended up doing, etc, I would guess that most of the things they were excited about ended up flopping. And meanwhile, x86, Windows, etc. just kept going and going. If you had told people in 1981, reviewing the original IBM PC with its 8088, that descendents of that architecture would not only dominate the nascent microcomputer market, but would swallow the minicomputer market and make a serious dent into mainframes, they would have laughed at you.
In its issue on the 10th best car engines of the 20th century (
https://www.wardsauto.com/news-analysis/10-best-engines-20th-century), Ward's Auto described the GM/Buick 3.8L engine as "the poster child of a bad idea turned good through fastidious refinement." That, in my mind, would describe the x86/IBM PC architecture as well - it was a bad idea originally, it remained a bad idea, it's still considered a bad idea 40 years later, yet somehow through fastidious refinement (largely by Intel and Microsoft, with some contributions from Compaq, AMD, Linus Torvalds and others along the way), it has basically dominated everything else and delivered absolutely unbeatable
performance-per-dollar. In fact, in 40 years of it being considered a bad idea, it is only in the last, oh, 3 years, and only after Intel lost their manufacturing edge to TSMC, that other alternatives have shown ANY prospect of supplanting it. And so far, primarily in laptops. Not sure where ARM-for-servers efforts are at; x86 and its companion GPUs from NVIDIA/AMD is still holding its own in desktop chips. Dismissing it now, only a few years after the first serious alternative shows up, seems rather hasty to me.
Honestly, go into a business, any business of more than, I dunno, 50 employees, rip out every x86 system, rip out every IBM legacy system, etc. (That includes, naturally, cancelling/replacing every cloud service powered by x86 systems as well - if x86 is obsolete, surely having x86 software running in Microsoft's datacenter instead of yours is not acceptable.) Your boss better have the bankruptcy lawyer's number on speed dial, because he/she will need it once you have destroyed the business' ability to make money and/or sent it back to the 1930s and/or spent 20 times the business' annual profits on new systems and run out of money before those new systems are ready.
One other point, which again reveals your lack of real world experience: you don't appreciate the benefits of sticking with things that work. If your company has been using software X for 20 years, your staff is trained on how to use software X to do their jobs, your IT department knows how to support software X, how to assist staff with the regular problems seen with software X, etc, then who cares if software X runs on an OS that arguably isn't the best and on a processor architecture that arguably isn't the best? Or even if software Y might be a little bit better in the abstract, and might be the choice you would make if you started the company from scratch today? Do you really think that any reasonable boss will be like "oh, yes, let's rip out software X, migrate all the data to software Y, retrain all the staff on software Y, spend 6 months figuring out the quirks of software Y when we already know the quirks of software X, throw out our Windows hardware... and all this because ARM Macs are the best CPU architecture, x86 is obsolete, and Lenovo is "cheap Chinese junk"?! No - the reasonable boss will rip out the idiot who thinks this is a good business decision, i.e. you, and keep running software X on Windows/x86 ideally until his/her retirement and long after.
And that's what you don't seem to understand when talking about things like heat and battery life: the fact that a Lenovo laptop can run 8 hours on battery while an Apple silicon laptop can run 18 hours is irrelevant in the business world. The fact that it heats your lap more is irrelevant. The fact that the Lenovo laptop is compatible with your already-paid-for management systems (because, in a business, centralized management of computers is critical), that that laptop runs the software your business is based on, and that your staff is trained on how to use that software on that OS on that machine is highly relevant - because, frankly, changing any of those three things will cost tens, if not hundreds of thousands of dollars (or more), and introduce all kinds of risks. (If you've been in a business the first week new technology is rolled out, you will know what I mean...) And for what...? Double the battery life? Lower temperatures? Nicer screens? If you tell a boss that you want to rebuild half the company's IT infrastructure and retrain all the staff to double battery life on laptops that are run on battery 5% of the time, any reasonable boss will ask you what the price tag of an external battery pack for the Lenovo laptops is and tell you to buy one for everybody who complains about battery life. And if you don't understand why that boss is right and why you are wrong, well... you have a lot to learn about the corporate world. You can start learning it today or you can learn it from your first boss or two.
Always, always, always remember one thing if you intend to make a career in IT - the purpose of IT is to
support the company/organization's operations and to do so as
efficiently and invisibly as possible. Not to build some kind of abstract showcase of the best/trendiest/etc technology.