To me, it looks like what other chipmakers are working on is more powerful processors, though that comes at the expense of efficiency. It’s a classic chipmaker’s methodology to just increase the frequency and number of transistors, power draw be damned, to make a chip as powerful as possible. But that added power means more heat and more energy draw, which then requires more cooling, larger devices to accommodate said cooling needs, and almost always requires constant plugged in power supply.
This now is a big issue in PC-land, for both the CPU and the GPU. You can see this directly with CPU and GPU reviews today, especially launch-day chips and the detailed, technical reviews on UTube.
Watts and efficiency used to be an afterthought. No longer. The reviewers now spend a lot more time going over watts, temps, thermals, efficiency, performance-per-watt, and detailed efficiency comparisons between and among CPUs and GPUs. They now include detailed graphs and additional testing to push these components and to challenge chipmakers' claims.
It's become a big issue for three reasons: the latest hulking GPU cards from Nvidia, esp. the 4090, the very hot top-end chips from Intel, and the skyrocketing power costs in the EU. The latest top-tier stuff in PC-land simply devours watts. All of it is very inefficient, hence 750-watt, 1000-watt, and even 1200-watt power supplies are becoming more common in some of these builds. And these systems run
hot, and so now there also is a lot more about air and water cooling, too.
Apple doing so much better in this regard, e.g.
latest studio is at 10 watts idle and 295 watts max for the M2 ultra. You couldn't power most PC GPU cards with that, let alone an entire computer.