Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
so a desktop CPU with idle power draw @ 70W manages to outperform a mobile chip which tops at around 100W when both CPU and GPU cores are pushed to their limits?
good job, intel, good job.
 
  • Like
Reactions: GuruZac
Desktop CPU vs. mobile CPU

Let‘s start comparing apples to apples when the iMac Pro/Mac Pro launches next year with the M1-based desktop CPU.
Price, please? How a computer with Intel Core i9-12900K (plus a RTX 3080) compares in price to a Mac Pro? Not a good comparison.
 
I think Apple is still going to have focus on cool and quiet. I seriously doubt that just throw more power at it.

Although I am not and never have been in the market for a Mac Pro, it’s the machine I am most curious to see in terms of what they’re doing with Apple Silicon.
Apple boasted that the 2019 Mac Pro has a 1.4Kw power supply. They can lower it to 800 watts and brag about less power and still give Apple Silicon some serious power.
 
Price, please? How a computer with Intel Core i9-12900K (plus a RTX 3080) compares in price to a Mac Pro? Not a good comparison.
You’re going to want a shiny new NVMe drive running on PCIe 4.0, DDR5 DRAM, at least a 750w PSU, very active cooling, a larger tower, a decent mobo, And various sun dries to bring it all together. Cha-Ching!
 
Comparing a desktop processor that hooked into a hydro-electric dam to a portable laptop attached to a 100 amp-hour battery that runs for 20 hours and can be operated while the battery recharges off a 146W charger.

Intel is in panic mode.
 
I read many comments pointing at the fact that Apple is yet to release a desktop-class M chip.
But actually it already has. The M1 was put in both the Mac Mini and the iMac. Surely not "pro", but definitely desktops.
I think that Apple will make little to no difference between desktop and laptop chips from now on. The main distinction will be the "regular" and the "pro" version of these. Perhaps the only exception will be the Mac Pro. But for all the others, the same SoCs will be used on both laptop and desktop.
It's already like that.

Possible.

One benefit of the M1 chip is that it allowed the iMac to be as thin and silent as it is, though this may not be a benefit that everyone appreciates.

The next iMac Pro may not be able to beat Intel in terms of raw performance, but the consumer would still benefit from a lower lower draw, and a smaller form factor overall, because you don’t need so much space for cooling.

Alder Lake looks impressive on paper, but it doesn’t seem very practical outside of a few very specific scenarios.
 
The question is for how much longer.

M1 is not an Intel competitor, never will be, but it chums the water for everyone else that wants to take a bite out of Intel's market share. I'm guessing we're a couple years from an entirely different PC landscape where we see multiple processor vendors peddling non-x86 architectures. Samsung and Qualcomm are often raised. I would not be surprised to see AMD also invest some of that new Ryzen cash into ARM or RISC-V architectures. I believe Intel has shown that they're culturally incapable of developing an alternative to x86.

Maybe the first steps will be in the Linux world but Microsoft isn't stupid. They'll say they aren't supporting ARM generally until the moment they do.

I don't expect that to happen until the more traditional PC makers have ARM based systems to run it on though. The failure of Intel is a threat to PC manufacturers too. There are a lot of people who won't move to Apple Silicon because they need Windows. If MS allowed Windows to run natively on AS without at least some caveats, then windows customers will be tempted to move to Apple as a Windows platform and away from HP, Dell, ASUS, etc... Those other players can't respond yet, and it would be bad business for Microsoft to undermine the bulk of their customer base (HP is a Microsoft customer, you and I are not).

I'd like to see more of these benchmarks explicitly compare performance under MacOS on M1 versus Windows on Intel-- right now all the benchmarks put the pressure on Intel only. A few headlines highlighting that Windows is also lagging because of their Intel dependence might help push the transition along.
Maybe, but just as Apple has been developing these chips for years intel needs a new roadmap and these new CPU’s has been in Development for some time and needs to go back to the drawing board which will take some time.
In a few years, Apple can always just toggle back to Intel or AMD. They are good at doing these changes.
extremely unlikely. Apples roadmap for these chips is longer than a few years. Only scenario would be if this transition fails miserably in a year or two which is probably not going to happen.
 
The crypto miners figured this out long ago, and it's where Intel needs to focus. Compute per watt. It isn't just about how much work can be done by the chip in the shortest time anymore. It's about how much the chip can do at the lowest energy cost.

The primary reason is that most compute is done in the cloud now. It's easy to string chips together there. Need a more powerful chip than the M1 Max? How about I just use two of them. And suddenly I've got more power than Intel's proc and it costs me less in electricity.

The secondary reason is all of us consumers want battery life. So computer per watt is key to make our devices do more while lasting the time between charges. Why don't we see Intel chips in mobile phones? Because they haven't been competing in the right way.
 
  • Like
Reactions: Zdigital2015
M1max: 10 core, extremely fast, quiet, cool, efficient.
Intel 12th gen: 16 core, slightly faster, one heck of a lot hotter, absolute power hog.
I know what I find more impressive, and it’s not something you need geekbench to notice.
They are a lot more than “slightly faster”. 1.5x as per the article. Plus they are competing with AMD Ryzen on the desktop in the PC space which they crush for the same price points.

As a PC Ryzen 5900X owner , I admire what Intel has done here. It’s a beast, and for desktop usage Power draw isn’t an issue. Light tasks will mostly use the efficiency cores anyway and then scale up to the big Performance cores if needed so it’s hardly a power hog.

I’m sure Apple will produce a higher able M1 Pro Max chip soon enough to compete with it in their MacPro (when they finally update it) but it will likely cost an absolute fortune. Intel’s 12700K which also beats the M1 Pro Max costs just £279 in the UK. You can build a high end compute PC that out classes the M1 Pro Max for peanuts in comparison.

But to be fair, Intel has nothing that competes with the M1 Pro or Max for laptop use at this point.
 
never say never ... if and when Intel outperforms ASi consistently in all aspects, it becomes a business decision, IF and when
If Apple had made a wholesale change from Intel, then maybe there is some merit to that line of thought, but Apple has been making their own CPUs for the iPhone as Intel completely blew that opportunity, which means that Apple is able to better allocate resources for Apple Silicon CPUs going into their own iPads and Macs since they make a new one every year for the iPhone. This is not some short term thing until Intel does better and gets into Apple’s good graces. Intel is done and AMD was never in the running. Neither can get to the performance per watt that Apple wants, so they’re both no goes. Sorry, I’m saying Never Ever.
 
I think its good for Intel, not that much but good. Can't imagine how much they will pull for the laptop version. Apple is ahead.
 
If Intel can compete in terms of speed then all the better. But Apple haven't unveiled the desktop machines yet and by the time Alder Lake surfaces then we'll be talking M2, M2 Pro, M2 Max.
 
  • Disagree
Reactions: jdb8167
so a desktop CPU with idle power draw @ 70W manages to outperform a mobile chip which tops at around 100W when both CPU and GPU cores are pushed to their limits?
good job, intel, good job.
The CPU doesn't consume anywhere near 70W at idle. A whole system including CPU, motherboard, high-end Nvidia 3080 card, storage and DDR5-RAM uses less than 60W idle.

 
  • Haha
Reactions: jdb8167
If Apple had made a wholesale change from Intel, then maybe there is some merit to that line of thought, but Apple has been making their own CPUs for the iPhone as Intel completely blew that opportunity, which means that Apple is able to better allocate resources for Apple Silicon CPUs going into their own iPads and Macs since they make a new one every year for the iPhone. This is not some short term thing until Intel does better and gets into Apple’s good graces. Intel is done and AMD was never in the running. Neither can get to the performance per watt that Apple wants, so they’re both no goes. Sorry, I’m saying Never Ever.
I follow your logic and I have my own doubts, which is why i bolded the if/when.
But, Arm is rising, others will follow Apples lead, RISC-V is here, so the computing landscape as we know it is changing, and note that I didnt say x86, so, who knows what we’ll have in 10 years? That’s why I say never say never …
 
  • Like
Reactions: Zdigital2015
So... what are those "efficiency" cores for, if it still uses 125W at base frequency?
 
The crypto miners figured this out long ago, and it's where Intel needs to focus. Compute per watt. It isn't just about how much work can be done by the chip in the shortest time anymore. It's about how much the chip can do at the lowest energy cost.
I'm sure they are well aware of it. The problem for them is that they are still lagging behind TSMC's manufacturing process, which is the main driver behind Apple's power efficiency. If they manage to catch up to TSMC (which will take 1-2 years assuming everything goes as planned for them) they will become competitive in terms of efficiency.
 
  • Disagree
Reactions: jdb8167
Your argument is either that the 5700G scores close to the M1 Max (while boosted an entire GHz above its base) or that it draws 65W. Pick one, because no way does it draw just 65W while clocked 4.8 GHz instead of 3.8.
Well you'd be wrong then and better start believing, as unlike Intel, AMD actually calculates its rated TDP values WITH BOOST FULLY ENABLED!

This means the Ryzen 7 5700G does in fact pull almost exactly 65W when under a full 100% load (specifically "71W" in this particular test) with the CPU boosting as high as it possibly can.

tomshardware.
com/reviews/amd-ryzen-7-5700g-review/2
(broke up the link so it wouldn't need mod approval first to actually post)

Yes, AMD's x86 "Zen" CPU architecture really is just THAT insanely power efficient!

So much so that the 7nm R7 5700G in laptop form (the 15W but still 8c/16t Ryzen 7 5800U) is actually in the same efficiency ballpark as Apple's first-gen 5nm M1/* silicon!
AND AMD's "Cézanne" (7nm Zen 3 & Vega APU) is also a complete SOC (ala CPU+GPU+I/O all on the same chip) so you can't even make the "but it's a CPU vs a full SOC!" argument either.

Hell, AMD's even an ENTIRE fab node behind Apple (TSMC 7nm vs 5nm) & yet they are STILL this crazy close in raw efficiency! Thus there's a VERY good chance that AMD's next-gen Ryzen Mobile 6000 series "Rembrandt" APU (6nm Zen 3+ & RDNA 2) launching at CES in January can definitively snatch back the "perf per watt" crown (only for M2 to steal it back later in the year ofc, but that's just how this stuff goes).

TL;DR - AMD has been right on Apple's efficiency tail this ENTIRE time (the Zen 3 APU & M1 released around the same time last year), and all while still being a full fab node behind! x86 is far, FAAAAAAR from dead. Just because Intel is struggling doesn't mean "x86 is done for in the near future" like many of y'all seem to believe...

(And Intel really isn't even struggling anymore tbh... Unlike the thermonuclear i9, the i7-12700K is seriously efficient for such a high perf desktop chip, & mobile specific Alder Lake should look even better. Only the i9 completely threw efficiency out the window for that laaaaaast bit of possible clock-speed/performance, power & thermals be damned, FX-9590 style.)
 
Last edited:
The CPU doesn't consume anywhere near 70W at idle. A whole system including CPU, motherboard, high-end Nvidia 3080 card, storage and DDR5-RAM uses less than 60W idle.

took result from https://www.trustedreviews.com/reviews/intel-core-i9-12900k and they claimed this for chip, not whole system... but ok, gonna trust techpowerup guys on this one.
but still, Intel doesnt look good. imagine the beating this gonna take when Apple releases proper desktop designs with higher TDP limits
 
  • Like
Reactions: SFjohn
Possible.

One benefit of the M1 chip is that it allowed the iMac to be as thin and silent as it is, though this may not be a benefit that everyone appreciates.

The next iMac Pro may not be able to beat Intel in terms of raw performance, but the consumer would still benefit from a lower lower draw, and a smaller form factor overall, because you don’t need so much space for cooling.

Alder Lake looks impressive on paper, but it doesn’t seem very practical outside of a few very specific scenarios.
But it will be the absolute best for gamers right?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.