Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You don't hear the fan on the 2021 Macbook Pro Max because it prefers to cook itself internally at 100C+. Why don't try turning off Turbo Boost on Intel Macbook Pro? Even on AMD I turn off turbo boost since it's overkill plus power consumption to performance scale goes up exponentially so not even worth it.
Reference #9
and #14 from the M1Max running HOT thread. Both by cmaier.
 
Apple wants anything with better than a basic GPU to be for "power users" willing to spend $3-4K for a workstation class machine - that's just not reasonable for a gaming system.
I see your point and thank you for sharing it.
The part I quoted - I would like to comment on. Apple does not ship a basic GPU with any of the M1 Macs.
The low end M1's have gotten really good graphical performance on games and productive work.

The base M1's can game very well. Though better get the MBP over MBA. But still the low end MBP can play a lot of games really well, and that is via Rosetta 2.
 
The M1 likely can't scale to this kind of clock rate, even if Apple wanted to. Perhaps we'll learn more about the limits if the rumored Mac Pro with M-CPU is real.
The whole point is M1 does not need to ramp up to that higher clock. M1 gets all the performance it needs with the lower clocks.
 
Lol, show me a Windows machine with more innovation than the latest MacBook Pro.

If you think Intel's "roadmap" (if they can even keep it on track and meet it) can compete with Apple's Silicon roadmap than you haven't been paying attention to history. If they had shown they have that capability than Apple never would have left Intel in the first place.
I guess people think Apple went fully 100% all out with the M1 and will be stuck with the cores, speed and other things which will make M2/M3+ useless? Essentially getting themselves in a corner with the first processor?
 
Some people have difficulty conceiving of a use for computers outside gaming
Is there that much of a need for a computer beyond gaming? Unless you are the 1% buying a computer as the tool of your trade (and you work for yourself), everything else a computer used to do can be done on a free android phone everyone has in their pocket / on their hip.

As far as gaming goes, computer gaming is vastly different than phone gaming or even console gaming - has been since the advent of computers. You needed a computer like an Apple or Atari ST to run a game like Zork because it wasn't built for an Atari 2600
 
Or you could simply acknowledge that both increasing IPC and raising clock rates are perfectly valid ways to improve the performance that each come with their own tradeoffs.
Raising clock rates and increasing power consumption are perfectly valid ways to improve your performance if you’re unable to increase IPC, yes. If, next year, Apple releases an M2 that’s NOTHING but a bump in speed and an increase in power consumption, it would be because they could not achieve an increase in IPC.
 
Intel has literally smeared apple with that graph.
If by "smeared", you mean something like "slandered" or "defamed", yeah, that is about right. Looking at anandtech's real-world test results from October, the 12900 is nowhere near as close as they suggest (and M1 Max will draw more power than 35W – almost 3x that at peak load, and make very good use of it). Real world, practical numbers do not look so good for Intel.
 
  • Like
Reactions: pacalis
Is there that much of a need for a computer beyond gaming? Unless you are the 1% buying a computer as the tool of your trade (and you work for yourself), everything else a computer used to do can be done on a free android phone everyone has in their pocket / on their hip.
I found a number that said in 2019, there were 2 billion computers in the world. Another statistic I found said that there were 3.1 billion gamers in the world and that around half gamed on PC’s. Well, that roughly 1.4 of the 2 billion computers so… I mean I’m sure someone’s done a better job of the numbers than me, but it looks like unless there’s a LOT of folks sharing a LOT of computers, that’s over half of the 2 billion computers being used for gaming.
 
Told you Intel would catch up. And BLOW BY.

Everybody HERE said INTEL was DEAD INTEL was History.

Meanwhile ALL MACS. weather MacBook Pro, iMac , Mac mini are all stuck on the M1 chip for the next 18 months.

INTEL has plenty of shrinking of their CPU size yet. While APPLE is approaching 3NM. and then probably will struggle
I think the main advantage is Apple’s, they no longer have to wait to introduce a new Laptop, while waiting for a new INTEL Processor, they can now introduce new product, with the chips they want, when they want, cus they now make them, (and LOW wattage at that). I left PC’s when Apple moved to Intel, staying with them, with their chips, now that they dumped Intel. TSMC is going to bury INTEL, best is yet to come!
 
Battery life is only half the problem. Laptop thermal throttling is an underrated issue. It doesn't matter what the peak performance is when it can only be sustained for 30 seconds.
 
  • Like
Reactions: pdoherty
Is there that much of a need for a computer beyond gaming? Unless you are the 1% buying a computer as the tool of your trade (and you work for yourself), everything else a computer used to do can be done on a free android phone everyone has in their pocket / on their hip.
You can technically do everything on the Android phone, but do you want to? In the US, seems like everyone uses a computer for work. At least more than 1%. I'd even want one just for non-work stuff like heavy messaging or watching movies together over Zoom.
 
Forgot to mention Apple's first attempt at a desktop class processor.
And of course Intel’s chips will probably be 4-10 months behind schedule. Apple is glad they ditched Intel in part because many times they planned to do computer refreshes but had to shelve them because intel didn’t have an available upgrade chip on their timeframe.
 
So turns out that Intel used ICC for this ’benchmark’ and LLVM on M1 Max. As was observed elsewhere, it isn’t the first time that shenanigans have been used to sqew results as was originally mentioned when ICC was instructed to run a deliberately slower code path for AMD processors who could also benefit from compilation optimizations from ICC.

With the 228W TDP desktop ADL parts under boost , compared with 40-50watts for M1 max full system (GPU included), I’m thinking that Apple should bring back a modern revisioning of this advert…

For context… this was Intels advert for the Pentium 2 with MMX :


This was Apples rebuttal:

and an even older personal favorite that still makes me chuckle!
 
Lifestyle company is beating the Pros! I guess intel doesn't want to be like what microsoft said about the iphone.
 
Maybe for operating temperature but for longevity from experience I like to target 70C but definitely no higher than 80C peak.
Not really. 100 degrees is comfortably within safe operating range.

From experience: I never had a CPU die. That CPU longevity is hampered by temps is an urban myth, you won‘t notice a (theoretically) decreased CPU life.
 
  • Like
Reactions: Stratus Fear
If by "smeared", you mean something like "slandered" or "defamed", yeah, that is about right. Looking at anandtech's real-world test results from October, the 12900 is nowhere near as close as they suggest (and M1 Max will draw more power than 35W – almost 3x that at peak load, and make very good use of it). Real world, practical numbers do not look so good for Intel.
Re: M1 Max, The peak current draw of 3x35W that you mention is with the GPU fully loaded as well. The CPU itself does top out somewhere in the 30-40W range. But that just makes the M1 Pro/Max all the more impressive from a perf per watt perspective.
 
  • Like
Reactions: pacalis
Just throwing a few random things out there other than Office. Certainly niche circumstance, and Parallels or RDP would likely work for most.

But, there are always those outliers. Not everyone has VPN or VM or RDP access depending where they are on a job for example. We've got sales guys doing demos in a room without an ability to use any internet or cellular access while at a potential client site. They need locally run SQL Server, Access, ArcGIS Server, and a whole suite of tools that chug along painfully on a MS Surface.

Sales demos...isn't that what PowerPoint was made for ?;)
Painfully true jokes aside, I do know some devs and sales people that run demos the same way, and really only have a Mac laptop because they like the shiny hardware. But it has been quite a while since I've had anyone be without cellular access and not be able to hook them up with network access. The main vendor I work with always has something that's "not in this build, but I can show it to you in the dev build we're working on for release of version X.x if you give me a sec to connect to the right VM".
 
If by "smeared", you mean something like "slandered" or "defamed", yeah, that is about right. Looking at anandtech's real-world test results from October, the 12900 is nowhere near as close as they suggest (and M1 Max will draw more power than 35W – almost 3x that at peak load, and make very good use of it). Real world, practical numbers do not look so good for Intel.

Yes. It was a double entendre. The Intel curve is also a blurry smear.

No kidding on that anandtech data!
 
Last edited:
“faster performance per watt than Apple” uh. Yeah. A $5000 laptop that won’t be available for a year is faster than the one I can buy tomorrow for $3500. Did intel hire Wimpy from Popeye to work for them?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.