Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
pretty nice. apple silicon finally beats 9900k at multi core

is there a metal benchmark yet?

I'm not a laptop guy but nice to see the card reader back

could almost see myself going back to a real mac in a couple years once my current hackintosh retires/gets demoted. depending on prices, ports and storage options of course
 
I did some Octane for Lightwave3D tests in prep for these new M1 Max units. I’ll update once I get mine. (November 5th currently)

Here’s my current scores. Keep in mind Lightwave3D requires Rosetta 2. Although I’ve only seen CPU scores of it running even faster this way, than the previous i9 in native x86 mode!

GPU RENDERING

Octane PR11
M1 MacBookAir
Metal
112-seconds (1:52)

Octane 2020.2
TR24
RTX3090 (CUDA)
7-seconds
All I wanna know is if I can sell my gaming laptop and just use a MacBook for ALL my needs instead of having to hop back and forth. Infrequent gaming is the only reason I still have a PC in my house.
 
  • Like
Reactions: ThatGuyInLa
1834 ST / 17370 MT. ( I.e. faster in both )


top of line mainstream desktop 16 cores and 24 threads .

M1 pro/max probably better than the mobile Gen 12 ( alder lake ) versions will be.

if Intel gets to TSMC N3 about same time Apple does with some mobile product ( at least the GPU part ) they should be closer than they are now . Intel isn’t going to be able to jump back in one quick leap . They need to consistently execute for 3-4 years to be back on really solid ground again.

However this result also omits the most interesting point!
Apple has more experience than Intel (or anyone else) at the use of small vs large cores on mobile, tablets, and "PC"s.
And they concluded that, tiny as their small cores are, it wasn't worth putting even 4 of them on a "high end desktop" class machine. Which suggests that for the sort of work these machines are expected to perform, small cores just aren't worth much. One or two are valuable to save power when reading email and watching movies, but they can't do much for heavy lifting. I'm honestly surprised by this -- I expected Apple to take by far the easier path of just cut-and-pasting the 4-core E-cluster. And yet they did not -- and they have far more real world knowledge than us.

So Alder Lake may look good in raw thread count, but in real life is the config above realistically going be essentially an 8+2 machine, with the other 6 small cores irrelevant to anything?
 
In the US it is possible. Here's what I'm getting:

View attachment 1870368
Nice! I was taking a look at exactly that for entertainment purposes (that’s how I start, a month later I’m buying something probably) and found myself thinking if 2TB but 32GB RAM or 1TB with 64GB RAM… I guess the point here is an external 2TB or more fast SSDs are cheap to come by while that’s out of question for this System+GPU unified RAM.
Wondering also what would be match between 14” vs 16”, price wise for the same config is not much more, my favorite size was the 13” but in the past it throttled considerably more.

I know teraflops don't accurately describe how powerful a GPU is,

But RTX 2080 Desktop was 10.07 TFLOPS, and the M1 Max 32-core is 10.4 TFLOPS

So yeah, I would guess it is as fast as RTX 3070 mobile, maybe even faster, but doesn't have any hardware Ray Tracing.
This is truly insane when put in raw numbers like this… and this is still not a desktop targeted chip. Regarding HW Ray Tracing, the Developer Conferences and XCode projects point to so many Ray Tracing talks and samples that I wonder if it is sooner or later flip the switch software side or somehow use what’s already there to accelerate that part.

Well, the stagnation of single-core perf is bad. Of course, that was to be expected, but all of this are glued together M1.
The sad thing is that last year Apple was the most performant single-core CPU on the planet by far.
Now, a year later they're 30% slower than the latest AMD/Intel who clocks in 2300 single-core Geekbench 5.

Granted, I'm comparing laptops to power-hungry high Ghz desktop computer, but the thing is, that the M1 single core is the highest single-core experience on the mac you can get, there's no faster single-core Macs. Desktop uses, will use the same M1 cores. We'll have to wait for the M2. I call this a stagnation, if not a regression if we compare to competitors.

For intel that's very good news.

For the record, single-core is what matters the most, because 100% of the software benefits from faster single-core perfs, whereas only a few benefit from multicore perf (and only in some dedicated subtasks)
.
This means that most people with M1 won't benefit from these new macs. This means PC camp, is already crushing MAc in absolute perfs, and will soon equal it in the laptops.

Disappointing (though expeceted)

GPU perfs and Memoruy bandwith and proRes stuff is impressive thaough, but will only benefit to few people
Was going to say something about this, but not anymore, there’s some truth to these (except the mega overclocked ones that go above 2000+ on GB5).

I think what I can say is this (trying to see it in an optimistic light):
if single core is the main use case for someone, getting an M1 would be enough for a very long time… I sometimes use an M1 Air and it feels snappier and more responsive than my workstation iMac 2020 where I spend 80% of the time working. For those that need the multi core power it’s finally there (C4D, Blender, etc) for those that don’t, M1 is already above and beyond the needs with infinite battery to boot, great BT/WiFi/Camera/etc connectivity that just works without having to deal with drivers of all sorts.
 
  • Like
Reactions: macpot
just leave this here. Apple is comparing the Pro 16 M1 Max 32 GPU cores to the RTX 3080.
View attachment 1870472

thats a mobile 3080 with chopped down clocks and chopped down memory bandwidth (and capacity ; narrower memory bus ) . The root problem issue is that trying to shoehorn something primar designed for ‘no boundaries’, discrete add-in card market into a mobile product . Of course the curve stinks because not what it was tuned for.
 
All I wanna know is if I can sell my gaming laptop and just use a MacBook for ALL my needs instead of having to hop back and forth. Infrequent gaming is the only reason I still have a PC in my house.

if and only if the games you play are available for macOS
 
I've gotta say, I calculated 13,000 as an estimate a few months ago, for a 10-core M processor.
It's a hell of a beast, don't get me wrong, especially for a laptop, but I set my own expectation too high and now I look at that, and I'm like "ok".

The M1 benches 7,700 and the M1 Max benches 11,500. It's not twice as powerful.

Apple stated the 10-core CPU is 70% faster than the 8-core in the M1, so that would in fact put it around 13,000. Wait for more tests. As is typical with GB, the scores are over the place.
 
Apple stated the 10-core CPU is 70% faster than the 8-core in the M1, so that would in fact put it around 13,000. Wait for more tests. As is typical with GB, the scores are over the place.
up to .faster yes ,but not in every sub-test.what you see here,the 50% imporvements only,in geekbench,is actually the average of all subtests
 
  • Like
Reactions: brucemr
That's pretty *slightly* higher than my middle of the road i9 (10900K) desktop. I don't feel so bad for not ordering the new M1 Max MBP now. (no way am I buying a notch)
You do realize you’re comparing a desktop chip to a mobile chip. And the i9 is hardly a middle of the road chip.

Also, have you read how the notch will be invisible with full screen apps, otherwise the space to either side of it will be taken up via the menu bar. In other words, you’ve been given 74 pixels of extra real estate that would normally be taken up via a bezel.
 
The real test is in specific app-related performance. Compiling code, rendering (ProRes) video, running multiple streams of 4K/8K video, playing virtual sample libraries, running AfterEffects, etc. There are a bunch of additional accelerators in the M1 Pro/Max tuned to MacOS and app optimizations that Geekbench doesn't touch.

Can't wait to see how real-world users respond to this horsepower.
 
It is not immediately obvious that the M1 Max starts at 32GB RAM. So if you were interested in the 10 core M1 pro with 32gb then the extra $ for the m1 Max is not that bad considering it is another step in performance.
Saw that when I ordered my 14” minutes after the keynote, was planning for the m1 pro (which to be honest is sufficient for me) with 32gb but decided to go for the m1 max then. Delivery next week!
 
  • Like
Reactions: escapevelocity
It is not immediately obvious that the M1 Max starts at 32GB RAM. So if you were interested in the 10 core M1 pro with 32gb then the extra $ for the m1 Max is not that bad considering it is another step in performance.
Saw that when I ordered my 14” minutes after the keynote, was planning for the m1 pro (which to be honest is sufficient for me) with 32gb but decided to go for the m1 max then. Delivery next week!
but youre charged for it lmao

they say 300$ more if u want the max,but charge u 700 for it,no wonder it can be more powerful
 
All the years of criticism of Tim Cook, “caring about nothing other than phones and emojis” - how he has abandoned the Mac. This is as far apart from that critique as one can get. Ditching Intel and creating amazing Apple silicon - by any measure, this year has as been as transformative for Apple as any milestone in their history. This is a true game-changer (for the industry, not just for Apple - sorry Intel - you may lose more than Apple when all the dust settles). Well played Mr Cook, well played . . .
 
  • Love
Reactions: escapevelocity
thats a mobile 3080 with chopped down clocks and chopped down memory bandwidth (and capacity ; narrower memory bus ) . The root problem issue is that trying to shoehorn something primar designed for ‘no boundaries’, discrete add-in card market into a mobile product . Of course the curve stinks because not what it was tuned for.

yeah Chopped down clocks, but still one of the most powerful mobile chip available right now. The Razer Blade is not bad at all and the 3080 runs at 105W compare to the M1 Max 55W. IF and only IF Apple chart is true, Intel is really getting their ass kicked.
 
I'm returning my Alienware X17 and trading in my M1 MBA to get MBP 16 M1 Max!! With a 120Hz screen and GPU performance similar to RTX 3080, I can finally get rid of a Windows machine for gaming. I miss Alienware X17's 360Hz screen, but M1 Max is so much better than i7-11800H at lower power consumption.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.