Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'd also like to see what the Mac Pro ends up getting, but I am pretty sure it will start around $10k, and likely max around $20k, so the maxed out Studio for $6800 is looking good. I just want to see a few real-world app specific benchmarks first, vs Geekbench numbers.
Specifically Cinema 4D render times and After Effects please!

The difference between identically configured Mac Studios with entry level configurations of M1 Max and M1 Ultra is $1,400. Assuming the Mac Pro is offered with a 40 core M1, that implies a base price of roughly $6,800.

And an insane level of performance.
 
The difference between identically configured Mac Studios with entry level configurations of M1 Max and M1 Ultra is $1,400. Assuming the Mac Pro is offered with a 40 core M1, that implies a base price of roughly $6,800.

And an insane level of performance.
I think it will have to be dual sockets, if you believe them when they say ultra is the final M1 design.
 
Let US.. the user do the calculations! If true by now EVERYthing would have been realtime for years!
3.4x faster, next year 2.5x faster, year after that 4.3x faster. Yet still we are watching a stupid beachball with any minor calculations or when doing updates.

Hey man. How hard do you want us devs to work? We want to work as little as possible, and every new faster CPU is just lets us ship even fatter code to leave for vacation even sooner;)
 
I think it will have to be dual sockets, if you believe them when they say ultra is the final M1 design.

Yea, that makes perfect sense.

Apple doesn't want to throw a lot of resources into their lowest volume SOC. Dual sockets should get them 85% of the bang for 10% of the bucks, and they only have to make more Ultras which helps lower it's unit costs over time. It also makes for a much more flexible Mac Pro, where pricing can start with one low end Ultra and scale all the way up to two high end Ultras.

And possibly the CPUs will be user upgradable.
 
Yep. I have posted in other topics but as a game developer, macOS is not important for me. I am focusing my attention on the remaining 7-80% of the desktop market - Windows. Even though I am using an engine and framework that is cross platform and I have tested my game on macOS. I am still not going to release it for mac. And my game can run on a potato Windows PC so any decent Mac recently can run it just fine! Think Factorio, Stardew Valley or Terraria required system specs. It runs on an old laptop with Intel Integrated Graphics at 60fps!
I appreciate the answer. I'm not a developer or programmer, but if you test your software on a Mac why not release it? Wouldn't that be more market?
 
Why 5 years from now? Why not embrace the insanity of this level of performance today? I don't understand the constant 'just wait' sentiment in this group at all.
I’m not sure I understand your comment. I am embracing the technology they publicized today. Why do you think I have a bad sentiment from it?
I said that given what was announced today, I can’t even fathom what things will be like in five years timeframe.
I lost as to what you mean
 
Am I the only one who is skeptical of these GPU numbers? When independent reviewers measured the performance of the M1 Max GPU it got slaughtered by notebook level cards in spite of what Apple said it would do in anything that wasn't video encoding. I'm to suddenly believe that doubling the cores now makes it faster than a desktop 3090 or 6900? Really?
 
Am I the only one who is skeptical of these GPU numbers? When independent reviewers measured the performance of the M1 Max GPU it got slaughtered by notebook level cards in spite of what Apple said it would do in anything that wasn't video encoding. I'm to suddenly believe that doubling the cores now makes it faster than a desktop 3090 or 6900? Really?
Are you talking about gaming? Of course it’s going to lose at gaming. That’s like buying $100 screwdriver and complaining it’s terrible at driving nails.
 
Am I the only one who is skeptical of these GPU numbers? When independent reviewers measured the performance of the M1 Max GPU it got slaughtered by notebook level cards in spite of what Apple said it would do in anything that wasn't video encoding. I'm to suddenly believe that doubling the cores now makes it faster than a desktop 3090 or 6900? Really?

Source? Because the only "slaughter" of the M1 Max I've seen was when NVidia benchmarked custom code written for it's latest and best cards in specialized tests.


It wasn't Apples to Orange, it wasn't even Fruit to another type of Fruit.
 
Why 5 years from now? Why not embrace the insanity of this level of performance today? I don't understand the constant 'just wait' sentiment in this group at all.
They never said anything about waiting..

“it will be fascinating to see where we are in five years from now.”
 
Lightroom is so slow even on my M1 max MacBook Pro it’s not the computer it’s the app Lightroom is terrible so ultra wouldn’t help
When is the last time you used Lightroom? It’s been updated recently and those performance issues seem to have been sorted out. I’ve been using Lightroom professionally for over 10 years and the latest performance update made a big difference. Update your app.
 
Yes the M1 Ultra GPU produces about 40 teraflops which is about the same as the Xbox Series X RDNA GPU, but you can bet that we will never see all those beautiful ray traced 4K games on a Mac Studio, because Apple have not had the foresight to work with the leading game developers. It’s a shame because Apple have the power to do it. You can say what you want about Microsoft, but they know their stuff when it comes to gaming, i mean they just bought Activision Blizzard for 68 Billion dollars!
No. Apple quotes Ultra GPU compute just over 20 teraflops. Not 40. It does have very high raster rates even compared to a 3090, but significantly weaker in compute. And no ray tracing, no tensor cores (the two neural engines are slower than Nvidia, etc)

But yeah. For gaming, no way. No software!
 
It won’t.
The failure of gaming on the Mac has almost nothing to do with hardware.
Apple could shove the best top of the line GPU, better than anything Intel, Nvidia or AMD have ever produced, in one of their computers, and scream from the mountain tops that Mac’s are now gaming beasts, and that wouldn’t change a thing.
It’s all about the gaming developers, most of which don’t believe developing for the Mac is that important

its not the developers, apple just doesn’t care about desktop gaming, never have and never will unless there is drastic change to senior leadership. Their hardware especially the m1 ultra can easily compete with windows pc at the highest level if they just focused on making the software/driver game friendly, and invested money to get triple a studios on it. With the money apple has, it will be a finger snap if they wanted to go after desktop gaming.

It’s definitely a missed opportunity now that the hardware is capable….I get apple thinking that the desktop gaming profit is just a fraction and not worth wasting investment on…but they don’t seem to understand that’s the only reason left to get a window machine, by breaking this and get triple a game studios on mac, they will gain significant market share, which in turn will further secure the eco system as those people switch to iPhones etc..

As it is now, I bough a rtx 3070 laptop just to play games, and now buying a Mac studio for everything else…such awkward setup…
 
Yes the M1 Ultra GPU produces about 40 teraflops which is about the same as the Xbox Series X RDNA GPU, but you can bet that we will never see all those beautiful ray traced 4K games on a Mac Studio, because Apple have not had the foresight to work with the leading game developers. It’s a shame because Apple have the power to do it. You can say what you want about Microsoft, but they know their stuff when it comes to gaming, i mean they just bought Activision Blizzard for 68 Billion dollars!
Not sure where you're getting that 40 TFLOPs number. The wikipedia page for the M1 Max states, "In total, the M1 Max GPU contains up to 512 Execution units or 4096 ALUs, which have a maximum floating point (FP32) performance of 10.4 TFLOPs." Likewise, Tom's Hardware just published an article stating, "When it comes to the 64-core GPU on the M1 Ultra (8,192 execution units, 21 TFLOPs), Apple says that it offers performance on par with the NVIDIA GeForce RTX 3090, while consuming 200 fewer watts."

TBH, I'm pretty dubious about that 80% faster than the W6900X claim. According to Geekbench, the W6900X scores 170102, while the M1 Max scores 64219... Extrapolating to the M1 Ultra, the score would be 2*64219 = 128438. For reference 80% faster than the W6900X would be a score of 170102*1.8 = 306183. Maybe Geekbench is flawed for the M1 GPUs or something else is going on, but either way that's one hell of a discrepancy.
 
Am I the only one who is skeptical of these GPU numbers? When independent reviewers measured the performance of the M1 Max GPU it got slaughtered by notebook level cards in spite of what Apple said it would do in anything that wasn't video encoding. I'm to suddenly believe that doubling the cores now makes it faster than a desktop 3090 or 6900? Really?
Exactly. I have an m1 max laptop and a 3080ti PC. The max is extraordinary for video editing… The dedicated encoding and decoding blocks and the unified memory really help it there Match and sometimes even beat the 3080ti (basically a 3090) at like 1/3 the power. On a laptop. On battery. But if I’m rendering 3-D anything… The 3080 TI in my PC is 2-3 three times faster.

For a laptop is still extraordinary… It’s the same raw 3d wise as my old Vega 64 in the old iMac Pro (10.5 teraflop). But it’s nothing like modern GPUs (it’s roughly a desktop 3060ti; even Apple admits this today).

I do believe the 64 core Ultra will be as fast if not faster than a 3090 for raster; the texel and pixel full rates are a significantly higher. But it’s a lot slower in compute (20tflop vs 34?) never mind the lack of ray tracing and tensor blocks (the Neural engine helps a lot here but Nvidia is still faster and better supported in graphics)

Ultra is a VERY fast GPU and certainly somewhat competitive, but a 3080 will still match or beat it.

And Nvidia 4000 series Lovelace chips late this year will 2x that 3090 speed, though I suspect Apple has one more trick up their sleeve for the Mac Peo.
 
Ah, found it at 25:20 into the event, "We're adding one last chip to the M1 family..." Can't believe I'd missed that.
Yeah, that, and the “we’re done except for Mac Pro,” were two very uncharacteristic revelations today. Until now, nobody knew for sure that a Mac Pro with M1 was actually coming, and nobody new for sure that there wasn’t going to be an Ultra Duo.

Given these two data points, seems to me we’re most likely not going to see an M1 Mac Pro, and instead it will be an M2 variation (so they can offer a 40+ core variant). It’s not possible to do an M1 Ultra x2 unless you (1) build some sort of smart interposer, which is a bunch of work that has little payoff for apple or (2) use dual sockets. They made a big deal today about the programming model, and why two sockets is bad, so I don’t think they’ll do that.

I suspect that M2 Max will have a fancier fusion bus that allows each die to talk to 2 neighbors instead of 1, and may support up to 8 M2 Max’s tiled.
 
can you add RAM after purchase or is it on the MB?
The RAM is on the System-on-a-Chip (SoC) package, so it's effectively "part" of the CPU "chip". The whole black square below is the SoC, showing 4 RAM modules on each side (8 x16GB = 128GB max).

1646799267583.png
 
Yeah the M1 max can already edit 8k video fine it’s mad how people just want more power for the sake of it
Which people? You don't know what other people are using their machines for, and what they need or can use for their work.

For many tasks there is no such thing as "too much power" - you can use all the processing and graphical power available. There are plenty of videos out there showing the limits of the M1 Max for animation, video editing, music, software development.

Unless a machine can complete a task essentially instantaneously (by human perception), then there is some benefit to it being faster.
 
Last edited:
Yeah, that, and the “we’re done except for Mac Pro,” were two very uncharacteristic revelations today. Until now, nobody knew for sure that a Mac Pro with M1 was actually coming, and nobody new for sure that there wasn’t going to be an Ultra Duo.

Given these two data points, seems to me we’re most likely not going to see an M1 Mac Pro, and instead it will be an M2 variation (so they can offer a 40+ core variant). It’s not possible to do an M1 Ultra x2 unless you (1) build some sort of smart interposer, which is a bunch of work that has little payoff for apple or (2) use dual sockets. They made a big deal today about the programming model, and why two sockets is bad, so I don’t think they’ll do that.

I suspect that M2 Max will have a fancier fusion bus that allows each die to talk to 2 neighbors instead of 1, and may support up to 8 M2 Max’s tiled.
Nah, I’m sure they will do a dual core (or more) for the Max Pro. That’s the appropriate place to build such a beast.
 
Nah, I’m sure they will do a dual core (or more) for the Max Pro. That’s the appropriate place to build such a beast.
They already have a dual core. I’;m not sure what you’re talking about unless you mean a dual ultra. And if that’s what you mean, they flat out said today that the last M1 chip is the ultra.

Or did you mean dual sockets? I doubt they will do dual sockets.
 
  • Like
Reactions: Argoduck
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.