Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Unsurprising, but still a bit disappointing that Apple felt they had to pit their device against a 3090. They could have chosen a 3080 and it still would be impressive.

That being said, I'm more interested in performance in real-world workloads. I hope we see a bit more of that in the coming weeks.
 
As much as Apple wants to sell it there is a lot of scenarios when Ultra is not really 2x Max. Either way, its a great start. M2, M3..... will be insane :)
Maybe but Intel's and nVidia's products will be insaner if you are looking at raw power. Not every buyer is that bothered by energy constraints. Energy is money but so is time.
 
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
The issue is, that in the chart the line stops at about 300W despite the axis still going, suggesting that's is where the '3090' caps out. I agree that Apple is trying to show how the M1 Ultra's maximum W compares with a leading dGPU as the same W, but in doing so they made implications about performance limitations.

I was initially confused by this chart, thinking that Apple was comparing it to the 3080 FE. They didn't actually name the card, and I thought maybe they were able to classify the 3090 differently to exclude it. If they really did mean to compare the Ultra with the 3090 this is misleading at best. Power limits is low on the list of concerns for high-end workstations.
 
So what is the correct benchmark?
Maybe the 3090 should run Metal? Or maybe the benchmark is to run a video render or something using software optimized for each platform? It is likely that there is not a single purpose benchmark software that actually gets to the result optimized for the individual platform.
 
Despite Apple's claims and charts, the new M1 Ultra chip is not able to outperform Nvidia's RTX 3090 in terms of raw GPU performance, according to benchmark testing performed by The Verge.

Did Apple make the claim that it would outperform Nvidia's RTX 3090 in terms of raw GPU performance?

That's not what I'm seeing in the graphic at the top of the story.
 
  • Like
Reactions: CWallace and SFjohn
I think this is why the Mac Pro will stay intel a lot longer than people realize. Apple cannot compete with 1.5 TB RAM options and dedicated graphics cards.
You beat me to it.

I wouldn't be surprised if they decided to do a refresh one last time this year. If their best chip now, i.e., M1 Ultra, isn't even on par with the best graphics card, how do they justify a new ASi Mac Pro whose graphics performance is only slightly better nine months from now?
 
  • Like
Reactions: IllinoisCorn
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
A interestedly take away is from the green standpoint, Apple is achieving a lot with far less electric usage. Yes the 3090 can scale a lot higher, but all you need is the equivalent of 2 M1 Ultras using 200 watts to somewhat compare to a single 3090 using 300 watts with that thought. If that was doable, then you have a 40 core CPU and GPU together this becomes a even more lop sided when thinking of enterprise expenses with use of power grids.
 
  • Like
  • Haha
Reactions: CWallace and Sanme
RTX3090 energy cost per year including host machine: £981/year

M1 Ultra energy cost: £148/year

Even if it’s half the speed that’s a win.
It is not. Time consuming tasks like machine learning or rendering take long. Software specialists are somewhere between €100 and €200 an hour. AI specialists or people working at Apple or Google earn somewhat around $200K (can also be 100K or 300K).
So, we don‘t care about £981/year when a task takes 24hours instead of 12 hours to complete.
 
Last edited:
You missed the point. Apple could risk hurting their integrity if its core audience found out that the chart was off. When in fact they don't need to display such a chart in the first place given what they have already accomplished.
I don't think the charts are going to make people who are already pre-disposed to Apple products think Apple has lost integrity.
 
I think Apple has to try to compare themselves favorably to the RTX 3090 because anyone that is going to try to build a comparative powerhouse to the M1 Ultra Studio is probably going to be at least considering the RTX 3090. So a little tomfoolery in the graphics (not new to them). Once you take a closer look you find out that you can probably do better on your own, even with a RTX 3090, taking into account power concerns of course. Also not new a new concept - you could always do better on your own and most of the time for less money.

Yet the whole thing is moot because people will generally stick with their platform of choice. In this way Apple really only competes against themselves. If you are a Mac Pro person and always have been Apple would have to be particularly careless to lose you.
 
Last edited:
  • Like
Reactions: CWallace and dspdoc
I’ve always felt apple’s charts and tests were bullcrap. Now it just appears they’re flat out lying.
They're not lying, Apple just cut off the top half of the chart where the 3090 keeps going. The M1 Ultra does beat it out at lower wattage. So misleading, yes, are they lying, no, technically it does beat it at performance per watt.
 
I doubt Intel will. Nvidia most likely but it all depends how they all adapt as x86 seems to be the end road here.
ARM is the future so lets find out in few years time.

Looking at iPhone and iPad vs competition I think Apple will overtake Nvidia eventually.

One thing is sure - the future is exciting for us customers.

Maybe but Intel's and nVidia's products will be insaner if you are looking at raw power. Not every buyer is that bothered by energy constraints. Energy is money but so is time.
 
Not sure how some people expected the M1 Ultra to have exactly double the performance of the M1 Max while there were tests last year showing that M1 Max with 32 GPU cores was not exactly twice as powerful as the M1 Pro with 16 GPU cores.
 
My bet: an engineer was trying to make a point about performance per watt and a marketing drone ran way the hell the wrong way with it
 
A interestedly take away is from the green standpoint, Apple is achieving a lot with far less electric usage. Yes the 3090 can scale a lot higher, but all you need is the equivalent of 2 M1 Ultras using 200 watts compared to a single 3090 using 300 watts. Considering with that you are getting a 40 core CPU and GPU together this becomes a even more lop sided when thinking of enterprise expenses with use of power grids.
Historically linked GPUs a drop of 50% performance for each GPU added. Such that if GPU1 is 100 then GPU1 + GPU2 is 150, and GPU1 + GPU2 + GPU3 is 175. Apple complex bridging appears to allow them to get near 100% performance added, but linking two M1 Ultras would see 75% of the performance of a 3090. Three M1 Ultra, at 300W would see 82.5% of a 3090. Normalizing for performance the M1 Ultra tends towards infinite power.
 
People should not be discounting Apple's chart based on a couple of benchmarks by one online site. We don't have the data on which Apple's chart is based. Once we have more benchmarks, including what goes into the chart, we can decide if Apple was being untruthful or not. They might be but they might not. Jumping in and commenting, "Apple was lying" is making a lot of assumptions.
 
  • Like
Reactions: epirali and SFjohn
They're not lying, Apple just cut off the top half of the chart where the 3090 keeps going. The M1 Ultra does beat it out at lower wattage. So misleading, yes, are they lying, no, technically it does beat it at performance per watt.
But at 500W the 3090 performed twice as well as the M1 Ultra. This can't be extrapolated from the graph because the 3090's performance is showing decreased performance gains per watt at 300W. Are we to believe the 3090 performance per watt is a cubed function?
 
The only thing funny about this is how people are discounting Apple's chart based on a couple of benchmarks by one online site. Further, we don't have the data on which Apple's chart is based. Once we have more benchmarks, including what goes into the chart, we can decide if Apple was being untruthful or not.

Spot-on. People here are so quick to hurl a "gotcha" before even reading and soaking in what's being presented and compared. Another bucket-of-chum-in-the-water-fest.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.