M1 Ultra Doesn't Beat Out Nvidia's RTX 3090 GPU Despite Apple's Charts

OK BUSTED! The graphic above does not show the actual caption used in the Verge article. If you read the article, the 3090 is compared to the M1 Ultra for running OpenCL (software deprecated on the Macs for years). Not surprising at all that software that is obsolete on Macs would not perform as well as on the Nvidia. also not surprising that people at the Verge couldn't figure that out at all. A bit amazing that the author here did not point that out.
Whee you are pushing a story designed to generate controversy within the community, any facts that go against this are ignored.

This article is about pushing a narrative, not pushing all of the facts.
 
The M1 Max in the MBP is already faster than a RTX3090 in a number of areas where the Mac Studio is marketed. The disparity will be even greater with the Ultra.



Professionals don’t sit and run Geekbench all day long.

After seeing this it’s clear that a lot of the debate is moot. Even the M1 Max in an MBP can smoke an RTX3090 in practical day to day workflows because of the total package Apple puts together.

Ultimately that’s what matters to me personally, even though I was a tad disappointed at the original story here.
 
This is absolutely the truth, and it beggars belief that an outlet with the Verges resources couldn’t even bother to do the minimum amount of research to figure this out. That quote is gonna reach a lot of Mac haters and influence their perception of the Mac Studio as a performance machine. Unfortunately we’ll have to wait for another five years to look back at this review as the trash it is. Same as the Zune iPod killer ones that were all the rage back in the day.
The Mac haters do not care. They will just cherry pick articles, regardless of the misinformation contained within. Just to hate on Apple even more.
 
Well if we are talking about what could be, remember that RTX 3xxx are made from Samsung 8nm. The upcoming RTX 4xxx will be made using TSMC 5nm.
Apple still needs a couple more years to really solidify their dominance on performance in the chip game.
 
Yeah, that's not even a consideration if you are using your machine for work.
Now that you mention this, it makes things like the Studio Display “being too expensive” argument somewhat have a lot less weight…
Paying $1K a year for electricity (and maybe more on an expensive or colder than usual year) isn’t that far from paying $1.5K for a screen that will last (at least in my case the older TB displays have been ongoing for 10 years).

Assuming those electricity costs are right, the Mac Studio will pay for the screen itself.
 
Too bad you didn’t show their Shadow of the Tomb Raider numbers, as I thought they were interesting. The Mac Studio had 96 FPS at 1440p versus 114 FPS for the 3090. So doesn’t that mean the M1 Ultra was getting 84% of the FPS of the 3090 while running a non-native game which was written for Intel Macs? That’s crazy. That is way better than I would have expected, or has Tomb Raider actually been ported to the M1? Too bad they didn’t test the 3090 at 4k, since the Ultra also did pretty well there at 60 FPS and I’d be curious to see where the 3090 sets the benchmark.
 
Apple is making it’s own dGPU’s for future Mac Pro’s. Code name “Lifuka”. At every step, there’s naysayers saying “it’s impossible” yet Apple continue to redefine what’s possible. Just 18m ago, the naysayers were proclaiming “there’s no way a phone SOC can handle desktop workloads”…..yet here we are.
That rumour appears more than one year ago and has never been corroborated. It's much more likely that it's another class of chips, rather than a GPU.

Saying SoC can't handle desktop workload isn't exactly like saying it's nearly impossible for SoC/SiP to handle dGPU. Why? Because the SoC/SiP architecture is nothing like your traditional CPU or CPU cores. They're essentially designed to avoid having to have dGPU. Adding a dGPU or even an eGPU takes away a lot of the advantages of that chip design, e.g. unified memory, thermal, and compactness. It basically makes them more like Intel x86 chips. It's a regression in many ways.
 
Just watched the video and I have to say it was poorly presented. They kept talking about how fast the device is on photo/video work as well as After Affects; however, when it comes to M1 ultra vs RTX3090 comparison, they suddenly switched to Geekbench compute and Shadow of the Tomb raider? Seriously? Even M1 Max was competitive against the likes of RTX3070 or RTX3080 at certain workloads and not at others. Why is is so difficult to show where a chip is good at and where it is not?

I have been using M1 Max MacBook Pro and I also have a miniITX tower PC (Ryzen 5900X & RTX3060Ti). MacBook is noticeably faster during timeline scrubbing, adding effects, exporting in DaVinci Resolve but RTX3060Ti is miles ahead in Blender due to OptiX. Considering the fact that Mac Studio is a few times smaller than even my miniITX system, it is obvious that even the base M1 Max version is a device of its own kind.
 
You clearly do not understand Apple's business model. High end Macs are used to create content. And part of that content is what people consume on Apple's real money makers: iPhone and iPhone peripherals. The high end macs thus serve to supply the needs of the cash cow.
Gaming on high end Macs is not part of that model and thus irrelevant to Apple. It is not a profitable venture.
Look at the name of the latest Mac. In a studio you produce, you work, and gaming is not part of that environment!
I don't follow what you're saying. Why do you restrict gaming to "high end Macs" when Apple touts their M-series chip as state-of-the-art and ahead of the curve than most Intel chips on the market today?

Apple doesn't have just one business model. It has many. Their second most profitable revenue stream is services and it doesn't fit into what you're claiming.

And you do realize Apple has their own gaming service called Apple Arcade? So how's gaming irrelevant to Apple? How is gaming not profitable? Do you even know what you're talking about?

Lastly, ever heard of a gaming studio?
 
M1 Ultra is more powerfull in GFX Bench, Purebenchmark, Adobe premiere,... Not in Geekbench M1 does not have enough time to rump up frequency...
 
Too bad you didn’t show their Shadow of the Tomb Raider numbers, as I thought they were interesting. The Mac Studio had 96 FPS at 1440p versus 114 FPS for the 3090. So doesn’t that mean the M1 Ultra was getting 84% of the FPS of the 3090 while running a non-native game which was written for Intel Macs? That’s crazy. That is way better than I would have expected, or has Tomb Raider actually been ported to the M1? Too bad they didn’t test the 3090 at 4k, since the Ultra also did pretty well there at 60 FPS and I’d be curious to see where the 3090 sets the benchmark.
I think the benchmark was false. My guess is they were running the M1 Ultra at different settings since it can't do ray tracing well.
At beyond the highest preset in Tomb Raider and not using DLSS or ray tracing at 4k resolution my RTX 3090 gets about 100 fps average in the benchmark.
There is no way an M1 Ultra comes close to a 3090 in games and it probably falls behind a 3060.
But the M1 Ultra's strength is not gaming and it's power delivery is outstanding for what it is.

If you want high end performance in gaming you get a normal AMD or Nvidia video card.
 
Last edited:
That rumour appears more than one year ago and has never been corroborated. It's much more likely that it's another class of chips, rather than a GPU.

Saying SoC can't handle desktop workload isn't exactly like saying it's nearly impossible for SoC/SiP to handle dGPU. Why? Because the SoC/SiP architecture is nothing like your traditional CPU or CPU cores. They're essentially designed to avoid having to have dGPU. Adding a dGPU or even an eGPU takes away a lot of the advantages of that chip design, e.g. unified memory, thermal, and compactness. It basically makes them more like Intel x86 chips. It's a regression in many ways.

You should maybe have a look at a video I have already linked:

 
I don't follow what you're saying. Why do you restrict gaming to "high end Macs" when Apple touts their M-series chip as state-of-the-art and ahead of the curve than most Intel chips on the market today?

Apple doesn't have just one business model. It has many. Their second most profitable revenue stream is services and it doesn't fit into what you're claiming.

And you do realize Apple has their own gaming service called Apple Arcade? So how's gaming irrelevant to Apple? How is gaming not profitable? Do you even know what you're talking about?

Lastly, ever heard of a gaming studio?

Apple already makes more money from gaming than Sony, MS, Activision and Nintendo combined. Mobile gaming is where the growth is, recording 9% growth YOY, whereas desktop gaming is shrinking at -4.5% YOY. Desktop gaming is already smaller than Apple’s mobile gaming market share and high-end is even smaller.…the most common gfx card on steam is a 1080Ti. This was all revealed in the Epic V Apple court case. Apple has more than 1Bn iOS devices already in circulation vs a tiny fraction of Macs, with an even smaller percentage of that able to game and an even smaller amount of that with users wanting to game. Apple is heading where the market is going, not where it has been. AR/ VR experiences is the next big market and that’s where Apple will be focussing next.

Mac users don’t generally game and those that do have the cash to buy either a console or dedicated gaming PC as well.
 
The Mac haters do not care. They will just cherry pick articles, regardless of the misinformation contained within. Just to hate on Apple even more.


Here is some comparison from YT:

Screen Shot 2022-03-18 at 15.38.43.png
Screen Shot 2022-03-18 at 15.38.55.png


Screen Shot 2022-03-18 at 15.39.21.png


APPLE didn't cheat or give false data comparison.
 
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
Which is ofcourse what happens when you want industry-leading performance.
 
Here is some comparison from YT:

View attachment 1975667



APPLE didn't cheat or give false data comparison.
The gpu benchmark for Tomb Raider is deceptive. At 1440p high settings the 3090 is bottlenecked by the CPU so the 3090 is not being fully utilized.
Double the resolution from 1440p to 4k and the frame rate only drops about 33% from 153 to 100.
Would be more accurate if you compare 4k M1 Ultra to 4k 3090.

index.php


Other than gaming though the M1 Ultra is a great performer especially with how little power it uses.
 
Apple already makes more money from gaming than Sony, MS, Activision and Nintendo combined. Mobile gaming is where the growth is, recording 9% growth YOY, whereas desktop gaming is shrinking at -4.5% YOY. Desktop gaming is already smaller than Apple’s mobile gaming market share and high-end is even smaller.…the most common gfx card on steam is a 1080Ti. This was all revealed in the Epic V Apple court case. Apple has more than 1Bn iOS devices already in circulation vs a tiny fraction of Macs, with an even smaller percentage of that able to game and an even smaller amount of that with users wanting to game. Apple is heading where the market is going, not where it has been. AR/ VR experiences is the next big market and that’s where Apple will be focussing next.

Mac users don’t generally game and those that do have the cash to buy either a console or dedicated gaming PC as well.
I love how you're trying to cover all the bases against a counter-argument and ended up diluting or even contradicting what you're trying to say in the first place.

Nobody is denying the money is on mobile gaming. Have I ever said anywhere that that's not the case?

How is that even remotely related to your initial claim though? The one about Apple's supposed business model? Yeah, sure, Apple follows the money, just like any other company. I'm glad you worked that one out.

You never explained why gaming isn't a profitable venture. You actually contradicted yourself by citing the case of mobile gaming. So gaming really IS profitable.

Note I never said "desktop gaming." I said gaming, period.

Your point about AR/VR being the next big thing really defeats what you said before. First of all, read the thread, I echoed the same sentiment. Second of all, AR/VR is really device agnostic. You can do AR/VR on either a handheld, a tablet, a laptop, or a desktop. You really ended up agreeing with me because that's what I've been saying all along in this thread (and elsewhere)—that Apple should take gaming seriously by getting the big studios involved, all the while try to demonstrate the full potential of their M-series chips by commissioning a short graphics-intensive game.
 
You should maybe have a look at a video I have already linked:

You should go through his videos and count how many predictions he (and his brother) got wrong. His review videos are informative but I wouldn't rely on Max Tech to predict tech trends. Here is one of his earlier videos claiming ASi won't have any dGPUs. Well, he has to pick a side, doesn't he? He can't just keep refuting his earlier predictions/positions.

Lastly, mate, he mentioned the exact rumour article I told you about. The one from 2020. Again, it's uncorroborated and he was wise to spend no longer than 10 seconds on it.

You have to dig down to the science. It's nigh impossible to have a dGPU with ASi's architecture. Doing so requires a new class of chips and it's not likely to be the M2. And it's not likely to be in Sept this year.

I'd love to be proven wrong. With the blackouts in Taiwan, the two-plus years of a pandemic, worldwide labour shortage, ever-increasing transport cost, and a war in Europe, I'd be extremely impressed if Apple can manage to get a new class of chips out like the rumoured M2 on the market and put them in Mac Pro no less. And an M2 with dGPU?

If Apple can manage to do all this in 2022, nobody will be able to catch them for the next 10 years. But again, it won't happen.
 
Last edited:
Apple GPU boasting for the M1 chips has been their most mis-leading PR for many years IMHO.
This has been the best marketing move in Apple history. 14 pages and counting about the M1 Ultra GPU just on this thread. And there are inconclusive conclusions with every data point presented. Every single article that pits the TOP of the line RTX 3090 GPU against this chip, puts it in the same boxing ring As the Ultra as COMPETITORS.

You might think that because the 3090 is a more powerful in terms of raw performance it diminishes the Ultra…but that would be the real trap. The aim of the charts Apple presents is to get the tech media/haters to do the comparisons. The trap gets the TARGET audience to assess the totality of both products.

That also mens the target audience gets to ask some basic value proposition questions.

Yeah…the ultra is a bit/lot slower than that beastly 3090…but it is a whole computer.

That CPU stomps! Oooh its really not that big…I can’t hear it…is it on…?!?

I can build something more powerful than that for cheaper! I’ll try and sell these old parts to pay for it...what do you mean this used GPU is a dime a dozen now on eBay?

and on and on…

Success!!!!
 
I believe Nvidia and Intel really need to be worried with the benefits and merits of unified architecture. First batch is already impressive. Future surely seems bright on the Mac side. Can’t wait to see MP.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top