Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
That’s exactly what I took when I watched the presentation live.

Apple clearly did slight of hand and knew they’d generate headlines with the case that their cpu/gpu out paces a discrete $2k gpu just that it doesn’t as the nvidia part gets much more performance by chewing more electric.

Apple did stress multiple times that their part did more work at less watts and was keen to stress what work loads their part could go to, 16 x 8k streams etc etc.

2 things are crystal clear

1: Apple are benchmarking against the discrete market with their imbedded gpu
2: Apple stated the Mac Pro is still to come
 
  • Like
Reactions: CWallace
Unsurprising, but still a bit disappointing that Apple felt they had to pit their device against a 3090. They could have chosen a 3080 and it still would be impressive.

That being said, I'm more interested in performance in real-world workloads. I hope we see a bit more of that in the coming weeks.
Agree and will add “sustained performance”.
 
As usual marketing was making use of ultimate scenarios and exaggerating where possible.
Nothing new.
Especially for Apple.

Graphics has always been a performance Achilles heel with Apple computers. The Highest-end grfx AMD / nVidia cards were never available for Power Mac or Mac Pro.
Some decent mid-high range versions were, but often the drivers were not optimal.

The M1 (Pro / Max) are superb, superb mobile CPU / GPUs.
For desktops, the M1 Ultra CPU is "excellent", and the GPU "very good", so the combo is not insanely great, but simply very very good.

But the whole package (the hardware, aesthetics, noise, macOS, power consumption, etc.) makes the Mac Studio (Max or Ultra) a very attractive Prosumer desktop computer IMHO.

Do you want the highest GPU performance? You simply need to get a high-end PC / workstation with the Windows-only high-end grfx card. But you're stuck with a high energy-hungry big box running Windows OS.
This is the exact conclusion I came to today.
 
This is the exact conclusion I came to today.

Actually disagree with this conclusion, to the point of thinking it’s silly.

The Studio will be used by professionals across all industries for all levels of work. This ‘prosumer‘ thing is lazy shorthand marketing nonsense. Always has been. You can’t deride Apple for their marketing then write posts which are even more lamentable and be expect to be taken seriously.
 
  • Like
Reactions: George Dawes
Here is some comparison from YT:

View attachment 1975666View attachment 1975667

View attachment 1975668

APPLE didn't cheat or give false data comparison.
Another datapoint from dpreview.


M1max generally destroys the rtx3080 mobile in real world application benchmarks despite having much lower Geekbench compute scores
 
Another datapoint from dpreview.


M1max generally destroys the rtx3080 mobile in real world application benchmarks despite having much lower Geekbench compute scores

And there’s videos showing the same thing, even in the MBP Max.
 
  • Like
Reactions: Mr.PT
And there’s videos showing the same thing, even in the MBP Max.
That’s why I’m sure Apple are comparing real world applications in their comparison to the rtx3090 rather than Geekbench compute.
 
The Mac haters do not care. They will just cherry pick articles, regardless of the misinformation contained within. Just to hate on Apple even more.
That is something both hater and fanbois are both guilty of, including Apple fanbois

Companies in general (including Apple) should rely on the strengths of their products and not have to rely on
- ads that make the competition look bad (ahem Microsoft and Samsung)
- misleading graphs that cherry pick and not tell the whole story

Fact is performance per watt is relevant but definitely less relevant in a desktop machine (for most consumers - everyone has different priorities)

I just feel it's better to tell the full story instead of treating customers like ignorant idiots that don't know any better
 
Forget the actual results. The worrisome part for Nvidia is Apple has a chip that is quite good and might overtake them at sometime even though it is not a discrete graphic card. I mean this stuff is built into the chip. Everyone used to laugh at built-in graphics. Not so much anymore.
Hehe, no.

Apple could do a great discrete chip if they wanted to though, that could be interesting.
 
That is something both hater and fanbois are both guilty of, including Apple fanbois

Companies in general (including Apple) should rely on the strengths of their products and not have to rely on
- ads that make the competition look bad (ahem Microsoft and Samsung)
- misleading graphs that cherry pick and not tell the whole story

Fact is performance per watt is relevant but definitely less relevant in a desktop machine (for most consumers - everyone has different priorities)

I just feel it's better to tell the full story instead of treating customers like ignorant idiots that don't know any better
There’s a difference here…Apple is very particular about their ad. The Apple ad says “Think Different“.
 
The ultra is a joke it’s not needed it’s just for the fact they can say it’s faster and charge double the m1 max is enough for anything macs can do, the only reason 3090 is needed is for gaming and macs can’t game
Put sufficient performance in a computer that most people can afford and this will no longer be the case. Creating games for Mac is a case of supply and demand. Not some inherent inability for Macs to run games.

Once Apple has rolled out the entire M1 lineup, they need to start incentivising the gaming industry - spend a bit of money getting some hotly anticipated AAA games on Apple Arcade and highlight what performance different Macs across the range cost and how that compares with PCs.
 
Fact is performance per watt is relevant but definitely less relevant in a desktop machine (for most consumers - everyone has different priorities)

This narrative is long overdue a change. Energy consumption has to be marketed heavily going forward, with other considerations all secondary. The recent pushback on cryptocurrency should have woken everyone up.

How anyone can exist in 2022 and think otherwise is appalling. Alongside FPS overlays we should have wattage and wattage ratio overlays.
 
  • Haha
Reactions: lysingur
The Nvidia card we are comparing is £2000 on its own!
lol...
I dont think any integrated small box machine, whether windows or Mac is going to compete with that. Its physically not possible I'd imagine.

What I find interesting about all these debates is that I buy Apple for Mac OS and iOS. I buy it for the heart of the system and unless performance is absolutely dire vs Wintel I would always choose Mac.

Reviewers like the verge make it seem like the OS is an inconsequential choice!
They compare dell laptops to MacBooks and I'm like why are you doing that? who does that?
How can you spend 10yrs of your life basing your workflow around Mac OS and then because a PC is 20% faster at graphics you then buy a PC?
Thats a very twisted mentality. (The same is true going from PC to Mac for relatively small gains).

OS over everything!
 
The Nvidia card we are comparing is £2000 on its own!
lol...
I dont think any integrated small box machine, whether windows or Mac is going to compete with that. Its physically not possible I'd imagine.

What I find interesting about all these debates is that I buy Apple for Mac OS and iOS. I buy it for the heart of the system and unless performance is absolutely dire vs Wintel I would always choose Mac.

Reviewers like the verge make it seem like the OS is an inconsequential choice!
They compare dell laptops to MacBooks and I'm like why are you doing that? who does that?
How can you spend 10yrs of your life basing your workflow around Mac OS and then because a PC is 20% faster at graphics you then buy a PC?
Thats a very twisted mentality. (The same is true going from PC to Mac for relatively small gains).

OS over everything!
Erm, what? I am a director of a software and design house and we make power/performance considerations with our devices all the time.

Most of our developers are platform-agnostic; I myself have moved to a MBP 14" with M1 Pro after using Windows almost exclusively for more than a decade becaause of how much faster it can simulate and run large code projects with the added benefit of not being plugged into the wall for much of the work day. It is saving us significant amounts of money on monthly energy bills. Having said that, I prefer Windows as an OS because of my familiarity with its use but I'm happy to use MacOS with how much it has sped up my workflow. The trade-off is worth it to me.

Seeing a comparison between the latest similar Windows machine and the latest MacOS machine is a very useful process for our business decisions and to suggest otherwise is nonsense. The more information a buyer has to make their decision the better.
 
This narrative is long overdue a change. Energy consumption has to be marketed heavily going forward, with other considerations all secondary. The recent pushback on cryptocurrency should have woken everyone up.

How anyone can exist in 2022 and think otherwise is appalling. Alongside FPS overlays we should have wattage and wattage ratio overlays.
I agree. Apple can actually help this by emphasising this more but they chose to remain vague
 
Actually disagree with this conclusion, to the point of thinking it’s silly.

The Studio will be used by professionals across all industries for all levels of work. This ‘prosumer‘ thing is lazy shorthand marketing nonsense. Always has been. You can’t deride Apple for their marketing then write posts which are even more lamentable and be expect to be taken seriously.
That’s fine for you to disagree.
My studio leans heavily on the GPU and now feel I was a conned by the graphs. It is not a pro machine for me if it cannot get close to a 3090.
 
The problem will the constant comparisons of Intel CPU's versus Apples Mx CPU range and NVIDIA GPU's versus Apple Mx range is that it is totally dependant on the wattage used by each chip to achieve a certain performance point. Others have pointed out the similar benchmarking approach used when there have been comparisons made between electric car v sports car, or when there have been Formula 1 car v fighter jet. One will always out pace the other in the first few seconds of the start, then they come neck and neck and then one kicks in the power and off they go.

Apple M range is designed for lower power use, thats their design purpose because that is the due to the ARM technology the chips are designed on. ARM technology is all about performance over power. The Mx range of chips will always beat their competitors on raw power, that's a given but when the raw performance of Intel and NVIDIA chips kick into high power, they leave the Mx range of chips behind but that is to be expected because that is not what the Mx range chips are designed for.
 
  • Like
Reactions: fisherman188
Erm, what? I am a director of a software and design house and we make power/performance considerations with our devices all the time.

Most of our developers are platform-agnostic; I myself have moved to a MBP 14" with M1 Pro after using Windows almost exclusively for more than a decade becaause of how much faster it can simulate and run large code projects with the added benefit of not being plugged into the wall for much of the work day. It is saving us significant amounts of money on monthly energy bills. Having said that, I prefer Windows as an OS because of my familiarity with its use but I'm happy to use MacOS with how much it has sped up my workflow. The trade-off is worth it to me.

Seeing a comparison between the latest similar Windows machine and the latest MacOS machine is a very useful process for our business decisions and to suggest otherwise is nonsense. The more information a buyer has to make their decision the better.

By being platform agnostic you are working at the lowest common denominator all the time. You aren't making use of either systems to their fullest extent.

I understand that as a software house that makes sense to you in terms of total cost saving or additional performance boosting in particular areas per developer seat. However, in my opinion, people who have invested years in using the features each OS provides fully will find speed and performance differences too negligible to warrant switching platform back and forth.

For example, a typical user may have external drives that are MacOS formatted / Windows Formatted, photo's on iCloud, tooling like Final Cut Pro or Logic etc etc.

Yes using a c++ compiler and a few cross platform tools may allow you to avoid the benefits of each OS, but on the whole that is not how these systems are used by the majority of users.
 
If these Keynotes were classified as adverts in the UK, which by extension they're long-form marketing tools, the ASA would be coming down hard on Apple right about now.
 
That’s fine for you to disagree.
My studio leans heavily on the GPU and now feel I was a conned by the graphs. It is not a pro machine for me if it cannot get close to a 3090.
Anyone who knows a thing or two about computer hardware would know an M1 Ultra had no chance of matching a 3090 or even a mid range 3070.
The power budget just isn't there.
It's like expecting a game console with a 200 watt budget to match a top end gpu.

Never trust benchmarks before hardware is released.
Rumors are also a dime a dozen.
Just like rumors saying the 4090 will consume 800 watts which is BS.
The more you bump the wattage the harder it is to cool the chip and the more problems you run into with power delivery through vrms.
 
This has been the best marketing move in Apple history. 14 pages and counting about the M1 Ultra GPU just on this thread. And there are inconclusive conclusions with every data point presented. Every single article that pits the TOP of the line RTX 3090 GPU against this chip, puts it in the same boxing ring As the Ultra as COMPETITORS.
And every article will show that Apple is lying, and that their PR is nowhere near the truth. This paints a really false image of Apple, which is not good long term. People will quickly forget that the Ultra was compared to a 3090, but they will remember that Apple was lying and that the Ultra is slow (In comparison).

You might think that because the 3090 is a more powerful in terms of raw performance it diminishes the Ultra…but that would be the real trap. The aim of the charts Apple presents is to get the tech media/haters to do the comparisons. The trap gets the TARGET audience to assess the totality of both products.
They would have done those things anyway, as they always do.

Yeah…the ultra is a bit/lot slower than that beastly 3090…but it is a whole computer.
Well, the 3090 is not just laying on the floor doing these things by itself.. :p

There's no doubt that the Ultra beats the crap out of the 3090 power wise, but that's not the take home message people get. They see Apple was completely wrong and lied.
 
  • Like
Reactions: sudo-sandwich
If these Keynotes were classified as adverts in the UK, which by extension they're long-form marketing tools, the ASA would be coming down hard on Apple right about now.

Why, because ‘the Verge’? Be serious. ASA definitely wouldn’t have any issue with this.
 
I don't get it. Apple should just be honest instead of creating bad PR with a fake chart. Even at half the speed of the 3090, it's still a great GPU, and the workstation is otherwise fantastic.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.