Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Along the way, these 'experts' at the Verge forgot how to read graphs. Apple's graph is a performance per wattage (power). Their marketing from the beginning has always been about power efficiency. This article and Verge 'review' on 'performance' should be ahem...reviewed.
Yeah, I dunno why MacHumors chose that graphic for this article. It has nothing to do with anything.
 
I knew the results were going to have certain stipulations to meet what they advertised, but the highlight was definitely how much less power it used than other chips. So the graph is showing it’s peek performance, matching the competitors at the same performance level and how much power it uses at that level.
 
This is just a reminder there's lies, damn lies, and statistics.... lol

It's also a reminder that you always have to read the fine print or be a victim of fake news.

I'm glad the benchmark tests exist so apples can be compared to apples, no pun intended, in the real world and we don't have to believe the spin of the marketing department
 
The main advantage of the M1 GPU is going to be how it leverages the shared RAM. Very few cards have as much available VRAM as the M1 Ultra can provide, and if software can leverage that then it will have very good performance. I'm guessing it's weaker than Nvidia cards in a lot of the traditional benchmarks because it's not really leveraging its natural advantage, but once programs like Resolve and Final Cut integrate it better then it will be able to compete. Hard to say if the GPUs will ever be as competetive, because a lot is determined by how that VRAM is used.

As some one that works in video, this system is going to be perfect for me, because it eats ProRes like it's nothing, which is possibly even more valuable than all the extra GPU cores.
 
Last edited:
It's been mentioned that Geekbench 5 Compute isn't a valid test for the M-series past the M1 Pro. The tests that are done back to back quickly on that test don't let the clock speeds ramp up fast enough, they finish the task before the freq. reaches the max amount. Tells me a lot about "how good" The Verge's investigative abilities are.
This is absolutely the truth, and it beggars belief that an outlet with the Verges resources couldn’t even bother to do the minimum amount of research to figure this out. That quote is gonna reach a lot of Mac haters and influence their perception of the Mac Studio as a performance machine. Unfortunately we’ll have to wait for another five years to look back at this review as the trash it is. Same as the Zune iPod killer ones that were all the rage back in the day.
 
  • Like
Reactions: jons
RTX3090 energy cost per year including host machine: £981/year

M1 Ultra energy cost: £148/year

Even if it’s half the speed that’s a win.
If you choose to look at it in the same way Apple does.
Whilst power is a valid concern, the things I tend to constantly hear about people that are really crunching numbers, (always regarding RAM and GPU) is this;
I’ll take everything you can throw at it.

Apple marketing is dishonest and like the one you make enable it.
 
We really need to see native M1 triple A games running on Metal to see what this is truly capable of.
Which are never going to happen unless Apple gives them financial incentives or is willing to directly work with the publishers. Which TC has never shown much interest in. He likes his iOS games.

I play with FFXIV a bit and someone made a better wrapper than Squeenix and my M1Max can actually play it decently. Nowhere near as smooth as my Legion 7 5900Hx/3080 though, but that's Squeenix fault as Blizzard at least made a M1 native WoW port.
 
At this point, Apple Silicon will be powerful only for limited tasks and it's a joke. I dont think Apple can compete with Nvidia in GPU market in terms of performance.

I assume that ultra fusion isn't working well.
 
  • Disagree
Reactions: jz0309
some people care. i can afford a ferrari but i drive a toyota because i care more about efficiency than speed.
Let's be honest. You drive a Toyota because you don't need to beat lap times on a race track. When I buy a computer for demanding tasks, like ferrari for racing, I don't need "eco-drive" mode. It's not a Tesla or battery powered computer. I just don't like that Apple is advertising their most powerful processor they've ever made with power efficiency slides in a high-end-desktop-computer. Turns out that something is not as interesting as less heat released to the atmosphere...
 
.....

Bonus for the "working Mac" crowd: now that Bootcamp seems doomed on Silicon, those of us who must be able to run Windows too face the reality of probably needing to buy a PC when our Intel Macs conk. PC hardware makers trying to "beat Apple" means we can get a fantastic Windows machine to revive Bootcamp the old fashioned way.

Personally, I bought a lifetime license for CrossOver. All of my games and other Windows apps work fine. Seems to me this would be the way to go instead of having to boot into a different OS. I have also noticed that my software runs faster than under Windows native.
 
  • Like
Reactions: D_J
Personally, I bought a lifetime license for CrossOver. All of my games and other Windows apps work fine. Seems to me this would be the way to go instead of having to boot into a different OS. I have also noticed that my software runs faster than under Windows native.
I wish CrossOver worked for the apps I use under Windows
 
The Ultra should get closer to twice the Max. Something must be wrong with these benchmarks. It obviously will be below a 3090, but it should have a better showing then this.
 
We use gaming cards for pro apps, as the apps are built using gaming engines [unreal engine specifically]. A gaming GPU performs far better than a quadro for this.

We also use CAD software that would benefit from the quadro, but the reality is the RTX cards perform very well in this area too, hence always using RTX cards in PC [3080ti at present].

We have a mac studio ultra on order. If the GPU performance is as the Verge have suggested, it will be boxed back up for return within a day. It wont take me long to test out a few designs to see where it is at.

I am actually now thinking however, that this is the end of macs in our studio. A bit ironic that the Mac Studio killed the macs in the studio............
Which CAD software do you use? Is it for mechanical design? I use Onshape, and just bought the base M1 Ultra to run it.
 
Compete with what and more important for whom?

The 25.6 non-miners that got a 3090?

Most PC users neither need or can afford that kind of power and would be quite happy with the power of the 2 base configs. Try to match (12th gen Intel, DDR5 and mid to upper tier GPU) and actual retail prices won't be that far off from what Apple asks.

I also don't think "noone care about power consumption" is true. Electricity rates can be pretty high in some places and when you're in an area where you need to run AC against that heat output it adds up fast even on low rates.
Also heat==noise and a quiet computer is a value of it's own.

So yeah, Apple can truly compete TODAY. Wether they will be able/willing to scale up to the highest tier is to be seen.
I get really tired of the "no 3080 or 3090 or NVIDIA in general is the ONLY reason gaming doesn't exist on Mac" talk. I said it before, but guess what? A brand new game in 2022 (Elden Ring) runs VERY VERY well on my GTX 1080. I think the M1 Ultra is at least better than that no? The PC Master Race just goes to the extreme though. You don't need a 3090.

 
It should be possible now that studios are often standardizing on Unreal Engine.
I'm not sure if you are understanding this issue.

It doesn't matter wether Mac is powerful or not. macOS platform itself has 0 popularity for gaming market and there aren't many Mac gamers and therefore, not profitable considering that you need to spend more for time and money. PC is still a King for PC/Console gaming.

There are less than 3% Mac gamers from Steam alone. And now, most of them dont even support Apple Silicon natively. Who would even wish to support Apple Silicon Mac since macOS platform isn't profitable?
 
  • Like
Reactions: cardfan
Is everyone missing the fact that Apple's chart displays "Relative" performance? The RTX was not pushed to its own maximum, but rather the M1 Ultra's maximum, at which point you can see that the M1 Ultra is providing much better performance-per-watt.

Now, push the RTX to its maximum, and you have a different story. That wasn't what Apple was highlighting, though. It was a relative comparison, not absolute.

They didn't lie.
 
  • Like
Reactions: d3bug
The ultra is a joke it’s not needed it’s just for the fact they can say it’s faster and charge double the m1 max is enough for anything macs can do, the only reason 3090 is needed is for gaming and macs can’t game
Huh, that's funny... I game on my M1 Mini every day. The only thing I cannot run at a decent framerate is Star Citizen, but that is really no shocker there. I would bet if I got even the Pro or Max it would run without issue. The Ultra would be overkill for me... but I do love me some overkill once in awhile too... :D
 
I don't think the charts are going to make people who are already pre-disposed to Apple products think Apple has lost integrity.
That is why they put footnote and legal speak to cover themselves up. I honestly don't think a presentation where you introduce a NEW CHIP (which NEEDS to be discussed from a performance perspective) wants to have 15 minutes dedicated to "Okay so here is our test set up, running Monterey XYZ, at X% load, running Software Y, Z amount of time in a room of X size, with AC cooling set to Y degrees........."

Have you watched GamersNexus? They go into so much detail on how they test things. Devoting to all that discussion in a keynote is not beneficial.
 
  • Like
Reactions: IllinoisCorn
The title of the chart says the comparison they are focused on. The trick is stating that they didn't all of the data if you go beyond those power consumption data points. "200 W less power" is the point they were clearly making. This is great click-bait writing. Well done. And also well done by the Apple marketing team for finding the part in the data story where they can talk about what matters to them. I remember them also saying the average home "may save up to $50 year" and that's a lot of power. More than enough for me to respond to anyone that doesn't like me coming to any type of defense of product marketing for a company that if you are here you likely have some respect for. Or you like to be a troll and that's fair too.
Exactly. Apple needs to have a baseline. Pump enough water cooling and overclocking and the 3090 performs even better than what someone else has for their 3090. So that can be seen as "unlocking" the 3090 true potential by overclocking.
 
And this is why I was yawning... it's just two M1 Maxs tied together... and they're fused EXTERNALLY (as there is a connecting header on each M1 Max. "Siamese Twins" (connected at the head) might be a way of putting it. As good as the linkage may be, it's not the same as if the M1 Ultra were a singular organism, designed that way, from the onset.
So.... just like the two halves of the human brain work...
 
  • Like
Reactions: D_J
Forget the actual results. The worrisome part for Nvidia is Apple has a chip that is quite good and might overtake them at sometime even though it is not a discrete graphic card. I mean this stuff is built into the chip. Everyone used to laugh at built-in graphics. Not so much anymore.
That is a good point.
 
Totally impossible. There aren't many Mac gamers and that's the biggest problem.
The biggest problem for????
There are use cases other than gaming that require high performance computers, that’s Apples target … if gaming is the one thing you care about, don’t get a Mac
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.