Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Actually the point whether valid or not is the performance you can get UP TO a certain amount of power usage. For that range the Ultra chip has much more performance. So its more like saying for 0 to 40 an electric car will beat a McLaren, but obviously 40 McLaren can blow it away. Doesn't mean that in "speed restricted" uses an electric car isn't better.

So it is misleading but not inaccurate. Its even titled "Performance vs Power."
The problem here is that if you take this method of benchmark testing as valid and useful? The single purpose microprocessors out there in dedicated devices will absolutely crush any M1 series processor.

You don't buy a 100 watt lightbulb and then talk about how efficient its light output is at 40 watts....
 
  • Like
Reactions: bilbo--baggins
The ultra is a joke it’s not needed it’s just for the fact they can say it’s faster and charge double the m1 max is enough for anything macs can do, the only reason 3090 is needed is for gaming and macs can’t game

The Verge did a comparison with "Shadow of the Tomb Raider" and even though hobbled by having to run using Rosetta 2 emulation, the Ultra was within 25fps of the 3090.
 
The Verge is not very good at benchmarks, and, they talk about that specific Geekbench 5 Compute one, @MacRumors I would wait for the Anandtech review with multiple tests and data points before assessing how the M1 Ultra and the RTX 3090 stand relatively to each other. As someone mentioned, in WoW the M1 Ultra beats it the RTX 3090 by a few fps, WoW being actually optimized for AS...

Of course, Apple should've specified which benchmark they used for that chart, as I'm sure they weren't lying per se.
Agreed. Lets wait for a more reputable source before jumping to conclusions.
 
Apple does list the models they compete against in the fine print at the bottom of the charts. In the case of that chart, the competition was a PC with Intel Core i9-12900K using DDR5 memory and equipped with a 3090.
What’s your point? They clearly missed the mark here by a large margin. So much so that it brings ethics into question. Stop being an Apple apologist. This is coming from an avid Apple supporter.
 
Forget the actual results. The worrisome part for Nvidia is Apple has a chip that is quite good and might overtake them at sometime even though it is not a discrete graphic card. I mean this stuff is built into the chip. Everyone used to laugh at built-in graphics. Not so much anymore.
I completely agree. I use a desktop with 2 x 3090s and it’s a beast but it’s impressive for Apple to be this close with an integrated GPU. Having said that, the 4090 is rumoured to be twice as powerful as the 3090 (and consume almost twice as much power) so maybe that gap between the two will only increase.

Personally, as a macOS loving Windows user, it’s great to see Apple pack this much GPU power into a single chip. Apple have come a long way.

Excited to see Redshift benchmarks for the Ultra too.
 
And what sort of things would a person using a Nvidia RTX 3090 GPU do? Where it mattered?

Play a game?
 
Yikes! Yeah no kidding. This doesn't look good for the M1 Ultra at all.
And this is why I was yawning... it's just two M1 Maxs tied together... and they're fused EXTERNALLY (as there is a connecting header on each M1 Max. "Siamese Twins" (connected at the head) might be a way of putting it. As good as the linkage may be, it's not the same as if the M1 Ultra were a singular organism, designed that way, from the onset.
 
This is incredibly disheartening. I was really looking forward to Apple finally toppling the powerhouse, and it looks like they fell short.

Yes, watt-efficient computing is cool and cost-effective. But when you need performance, you don't care that your system is drawing more power. You just get the task done.

Right. I can make a gallon of gas get me much further on a moped but I choose to drive a car. Sometimes I need to be able to haul more things or people than I can balance on a much-more, power-efficient moped. ?

Yes, I could get in a forum and brag about miles per gallon until I'm blue in the face... but sometimes real power does matter more.

I think what we have here is not a need for even more cores but ways to make cores run faster. How that is historically done is with more power. Perhaps Apple ultimately has to jack the power-per-core to get Silicon to actually outperform Wintel hardware along these lines? Or perhaps Apple finds another way and Wintel builders then feel the pressure to cut power but hold to their performance objectives?

Again, BOTH sides win that kind of competition:
  • Go PC makers go. Motivate Apple to try harder.
  • Go Apple go. Motivate PC makers to build more efficient hardware.
Win consumers win... no matter what kind of computer you want/need to buy.

Personally, I look forward to my Studio Ultra arriving next few days... even if select PC hardware would prove to be more robust at some heavy lifting tasks. With Bootcamp deprecation, I'll need to add a new PC to the mix soon too... so perhaps I can make the most of BOTH worlds... because both sides feel some real "heat."
 
Last edited:
  • Like
Reactions: ChrisMoBro
This is incredibly disheartening. I was really looking forward to Apple finally toppling the powerhouse, and it looks like they fell short.

Yes, watt-efficient computing is cool and cost-effective. But when you need performance, you don't care that your system is drawing more power. You just get the task done.
Actually you do care about power. Power is how you’re charged in data centres. The computer is the really cheap bit these days.

Also if you want compute like this, you rent it. Because like hell anyone wants to spend three months a year with a 100% duty cycle 3090 being RMA’ed because something exploded.

(I know someone who runs GPU miners and it’s a **** show)
 
  • Like
Reactions: Stryder541
Still one picked benchmark. Lets wait to see all the benchmarks or even better yet, lets see how it performs in the world for what Studio is made for, before everyone makes their mind up.
 
People want a CPU + GPU chip to compete against a fully dedicated GPU. Not sure what Apple's marketing team were smoking when they did that chart but people actually thinking it will be true it's even more funnier.
Apple's chart were always tricky and who knows in what scenario they showed that M1 Ultra is better than 3090. Never trust those charts in the apple events. Wait for real benchmarks.
 
  • Like
Reactions: flogician
Yikes! Yeah no kidding. This doesn't look good for the M1 Ultra at all.
So, if Apple decided to actually let the chip rip at even 200 watts of power draw how do you think they would compare. The other chip would set the Studio on fire or have the fans blowing so loud you couldn’t have a conversation let alone record audio.
 
The ultra is a joke it’s not needed it’s just for the fact they can say it’s faster and charge double the m1 max is enough for anything macs can do, the only reason 3090 is needed is for gaming and macs can’t game
This is the second thread I've seen you say that this sort of power isn't needed. You've clearly never worked in the 3D, video, VFX or science industries, where software and simulations will use all the power you can throw at them. If the 3090 is just for gaming, why do professional 3D artists fill up their machines with them? Octane, Redshift etc. will use as many cards as you give them.

You may not need the power, but industry professionals definitely do.
 
This is particularly relevant for desktop class computing. Most people are not concerned with power consumption for non-portable devices. As expected, it is going to take YEARS for Apple to truly compete. And even then, I won't be surprised if eventually Apple caves and has dedicated graphics options.
It's difficult for SoC design to maintain unified memory (CPU cores & GPU accessing the same memory) with a dGPU. It's nigh impossible in fact. Apple must live or die by its own chips now.
 
  • Disagree
Reactions: the8thark
Geekbench compute seems to be poorly documented as to what it actually does, and what software it uses to get there. Now I am not saying it is invalid, because I don't know, and neither does anyone at the Verge. there seems to be a reason that Apple uses Metal (though I can't find the 3090 in the charts). so my broad question to Verge is the compute benchmark actually optimized for Apple silicon, or simply ported over to run using perhaps deprecated software? The actual only valid test would be to run something that does exactly the same thing, but in software optimized for the platform. for example, one clown recently say oh ah the M1 is not as fast as (insert chip here) because blender optimized for that chip runs faster on it than it does on the M1, not taking into account of course that Blender actually has a version that uses metal and is therefore supposedly optimized for the M1. Now a valid test would be performing something on both optimized platforms.

Why is this hard?
 
The ultra is a joke it’s not needed it’s just for the fact they can say it’s faster and charge double the m1 max is enough for anything macs can do, the only reason 3090 is needed is for gaming and macs can’t game
I need my Mac for visual effects work, so yes, it's VERY needed. If we had anything 10x faster than what we currently have now, it would be VERY welcomed. I want to work faster than my computer can compute.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.