Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There are other things going on in the world besides games. Personally, I never played, never want to waste my time playing, but for those of you for who it is a hobby, great. But I never once heard anyone say they wanted to by a Mac to play games, seriously, no one ever.
Google circular argument.

People don't say that because there are few great AAA games on Macs.
 
  • Like
Reactions: bilbo--baggins
It is not. Time consuming tasks like machine learning or rendering takes long. Software specialists are somewhere between €100 and €200 an hour. AI specialists or people working at Apple or Google earn somewhat around $200K (can also be 100K or 300K).
So, we don‘t care about £981/year when a task takes 24hours instead of 12 hours to complete.
Yeah I know I hire them. It’s still a major expense. We don’t just run one card in one computer. We can hire three more engineers by reducing the compute cost in the same budget.
 
  • Like
Reactions: LoggerMN
Of course the problem is that you can't use Metal on a non AS package.
Exactly. So he just doesn't know what he's talking about.

The Verge includes Metal benchmarks in their comparison, which is very fair since on the Mac side, the best graphics performance is software that's optimized for Metal. And on the PC side, you use OpenCL. I don't see what the problem is. If you do, please enlighten us.
 
Exactly. So he just doesn't know what he's talking about.

The Verge includes Metal benchmarks in their comparison, which is very fair since on the Mac side, the best graphics performance is software that's optimized for Metal. And on the PC side, you use OpenCL. I don't see where the problem is.

I think these benchmarks are a pain in the backside
 
  • Like
Reactions: hagjohn
Even if the Geekbench compute benchmark is a load of crap; all I know is the W6800X Duo in my Mac Pro is a compute monster for things like Octane.

A single core of the W6800X gets 151,667 on the Geekbench Metal compute test. For apps that can use both cores that is over 300,000; Octane loves this card.

With all that said, the fact that the M1 Ultra can eek out ~84,000 for an embedded GPU is pretty damn impressive.

But I am going to wait to see how actual things perform like Octane and Redshift. Video editing performance is going to crush a stock MacPro -- that is kind of obvious due to the dedicated hardware for H.264/HVEC and ProRes. A better test will be a MacPro with a Afterburner card (sure it ups the price, but makes for a better comparison.)

The W6xxx series GPUs have already been benched to death on these forums, do a search if you want more info. I just do not see the M1 Ultra beating W6800X or W6900X or nVidia RTX3090. Everyone who needs this type of compute power do not care about the wattage required to get it.
 
“Relative performance” on the y-axis. Those words were carefully chosen.
 
It's been mentioned that Geekbench 5 Compute isn't a valid test for the M-series past the M1 Pro. The tests that are done back to back quickly on that test don't let the clock speeds ramp up fast enough, they finish the task before the freq. reaches the max amount. Tells me a lot about "how good" The Verge's investigative abilities are.
I believe Andrei Frumusanu (formerly of Anandtech) determined this as well and has tweeted about it. Honestly, he knows his stuff so well and goes so in depth on things like this that I'd trust him long before I'd trust a single benchmark from the Verge.

Edit: Found it, wasn't a tweet, was on Anandtech itself
 
Last edited:
There are other things going on in the world besides games. Personally, I never played, never want to waste my time playing, but for those of you for who it is a hobby, great. But I never once heard anyone say they wanted to by a Mac to play games, seriously, no one ever.
This is true. However, billions upon billions of dollars are "wasted" on iPhones (and other computers of course!) playing games. It's not your gig that's fine, but believe it or not, you represent the minority.

And you realize that your phrase of "no one ever" is actually because they have always been really ridiculously poor at this market. It's about as laughable as McDonalds being considered to have awesome steak... it's a joke. However, amongst the computer industry, gaming represents a massive industry. If Apple made a push in this space it would be massive... but they just don't.

Also speaking of McDonalds, Apple has always been about catering to the kids to get market positivity at a young age, which is why they saturate schools, etc. If they got gaming too that would be a lock-in. Never quite understood why.
 
  • Like
Reactions: lysingur
We have been fooled. At first I was deluded that it was about OpenCL (Metal) performance but even in that environment the M1 Ultra doesn't measure up to the RTX3090. That's still impressive performance for a chip this size (it's maybe 20% slower than two RTX2080Ti's?!) But I was expecting something more. Although it's too early to make judgements: you have to test this hardware in real-world applications. Personally, I can't wait to see the performance results in Octane Render and After Effects.
 
  • Like
Reactions: amartinez1660
I wouldn't hold your breath on AfterEffects or anything Adobe really. Adobe needs to get their act together and start supporting multiple GPUs and also allow you to choose which GPU cards do the compute tasks. I would love to throw my 580x back in just to run monitors.

The only thing that is going to be helping the M1 is the dedicated encode/decode hardware; which is going to be a major gain, but almost nobody only uses the stock effects, which is what most tests only use.
 
This is true. However, billions upon billions of dollars are "wasted" on iPhones (and other computers of course!) playing games. It's not your gig that's fine, but believe it or not, you represent the minority.

And you realize that your phrase of "no one ever" is actually because they have always been really ridiculously poor at this market. It's about as laughable as McDonalds being considered to have awesome steak... it's a joke. However, amongst the computer industry, gaming represents a massive industry. If Apple made a push in this space it would be massive... but they just don't.

Also speaking of McDonalds, Apple has always been about catering to the kids to get market positivity at a young age, which is why they saturate schools, etc. If they got gaming too that would be a lock-in. Never quite understood why.


apple can supply all the power necessary for gaming and the situation of gaming on the mac won't change because the tools to make the games themselves are heavily intertwined with windows and most creators are on windows. a large percent of PC games are actually not very hardware intensive. games like Risk of Rain 2 could run on every current mac and with a few pushes of buttons here and there the developers could have easily spit out a mac version since it is made in Unity (which was originally developed for mac), but they don't. because it doesn't matter if macs are powerful enough, the developer and the gaming culture doesn't care about mac and it is partly apple's fault.

how does Unity (originally an apple exclusive) end up with C# (a microsoft language) as it's scripting language? and what is apple doing about it? nothing. they don't support anything outside of mac, therefor no one outside of mac supports it. meanwhile microsoft supplies frameworks and tools for making apps and games for mac and that's why game devs will remain on pc.
 
  • Wow
  • Like
Reactions: Mr.PT and diamond.g
The graph is actually show that at 100 Watts, the M1 Ultra is more powerful than the 3090 at 320 Watts
Who the **** cares about comparing 100 watts to 320 and why should that actually make people happy? Does that mean my Renault Clio is as fast as a Lamborghini Aventador at 60mph or what? :D
 
Last edited by a moderator:
All I’m getting from this is that The Verge doesn’t know how to read the text near a graph’s axis

Is it stupid and misleading to have the y axis being a nebulous “relative performance”? Yes
Does that mean Apple was claiming M1 Ultra beat the 3090 in raw performance? No

I don’t blame the average person to get confused by the graph (hell, even I initially took away “this is more powerful than a 3090 and draws less power”) but I expect journalists not to.
Also am I crazy or this exact same thing happened when the M1 Pro and Max were unveiled lol

Also if you know anything about physics you both know how impossible it would be for this kind of system to outperform a triple slot GPU and how impressive it is that they “barely got to half its score”
 
  • Like
Reactions: Mr.PT
Who the **** cares about comparing 100 watts to 320 and why should that actually make people happy? Does that mean my Renault Clio is as fast as a Lamborghini Aventador at 60mph or what? :D

some people care. i can afford a ferrari but i drive a toyota because i care more about efficiency than speed.
 
Last edited by a moderator:
I’ve always felt apple’s charts and tests were bullcrap. Now it just appears they’re flat out lying.
No they are not, evidence is GFXBench:

M1 Ultra:
1647555318030.png



RTX 3090:
1647555353418.png

Although the M1 Ultra lost both Aztech Ruins Test, they won the T Rex and Manhattan Tests, and I am not cherrypicking, this is real data from GFXBench
 
It's been mentioned that Geekbench 5 Compute isn't a valid test for the M-series past the M1 Pro. The tests that are done back to back quickly on that test don't let the clock speeds ramp up fast enough, they finish the task before the freq. reaches the max amount. Tells me a lot about "how good" The Verge's investigative abilities are.
This would help explain why the Max vs Ultra doesn't shows around a 45% differential. I think we'll see some analysis by anandtech soon enough.
 
I don't think Apple will have anything substantial until we hit M3. Most likely they will stick to predefined path like tick-tock Intel had :)

I'm glad you said "ARM" is the future rather than Apple Silicon.

People who use graphics cards like Nvidia RTX 3090 don't really care about performance per watt. They care about performance, period.

Apple Silicon shines when performance per watt matters, e.g., mobile devices & laptops, but it's not significantly better on a desktop all things considered, at least for now.

A dGPU still has its advantages. Apple Silicon is still one to two years behind in graphics performance compared to the highest end dGPU. This MR news article shows that the surge in performance in Apple Silicon is starting to level off...well, unless Apple has something up its sleeves in M2 that we don't know about. ;)
 
No they are not, evidence is GFXBench:

M1 Ultra:
View attachment 1975412


RTX 3090:
View attachment 1975413
Although the M1 Ultra lost both Aztech Ruins Test, they won the T Rex and Manhattan Tests, and I am not cherrypicking, this is real data from GFXBench
Yep, as usual it's going to differ from workload to workload. It's going to lose some, get close on some, and win on others. Nobody should be surprised by any of this, really.
 
  • Like
Reactions: amartinez1660
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.