Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thank you for helping explain this. That was my point all along that since Elden Ring performs well (while not 4k ULTRA!!!!) on a GTX 1080, it should be fine on a Mac if it was ported (and ported well). Thus, helping the whole "gaming on mac" argument settle down a bit by not gating it to "needs NVIDIA GPUs" or "needs equivalent of 3080 performance".
I bet if Apple had it's own chips ages ago this would be more likely. I think especially I wonder from a programmers standpoint
I'm not confusing 1080p with GeForce GTX 1080. LOL

My points are:
1. What constitutes "reasonably well" varies by person.
2. You can't extrapolate PC gaming experience on the same title to Mac.

I'm not let down by anything related to gaming on a Mac. I use Boot Camp and I play RDR2 on high. I'm good. ;)

I just find it hard to understand why Apple won't take gaming seriously when it's clearly a profitable venture.

okay. I just wanted to check. It seemed the argument was about one person saying 1080p, but they never explicitly mentioned that.

1. I do agree, "reasonably well" varies from person to person. Even these days I think my own personal expectations are too high. I was more than happy playing N64 games at sub 480p and abysmal frame rates. They were fun and that was good enough for a kid. Now, under 60FPS frustrates me.

2. I don't think you can help people comparing systems, especially with the assumption that you would be playing the same games. It is inevitably going to be an apples to apples comparison even if the architecture of the chipset is different. While we hope that if coded properly that the efficiency of Mac OSX and M1 being standardized would provide an advantage, there are even small things that would need to be kept in consideration. For example, even the base M1 has multiple variants, let alone considering Pro/Max/future. Would some games run sub-optimally on the 7GPU core variant, or would developers need to target that still? This is the part where the comparison changes. As a PC game developer, you develop a game with an assumption on what the customers will use in a couple years when your game is released. If you overshoot and the average customer doesn't have a powerful enough system yet, you either delay and/or optimize code a bit, or hope you can push your customers to upgrade something because your game is worth it. What do you target as a Mac developer? Your customers can't upgrade anything internally, only can switch out their whole product. Plus who the hell knows what will come out from Apple at any given time...M1 Pro Plus Max....M2 Gamers edition with half the CPU cores and double the GPU cores!? If they show with M1/M2/etc. some fairly reliable to track changes year to year, the games will come. But now, PC and consoles are so similar and work with developers years in advance. Targeting that market is more logical.

Relating to money, we can (and rightfully will) criticize Mr. Cook for many things. But, the man definitely knows how to make the calls that will bring in profit. I assume at this point they have run the numbers and figured casual gamers on their eco system are happy enough with iOS games and that hard core gamers will play elsewhere or use the relevant software to get it up and running. I cannot think of anyone close to me that is talking about Elden Ring yet only has an iMac at home.
 
  • Like
Reactions: ConvertedToMac
Then which benchmark to test for GPU? Even other GPUs also test in a short term so I dont see anything wrong about it.
GFXBench and 3Dmark Wildlife Extreme are better. 3DMark scales about 1.8 times with GPU count and has a 10 minute stress test.
 
You literally said "Apple really needs to get into gaming. Perhaps commissioning a game studio to make a short but graphics-intensive game for them to demonstrate the power of their chips and to shut critics like me up." in post 225.
How is this the same as "Apple needed to create a few AAA games." Like, seriously? A commissioned short graphics-intensive game vs a few AAA games created by Apple.
 
How is this the same as "Apple needed to create a few AAA games." Like, seriously? A commissioned short graphics-intensive game vs a few AAA games created by Apple.
Because developers are going to want to see a show of commitment from Apple. That's why Stadia failed. Google had the tech, but lacked the dedication (they didn't seem to realise how expensive it would be to publish a game title), and it was this lack of apparent commitment that led to other gaming developers not being willing to embrace the platform themselves.

Sorta like a "You want me to jump? Then you jump first" mentality. Apple has made it painfully clear that they have little interest in supporting gaming on the Mac, and lip service in the form of a glorified demo isn't going to be enough to sway the detractors. Even if the hardware is technically there.
 
What's the test result for both 3090 and M1 Ultra?
I posted earlier some results for GFXBench. Ultra is very close to 3090. Ultra 484 fps, 3090 505 fps

As for Wildlife extreme M1 Max gets about 20,000 and Ultra 35,000. It's somewhat equal to 3080 according to https://www.3dmark.com/.
 
  • Like
Reactions: Xiaojohn
I bet if Apple had it's own chips ages ago this would be more likely. I think especially I wonder from a programmers standpoint


okay. I just wanted to check. It seemed the argument was about one person saying 1080p, but they never explicitly mentioned that.

1. I do agree, "reasonably well" varies from person to person. Even these days I think my own personal expectations are too high. I was more than happy playing N64 games at sub 480p and abysmal frame rates. They were fun and that was good enough for a kid. Now, under 60FPS frustrates me.

2. I don't think you can help people comparing systems, especially with the assumption that you would be playing the same games. It is inevitably going to be an apples to apples comparison even if the architecture of the chipset is different. While we hope that if coded properly that the efficiency of Mac OSX and M1 being standardized would provide an advantage, there are even small things that would need to be kept in consideration. For example, even the base M1 has multiple variants, let alone considering Pro/Max/future. Would some games run sub-optimally on the 7GPU core variant, or would developers need to target that still? This is the part where the comparison changes. As a PC game developer, you develop a game with an assumption on what the customers will use in a couple years when your game is released. If you overshoot and the average customer doesn't have a powerful enough system yet, you either delay and/or optimize code a bit, or hope you can push your customers to upgrade something because your game is worth it. What do you target as a Mac developer? Your customers can't upgrade anything internally, only can switch out their whole product. Plus who the hell knows what will come out from Apple at any given time...M1 Pro Plus Max....M2 Gamers edition with half the CPU cores and double the GPU cores!? If they show with M1/M2/etc. some fairly reliable to track changes year to year, the games will come. But now, PC and consoles are so similar and work with developers years in advance. Targeting that market is more logical.

Relating to money, we can (and rightfully will) criticize Mr. Cook for many things. But, the man definitely knows how to make the calls that will bring in profit. I assume at this point they have run the numbers and figured casual gamers on their eco system are happy enough with iOS games and that hard core gamers will play elsewhere or use the relevant software to get it up and running. I cannot think of anyone close to me that is talking about Elden Ring yet only has an iMac at home.
Tim Cook is a great CEO, except he's old. Tech industry is very ageist, in part because it's dominated by young people and is a future-oriented market. It's always about what's the next big thing. And old people just aren't good at sensing what young people want.

If M1 chips are as powerful as Apple claims, variants don't really matter. The variations you get on the PC side are at least 100× what you get on the Mac side. That's why games have graphics settings, to accommodate variations in specs.

Your point about upgradeability is valid. But my counterpoint would be Apple Silicon is so ahead of the curve that it shouldn't matter, especially after M2 comes out. I read somewhere that M1 is already as powerful as console chips and minus the overhead of running macOS, it should still run games at a decent frame rate. And before you say, yeah, but M2 hasn't come out yet. True, but we aren't arguing about what is, we, i.e., I, are arguing about what should/could be.
 
  • Like
Reactions: waterfta
Because developers are going to want to see a show of commitment from Apple. That's why Stadia failed. Google had the tech, but lacked the dedication (they didn't seem to realise how expensive it would be to publish a game title), and it was this lack of apparent commitment that led to other gaming developers not being willing to embrace the platform themselves.

Sorta like a "You want me to jump? Then you jump first" mentality. Apple has made it painfully clear that they have little interest in supporting gaming on the Mac, and lip service in the form of a glorified demo isn't going to be enough to sway the detractors. Even if the hardware is technically there.
Nanosaur isn't Jurassic World Evolution. You misquoted me and now you're just going off on a tangent.

I was simply saying Apple should do a game like Nanosaur which they did for G3 iMac to demonstrate the potential of their chips. It's not about showing game developers their commitment. It's about showing the power of Apple Silicon. How you misinterpreted what I was saying is beyond me. I'm not so dumb as to think a commissioned graphics-intensive game is enough to enthuse game developers for the Mac platform.
 
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
An Apple machine couldn't handle the thermal output from a fully supplied GPU, it'd melt into a puddle. THAT'S why they can't directly compare.
 
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
Well yes, the chart shows the 3090 running at ~320 Watts, but this is definitely still a bending of data. Even at 320 Watts the 3090 would still be more powerful than a M1 Ultra. I'd assume when they say "relative performance" they decided since relative means ratio, they decided to make it so that since the M1 Ultra only runs at 100W and the 3090 at 320 to 1/3 the 3090's real performance from their data to make it "relative" to the M1 Ultra.
Anyways it doesn't really matter because it's just marketing, of course they are going to bend data to make it seem nice.
 
  • Like
Reactions: Ulfric and lysingur
It's difficult for SoC design to maintain unified memory (CPU cores & GPU accessing the same memory) with a dGPU. It's nigh impossible in fact. Apple must live or die by its own chips now.

Apple is making it’s own dGPU’s for future Mac Pro’s. Code name “Lifuka”. At every step, there’s naysayers saying “it’s impossible” yet Apple continue to redefine what’s possible. Just 18m ago, the naysayers were proclaiming “there’s no way a phone SOC can handle desktop workloads”…..yet here we are.
 
  • Like
Reactions: Homy
Apple's marketing team needs to be let go
They really are setting themself up for L after L with some of their claims.

More detail on that graph is all that was needed. This probably won’t be a huge deal but in the long run the company’s reputation is more important than a couple weeks of hype.

To clarify: I’m not saying these chips aren’t powerful or as good as they say, I’m just saying when they push public perception one way then reality appears to contradict that initial perception, it makes them look bad.
 
Last edited:
  • Like
Reactions: Ulfric
I think Apple has to try to compare themselves favorably to the RTX 3090 because anyone that is going to try to build a comparative powerhouse to the M1 Ultra Studio is probably going to be at least considering the RTX 3090. So a little tomfoolery in the graphics (not new to them). Once you take a closer look you find out that you can probably do better on your own, even with a RTX 3090, taking into account power concerns of course. Also not new a new concept - you could always do better on your own and most of the time for less money.

Yet the whole thing is moot because people will generally stick with their platform of choice. In this way Apple really only competes against themselves. If you are a Mac Pro person and always have been Apple would have to be particularly careless to lose you.

The M1 Max in the MBP is already faster than a RTX3090 in a number of areas where the Mac Studio is marketed. The disparity will be even greater with the Ultra.



Professionals don’t sit and run Geekbench all day long.
 
I just find it hard to understand why Apple won't take gaming seriously when it's clearly a profitable venture.
You clearly do not understand Apple's business model. High end Macs are used to create content. And part of that content is what people consume on Apple's real money makers: iPhone and iPhone peripherals. The high end macs thus serve to supply the needs of the cash cow.
Gaming on high end Macs is not part of that model and thus irrelevant to Apple. It is not a profitable venture.
Look at the name of the latest Mac. In a studio you produce, you work, and gaming is not part of that environment!
 
Last edited:
Yes you could argue that the chart is cropped to cap out at the TDP that the M1 Ultra excels at. But they made a point to make it seem like it had higher relative performance, which is misleading at best.
Nope, it clearly showed that it had the same relative performance of an RTX3090 at 300W, whilst drawing 200W less. Nowhere in the graph does it show the Ultra has greater absolute performance.
 
All of you dorks seem to forget the fact that the comparison is between an integrated graphic sized chip vs a flagship card the size of a large brick, nevermind the power consumption.

Before apple came along, how come no one ever tried to compare integrated graphic with nvdia flagship desktop cards? Because you get laughed out of the room…now everyone is taking this seriously and debating, how insane is this what apple has achieved.
You also seem to forget that for many use cases, especially desktop computing, the size of the GPU is a non-issue.

Doesn't matter how small a GPU is if it doesn't do what people want.

Your post reveals that you are just projecting your aesthetic preferences onto other people.
 
Your point about upgradeability is valid. But my counterpoint would be Apple Silicon is so ahead of the curve that it shouldn't matter, especially after M2 comes out. I read somewhere that M1 is already as powerful as console chips and minus the overhead of running macOS, it should still run games at a decent frame rate. And before you say, yeah, but M2 hasn't come out yet. True, but we aren't arguing about what is, we, i.e., I, are arguing about what should/could be.
Well if we are talking about what could be, remember that RTX 3xxx are made from Samsung 8nm. The upcoming RTX 4xxx will be made using TSMC 5nm.
 
As a CG artist I am bummed. I was expecting more numbers.

I want to know how it performs in Blender Cycles while rendering a scene at 4096 samples, how it performs in Houdini PyroFX, Fluids, particle * other Rigid body, soft body simulation, procedural, Vellum? Or some Nuke benchmarks in 2D & 3D viewport?

How about its rendering performing in Octane, Redshift, Vray.

Is it on par with Nvidia Optix?

What about simulations or parametric modeling on Solidworks? How about some numbers on Rhinoceros 7.0?

How about RFO Benchmark for Revit? (Though I am not a BIM user myself).
 
You clearly do not understand Apple's business model. High end Macs are used to create content. And part of that content is what people consume on Apple's real money makers: iPhone and iPhone peripherals. The high end macs thus serve to supply the needs of the cash cow.
Gaming on high end Macs is not part of that model and thus irrelevant to Apple. It is not a profitable venture.
Look at the name of the latest Mac. In a studio you produce, you work, and gaming is not part of that environment!
So making games is not producing? Streaming is not producing?
 
As usual marketing was making use of ultimate scenarios and exaggerating where possible.
Nothing new.
Especially for Apple.

Graphics has always been a performance Achilles heel with Apple computers. The Highest-end grfx AMD / nVidia cards were never available for Power Mac or Mac Pro.
Some decent mid-high range versions were, but often the drivers were not optimal.

The M1 (Pro / Max) are superb, superb mobile CPU / GPUs.
For desktops, the M1 Ultra CPU is "excellent", and the GPU "very good", so the combo is not insanely great, but simply very very good.

But the whole package (the hardware, aesthetics, noise, macOS, power consumption, etc.) makes the Mac Studio (Max or Ultra) a very attractive Prosumer desktop computer IMHO.

Do you want the highest GPU performance? You simply need to get a high-end PC / workstation with the Windows-only high-end grfx card. But you're stuck with a high energy-hungry big box running Windows OS.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.