Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Nothing indicates that they can compete with Amazon and Google yet. I will believe it when I see it.
Apple will not "compete" with AWS and Google, in the same way that they don't compete with Intel or IBM.
They will do something different that matches their needs, their developer needs, and their user needs; and that is of little interest to enterprise.
iCloud (remote storage) was the first step; XCode Cloud (still in beta, as remote compute) will be the next step. But it won't end there.
 
it’s much easier to just buy a whole new machine with warranty and support.
Funny thing is back 20 years ago when I was in support our customers thought along the same lines about whole unit replacements except:

They don't buy a warranty. Ever. It's cheaper to just buy 2-3 units for every hundred units you buy than pay $40 $50 per box for a warranty. 2-3% is far above expected failure rates for a Relationship level box

They absolutely do not want support and prefer never have to deal with a support agent. many go so far as simply to pay 3rd parties to do it for them. For the ones that do get a warranty, they pay for the ability to just order the parts themselves and completely skip even the business and /or server level support agents.
 
I know it's not Apple's focus, but I'm disappointed by the gaming performance of the M1 Ultra. I just watched a YouTube video testing the M1 Ultra w/ 48-core GPU running CS:GO in macOS, and I was shocked to learn that my 2019 iMac 27" with Core i9 and Radeon Pro Vega 48 performs better in that game... like, a lot better (50%+ higher fps with the same settings).
 
  • Like
Reactions: Orionfox
I know it's not Apple's focus, but I'm disappointed by the gaming performance of the M1 Ultra. I just watched a YouTube video testing the M1 Ultra w/ 48-core GPU running CS:GO in macOS, and I was shocked to learn that my 2019 iMac 27" with Core i9 and Radeon Pro Vega 48 performs better in that game... like, a lot better (50%+ higher fps with the same settings).
Not surprising since hardly any games have been re-released as native Apple silicon binaries. I would not judge performance off anything that hasn't been compiled specifically for Apple Silicon.
 
Even if this thing could MEET the 3090, which no one took seriously -- remember the 3090 came out A YEAR AND A HALF ago, and it's a MODULAR component!

You're not stuck with it for the entire lifecycle of your machine.

You can upgrade when the 4090 comes out (which is, right now, looking like September of this year, so in ~6 months or so...).

Remember also, AMD and Intel both have plans to release updated GPUs that are competitive and more affordable, too.

And for people saying "yeah but fan noise and heat" -- you do know optical thunderbolt 3 cables exist, right?

Put the computer in another room, and put a fanless dock on your desk...problemo solvedo.

To me it's not worth getting f'ed with no vaseline and being stuck with a "studio" computer that can't be repaired or upgraded, ever!
 
Last edited:
Even if this thing could MEET the 3090, which no one took seriously -- remember the 3090 came out A YEAR AND A HALF ago, and it's a MODULAR component!

You're not stuck with it for the entire lifecycle of your machine.

You can upgrade when the 4090 comes out (which is, right now, looking like September of this year, so in ~6 months or so...).

Remember also, AMD and Intel both have plans to release updated GPUs that are competitive and more affordable, too.

And for people saying "yeah but fan noise and heat" -- you do know optical thunderbolt 3 cables exist, right?

Put the computer in another room, and put a fanless dock on your desk...problemo solvedo.

To me it's not worth getting f'ed with no vaseline and being stuck with a "studio" computer that can't be repaired or upgraded, ever!
I can see your point. But remember that new gpu’ s and cpu’ s require better power supplies, new motherboards and Ram.

But I have a more technical point of view, I am not judging it as a customer. I like the route apple has taken in being competitive in the high end, and their way of building cpu’ s and gpu’ s seems to have more headroom to grow and beat the old x-86 architecture with all of its add-on cards, upgradeable ram, etc.
We are not there yet, but in 3 years time apple will have the all-in one solutions that will beat the high end x-86 machines with power hungry and got cpu’ s, gpu’ s , stand-alone ram and pci slots.
 
  • Like
Reactions: VulchR
Even if this thing could MEET the 3090, which no one took seriously -- remember the 3090 came out A YEAR AND A HALF ago, and it's a MODULAR component!

You're not stuck with it for the entire lifecycle of your machine.

You can upgrade when the 4090 comes out (which is, right now, looking like September of this year, so in ~6 months or so...).

Remember also, AMD and Intel both have plans to release updated GPUs that are competitive and more affordable, too.

And for people saying "yeah but fan noise and heat" -- you do know optical thunderbolt 3 cables exist, right?

Put the computer in another room, and put a fanless dock on your desk...problemo solvedo.

To me it's not worth getting f'ed with no vaseline and being stuck with a "studio" computer that can't be repaired or upgraded, ever!
Uhh, the 3090 came out in 1985...

And yes, this is a joke on my part. But it's a joke with a point!
The point being that: You do not star in anyone else's movie but your own...
Everyone lives in different worlds. You live in a world where 3090 means a recent GPU; I live in one where it means an old IBM mainframe. And people living in different world, have different needs.

Yes, a Mac Studio (and probably every Apple product ever) does not meet your needs, we get it. I don't think the above comment was required just to convey that sentiment.
 
Even if this thing could MEET the 3090, which no one took seriously -- remember the 3090 came out A YEAR AND A HALF ago, and it's a MODULAR component!

You're not stuck with it for the entire lifecycle of your machine.

You can upgrade when the 4090 comes out (which is, right now, looking like September of this year, so in ~6 months or so...).

Remember also, AMD and Intel both have plans to release updated GPUs that are competitive and more affordable, too.

And for people saying "yeah but fan noise and heat" -- you do know optical thunderbolt 3 cables exist, right?

Put the computer in another room, and put a fanless dock on your desk...problemo solvedo.

To me it's not worth getting f'ed with no vaseline and being stuck with a "studio" computer that can't be repaired or upgraded, ever!
Your solution to a bulky, noisy and hot desktop setup is to stick it in another room...?

That said, I am not really optimistic for the 4090. Looking at the current trends, it looks like Intel and Nvidia's method to improve performance is simply by improving cores, which in turn leads to increased power consumption and heat generated. Some people may still be able to accept or at least tolerate this for the improved performance gains, but it's a no-go for laptops in the very least (where Apple is offering superior, sustained performance in addition to longer battery life).

This is the opposite of what Apple has done, where they created a new architecture which has been able to offer comparable performance, with a fraction of the drawbacks. And I believe this is how Apple plans to continue to differentiate their offerings moving forward - with their processors enabling unique form factors (and in turn, unique experiences) that the competition cannot realistically match.
 
The verge did not do this test correctly, Apple never specified the ram in the test PC, and also the verge used the wrong CPU. Most importantly Apple never said anything about geekbench.
 

Attachments

  • 1643FF71-5910-4D2A-A5D6-BC5984EF1055.jpeg
    1643FF71-5910-4D2A-A5D6-BC5984EF1055.jpeg
    21.8 KB · Views: 78
  • 45C4195F-F8D5-4945-A898-4CF7C97F1E66.jpeg
    45C4195F-F8D5-4945-A898-4CF7C97F1E66.jpeg
    134.5 KB · Views: 79
Last edited:
The ultra is a joke it’s not needed it’s just for the fact they can say it’s faster and charge double the m1 max is enough for anything macs can do, the only reason 3090 is needed is for gaming and macs can’t game.
This has nothing to do with the power of Macs it has to do with game developers. And also some popular games such as tomb raider, Cuphead, borderlands, and more are on the Mac.
 
This has nothing to do with the power of Macs it has to do with game developers. And also some popular games such as tomb raider, Cuphead, borderlands, and more are on the Mac.
Going on over a year since M1's announcement and multiple products with nary a peep from the majority of AAA developers, I've given up on mac gaming in general going forward. Only reason my Intel imac and MBP's were decent gaming machines was because of boot camp. I've decided to lug around a Lenovo gaming laptop to go with my loaded MBP 16 M1max when traveling as my M1max kills my MBP 16 i9 in Final Cut with hardly any heat/noise. But the gaming support is just plain pathetic and isn't improving anytime soon.
 
I suppose it is possible that in the chart displayed in this article, Apple is saying an M1 Ultra at ~100 watts matches a 3090 at ~300 watts. However, pushing the 3090 to it's near-500W maximum would allow it to pull significantly ahead as the benchmarks run by The Verge showed.
That is totally what the chart says. I don't get why everyone is thinking M1 Ultra beats a 3090 at full power - seems obvious.
 
So still having issues that M1 Ultra cant really beat 3090?
M1 Ultra was never meant to beat the 3090. However, it can certainly keep up with the 3090 BUT only up to a certain wattage. Obviously the 3090 is a beast of a GPU and can pull a lot more wattage. In fact the 3090 can pull more power on its own than the entire M1 Ultra.

The M1 Ultra as well as the entire M1 (and now M2) family is focused on efficiency. After all, the M1 Ultra is just a connection of (2) M1 Max chips which were designed to go into a laptop. Which means from the beginning the M1 Max had to be designed with thermals, battery life and size in mind. Nvidia doesnt need to worry about battery life with their GPUs. Even their mobile GPUs will only give you full power if your plugged into a wall.

Let’s wait to see what the Mac Pro comes with.
 
M1 Ultra was never meant to beat the 3090. However, it can certainly keep up with the 3090 BUT only up to a certain wattage. Obviously the 3090 is a beast of a GPU and can pull a lot more wattage. In fact the 3090 can pull more power on its own than the entire M1 Ultra.

Actually, Nvidia GPU is the performance per watt king. 70W mobile Nvidia 3060 is 3x faster than Macbook Pro M1 Max 32GPU and 2x faster than Mac Studio M1 Ultra 64GPU on Blender rendering.

7.23s - Nvidia 3080ti (GPU OptiX Blender Linux Mint 20)
8.21s - Nvidia 3080ti (GPU OptiX Blender Windows 10)
13.13s - Nvidia 3080 mobile (GPU OptiX Blender)
16.39s - Nvidia 3060 70W mobile (GPU OptiX Blender 3.0)
20.57s - AMD 6900xt (GPU HIP Blender 3.0)
29s - 2070 Super (GPU OptiX)
30s - AMD 6800 (GPU HIP Blender 3.1)
34s - M1 Ultra 20CPU 64GPU (GPU Metal Blender 3.1)
37s - M1 Ultra 20CPU 48GPU (GPU Metal Blender 3.1)
42.79s - M1 Max 32GPU (GPU Metal Blender 3.1 alpha)
48s - M1 Max 24GPU (GPU Metal Blender 3.1 alpha + patch)
51s - Nvidia 2070 Super (GPU CUDA)
1m18.34s - M1 Pro 16GPU (GPU Metal Blender 3.1 alpha + patch)
1m35.21s - AMD 5950X (CPU Blender 3.0)
1m43s - M1 Ultra 20CPU 64GPU (CPU Blender 3.1)
1m50s - M1 Ultra 20CPU 48GPU (CPU Blender 3.1)
2m0.04s - Mac Mini M1 (GPU Metal Blender 3.1 alpha + patch)
2m48.03s - MBA M1 7GPU (GPU Metal Blender 3.1 alpha)
3m55.81s - AMD 5800H base clock no-boost and no-PBO overclock (CPU Blender 3.0)
4m11s - M1 Pro (CPU Blender 3.1 alpha)
5m51.06s - MBA M1 (CPU Blender 3.0)
 
  • Haha
Reactions: Apple Fan 2008
Actually, Nvidia GPU is the performance per watt king. 70W mobile Nvidia 3060 is 3x faster than Macbook Pro M1 Max 32GPU and 2x faster than Mac Studio M1 Ultra 64GPU on Blender rendering.

7.23s - Nvidia 3080ti (GPU OptiX Blender Linux Mint 20)
8.21s - Nvidia 3080ti (GPU OptiX Blender Windows 10)
13.13s - Nvidia 3080 mobile (GPU OptiX Blender)
16.39s - Nvidia 3060 70W mobile (GPU OptiX Blender 3.0)
20.57s - AMD 6900xt (GPU HIP Blender 3.0)
29s - 2070 Super (GPU OptiX)
30s - AMD 6800 (GPU HIP Blender 3.1)
34s - M1 Ultra 20CPU 64GPU (GPU Metal Blender 3.1)
37s - M1 Ultra 20CPU 48GPU (GPU Metal Blender 3.1)
42.79s - M1 Max 32GPU (GPU Metal Blender 3.1 alpha)
48s - M1 Max 24GPU (GPU Metal Blender 3.1 alpha + patch)
51s - Nvidia 2070 Super (GPU CUDA)
1m18.34s - M1 Pro 16GPU (GPU Metal Blender 3.1 alpha + patch)
1m35.21s - AMD 5950X (CPU Blender 3.0)
1m43s - M1 Ultra 20CPU 64GPU (CPU Blender 3.1)
1m50s - M1 Ultra 20CPU 48GPU (CPU Blender 3.1)
2m0.04s - Mac Mini M1 (GPU Metal Blender 3.1 alpha + patch)
2m48.03s - MBA M1 7GPU (GPU Metal Blender 3.1 alpha)
3m55.81s - AMD 5800H base clock no-boost and no-PBO overclock (CPU Blender 3.0)
4m11s - M1 Pro (CPU Blender 3.1 alpha)
5m51.06s - MBA M1 (CPU Blender 3.0)
Cherry picking much are we? Blender is not optimised for M1. Also Blender is an edge use case. Now go and compare your dGPU when running optimised apps like Resolve or even the badly optimised Adobe apps and things become totally different. And these are the kind of work Mac users do.
It is all about productivity and nothing about benchmark results!
 
Actually, Nvidia GPU is the performance per watt king.

lmao

70W mobile Nvidia 3060 is 3x faster than Macbook Pro M1 Max 32GPU and 2x faster than Mac Studio M1 Ultra 64GPU on Blender rendering.

The CPU + GPU + RAM of that MacBook Pro uses less power than your 70W for the GPU alone.

Nvidia is frequently performance king, but they're worse than Apple, AMD, or Intel at performance per watt.

 
This has nothing to do with the power of Macs it has to do with game developers. And also some popular games such as tomb raider, Cuphead, borderlands, and more are on the Mac.
Unfortunately, this statement just helps illustrate how pathetic the Mac gaming options really are. I'm not saying anything's wrong with these titles ... but Cuphead comes free as one of the games you can play on the touchscreen of a Tesla these days. It's not exactly a "high end" gaming title.

Borderlands was a great game, but it's also not that demanding on graphics or even raw CPU power vs the latest action games on the market. And really, most "serious gamers" already played and are done with that game because it's been out long enough on other platforms.

Tomb Raider was one of the more impressive games to get native Mac OS support. I think they've done at least a couple of editions of it for Mac over the years. But it's, again, sad that you can't point to many others like it that run on Mac.
 
Unfortunately, this statement just helps illustrate how pathetic the Mac gaming options really are. I'm not saying anything's wrong with these titles ... but Cuphead comes free as one of the games you can play on the touchscreen of a Tesla these days. It's not exactly a "high end" gaming title.

Borderlands was a great game, but it's also not that demanding on graphics or even raw CPU power vs the latest action games on the market. And really, most "serious gamers" already played and are done with that game because it's been out long enough on other platforms.

Tomb Raider was one of the more impressive games to get native Mac OS support. I think they've done at least a couple of editions of it for Mac over the years. But it's, again, sad that you can't point to many others like it that run on Mac.
How pathetic it is for you. That is the problem with these gaming threads. Minecraft, WoW, Stardew Valley, Factorio, Terraria, Borderlands 2. Every one of these is on Mac and every one of these I have well over 300 hours in. The AAA gaming industry is getting far too ridiculous for my liking with loot boxes, season passes, micro transactions and half-*** releases where even my 3080 Ti struggles because they don't optimize well.
 
How pathetic it is for you. That is the problem with these gaming threads. Minecraft, WoW, Stardew Valley, Factorio, Terraria, Borderlands 2. Every one of these is on Mac and every one of these I have well over 300 hours in. The AAA gaming industry is getting far too ridiculous for my liking with loot boxes, season passes, micro transactions and half-*** releases where even my 3080 Ti struggles because they don't optimize well.
Do you play something that actually requires a 3080Ti?
 
  • Like
Reactions: mi7chy
Do you play something that actually requires a 3080Ti?
Nothing requires a 3080 Ti. Even the popular Elden Ring works fine on my GTX 1080. I have never once even seen a high end 20 series GPU in the recommended listing for games.
 
Nothing requires a 3080 Ti. Even the popular Elden Ring works fine on my GTX 1080. I have never once even seen a high end 20 series GPU in the recommended listing for games.
Dying Light 2 does (if you have RT enabled).

I was just wondering, why even upgrade your 1080 to a 3080Ti if nothing you play is sweating your 1080.
 
Dying Light 2 does (if you have RT enabled).

I was just wondering, why even upgrade your 1080 to a 3080Ti if nothing you play is sweating your 1080.
As I said before, I do video editing so a 3080 improves that performance. I played Dying Light 2 on my 5700 XT so no its not a requirement for a 3080 Ti. I do not find RT the end all be all in gaming so I never use it.
 
As I said before, I do video editing so a 3080 improves that performance. I played Dying Light 2 on my 5700 XT so no its not a requirement for a 3080 Ti. I do not find RT the end all be all in gaming so I never use it.
Is NVENC faster than QuickSync? (As an aside why in the world are you not using a Mac to do video editing?)

Ehh you said it wasn't in the recommended listing, not required listing. I think it and Cyberpunk 2077 are the only two games that currently recommend high end GPU's for any meaningful ray tracing settings.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.