Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Almost every online YouTube review says that if you are going to buy a Max Studio desktop that you should buy the stock 32GB model and not the Ultra models, both because of the cost of the processor upgrade and the fact that the performance is either small or sometimes non-existent based upon benchmark tests.

The reviews on the base $1900 Studio unit is favorable for the unit itself, it’s needing to buy a $1600 monitor to get a 5k HDR screen that bothers a lot of reviewers.

And that the speed of the graphics in the M1 Ultra is pretty good but it’s not near the performance of a Radeon card. Except in having a lower power usage.
I guess you don’t use Logic? In the latest benchmarks,
M1 Ultra 311 gets tracks
M1 Max 195 gets tracks

That’s a massive difference.
 
Fun part: using that xcode benchmark project, and building via sh script takes about 93sec on Macbook Pro 16' M1 Max. However, it takes 67sec when doing the same inside Xcode 13.3. That would make my M1 Max as fast as M1 Ultra, and I believe that is not the case. Because there are compilation test results from clang, and those are basically twice as fast on ultra, and that actually makes sense as Ultra has twice as many CPU resources. So there is some issue on the SW side of things with xcodebuild command and maximum threads used during the compilation.
 
  • Wow
Reactions: Argoduck
Well, forum is having same performance issues at the moment. same post twice.
 
Last edited:
Why does Apple refuse to put the Ultra chip in a $900 32 inch iMac and make it thinner than the 24 inch iMac? Oh why does Apple hate their customers so much?!?!?!?

More seriously, how is Apple going to build the 40 core version coming in the new Mac Pro in June?
M2 Ultra with 20 cores with the option to have a 40 core M2 Ultra Max ;) Purely guessing. lol!
 
I guess you don’t use Logic? In the latest benchmarks,
M1 Ultra 311 gets tracks
M1 Max 195 gets tracks

That’s a massive difference.
I use Logic but that’s not a very useful benchmark to me... what effects are on the tracks? What instruments? That’s more informative
 
To present to any tv or projector made in the last 15 years.

Adding ports adds cost. Apple has data how often people use ports, and that helps makes the decision on whether the added cost is worthwhile.

HDMI may have been around forever like RS232, but just like RS232 it appears people aren’t using it very often on Macs anymore.
 
  • Like
Reactions: KeithBN
Adding ports adds cost. Apple has data how often people use ports, and that helps makes the decision on whether the added cost is worthwhile.

HDMI may have been around forever like RS232, but just like RS232 it appears people aren’t using it very often on Macs anymore.
That’s my point. People ARE using it. HDMI allows presentation to nearly any HDTV or monitor made in the last 15 years. And for the few high end TB only displays, they can also be connected.

I am happy my 16” M1Pro has HDMI. My 2.5k monitor has hdmi and DP and the DP stopped working 2 years ago but the HDMI works with anything I plug into it.
 
  • Like
Reactions: sudo-sandwich
Adding ports adds cost. Apple has data how often people use ports, and that helps makes the decision on whether the added cost is worthwhile.

HDMI may have been around forever like RS232, but just like RS232 it appears people aren’t using it very often on Macs anymore.
HDMI is plugged into my MacBook right now. It's probably the only port I use besides MagSafe. My projector and TV don't take USB-C, nor do any others. My screens at work have USB-C but, ironically, only work with HDMI from a Mac.
 
Are there any benchmarks on neural engine performance?

A lot of the benefits of the M1 architecture are the heterogenous computing modules (neural engines, media engines,etc), but all the benchmarking I see is plain vanilla CPU/GPU testing.

I’m not aware of any, but not sure what you would compare it to if it existed. It can do 22 trillion operations per second with 32 cores. Interestingly the M1 Ultra actually has 64 Neural Engine cores, but half of them are disabled right after the operating system boots. There is a theory they might be enabled later to facilitate ray trace acceleration. If that happens in a future release of macOS, one could look at raytracing acceleration benchmarks.
 
  • Like
Reactions: Argoduck
HDMI is plugged into my MacBook right now. It's probably the only port I use besides MagSafe. My projector and TV don't take USB-C, nor do any others. My screens at work have USB-C but, ironically, only work with HDMI from a Mac.

I put pepper on my cereal but don’t expect kelloggs to do it in the box.

Just because you use HDMI doesn’t mean it’s popular among Mac users anymore. There are plenty of Thunderbolt to HDMI converters, no need to add it to Macs and jack the price.
 
  • Like
Reactions: KeithBN
What you really need to know is this: there's a chip called M1 Max, and there's a chip in the same family that is better than that one.

In other words, Apple still can't get its branding perfectly in order, even for a brand new product line.

For this reason I thought the Ultra should have been named M1 Evo to indicate that it's a whole new evolution of the M1 family. Would have allowed the Max to legitimately retain its maximum status in the lower tier. As it stands, they just made the entire naming strategy a total mess with one stroke of a pen.
 
Besides common 4K computer displays, its used with VTC/projector setups at work or in ones Home Theater as a additional content source. Diffidently useful.

Exactly. External displays on the desk, the TV, and projectors at presentations.

Like, the odds that you're at a client and they have a monitor that takes DisplayPort are low. The odds that it takes USB-C are near-zero. The odds that it takes HDMI? Very, very high.
 
  • Like
Reactions: MajorFubar
But, until now, you could do all of it with a Mac. That’s the reason you could see Macs outside design studios.

That’s the whole point. A Mac’s utility for the overall professional market went downhill with this transition, so Apple could increase their margins.

I get that, it’s their job, I complain because it affects me negatively.
Just ‘so Apple could increase their margins’ is obviously a tiny narrow look at the big picture.
First, obviously that’s a goal.
Secondly though, and far more importantly they have created a product which blows away much of the competition in the fields in which Apple sees as the Macs core market. The creative arts, specifically; video, photo, dtp, music, illustration.

A great deal of work has gone into specific parts and abilities, over the fact of just doing it to try and increase their margins (allegedly- I suspect they’re spending a substantial sum on development to get to these better margins).

It’s true, we have lost windows compatibility, which was needed to run various softwares that don’t support the Mac or are less supported on the Mac. But at that point - you’re a windows user no? If your key software is Windows based?

Windows was a selling point of the mac before Apple turned into what they are now. They believe they no longer need this support pillar to move forwards with their own agenda.
 
  • Like
Reactions: Argoduck
For this reason I thought the Ultra should have been named M1 Evo to indicate that it's a whole new evolution of the M1 family.

It isn't, though. It's really just an M1 Max Duo.

(My main quibble is that mentally sorting them is hard. The M1 is the default, OK. But then the Pro is the high-end, right? No, the Max is even higher. But the Ultra is even higher than the Max. So the Max isn't actually… max?)

 
It isn't, though. It's really just an M1 Max Duo.

(My main quibble is that mentally sorting them is hard. The M1 is the default, OK. But then the Pro is the high-end, right? No, the Max is even higher. But the Ultra is even higher than the Max. So the Max isn't actually… max?)
Any suggestions as to how Apple could label them? Apple likes name monikers, but it would be easier with something like M1-3, M1-5, M1-7, M1-9, but they probably don't like the Intel/AMD association. I don't mind the M1, Pro, Max, Ultra as such, but as you say, they make no sense. In the old days, they used "Good, Better, Best" on the Applestore, and they would need an equivalent of that. M1, Enthusiast, Prosumer, Professional, perhaps. (Yes, also awful!).
 
Any suggestions as to how Apple could label them? Apple likes name monikers, but it would be easier with something like M1-3, M1-5, M1-7, M1-9, but they probably don't like the Intel/AMD association. I don't mind the M1, Pro, Max, Ultra as such, but as you say, they make no sense. In the old days, they used "Good, Better, Best" on the Applestore, and they would need an equivalent of that. M1, Enthusiast, Prosumer, Professional, perhaps. (Yes, also awful!).
I think it makes sense, when you consider it as a whole. The pro m1 is indeed a powerful chip, suitable for plenty of professional workloads. The max is as it says, they absolute ultimate that they offer. As the ultra is just 2x max, it’s not really a new soc, as others have stated perhaps something like maxdual or maxduo may have been clearer - but I also think they wouldn’t want the association with past naming schemes, and also it’s not quite accurate to call it dual or duo.
 
Adding ports adds cost. Apple has data how often people use ports, and that helps makes the decision on whether the added cost is worthwhile.

HDMI may have been around forever like RS232, but just like RS232 it appears people aren’t using it very often on Macs anymore.
Literally millions of people are still using HDMI. It is not an antiquated legacy port. Like the guy who complained last October about the new MBPs coming with so many unnecessary ports, I don't know what world you're living in where HDMI is not still current and relevant, but it's certainly not mine. Millions of display-devices connect via HDMI. It's the #1 method worldwide of connecting a display or projector to a video source.
 
  • Like
Reactions: chucker23n1
Any suggestions as to how Apple could label them? Apple likes name monikers, but it would be easier with something like M1-3, M1-5, M1-7, M1-9, but they probably don't like the Intel/AMD association. I don't mind the M1, Pro, Max, Ultra as such, but as you say, they make no sense. In the old days, they used "Good, Better, Best" on the Applestore, and they would need an equivalent of that. M1, Enthusiast, Prosumer, Professional, perhaps. (Yes, also awful!).

It's tricky.

They could do characters. M1 becomes M1A, M1 Pro becomes M1B, M1 Max becomes M1C and M1 Ultra becomes M1D. Makes it very clear which one is the beefiest. Kind of nerdy, though.
 
  • Like
Reactions: DeepIn2U
Nothing disappointing about compile times:

Mac Pro 2012: 230 seconds
27 inch iMac: i9 3.6 GHz 8-core: 167 seconds
M1 8-core: 130 seconds
M1 Pro 8-core: 110 seconds
M1 Pro 10-core: 98 seconds
M1 Max 10-core: 90 seconds
M1 Ultra 20-core: 67 seconds


Obviously I/O and max number of cores Xcode can utilize impact this benchmark, but it's still a monster. Wondering if my 16 inch Macbook Pro M1Max 10 core was a mistake?
The M1 Max in your MBP wasn’t a mistake; do you mean your purchase May have been?
 
Literally millions of people are still using HDMI. It is not an antiquated legacy port. Like the guy who complained last October about the new MBPs coming with so many unnecessary ports, I don't know what world you're living in where HDMI is not still current and relevant, but it's certainly not mine. Millions of display-devices connect via HDMI. It's the #1 method worldwide of connecting a display or projector to a video source.
Agreed.

It's also not going anywhere - every new TV and nearly every new non-Apple Monitor has an HDMI input and there is no replacement even vaguely on the horizon.
 
I guess you don’t use Logic? In the latest benchmarks,
M1 Ultra 311 gets tracks
M1 Max 195 gets tracks

That’s a massive difference.
I’ve seen the numbers. What I also have seen is a lot of photography and video YouTubers showing how long it takes their M1 Ultra to process a 4K video with color corrections and graphic overlays and instead of this huge time reduction-remember this is compared to a Studio Max and not an Intel based computer of any kind- the time saving that they are getting is a few minutes at best instead of a 2 or 3 times speed increase. And the Ultra has 64 GB of memory and literally twice the number of CPU and graphic cores of a Studio Max. They were expecting a lot bigger difference for something that costs at least $2000 more than the Studio Max.

The Studio Max showed a huge improvement over previous generation Mac desktop computers but not over the current MacBook Pro laptops, with the same amount of memory. The performance was about equal there.

Once more, I am not the one making these claims these are YouTubers that have regular channels and their emphasis is on higher end video and photography. Do a search for Studio Max and you should find lots of them. And if you think that they are wrong argue with them. I am looking personally for something for my amateur photography and my 2015 (actually purchased in early 2016) iMac gets incredibly slow if I have 6-10 pictures being edited or corrected by either Affinity Photo or LightRoom. And that’s why I’m looking. Spending $2000 and not locking up my computer and finishing work with them in 20 minutes instead of close to an hour is a better deal than spending $4000 and finishing in 18 minutes. The 2 minute improvement isn’t worth 2 grand to me.

And because I still need to buy a monitor and probably an external drive I may not be buying a new Apple computer at all.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.