Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

obviouslogic

macrumors 6502
Mar 23, 2022
277
437
Still don't understand why they didn't save the Ultra for the Mac Pro, because the Studio is a great deal compared to the new MP.

Not if you need faster storage and I/O solutions.

Thunderbolt has a max throughput rate of 5GB/s.
A single x16 PCIe gen. 4 slot is over 30GB/s.
 
Last edited:

Scoob Redux

macrumors 6502a
Sep 25, 2020
585
894
UPGRADING from the M1 Studio??? How ludicrous. Outside of a tiny number of business motivations for tax purposes, literally no one is looking to upgrade a 1-year-old high-end desktop.
 
  • Like
Reactions: User 6502

obviouslogic

macrumors 6502
Mar 23, 2022
277
437
I'm waiting to buy the M3 Ultra Mac Studio.

I think Apple won't be able to shrink the die past 3nm. So New Mac products will hit a major slow down after 3nm.

Not necessarily. More and more specialized co-processors can be added to take the burden off the CPU. So while testing just the CPU there may not be huge gains, but in real world use there could be significant gains in some areas where Apple feels performance would benefit.

An example of this (on the GPU side) would be to add hardware ray-tracing support, which would dramatically speed up 3D rendering. Metal supports ray-tracing, just needs to be added to the silicon.

But yeah, I'm really curious what the M3 will bring to the table. I'll probably upgrade my M1 mini to an M3 Pro mini.
 

Mr. Dee

macrumors 603
Dec 4, 2003
5,990
12,833
Jamaica
Or perhaps not a single person bought it.
Thought Apple would have sent out review units. Snazzy labs said theirs would arrive this week. It feels like the Mac Studio is the future of the Mac Pro. The 2013 Mac Pro is what’s casting a shadow over the Studio. The idea of a compact pro work station just seems to out there.
 

meboy

macrumors 6502
Apr 12, 2012
400
316
M1 Ultra was at best 40% faster than the Max only in a few very specific situations.
Otherwise single digit in most other things.

Clearly there were issues with software not utilizing it well.

Almost everyone is better off getting the Max.

Not getting hopes up that is rectified yet.
You have to test it in many different real life ways to really see it's issues.
 

k2director

macrumors regular
Jan 2, 2006
144
260
It’s easy to get huge improvements when the starting point wasn’t great. M1 GPU really wasn’t that good. Good enough for general use, not anywhere near what professionals who need GPU power for the workflows need, especially at that price. M2 isn’t still inadequate in that sense at its price point, and it’s just insane that in the Mac Pro one can’t install external GPUs.
When Apple makes changes, the reasons are often to support some long-term benefit that Apple is working towards. But the first year or two after the change often introduces hassles and inconveniences to certain classes of users. That’s the case with the new Mac Pro. It’s a great computer for people doing video production and audio who also need some PCIe slots. But it’s a step back for people doing 3D work, who need to install a big beefy 3D card. Now they can’t.

However, I guarantee you that Apple is working on another M-variant or something else with baked in GPUs that are MUCH better at 3D than the current crop of Apple Silicon. When those chips start to arrive next year or whenever else, the performance will make many 3D users not care so much about adding a massive 3D card to their machine. Apple’s new chips may not quite match the latest CUDA cards, but they will get close enough for a lot of people, especially given that Apple’s performance comes in a small, quiet, power-sipping package that also runs Mac OS and plays well with Apple’s growing universe of devices.…and no massive Mac Pro necessary!

The Mac Pro has become very much of a specialty unit, and for people wanting to do robust 3D work, they will still have to wait a little longer before Apple’s long-term strategy fully plays out. But Apple is setting up a very compelling position for itself that does not involve having to buy a $7K machine and adding an expensive, clunky 3D card….
 
How many M2's are we going to get?

M2, M2 PRO, M2 MAX, M2 ULTRA, M2 HYPER!!!
View attachment 2219016 View attachment 2219006
I got something better and useful for you.

IMG_4472.jpeg
IMG_4473.jpeg
IMG_4471.jpeg
IMG_4470.jpeg
 

CarAnalogy

macrumors 601
Jun 9, 2021
4,403
8,054
I have to assume the same: Apple knows what it is doing. Again, it would have been far easier to simply discontinue Mac Pro in a PR release or quick verbal statement... perhaps as they rolled out M2 Studio as the new king of this hill.

Yes but I get the feeling there were more politics than usual involved here. Just a hunch, of course I don’t know for sure.

Remember there were buyers for the $1X,000 Apple Watch Edition too... the $1,000 monitor stand, the $700 wheels, the $20 handkerchief, socks, etc. They didn't make that stuff to not sell any of it. They generally know they have buyers. And the closer you get to their bread & butter products, the more likely they have some confidence that whatever they package will sell.

Definitely true but the Mac Pro is on the very periphery of core products.

I'm convinced Apple could box ANYTHING, price it (too) high and there will be buyers for it. Anything. Dirt, poop, air, water, pop tarts, shoehorns, salsa, etc. ;) Furthermore, I'm convinced if they box something like air, a segment of those buyers would eventually smother by running out of Apple air and refusing to read inferior, non-Apple air. ;);)

Steve definitely could. And with the way some people act about iMessage I think you’re right about the air.
 

eifelbube

macrumors 6502
May 15, 2020
432
369


Apple at WWDC surprised us with the M2 Ultra chip, which is the key feature in the new second-generation Mac Studio. We thought we'd compare it to the first-generation Mac Studio with M1 Ultra chip to see whether it's worth upgrading.


Compared to the base M1 Ultra, the base M2 Ultra has a 24-core CPU (up from 20 cores) and a 60-core GPU (up from 48 cores). It has the same 64GB unified memory and 32-core Neural Engine.

In a real world Final Cut Pro export test with 4K footage, a 13 minute clip exported in five and a half minutes on the M1 Ultra machine, and just over three minutes on the M2 Ultra. An export of an hour-long video took almost seventeen minutes on the M1 Ultra Mac Studio, and just eight and a half minutes on the M2 Ultra.

Over time, the M2 Ultra is going to save a lot of exporting time for video editing, and other GPU-intensive tasks will likely see the same time savings.

The Mac Studio with M2 Ultra we tested is equipped with an upgraded 2TB SSD, and on the BlackMagic Disk Speed Tests, we got read speeds of 5,455 and write speeds of 5,100. With a 1TB SSD on the M1 Ultra machine, we were seeing read speeds of 1,853 and write speeds of 5,092, so SSD speeds are up on the higher end at a minimum.

If you've already got an M1 Ultra Mac Studio, it may be worth upgrading if you want to save time on video exports and similar tasks, but if you're not going to take advantage of the increased GPU speeds, it won't be worth the $3,999 starting price for the entry-level M2 Ultra Mac Studio. The M2 Ultra Mac Studio is unquestionably the best desktop machine that Apple sells in terms of price and performance for professionals.

There is an M2 Mac Pro, but it is priced starting at $6,999 for the same performance but with PCIe slots. What do you think of the M2 Ultra chip? Let us know in the comments below.

Article Link: Hands-On With the M2 Ultra Mac Studio
Would be nice if we had a Studio M2 Max and M2 Ultra comparison on the basis of the FCP exports …
 

the_original_markoconnell

macrumors newbie
Mar 8, 2022
8
6


Apple at WWDC surprised us with the M2 Ultra chip, which is the key feature in the new second-generation Mac Studio. We thought we'd compare it to the first-generation Mac Studio with M1 Ultra chip to see whether it's worth upgrading.


Compared to the base M1 Ultra, the base M2 Ultra has a 24-core CPU (up from 20 cores) and a 60-core GPU (up from 48 cores). It has the same 64GB unified memory and 32-core Neural Engine.

In a real world Final Cut Pro export test with 4K footage, a 13 minute clip exported in five and a half minutes on the M1 Ultra machine, and just over three minutes on the M2 Ultra. An export of an hour-long video took almost seventeen minutes on the M1 Ultra Mac Studio, and just eight and a half minutes on the M2 Ultra.

Over time, the M2 Ultra is going to save a lot of exporting time for video editing, and other GPU-intensive tasks will likely see the same time savings.

The Mac Studio with M2 Ultra we tested is equipped with an upgraded 2TB SSD, and on the BlackMagic Disk Speed Tests, we got read speeds of 5,455 and write speeds of 5,100. With a 1TB SSD on the M1 Ultra machine, we were seeing read speeds of 1,853 and write speeds of 5,092, so SSD speeds are up on the higher end at a minimum.

If you've already got an M1 Ultra Mac Studio, it may be worth upgrading if you want to save time on video exports and similar tasks, but if you're not going to take advantage of the increased GPU speeds, it won't be worth the $3,999 starting price for the entry-level M2 Ultra Mac Studio. The M2 Ultra Mac Studio is unquestionably the best desktop machine that Apple sells in terms of price and performance for professionals.

There is an M2 Mac Pro, but it is priced starting at $6,999 for the same performance but with PCIe slots. What do you think of the M2 Ultra chip? Let us know in the comments below.

Article Link: Hands-On With the M2 Ultra Mac Studio
This thing is wicked fast with AE and Premiere. No regrets.
 
  • Like
Reactions: Apple Fan 2008

rp2011

macrumors 68020
Oct 12, 2010
2,406
2,774
Worry, Intel = yes; Worry, Nvidia, not a chance, take a look at their datacenter products.
Nvidia is a dead company walking. They are where Intel was five years ago. Qualcomm will take all that data center business with the chips developed from their Nuvia acquisition. That, along with the death of Intel and the bit-mining scam, will end a chapter in computing. Goofy water-cooled power hogs are the bell bottom pants of computing.
 
Last edited:

heretiq

Contributor
Jan 31, 2014
870
1,388
Denver, CO
It’s easy to get huge improvements when the starting point wasn’t great. M1 GPU really wasn’t that good. Good enough for general use, not anywhere near what professionals who need GPU power for the workflows need, especially at that price. M2 isn’t still inadequate in that sense at its price point, and it’s just insane that in the Mac Pro one can’t install external GPUs.
Hmmm .. @User 6502 🤔 Which "professionals" are you referring to? I don't recall protests from professionals disappointed with the performance of the M1 GPU in the devices they typically purchased -- i.e., MacBook Pros with M1 Pro/Max SOCs.

In fact, I recall the opposite: many reviews from professional photographers and videographers endorsing the M1 and extolling the speed (processing, rendering and I/O), efficiency and other virtues of Apple Silicon and hailing the return of the MacBook Pro.

Net-net: The overall conclusion I recall from "professionals" was that the M1 raised the bar for performance and efficiency substantially (so much that it triggered speculation about the potential beginning of the end for Intel). Are you referring to some other type of "professional"?
 

ChrisA

macrumors G5
Jan 5, 2006
12,649
1,795
Redondo Beach, California
What I really care about is how much of my time is saved. Video rendering is unimportant because I can do something else while the video renders. What I care about more is how long things take when I can't do something else. Typically, these are those little 2 and 10 second "lags" that are annoying but still so short I can't go off and do something else.

I just bought an M2-Pro mini. It is much slower than the M2 Mac Studio but really, for most media edit jobs, the computer is waiting for me, not me waiting for the computer. For example, I am now typing a post to Mac Rumors. If my computer were 10 times faster, I would not be finished in one tenth the time.

What I'd like to see are "un-benchmarks" where an observer times a professional on everyday tasks and sees where they spend time. If they watched me working on 3D CAD design, they would see that most of the time it is me thinking about how to improve the design while turning the part around in 3D space looking at how it could be made lighter or stronger and easier to manufacture. The computer is now fast enough that my brain is the bottleneck.

It was not always this way. My first "real" computer was an IBM PC bought from IBM in 1982. It was slow. We waited for the screen to update, we waited for apps to load from floppy disks and when I compiled some code, I'd get up and take a break and drink some coffee and come back after 5 minutes to see how it was going. But those days are gone. My new M2 seems to wait for me.

How much of my time would be saved with an M2-Ultra, vs my M2-Pro? I do mostly software development, mechanical and electrical engineering (and of course web surfing and email.)
 
  • Like
Reactions: heretiq

ChrisA

macrumors G5
Jan 5, 2006
12,649
1,795
Redondo Beach, California
Because then they would be gatekeeping the chip behind a huge purchase price that much less people would want to spend. The Mac Pro solves a few specific needs and is obviously a bit of a placeholder for a higher class of computer than the Studio, but Apple can't actually make that computer yet. Or might have little interest in doing so ever.

Side note, we are living in the "best of times" for the much coveted "headless Mac". The mini is good enough for a huge group of folks, the Studio is good enough for a huge group of Pros, and the Mac Pro is good enough for some specific niches that the Studio doesn't satisfy. We still have a hole for true enterprise GPU performance or vast RAM needs, but Apple is still in a very good place right now.


Do you guys know what a "real" enterprise class Nvidia GPU costs? Not a RTX4090. The 4090 is not an enterprise class GPU. It is a toy for rich gamers. People spend $6K to $12K on high-end GPUs. Some computers have two or even four of them installed. The cost of the Mac Pro is pocket change compared to the cost of the GPU.

If Apple released a "professional" computer that only supported gamer GPUs, people would say, "What are they doing? Who is this for?"

The Nvidia A100 is pretty much the industry standard for heavy compute tasks like Machine Learning. If you are just editing photos or even 8K video, the Apple M2 Max/Ultra series does that just fine. But editing no longer stresses consumer level GPUs. ML is the new thing and the A100 is what they all use.

Look at the high-end GPS. One thing they lack is fans. They depend on the chassis for airflow. If you want to support them, you need a massive power supply and a massive and VERY loud fan.

You can buy them on Amazon.
amazon.com/NVIDIA-Tesla-A100...

People are actually buying HUNDREDS of these GPUs to run big jobs in reasonable time. Apple is simply not selling to that market. The built-in Apple Silicon GPU is faster enough for Apple's actual customers.
 

Sippincider

macrumors regular
Apr 25, 2020
174
415
I don’t understand why none of the reviewers seem to be addressing the issue directly. Haven’t watched the video yet but I see no mention of it in the text and people still clearly have questions about it.

Reviewer: This is hands-down the best Apple product we've ever tested!
Us: What about the whine?
Reviewer: We don't talk about that.
 

ElectricPotato

macrumors 6502a
Dec 13, 2018
756
2,077
Seattle
Because it seems like they don’t actually want you to buy the Mac Pro because they are trying to phase that form factor out of existence.

If that was the goal, they would simply pull it. Problem solved, migration complete.

A plausible explanation is it is a place holder that can be used by hardware developers to write macOS drivers for PCIe hardware. It would serve the same purpose as the early "developer" Apple silicon minis. When the real thing is released with an M3 SuperDuper, the hardware will be ready and tested for the platform.
 

TallManNY

macrumors 601
Nov 5, 2007
4,753
1,602
Do you guys know what a "real" enterprise class Nvidia GPU costs? Not a RTX4090. The 4090 is not an enterprise class GPU. It is a toy for rich gamers. People spend $6K to $12K on high-end GPUs. Some computers have two or even four of them installed. The cost of the Mac Pro is pocket change compared to the cost of the GPU.

If Apple released a "professional" computer that only supported gamer GPUs, people would say, "What are they doing? Who is this for?"

The Nvidia A100 is pretty much the industry standard for heavy compute tasks like Machine Learning. If you are just editing photos or even 8K video, the Apple M2 Max/Ultra series does that just fine. But editing no longer stresses consumer level GPUs. ML is the new thing and the A100 is what they all use.

Look at the high-end GPS. One thing they lack is fans. They depend on the chassis for airflow. If you want to support them, you need a massive power supply and a massive and VERY loud fan.

You can buy them on Amazon.
amazon.com/NVIDIA-Tesla-A100...

People are actually buying HUNDREDS of these GPUs to run big jobs in reasonable time. Apple is simply not selling to that market. The built-in Apple Silicon GPU is faster enough for Apple's actual customers.
Excellent analysis. I don't really know about that level of GPU usage. But it kind of was what I meant, the Mac Pro doesn't really service the needs of folks with truly high needs that can only be satisfied with a truly powerful GPU. The current Mac Pro solves a few issues that the Studio does not, but you have to really need that stuff to pay another $3,000 over the Studio. I'm glad the Mac Pro exists for now. But it doesn't seem set up for folks who need 1TB of RAM or need the type of GPUs you are talking about. It also, doesn't meet the needs of high end gamers, but that story is no different from any other point in Apple's history.
 
  • Like
Reactions: spaz8

chucker23n1

macrumors G3
Dec 7, 2014
8,612
11,424
I don’t understand why none of the reviewers seem to be addressing the issue directly.

Jason Snell did.

I’m happy to report that Apple has rejiggered the cooling system in the Mac Studio. I could only hear the fan blowing when I turned the Mac Studio around so that its vents were pointing right at me, and even then, it was pretty quiet. When I properly oriented the computer on my desk, I couldn’t hear the fan. I placed my M1 Mac Studio on a nearby table and could still hear it blowing, in fact.

I wouldn’t call the M2 Mac Studio silent, but it’s noticeably quieter than the M1 model, and if you were to keep it on top of your desk, you probably wouldn’t hear it.

And Apple’s specs say the noise has gone from 15dB on the M1 to 6dB on the M2. That’s quite a difference.
 

name99

macrumors 68020
Jun 21, 2004
2,313
2,175
No conspiracy theories. At 2nm or perhaps 1nm or somewhere thereabouts, laws of physics will end whole number steps (as it is now, 5nm is not actually 5nm and 3nm will not actually be 3nm, but the game is on to measure things differently than reality and is not easily altered now). So in the little room- real or spun- that will be left, TSMC or their partners will jump to Angstroms or something else to create a lot of whole numbers to resume this "meaningful leap" game again.

Yes, then we'll be working in fractions of whole numbers but by a simple word change, the whole numbers thing works again for some number of years. Yes, the benchmark gains won't be as tangibly dramatic as now but when measured against the prior "leap" (of this fraction of a whole) they will still show as significant percentage gains in generation vs. generation (at this measuring stick level).

And then we'll have 10 or 20 years before we run out of that split hair gauge and if we ever manage to reach some absolute (actual) minimum, TSMC and/or their partners will just move on to some other "very important" measure of computing power gains and then move us consumers to shift to viewing that new measure as some gauge of ever-improving computers.
Spoken like someone who doesn't have a clue about the actual engineering, and who thinks words matter more than the engineering.

No-one in this space GIVES A FSCK about the words that are used. What matters is upcoming technical details. First GAA, then BSPDN, then ForkFET, then clock on the backside, then CFET. Every one of these is a huge, important, and difficult step forward. And they will be marked by a succession of names like A24, A20, A18 or whatever.

There is NOTHING stopping you from learning enough about the technology to be part of this great adventure, to understand that the significant step in each one of these new names is whatever it is – a new type of transistor, a new way to deliver power, or whatever. But you don't want to do that, do you? You'd rather sneer about "marketing" and how you're not smart enough to be "fooled" (like TSMC cares what you think? are you buying space in TSMC fabs? ) than spend a day educating yourself about one of the most interesting adventures of our time.
 
  • Like
Reactions: dgdosen
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.