Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think over the past decade Apple’s attention and support of their “Pro” customers has been somewhat chaotic.

That's fair.

For example, their decision to give all MacBooks the butterfly keyboard contributed to me feeling unable to buy any MacBook Pro for years. (Other factors include poor port selection and poor Intel CPU offerings.) Could you buy a thicker Apple laptop with a better keyboard and better port selection? Nope. Take it or leave it.

But, they corrected that. Since the M1 series of Macs, I've been agreeing with most of their decisions.

Currently I see the problem as having more to do with the transition to Apple Silicon and a huge misstep in their plans for it; that being the use of UltraFusion to create a 4x SoC variant. The rumors and proof were there for it being a thing, but I think they learned the hard way that it just wasn’t going to be feasible, either due to cost or performance issues.

Sure, but this is only relevant for probably far below 1% of Mac customers. Granted, that's partially because many of them have already left in the past, but nevertheless.

Which means 3 years ago they had to start over. This left them shoving an Ultra 2 into the Mac Pro and calling it a day. I seriously doubt this was Apple’s game plan for the Pro.

Yep. I don't think they're happy with the 2023 Pro.

 
The Mac community, besides being full of the usual toxic-fandom, has some silly notions.
It's toxic at times, I agree. But I wouldn't call it "fandom". Actual fandom would look different.
In the PC world, archaic as it can be, no one whines about "mismatched" Lenovos because Lenovo offers PCs with either AMD, Intel, or Qualcomm processors. Even different generations from the same manufacturer.
Hah, great point!
Jury's out for M4 Ultra chips on Apple Watch 11 though.
No, but look out Watch Ultra 3!
But I also don't know anything about running a company with a 5 trillion dollar market cap.
Market cap isn't all that and a bag of chips. It's only a snapshot of a moment in time.

Case in point: AAPL's market cap today is 3.3 trillion.
"Who will ever need more than 640 KB RAM?"
I came here to say that for the same reason, but you beat me to it!
 
The lifecycle of a pro machine is likely longer, but sales lower.

With that in mind it might make more sense to release ultra’s bi-annually.
- less development cost
- bigger performance boost highlighting the value of an upgrade (for a company this is maybe not so emotional but an actual calculation)
 
That would be a shame.
It would definitely be, but he neglected the pro market for a very long time before and most people I know switched with HP workstations which were heavily promoted by Adobe back then.

The cost/performance difference between a Mac Studio and a Mac Pro is quite steep. And as far as I know you can’t get a new motherboard into the Mac Pro to replace it with faster chips when they come out.

I don’t know anybody who’s bought a Mac Pro while I knew very much in the past.
 
...Apple is reluctant to develop an M4 Ultra chip from scratch due to production challenges, costs, and the relatively small sales volume of its desktop computers, like the Mac Studio...
The irony... but ripping off customers with highway robbery prices for consumables like memory & storage. And then wonder why no one is buying the machines anymore. It's not that outrageous prices for consumables are new with Apple, what's new since the M-generation is that there is no longer a way to install these consumables from third-parties, effectively forcing customers to swallow Apple's prices, or go somewhere else entirely. And that's what's happening with the Pros. They would rather get a PC they can customize to their heart's content, without paying inflated prices. After all, they have a business to run and price/performance matters.

Apple has a business to run too, but they choose to run it to the maximum benefit of the shareholders, not their customers. With a bean counter at the helm, this isn't surprising.
 
The best match for productivity and hobby use would probably be one of the new Mac Minis with a third party screen. I've got a 40 inch ultrawide screen (Thinkvision P40w-20), and for my use it's better than anything Apple has made. As a bonus, I can just plug in my customer provided laptop when I need more than just Office 365 there.
He's leaning towards a Mac Mini Pro with a 3rd party screen.
 
The counterargument is maybe Mac Pro sales are so rare because it’s not a good value compared to the Studio.

I agree with this sentiment. IMO Mac Pro sales are so low because Apple made them that way. The 1,1 - 5,1 and 7,1 Mac Pros weren't just about absolute power. They were also about expandability / configurability post purchase.

there certainly must be some sort of mount to slap a mini onto the bottom/back of a large display and thus creating a sort-of back door aesthetically questionable AIO.

Googling turned up this: https://www.etsy.com/listing/1809425516/apple-studio-display-mac-mini-m4-mount
Uh, no :)
 
This is true you don’t know if the rumors are correct. I have a feeling many of these people are just guessing or they get bad information. It’s possible Apple could have an iMac and some test lab and leakers take that as it’s going to market it’s really not

I mentioned there are rumors of a larger iMac. However those rumors appear to be more guessing / wishful thinking than something I would recommend he hold out for. That said he's not in a rush so waiting a few months wouldn't be a problem for him.

I don’t see how anyone would be stranded. Granted you might not get exactly what you want, but Apple does make the Apple studio display that is compatible with both the mini and studio. While there may be some aesthetic differences, the functionality is the same. There is a big argument against the all in one computer because once the computer is out of date, then the monitor goes in the trash. This isn’t exactly ideal for the environment or consumers having to buy both items combined. I agree it looks better though.
Agreed about the screen being a part of the computer. Ironically the hardware is sufficient for his needs, the problem is lack of OS support which is starting to drive lack of application support. He has considered the Studio Display but I think he's leaning towards third party monitors.
 
  • Like
Reactions: russell_314
I mentioned there are rumors of a larger iMac. However those rumors appear to be more guessing / wishful thinking than something I would recommend he hold out for. That said he's not in a rush so waiting a few months wouldn't be a problem for him.
Yeah, you never know but there’s no concrete rumors that I’ve seen that had designs and stuff like that. Usually, we see more information if it’s coming out soon.

Agreed about the screen being a part of the computer. Ironically the hardware is sufficient for his needs, the problem is lack of OS support which is starting to drive lack of application support. He has considered the Studio Display but I think he's leaning towards third party monitors.
I think someone mentioned the boot loader that would allow the current version of macOS, but I don’t know how to do that since I’ve never done that.

If he goes with a third party monitor, there’s a big downgrade on how it looks. That may not bother some people though. It kind of sucks that I bought an iMac so I know what it can look like. Buying a Mac mini is really cheap when combine it with an affordable monitor. Add the studio display and it’s not so cheap.
 
Not sure how you arrived at that conclusion. I’m pretty sure that SoC is the whole reason why Apple Silicon performs so well. Do you mean, just for the extreme high end?

This isn’t really a fault of architecture or manufacturing difficulty, it’s a product of having a a great design that eventually outmodes the initial plan. They were expecting to need all these tiers of performance, such that they make space for the Ultra architecture, but didn’t foresee how well-suited “max” and even “pro” chips would be for a vast majority of workloads in the pro segment.

So now, any delay in the release of new variants that fit the original plan presents as an engineering problem.

I’d be pretty willing to bet that sales of the “studio” are low, because there just isn’t that high demand for the performance.
It doesn't matter whether it's SoC or not. It' just how you design the chip. But SoC is meant for mobile size chips, not desktop grade chips.
 
Pro machines tend to stay in service for years, not always ride the bleeding edge.
Why talk about the 'pro' market and then complain it isn't behaving like high end gamers want?

The M2 Ultra still outperforms the M4 Max in various benchmarks. I think we can safely assume the M3 Ultra will beat that. It will also bring other advances like that hardware raytracing I have been hot for.

And my M1 Ultra (soon to be for sale) still behaves quite nicely for many tasks.
 
  • Like
Reactions: tenthousandthings
The Mac Pro has few reasons to exist, as I see it.

On tower workstations, the space is used for powerful GPUs (multiple) and lots and lots of memory - you can get them with terabytes of memory(!). Neither of that is possible on the Mac Pro.

You can add some special connectivity cards if Thunderbolt 5 and 10 GbE is insufficient, but that's a tiny fraction of a tiny segment.

The Studio allows one , and only one, internal storage drive. There are folks that need more than 8TB of internal storage. It is more than a 'tiny fraction'. Even if Apple comes up with an deeper round peg in a square hole even larger , more expensive 16TB ... it is still one drive (and single point of failure).

Apple mentioned that only one internal drive was in issue back in April 2017. That issue didn't go away. TBv5 doesn't make it go away. (e.g., PCI-e x4 SSD drives have only gotten substantially faster since 2017. TBv5 is mainly just 'treading water' on how far behind that bottleneck is. )
 
i think maybe something we’re forgetting about here is that engineering resources have been diverted to server chips in the wake of the AI boom.

That really shouldn't be having a deep impact. Chip design is a 3-4 year horizon pipeline. That "AI hyper fad" in 2023 would lead to a chip coming in 2026-27 ... which is ...

" ... At least a trio of companies are believed to be involved with the chip. Apple is said to be handling the overall design of the chip, while Broadcom is said to be providing some networking technology for it. TSMC is expected to begin mass production of the chip in 2026, using its third-generation 3nm process, known as N3P ..."


What would have been 'oversubscribe' on design resources 3-4 years ago would have been R1 ; not the server chip.

There is little indication that what Apple is deploy in the near term is just different logic boards for the mainstream Mac based SoC for these server deployments.

The core issue the lack of proper time overlooks is that Apple had to make bets on the M3 before the M1 ever shipped. The M3 was in flight without not knowing at all how the transition was going. ( made even more opaque with layering the pandemic on top of the transition. And TSMC N3 being a substantive leap forward in fab process risk. )


They’re not going to divert from the flagship line (which is really only the pro and max), so the low-sales specialty tier (Ultra) is what suffers.

Again also not quite up to what is occuring at tactical level now. For the M1 and M2 the Pro shared a very high degree of layout/floorplan with the Max. ( The CPU count the same , NPU same , top 'half' GPU same, top 'half' memory channels same )

With M3 that changed. M3 Pro has six E cores. The M3 Max four E course. The Max has more P cores and generally just as substantially more silicon throw at it.

What missing was before Mac transition Apple had basically two A-series chip dies. The A__X designs periodically would skip a generation.

The plain Mn die basically takes the place of the A__X ( iPad Pro and entry Mac same SoC). So the Pro and Max were additional work overhead. As noted before M1 and M2 maximized that shared R&D for those two.

With M1 Ultra Apple likely was shooting to get folks to understand the move from large scren iMac --> Studio. M2 generation saw Mini get a Pro ( move volume to support future diverging Pro design) and M2 Mac Pro ( that was in class to supercede Intel 16 core + W5700X common configuration).

The M1/M2 Max being shared with the laptops is also a risk mitigation. If 14"/16" Max didn't sell well then the Max SoC can be consumed by the Studio ( and later Mac Pro in M2 generation). If 14"/16" Maxs sold extremely well they could just different the Studio and even more of the overhead costs of UltraFusion paid down by customers that couldn't possibly use it. (so below/above Studio Ultra sales expectation would matter less. )

“Not every generation will have an Ultra” then reads a bit like revisionist commentary on product strategy. AKA, it’s a convenient explanation that makes what is happening seem like it was the plan all along.

Probably not. Very good chance the M2 Ultra ( which was necessary to launch a creditable Mac Pro) likely was not targeted until end of 2022. Remeber Apple said two year transition. ( so either June 2022 or December 2022) needed to make that. M3 Ultra was very uniikely. If N3 had gone pefectly to plain in 2H2022 then would only have gotten M3 , Pro , Max out the door in time for that deadline. Ultra wouldn't have made it at all. That N3B had even more hiccups than expected; not a chance.

By M3 generation it would have been time to see if the desktop Max/Ultras could pull their own weight without subsidies from users that cannot possibly use the UltraFusion connector. Subsidize while in risky transition phase with radical 'change' products ( no screen on large iMac performance zone. No 3rd party GPU cards) makes lots of sense. Subsidize forever on every generation ... not so much.


I’d bet that, under the hood of top-level marketing and strategy goings-on, in the wake of making a killer product that redefined the boundaries of necessity, they’re trying to adjust what all the performance breakpoints should be, as well as the corresponding naming.

I wouldn't take that bet. I think the primary objective was to make the Max a Ultra 'killer' all along. To put a "Mac Pro 2019" into a 16" laptop was always the lead goal. The M4 Max is largely that (minus some fringe corner cases at top end of price sale that the MBP is relatively far cheaper than ) .

The AI server thing is likely going to look somewhat like a Google Tensor chip that Broadcomm helps them with.

Cloud Tensor Units.

These "sever chips" have HBM ( not RAM DIMMs ) and high connectivity built in. It isn't trying to mimic AMD EPYC or Intel Xeon 6 chips. More likely the Broadcomm connectivity will display all of the Thunderbolt stuff that Apple puts into the laptop/desktop units. It will look more like a data center GPU (next to no or zero display out) ... that really won't be viable for a Mac Pro.

Taking the display controllers and Thunberbolt out of a Max/Pro die really wouldn't be too hard. There are options for attaching the Broadcomm stuff. Either via UCI-e ( standard interconnect that will be quite viable in 2026) . Apple could reuse their "poor man's HBM" memory with LPDDRx if want to shave costs on that also. Change ratio of die space for NPUs versus CPU shouldn't be a huge stretch.


So what we get in the M3 Ultra Studio is just something easy based on an “older” architecture to satisfy that high-end specialty segment. At the end of the day, an M3 Ulta is still going to belch out those exports faster than an M4 max, and run large language models that an M4 Max can’t.

There never any economic sense in Apple throwing Ultra sized Chips in the trashcan every 12 months made any economics sense at all. The plain Mn chip doesn't disappear every 12 months. It gets 'handed down" to the iPad Air. The A-series get handed down to iPads/AppleTV/etc. The Apple Watch SoC gets handed down to the HomePods. The vast majority of Apple's silicon is about using the SoC in multiple products. Throwing away the Ultra every year doesn't do that.

[ The Mn Pro is on slippery slope also. Somewhat likely why they shrank it a bit by leaning more on E cores than P cores. Apple probably is looking for something else to toss it into. Or may fall back to staggering the Mini updates to extend the lifetime. ]

The M3 Ultra is likely coming now on Studio because Apple needed more time to get returns back on M2 Max ejected from the MBP 14/16" in 12 months. The rapid pace of ejection from the MBP 14/16 is likely contributing to the delay in updating the Studio and Mac Pro.

The M4 Max Studio has to compete with the M4 Pro Mini. It really can't wait too long because the GPU upgrades Apple has made M3,M4 have been substantive. Which is mainly to enhance the laptop segment.
 
The Mac Pro has few reasons to exist, as I see it.

On tower workstations, the space is used for powerful GPUs (multiple) and lots and lots of memory - you can get them with terabytes of memory(!). Neither of that is possible on the Mac Pro.

You can add some special connectivity cards if Thunderbolt 5 and 10 GbE is insufficient, but that's a tiny fraction of a tiny segment.
The notion that PCIE slots are only for GPUs is the province of gamers and outdated crypto miners.
There are lots of industrial (you know, professional) cards that analyze and control equipment. I have 3 tools on the manufacturing floor, each of which has 18 mac computers in them (older ones, but apple logos all around), and that is just 3 examples in a specialized area of a highly specialized field.

Dedicated, special cards in each of them are used to help us work.

High precision robot controllers, specialty sensors and IO are all over manufacturing. Arduino and Raspberry Pi aren't good enough for all applications. Think about what kind of processing power chews up 18 Mac Pros.

No, it isn't to run Crysis.
 
The Studio allows one , and only one, internal storage drive. There are folks that need more than 8TB of internal storage. It is more than a 'tiny fraction'. Even if Apple comes up with an deeper round peg in a square hole even larger , more expensive 16TB ... it is still one drive (and single point of failure).

Apple mentioned that only one internal drive was in issue back in April 2017. That issue didn't go away. TBv5 doesn't make it go away. (e.g., PCI-e x4 SSD drives have only gotten substantially faster since 2017. TBv5 is mainly just 'treading water' on how far behind that bottleneck is. )
Isn't it technically two storage drives mapped as one? And as long as TB5 is faster than NVME, it isn't treading water. I would prefer internal drives, but will settle for really fast ones outside.
Even my M1 studio passes BlackMagic disk speed test fast enough to handle 8K video on the external drive. I could supposedly even look at 12K if I was careful with the format.
 
The notion that PCIE slots are only for GPUs is the province of gamers and outdated crypto miners.
There are lots of industrial (you know, professional) cards that analyze and control equipment. I have 3 tools on the manufacturing floor, each of which has 18 mac computers in them (older ones, but apple logos all around), and that is just 3 examples in a specialized area of a highly specialized field.

Dedicated, special cards in each of them are used to help us work.

High precision robot controllers, specialty sensors and IO are all over manufacturing. Arduino and Raspberry Pi aren't good enough for all applications. Think about what kind of processing power chews up 18 Mac Pros.

No, it isn't to run Crysis.
These days the GPUs are mostly used for AI in such a workstation rather than graphics. Although some BIM/CAD models might need it, these cards are more likely to be used by CUDA than for displaying demanding graphics.
 
*noticeable progress. the scaling and performance gains still probably look relatively the same when averaged out over time.
We already pretty much hit physical limits when it comes to clock speed, so the strategy switched to parallelizing as many tasks as possible and just use a higher number of cores. Those cores also have physical limits though. The denser you build them, the higher the error rate gets.

Single core speed has stopped growing exponenentially a long time ago. I still remember the prediction in the 90s that the first CPU with a clock speed of 10 GHz would be available before 2010 based on the speed improvements of the past. 10 GHz were never reached though.
 
"Who will ever need more than 640 KB RAM?"
That is not what I mean with physical limits. It is not about what is needed, but what is possible. RAM might still have room to grow, but single core clocks speeds are limited by the speed of light for example. If an electron can only travel 5 centimetres within a clock cycle, that severely limits the size of a CPU.
 
That is not what I mean with physical limits. It is not about what is needed, but what is possible. RAM might still have room to grow, but single core clocks speeds are limited by the speed of light for example. If an electron can only travel 5 centimetres within a clock cycle, that severely limits the size of a CPU.

Ah, gotcha. I thought by "no meaningful difference", you meant that users weren't going to be able to perceive it.

There's still several angles from which performance can improve. 2 nm and 1 nm (or 20A and 10A) are being researched. Caches can be made bigger. New branch prediction techniques can be used. More code can move from the CPU to the GPU or NPU.
 
I still remember the prediction in the 90s that the first CPU with a clock speed of 10 GHz would be available before 2010 based on the speed improvements of the past. 10 GHz were never reached though.

Yep, Intel made a bad bet ca. 2000, thinking they could reach 10 GHz within a decade. Instead, they had to course-correct towards multi-core, which creates a problem because tons of code, even today, doesn't parallelize well at all.
 
  • Like
Reactions: Skyscraperfan
The Pro market changed. I'm the Pro market. I shoot and edit film. And use an M1 Max MBP. It's enough for me, and I shoot 8K. The people who definitely need a Mac Pro are so few these days, I can kinda understand Apple's position on this.
I've got a M1 Max MBP, but it's simply not enough when you start using noise reduction, motion blur, etc.
If only Apple had written drivers for dedicated GPUs, I'd have purchased either a Mac Pro or gone the eGPU route (especially now with the ridiculous speed of TB5!).
As it stands, Macs are only great up to a certain extent with video editing and motion graphics, but start stalling pretty quick when you have to go deeper.
My main PC probably cost me $7K+ to build, but it can handle a lot more than a Mac, and will again become more powerful if I can ever get a hold of a 5090. Macs just don't make too much sense for certain pros in the editing space right now.
And this is coming from someone that adores his MacBook Pro, and recommends Macs to most enthusiasts that ask me for purchase advice.
 
  • Like
Reactions: xyz01
Yep, Intel made a bad bet ca. 2000, thinking they could reach 10 GHz within a decade. Instead, they had to course-correct towards multi-core, which creates a problem because tons of code, even today, doesn't parallelize well at all.
Yes, I have that problem. I have a skyscraper website and I render it via PHP and that relies on a single core. I also use some old software that can't even use more than one core. So I really hope that in future we will see really fast single cores instead of just more and more cores.
 
The reason is simple: the upcoming M5 ditches the UltraFusion connector. Thanks to the new TSMC fabrication process used on M5, the M5 Max and M5 Ultra will all be on a single die for absolute maximum performance.
 
The reason is simple: the upcoming M5 ditches the UltraFusion connector. Thanks to the new TSMC fabrication process used on M5, the M5 Max and M5 Ultra will all be on a single die for absolute maximum performance.
It’s less about TSMC’s N3P process being used for A19/M5 as a whole, and more about TSMC’s SoIC (as opposed to SoC) being used for M5 Pro/Max/Ultra. See here for a recent press release about SoIC:


One of the first points TSMC makes about SoIC is always that it is compatible with both InFO (= UltraFusion) and CoWoS advanced packaging. So it could be that Apple will not only shift to an integrated “chiplet” (SoIC) design for M5 Pro/Max, but it could also retain the 2x Max = Ultra formula. One doesn’t preclude the other.
 
Last edited:
Isn't it technically two storage drives mapped as one?

No.
The NAND chips are spread over two daughter logic boards , but the metadata for just one drive. The usage of the drive volume is being spread out but if either one of those fail the whole thing fails. Conceptualy it is RAID-0. But all SSDs where write and read speed are anywhere near equal are doing that general concept.

There is no SSD controller on those daughter cards. They are not 'drives'. It is a 'brainless' subcomponent.



And as long as TB5 is faster than NVME, it isn't treading water. I would prefer internal drives, but will settle for really fast ones outside.

But it isn't.

back in 2023

Generally PCI-e v5 x4 is 128Gb/s conduit; not 80Gb/s.

2025
"... The P510's performance does not surpass that of the T705, which provides sequential read speeds of 14,500 MB/s and write speeds of 12,700 MB/s. The drive falls slightly short of the T700, Crucial's first PCIe 5.0 drive, which offered read speeds of 11,700 MB/s and write speeds of 9,500 MB/s. ..."

14.5GB/s ==> 116Gb/s no 80Gb/s


2025-26

( 8 * 27 = 216 not 80 Gb/s )

Even my M1 studio passes BlackMagic disk speed test fast enough to handle 8K video on the external drive. I could supposedly even look at 12K if I was careful with the format.

Is TBv5 "fast enough" for most general workloads and general users. Probably. However, that doesn't mean that TBv5 is outpacing modern, leading edge NVME SSD drives. It is not.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.