Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
kind of interesting if the Mac Studio is just like the iMac Pro - a stop gap using an existing form factor (in this case a mac mini with an enormous fan on top) until Apple's pro team can pull its finger out.
Except that’s not at all what the Mac Studio is. Have you seen tear downs of that thing? It’s all bespoke… You can make that argument for the DTK, the M1 Mini and even, to some extent, the M2 machines vis-à-vis the last Intel models (including the “pro”, space gray SKU), but the Studio has a completely different construction (for starters, it’s much more similar to the Cube in that you unscrew the bottom lid and drop down its innards, instead of sliding them out the back, but the power supply, logic board, port and antenna configuration are also completely different). If anything, the Studio is more of a G4/first-gen Mini reboot in spirit and design, it’s a completely different machine.
 
Link pls? I've seen those ads of cheap Office and Adobe perpetual licenses on Facebook ads. The image looked like it was produced by a student on MS Paint.
The fact that you mentioned Facebook is quite telling as Facebook and its affiliates majority are scammers. As I don’t use Fb the link is not from there. Adobe would never provide a perpetual license and quite frankly there are better options that do offer better value. Sure Adobe may do something’s better but that’s the beauty to challenge oneself to figure a method or workaround with the competition. No this not by some student producing something on MS Paint, if you are unable to shop around or do a web search and MR is unaware to provide software deals as it did in the last to its community then it’s up to the individual or a willing forum member to post a thread. If you are interested send me a PM and no I am not an affiliate and you can confirm this via the URL.
 
Yeah, I agree. In my opinion the mini and the studio should be merged. Just make the mini a little bit taller and allow it to get the Max as a processor option as well.
...and maybe add some USB ports on the front and call it "Mac Studio" :)

Here's one that will never happen(and in fairness there are technical hurdles here, but I can dream), but would really nail down the Pro in Mac Pro. Dual Processors, only with a twist. One is an Apple Silicon (and this runs the OS and most things), and the second is an Intel or AMD x86 chip.
A computer with ARM and Intel? Completely and utterly ridiculous....


...although pretty much every serious non-PC computer at the time had an add-on which was basically a headless PC on a card, the Acorn solution reduced it to an x86 clone CPU and a 'glue' logic chip on a card that shared RAM and everything else with the ARM-based host system.

The two drawbacks are cost and compatibility: Compatibility was really pretty good but not 100%, and while there was a basic, affordable 486SX that was probably subsidised by Acorn to tick the "PC compatibility" box, the more powerful processors (I think they got up to "586", i.e. 3rd party Pentium 1 clones) cost nearly as much as a "real" PC system. Economies of scale are important...

These days... well, it's still likely to add most of the cost of a mass-market Intel/AMD PC to the price. I'm not sure how sharing PCIe and Thunderbolt would work - obviously, sharing unified RAM would be problematic - and if the sharing is less than 100% efficient it's going to have Rosetta 2 snapping at its heels, especially when the performance bottleneck lies in the GPU or exploitation of neural & media engine.

The "Mac Pro Problem" is mostly about RAM, PCIe and GPU support - ARM ISA vs. x86 is a way down the list.
 
The fact that you mentioned Facebook is quite telling as Facebook and its affiliates majority are scammers. As I don’t use Fb the link is not from there. Adobe would never provide a perpetual license and quite frankly there are better options that do offer better value. Sure Adobe may do something’s better but that’s the beauty to challenge oneself to figure a method or workaround with the competition. No this not by some student producing something on MS Paint, if you are unable to shop around or do a web search and MR is unaware to provide software deals as it did in the last to its community then it’s up to the individual or a willing forum member to post a thread. If you are interested send me a PM and no I am not an affiliate and you can confirm this via the URL.
I'm not looking. I am pointing out that those cheap perpetual licenses are mis licensed from bulk corp/org licenses.

I talked to a MS country manager about that and they confirmed its a mislicensing.
 
These days... well, it's still likely to add most of the cost of a mass-market Intel/AMD PC to the price. I'm not sure how sharing PCIe and Thunderbolt would work - obviously, sharing unified RAM would be problematic - and if the sharing is less than 100% efficient it's going to have Rosetta 2 snapping at its heels, especially when the performance bottleneck lies in the GPU or exploitation of neural & media engine.

If you REALLY want to, you can control a x86 processor remotely with Moonlight / RDP. The difference from an Apple solution is that this would instead be integrated in the OS, so it'd be much easier.
 
The Apple Silicon chips are revolutionary and the computers Apple is coming out with are awesome and very good values. Their laptops are the best at actually being portable computers. The Mac mini m1 and now M2 are the best small desktops. Go find some benchmark tests of Apple's current computers and run them against the Intel based computers they were making five years ago. I just don't think you realize how much has been changed.
While I like the new chips overall, I would have to disagree with revolutionary. The new SoC chips only created competition (which is good) and now Intel and AMD are catching up and exceeding Apple--so not revolutionary. SoC has it's thermal limits regardless of efficiency, and for Pros, we want options. Apple has also artificially limited its' system in so many ways also stifling possibilities and 3rd party innovation.
Grading everything in comparison to the iPhone is just setting the bar way too high. If you use that as your standard, then I'm not sure ANY piece of consumer electronics except the iPhone would count.
"New" is part of "innovation" and I am referring to actually new functionality and purpose. Look at Apple's history--it was different back then. Other start-ups are actually inventing--even larger companies are doing better. Examples are in the fields of VR, AR, 3D printing, AI, robotics, and other things that could actually improve lives. Looking back at Apple's financial bandwidth when they invented revolutionary computers, music devices, and the iPhone, that was incredible! Imagine if they actually used their billions (trillions?) to invent actual new things instead of just bumping up the next iPad or Macbook Pro. What are they doing anyway?? Apple Car, Apple AR, Apple AI devices have all been resurrected and crushed. Why? Because someone says they won't sell. Money, not innovation is what drives them now. I am a traditional Apple fan and I hope they eventually go back to their roots.

Apple, can you at least make one computer that gives US the option to innovate (2019 Mac Pro almost worked)? I still think my 2010-12 Mac Pro is the pinnacle of their computing--full upgradeability with tons of options for addons. Please.

Rant over. :cool:
 
  • Disagree
Reactions: edanuff and jdb8167
Catching up and exceeding…with some serious caveats.

I don’t understand how a laptop that needs to be plugged in to “exceed” a battery powered one is anything other than cranking voltages (and increasing heat) 🤷‍♂️
 
Can someone please tell me what the point of an Apple Silicon Mac Pro would be seeings as it would be fundamentally incompatible with any and all 3rd party GPU/PCI cards?

It made sense when Apple was running x86 had RAM slots and a functional relationship with Nvidea, but in 2023 A Mac Pro makes about as much sense as a chocolate frying pan.
What is the functionality that an "external" graphics card has that an integrated one cannot have?
 
No way!

My Mac Studio Max has 64GB of RAM and 32 cores GPU for memory intensive tasks and much more faster for those benefited from GPU power.

I actually regrets not getting the Ultra model: 10 core CPU with 64G RAM is not enough for even a single heavy duty VM.
And this is why apple silicon is a dead end for high performance uses. The King Kong glued to Godzilla sized chip that is the Ultra still can’t handle workloads that a higher end consumer socket x86 CPU can, let alone HEDT or server parts, and all that’s before we touch the abysmal GPU/compute performance.

Unless Apple invests in dies that meet higher end needs, they have ceded every piece of the computing market more powerful than a decent “creator laptop” workflow. Which is fine, they make more money than god doing so, and would likely make less at least for a good while if they chased these other markets (can you imagine how hard it would be to break into server/data center with a custom ARM based architecture where many customers need to support 20+ year old code that is nowhere near being sunsetted?). But it’s still disappointing and sad to watch the final death knells of the Mac as a professional tool. From desktop publishing in the 80s until recently, there were large, high performance computing industries that relied on the Mac and its unique hardware/software combination. Going forward, the “pro” Apple cares about is a YouTuber who wants to be able to answer iMessages while using Final Cut (which still has zero collaboration features, in 2023! So can’t be used by even a team of 2, yikes!)

Science? Not enough memory and no access to CUDA mean many use cases are impossible for Mac.

3D work? Laughably weak GPUs with terrible software support.

Gamers? No game support, and crap GPUs even when supported.

Serious video editors — need things like collaboration and will use software that the bigger players use to build skills— so no Mac support.

Developers/programmers — unless programming for MacOS/iOS, the lack of memory and threads for virtualization is a major weak point.

serious web users or developers — safari’s insistence in being WebKit based rather than chromium has a good chance to turn it into this decade’s internet explorer. if I’m stuck in chrome or Firefox or whatever, why not just use them in more powerful windows hardware?

And the list goes on forever.

Apple silicon Macs are amazing. They really are. And for most causal, mobile users they are better in every way than what PC laptops have to offer. But if you make a living from computing (not using a computer for office tasks, but using computing power to create or discover), it’s damn near impossible to recommend the Mac anymore. (Plus, have you used MacOS lately, what a joke. They almost couldn’t be making it worse to use faster if they explicitly tried!)

Three years ago, I was so excited for the potential for Apple Silicon. I was filling rumors of the announcement like crazy. I was so hyped on the potential to create special, super unique hardware for 2020s computing needs, rather than glom more extensions onto the shambling wreck of x86 after the years of intel stagnation, the spectre, meltdown and other major issues with security that lead to decreased speed over time, the absurd heat and power budgets that made mobile computing impractical and more.

But instead if innovative hardware we got overgrown cellphones. Which make amazing consumer laptops. They really do. An AS MBA is a masterpiece! But they make truly crap high performance or professional hardware.

I was hopeful for all sorts of specialized add ins like the afterburner card, for complex chiplet designs that allowed for serious flexibility in applications, for a Mac that was more capable than intel 14nm++++++, with underpowered, uncompetitive Radeon GPUs (if your model even got a GPU!), and the kludged together T2 co-processor to handle some apple specific tasks.

I still plan to use apple for my mobile computing needs, but it looks like there’s no reason to hold our breath for them to create professional or desktop hardware ever again. It was a good run. And I’ll miss the Mac, but after 30 years defending it, I think it’s time to throw in the towel and admit the party’s is over.
 
Can someone please tell me what the point of an Apple Silicon Mac Pro would be seeings as it would be fundamentally incompatible with any and all 3rd party GPU/PCI cards?

It made sense when Apple was running x86 had RAM slots and a functional relationship with Nvidea, but in 2023 A Mac Pro makes about as much sense as a chocolate frying pan.
An apple silicon Mac Pro MUST support PCIe GPUs, whether designed by Apple or a 3rd party. If they aren’t creating chips that support this then whatever they release isn’t a Mac Pro. even a waferscale Apple silicon SoC would likely struggle to compete in many workloads with a sub $5000 total cost of system PC with high end consumer GPUs. And just the silicon would cost many times more than the PC.
 
  • Like
Reactions: Rokkus76
Can someone please tell me what the point of an Apple Silicon Mac Pro would be seeings as it would be fundamentally incompatible with any and all 3rd party GPU/PCI cards?
PCIe isn't just for GPUs - it's for audio, video capture, extra Ethernet interfaces, storage etc. and there are already multiple PCIe cards that work with Apple Silicon (via a Thunderbolt PCIe enclosure):


AFAIK there's not even any fundamental reason why Apple couldn't support GPUs on Apple Silicon - they just don't (currently) and have so far indicated that integrated GPUs are the way forward - which makes strategic sense because they want developers to optimise for Apple Silicon graphics and make software that makes the most of MacBook Pros, Minis and Studios. Maybe they'll change their tune now that they've built a solid base of Apple Silicon Mac users. The main problem is that none of the Apple Silicon processors we've seen so far have enough PCIe lanes to drive the sort of multi-high-end-GPU (using 16 lanes each - the Xeon supports up to 64) setups possible with the 2019 Mac Pro. Converting 2-4 of the 6 TB4 interfaces on an M1 Ultra (you'd need to keep some for displays, external expansion etc.) to PCIe would give 8-16 lanes - not enough for a killer multi-GPU setup but enough for a few audio interfaces or a couple of SSDs.
 
I just sold my '08 Mac Pro and my '18 Intel Mini. I was waiting so long to upgrade my Intel Mini to the M2, so of course they updated it three months after I bought a Studio (which is now already oudated!!). I really get tired of Apple not releasing a roadmap so people like me can make real world decisions. It's likely I'll never need a new Mac Pro, and even more likely I'll never be able to afford one, but it would be nice to see the bridge before I try to drive over it. Apple has become a phone company, that's where they are excelling, and a mobile computer company next. That's where they're making all the dough. The Mac Pro is a niche product and has suffered so many delays, and failures I really don't see where it sits in their lineup. They've filly farted around so much with that product, they'll never be ahead of the curve, and the small market that needs the power has already jumped ship. That apple has fallen too far from the tree time and again with their indecision.
 
Last edited:
PCIe isn't just for GPUs - it's for audio, video capture, extra Ethernet interfaces, storage etc. and there are already multiple PCIe cards that work with Apple Silicon (via a Thunderbolt PCIe enclosure):


AFAIK there's not even any fundamental reason why Apple couldn't support GPUs on Apple Silicon - they just don't (currently) and have so far indicated that integrated GPUs are the way forward - which makes strategic sense because they want developers to optimise for Apple Silicon graphics and make software that makes the most of MacBook Pros, Minis and Studios. Maybe they'll change their tune now that they've built a solid base of Apple Silicon Mac users. The main problem is that none of the Apple Silicon processors we've seen so far have enough PCIe lanes to drive the sort of multi-high-end-GPU (using 16 lanes each - the Xeon supports up to 64) setups possible with the 2019 Mac Pro. Converting 2-4 of the 6 TB4 interfaces on an M1 Ultra (you'd need to keep some for displays, external expansion etc.) to PCIe would give 8-16 lanes - not enough for a killer multi-GPU setup but enough for a few audio interfaces or a couple of SSDs.
I think Apple’s entire strategy around GPUs is to eliminate the constant swapping of data back and forth between RAM and the Ram on GPU’s. I watched one of the WWDC videos on it after the M1 launch and it showed that they’ve eliminated a massively wasteful process and cut down dramatically on idle time the system is just waiting on that data to be moved.

Because of how much thought was put into this, I don’t see Apple opening this avenue of slowness again. I don’t think 3rd party graphics are ever going to be a thing again on Macs.

That may sting right now on the very high end, but it will also enable things that simply cannot be done without this new approach. I suspect gamers will be mad for the next few years, but they’re always mad anyway…


This may not be the exact video I’m remembering but it does go into the difference between Apple’s tile based approach vs what Intel and AMD are doing:https://developer.apple.com/wwdc20/10631
 
Last edited:
  • Like
Reactions: Unregistered 4U
I think Apple’s entire strategy around GPUs is to eliminate the constant swapping of data back and forth between RAM and the Ram on GPU’s. I watched one of the WWDC videos on it after the M1 launch and it showed that they’ve eliminated a massively wasteful process and cut down dramatically on idle time the system is just waiting on that data to be moved.

Because of how much thought was put into this, I don’t see Apple opening this avenue of slowness again. I don’t think 3rd party graphics are ever going to be a thing again on Macs.

That may sting right now on the very high end, but it will also enable things that simply cannot be done without this new approach. I suspect gamers will be mad for the next few years, but they’re always mad anyway…


This may not be the exact video I’m remembering but it does go into the difference between Apple’s tile based approach vs what Intel and AMD are doing:https://developer.apple.com/wwdc20/10631
It's not the very high end that it stings, it's everyone beyond the casual computer user. Apple makes great iGPUs, but these aren't even close to dedicated GPU performance. The AS Max dies have been larger in area than almost any GPU ever created. And the M1 Ultra is much larger than the biggest current consumer CPU + the biggest current consumer GPU. These chips are monstrous! And for all that expense and size, they often perform GPU compute like a mid range last Generation GPU at best.

One example, the 38 core M2 max is comparable to an RTX3050 in GeekBench 5's OpenCL test. This means even with perfect scaling, and in the ideal workload, one that doesn't need any of the many features nVidia GPUs have that Apple doesn't, an M2 Max will perform around a 3070.

You can easily build a PC tower with a 3070, Zen4 CPU, 64GB of DDR5, and 2TB Samsung 980Pro for under $1800. That's a CPU that performs like a 12core m2 Max, a GPU that performs like the theoretical m2 Ultra, and upgrades to RAM and storage that would cost $1000 from apple, all for less than the Studio's starting price.

the MacBook Air is the best laptop in the world, and the best value in computing. But the higher up the stack you get, the worse Apple Silicon looks.

I had really high hopes that Apple would create amazing machines on their custom hardware, and they did. But I really didn't expect that they'd refuse to create anything other than a overgrown cellphone SoCs, limitng the appeal of those machines to casual users. It's very sad that they don't seem to be creating chiplet based designs or workstation/server focused chips for high end needs.

Yes, it's amazing how much Apple can do by creating such a well integrated SoC, especially on the efficiency front (which translates to gobs of computing power on mobile devices), but not even trying to compete with high end laptops or any desktop is a steep price to pay for those low-end gains.
 
Because of how much thought was put into this, I don’t see Apple opening this avenue of slowness again. I don’t think 3rd party graphics are ever going to be a thing again on Macs.

They don't have to. Just make it like the Xeon architecture. "Look, guys, up to the RAM in your base system, you'll use UMA. If you add memory, the extra memory will run a bit slower."

The people who do need a huge amount of memory would be much happier.
 
The cost of the empty rack is $549.99, quite a lot for just a rack.

My apologies if this has already been addressed... $550 is dirt cheap if its all a person needs to bridge between a Mac Studio to a device with expansion capabilities. The M1 Ultra is already a pro level chip.
 
They don't have to. Just make it like the Xeon architecture. "Look, guys, up to the RAM in your base system, you'll use UMA. If you add memory, the extra memory will run a bit slower."

The people who do need a huge amount of memory would be much happier.
Exactly. There's nothing stopping a Mac Pro from maxing out at 256 GB of on package RAM, but allowing 2TB of additional RAM (especially using full DDR5, where 512GB modules are coming -- that could be only 4 slots for 2TB, and running it in quad channel for the bandwidth Apple loves still gives 200GB/s with 6400MT/s memory)

Same thing is true with GPUs and other accelerators. just because the SoC has an amazing iGPU, doesn't mean the system can't see or use a dGPU. I loved the insane thought that the "stubs" from turning a Max die into a Pro could be activated as GPU chiplets. Even if they had to clock low, or used a ton of power or had a few dead cores, making a PCIe GPU out of 8 16 core M2 "stubs" with probably 64GB of VRAM and some sort of I/O die/interposer would be the kind of insane innovation I was hoping to see from Apple (even though just making an All GPU die the size of the Max die with like 60-80 cores and "Ultra-ing" or "duo-ing" two together probably is more realistic)

And like, the Afterburner card was a great addition for people who could use it, and the media engine on Apple Silicon is even better, but what about an "Afterburner 2" that is a pile of Media Engines designed to work in parallel and truly obliterate transcode and render type taks. Now add in 40Gb or faster networking, connect the monster Afterburner 2 equipped MacPro (which also has terabytes of RAM, serious GPUs and a crapload of fast onboard storage) to other Macs on the network and finally add collaboration to FCP and you've got the makings of a "Pro level" video editing setup on Mac again. Where your team can share resources and work together on a project even from an iPad but the heavy lifting and rendering is done on a central server or render farm setup, likely in real time. It's a fever dream, but this is the sort of thing I expected Apple Silicon to bring, new ways to use the Mac with software and hardware that caught up to the 2020s and looked ahead to what could be done in the future.
 
  • Like
Reactions: Rokkus76
They don't have to. Just make it like the Xeon architecture. "Look, guys, up to the RAM in your base system, you'll use UMA. If you add memory, the extra memory will run a bit slower."

The people who do need a huge amount of memory would be much happier.
Just….change the architecture? Just like that, just make it like Xeon.

Can you elaborate on this super simple concept your after here, and how it doesn’t fundamentally abandon the benefits of the SoC that Apple has built their future on?

Can you speak to this at all, or is “do it like Xeon” the extent of your technical knowledge on this? I’m really trying to figure out if you realize how flippant a statement that was?
 
It’s another example of Apple building what they want and not fully addressing the needs of the customer.
I think Apple’s making what their customers want. Mainly because they’re inviting those customers to their campus and talking to them, getting feedback on what those folks would want next.

For the future Mac Pro, I have a STRONG feeling that… if you’ve not been invited to Apple’s campus to talk to them about what you’d like to see in the next Mac Pro, they don’t consider you a potential Mac Pro customer. One MAY buy a Mac Pro… and that will be surprising to Apple, but they wouldn’t have intended that person to. They’ll just have to make 1 extra, so not that big of a hardship.
 
  • Like
Reactions: jdb8167 and NT1440
Just….change the architecture? Just like that, just make it like Xeon.

Can you elaborate on this super simple concept your after here, and how it doesn’t fundamentally abandon the benefits of the SoC that Apple has built their future on?

Can you speak to this at all, or is “do it like Xeon” the extent of your technical knowledge on this? I’m really trying to figure out if you realize how flippant a statement that was?
He means add DDR5 controllers on the die, and allow the system to address both the on package LPDDR and the DDR dimms. Basically having like up to 200GB/s swap with up to a few terabytes of DDR5, or up to like 256GB or “highest level cache” that runs at like 800GB/s or more, depending on how you look at it.

Not a complicated concept.
 
Last edited by a moderator:
I think Apple’s making what their customers want. Mainly because they’re inviting those customers to their campus and talking to them, getting feedback on what those folks would want next.

For the future Mac Pro, I have a STRONG feeling that… if you’ve not been invited to Apple’s campus to talk to them about what you’d like to see in the next Mac Pro, they don’t consider you a potential Mac Pro customer. One MAY buy a Mac Pro… and that will be surprising to Apple, but they wouldn’t have intended that person to. They’ll just have to make 1 extra, so not that big of a hardship.
This is hilarious. Apple is a for profit company with a duty to shareholders to make as much profit as possible. They don’t give a crap what we want as long as we buy what they sell. On the “pro” side, any dialogue with customers is likely trying to answer “would you buy a super extra large cellphone for pro work, or would it be better just to finally admit the pro market hasn’t been worth it to us for a long time?”
 
  • Disagree
Reactions: Detnator
He means add DDR5 controllers on the die, and allow the system to address both the on package LPDDR and the DDR dimms. Basically having like up to 200GB/s swap with up to a few terabytes of DDR5, or up to like 256GB or “highest level cache” that runs at like 800GB/s or more, depending on how you look at it.

Not a complicated concept.

Exactly.
 
  • Like
Reactions: Rokkus76
I would be very surprised if Apple would bother with any GPU other than those built into Apple Silicon chip.
4 years from now after numerous continuous years of Apple NOT shipping a discrete GPU, there will still be folks saying that Apple’s going to do it “any day now” and “there’s no reason why they can’t”. Which, ok, technically… true? :)
 
  • Like
Reactions: Detnator
So much attention has been put on the horse race at the performance ceiling that people have failed to see the significance of how much Apple has raised the performance floor.
This is the ultimate goal. There will be a time in the not so distant future where every Mac sold will have performance across the board greater than 90% of what the competition is selling. Not because the competition can’t do the same, but because competition has to pad their bottom line with poorly performing low end solutions such that the price gradient continues to make sense. Give the folks at the bottom too much power (like making it so that ALL their solutions score within a few percentage points of their high end in single threaded workloads) and there will be less need to buy into their high end stuff. ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.