Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What does M1 Ultra performance have to do with anything? The absolute cheapest model of M1 Ultra is $4k currently.
$4k buys you a PC which SMOKES M1 Ultra in 3D / AI / Adobe creative suite applications (single core performance is the only important thing with Adobe - because they don't care and didnt optimise their software). The only thing it *might* be faster in is 4K/8K encoding.
I think the more pertinent point is:

Does Apple care?

For the most part Apple believes that it is competing with its own products, not Windows products. As long as Apple's performance is "close enough", I don't think it matters. Who care if Windows is faster if the users want to own a Mac? There are tons of computer markets (at the low-end and at the high-end) that Apple doesn't even bother competing in.

If people on here want a Intel CPU and a 4090, go buy one. Be happy. Complaining on here isn't going to help. :) If you want a PC shaped PC, buy a PC. Apple is never going to make one again for less than $5000. 2012 Apple is not coming back.

If Apple gets TOO far behind in performance, that could be an issue. In the markets that matter (i.e. portable phones and portable computers), Apple's performance (where performance per watt matters) is really good.

If Apple can eventually ship a M* Quadra (even if it starts at $8000), that would also be close enough to a 4090 that most users just wouldn't care.
 
I think the more pertinent point is:

Does Apple care?

For the most part Apple believes that it is competing with its own products, not Windows products. As long as Apple's performance is "close enough", I don't think it matters. Who care if Windows is faster if the users want to own a Mac? There are tons of computer markets (at the low-end and at the high-end) that Apple doesn't even bother competing in.

Well said!

People say "Apple needs to be able to compete with eight-way Xeon/Epyc workstations and servers that host a dozen nVidia Quadro video cards" when Apple was never even remotely in those markets and lacks the fundamental software and support knowledge to even try to play in those markets. Like someone is going to haul a $100K 4U server into their local Genius Bar for support. :p

I expect the Mac Pro exists to support Final Cut Pro and Logic Pro first and, arguably, only. Everything else you can do with the thing outside of running those two apps is tangential to Apple's interests.

When it comes time for Apple to embrace AI, they're going to do so with their own dedicated hardware on-SOC and perhaps also via plug-in cards for a Mac Pro. Thinking they're going to want to, much less need to, use NVIDIA AI and NVIDIA Tensor GPUs is...well, forum rules prevent me from offering my true view on that. :p


Apple is never going to make one again for less than $5000. 2012 Apple is not coming back.

The Mac lineup is also vastly more powerful across the board than it was in the early 2000s. Apple had an incentive to keep the Mac Pro "reasonably-priced" because it was the only Mac that one could do "real" work on. Once Apple moved to Intel, they were able to bring more powerful CPUs to the MacBook Pro and iMac which allowed those machines to also start doing "real" work.

People also seem to forget that inflation exists as well as Intel's upward pricing strategy of the Xeon family as it became their primary profit center. :)
 
I think the more pertinent point is:

Does Apple care?
It's a big gamble.

If AI/ML turns out to be a flash in the pan, no harm done.

But if AI/ML turns out to be the future, they will be waaaaaaay behind even more.

From looking at what happened to Siri, it's going to be a very, very serious problem for Apple if the latter is true.
 
  • Like
Reactions: Mr Rib
It's a big gamble.

If AI/ML turns out to be a flash in the pan, no harm done.

But if AI/ML turns out to be the future, they will be waaaaaaay behind even more.

From looking at what happened to Siri, it's going to be a very, very serious problem for Apple if the latter is true.
Aside from well deserved Siri criticism, Apple still can improve software/OS side by speeding up specific conversions performed by hardware. Not saying they will equal expensive workstations but they can compensate a lot better then whats is available now if they are motivated. There is always the case of cluster computing too,
 
It's a big gamble.

If AI/ML turns out to be a flash in the pan, no harm done.

But if AI/ML turns out to be the future, they will be waaaaaaay behind even more.

From looking at what happened to Siri, it's going to be a very, very serious problem for Apple if the latter is true.
Siri is amazingly behind, sure; but, Apple IS working on hardware ML cores and will continue to do so. I fail to see how they are that far behind.
 
They will not waste One More Thing on the ski goggles. It will only be used for a Halo product announcement.
Per one of your previous comments, what makes you think the next Mac Pro will ship an M3-based SoC and not an M2-based SoC?
 
You have mentioned this several times in similarly vague terms. By "smokes" do you mean smokescreen? I encourage you to provide specifics and evidence to back up those claims.
I see a lot of negative reaction to my original question but nobody answered the question. Please be specific and back it up with evidence if you are going to make claims like this. An inconvenient truth is you can add a 4090 GPU to any Mac that has a Thunderbolt 3/4 port. I am not sure you can use it because I am not aware there are any drivers available. Is that Apple's fault? Or should nVIDIA provide those? That could be a legitimate debate. But saying you cannot interface a GPU with a mac is factually not correct.
 
Last edited:
Per one of your previous comments, what makes you think the next Mac Pro will ship an M3-based SoC and not an M2-based SoC?
On TSMC's earnings call on Apr 20, the CEO revealed HPC and smartphone chips were in high volume production. We already know from previous announcements that TSMC started high volume production in Dec 2022 and Apple secured the entire production of N3 for 2023. So, A17 logically is the smartphone chip. M3 Pro Max is the HPC chip. M3 will come later as volume ramps.
 
  • Like
Reactions: smulji
There’s no eGPU stack on ARM Macs, so “any Mac” means “a few model years of Intel Macs”.
This one, and others, exist:

i am not aware if there are drivers for AS macs though. As I said earlier, who is responsible for that is a worthy debate. But, you can add a GPU to an AS Mac.
 
There’s no eGPU stack on ARM Macs, so “any Mac” means “a few model years of Intel Macs”.
Sure, but that's just software.

While eGPUs for "video out" is basically dead, I see nothing stopping drivers being written to use any GPU for software acceleration (e.g. ML, etc...). Whether it happens is unknown, but it certainly could happen.
 
Sure, but that's just software.

Yes, but I think "you can add a 4090 GPU to any Mac that has a Thunderbolt 3/4 port" and "you can add a GPU to an AS Mac" are, well, rather misleading statements. Could you connect them? Yes. Could you do anything useful with this setup, today? No.

While eGPUs for "video out" is basically dead, I see nothing stopping drivers being written to use any GPU for software acceleration (e.g. ML, etc...). Whether it happens is unknown, but it certainly could happen.

Possibly. Depends on how large that slice of the market (M1 Ultra too slow, yet also 4090 fast enough, wants it on a Mac) is. I suspect many who have those needs put it on a render farm instead. Either locally somewhere, or rented out at a cloud provider.
 
Could you connect them? Yes. Could you do anything useful with this setup, today? No.
Certainly, nVIDIA has some responsibility for this state of affairs. They could for instance, open source CUDA so others could port their GPU's or third parties could develop solutions. That would allow software that uses CUDA to work with a broader range of GPU's.
 
I suspect many who have those needs put it on a render farm instead. Either locally somewhere, or rented out at a cloud provider.
There is at least one benchmark where someone compared a recent version of Pytorch on an M2 with a cloud based T4. The ANE was 6 times faster than the cloud T4. And there are ANE optimizations still in development with Pytorch so ANE will only get faster.
 
Yeah, I think the MP has gone the way of AirPower. If the entire platform is built around everything being built in to an SoC, what is there to be modular?
If the MP is going the way of the Air Power, that means Tesla will introduce the new Mac Pro. So here is the new task for all the digital kings of the hill: Envision the new Mac Pro issued by Tesla. Please post your design concepts how it would look like.
 
Does it really matter whether it's June 22, 2020, or November 10, 2020?
Seeing how we're way past two years, no it doesn't matter, Apple failed to meet it's self-imposed two-year deadline. I keep hearing from posters here saying the delays were about pandemic, recession, war in Europe, process issues, perhaps a comet hitting Brazil, and a host of other excuses. Thing is, I can't purchase excuses, just products.

I don't give Apple a pass on this, not in the slightest. Intel, AMD, and Nvidia all managed to ship product mostly on time, despite global events. Raptor Lake, Zen 4, RDNA 3 and Lovelace all arrived when expected. The exception was Intel Arc, and that was a driver maturity issue, while the cards were sitting in a warehouse ready to go. Yet mighty Apple couldn't finish the transition as they had claimed.

It's time for Apple to show us what they've got, ship a performance desktop, or admit defeat and get out of the way. If they can't at least announce an Apple Silicon Mac Pro by WWDC, then I don't see how that is anything less than an absolute failure.

@Mago (now) says May 2nd (or 9th) for the ASi Mac Pro...
"Signs point to no"

 
You have mentioned this several times in similarly vague terms. By "smokes" do you mean smokescreen? I encourage you to provide specifics and evidence to back up those claims.
"Specifics" from the top of my head: open Blender and do anything with raytraced low quality live preview window just like you'd do normally to model stuff in 3D with nvidia graphics. Try modelling something and see how it goes from there. Or try rendering a final animated piece. Also try using some native and 3rd party GPU heavy AE plugins such as echo/signal/tvpixel/time blend fx. Or custom template effects from AEJuice/ Motion Array. Or use stacks of blur / motion blur for 3D renders in After Effects. Or Element 3D plugin for AE with a lot of objects generated in an array. I can keep going. Stuff that chokes M silicon is still a pleasant, live preview experience on 4090 build. Tech youtubers doing these "stress tests" have no idea how to use the power they are testing, because they are not "pros" working in visual industry. The only thing they can do is some nonsense action applied to a stack of 100 photos with a stopwatch :) Or just keep compressing 8k footage as if that's what pros do..

The only reason I didnt abandon Mac is because I love Mac OS / hate Windows. But I feel I'm being held hostage right now by Apple. It just would be nice to at least know what their plan is - will they pivot and deliver a real pro machine or do they want to put a Mac Studio in a bigger chasis. Cause if that's what going to happen I'm out and trying not to kill myself relying on Windows
 
Seeing how we're way past two years, no it doesn't matter, Apple failed to meet it's self-imposed two-year deadline. I keep hearing from posters here saying the delays were about pandemic, recession, war in Europe, process issues, perhaps a comet hitting Brazil, and a host of other excuses. Thing is, I can't purchase excuses, just products.

I don't give Apple a pass on this, not in the slightest. Intel, AMD, and Nvidia all managed to ship product mostly on time, despite global events. Raptor Lake, Zen 4, RDNA 3 and Lovelace all arrived when expected. The exception was Intel Arc, and that was a driver maturity issue, while the cards were sitting in a warehouse ready to go. Yet mighty Apple couldn't finish the transition as they had claimed.

It's time for Apple to show us what they've got, ship a performance desktop, or admit defeat and get out of the way. If they can't at least announce an Apple Silicon Mac Pro by WWDC, then I don't see how that is anything less than an absolute failure.

Engineering is hard and Apple doesn't care if anyone gives them a pass?

A Quad M* Max package is a shockingly complicated engineering challenge. I assume that an UltraFusion PCIe bridge chip they might need is also really complicated.

The plan is the plan until it isn't the plan. Apple's "2 year plan" thing is marketing, as all public statements are. For a huge number of reasons that didn't happen. If people are making personal plans based on a company saying something is going to happen in 2 years, that isn't on the company.

What does "Get out of the way" even mean? You're welcome to give up on Apple, but hanging out on here talking about Apple is your choice, not Apple's. :)
 
  • Like
Reactions: CWallace
I fail to see how they are that far behind.

That's one thing we can agree on.

One thing is ray tracing - it should be supported by hardware. Apple may have its own vision of 3D in the future but it's time to face reality - you need hardware support for it to use 3D software. Software support of ray tracing is about order of magnitude slower than actual hardware cores that render it.

Another thing is letting pros use nvidia if they want to - because some of very cool game dev software belongs to nvidia and you can't use it without an nvidia GPU. Not only game devs - nvidia has very good software for storyboard artists, making mockups for pitches / sketches, etc.

Third thing is their GPU performance, apart from ray tracing - for better support of AI. And I'm not talking about language models and cores for neural networks - i'm talking about math equations handled by GPUs necessary for other AI tools that are 100% certainly our closest future of software development. And I am pretty sure they will leak to consumer market very soon as well - there's so much money to be made with AI generative capabilities, think of all the AR filters, much more advanced face filters, face swaps, voice swaps, etc etc etc. In fact, this market is financially much more attractive than AI tools for pros considering the size of it.
 
Another thing is letting pros use nvidia if they want to - because some of very cool game dev software belongs to nvidia and you can't use it without an nvidia GPU. Not only game devs - nvidia has very good software for storyboard artists, making mockups for pitches / sketches, etc.

I'm sure I'll care about ray tracing sometime in the next 5 years, but I'm in no rush. I'll certainly care more when games use full ray tracing instead of just hacking it in for light/shadows.

As for Nvidia, Apple hasn't supported them for ~15 years? Why even ask? The likelihood of it happening is somewhere less than 1% :)
 
  • Like
Reactions: Realityck
Engineering is hard and Apple doesn't care if anyone gives them a pass?
No, but they do care if they lose sales to competitors, I would think.
A Quad M* Max package is a shockingly complicated engineering challenge.
A product that has, thus far, only publicly existed in Mark Gurman's articles.
The plan is the plan until it isn't the plan. Apple's "2 year plan" thing is marketing, as all public statements are. For a huge number of reasons that didn't happen.
Some excuses may be valid, but that doesn't matter to the market. By the time the Mac Pro ships, Zen 5 and the Raptor Lake refresh will be out, along with possible GPU refreshes from the PC companies. If Apple wants to be competitive, then they need to release something that is in the same league with what is available when the Apple Silicon Mac Pro ships.
If people are making personal plans based on a company saying something is going to happen in 2 years, that isn't on the company.
I respectfully disagree. If Apple gives a timeline, and customer's plan a purchase around that timeline, then the blame falls squarely on Apple, when that goal is missed. It's likely that Apple has already lost many of their pro customers to Linux and Windows PC workstations. That bloodletting will just continue, as a result of this delay.
What does "Get out of the way" even mean?
It means that, if Apple can't release a performance desktop that competes with the PC companies, then they should concentrate on laptops and small form factor desktops. Otherwise, they're wasting engineering resources on something they can't accomplish.
You're welcome to give up on Apple, but hanging out on here talking about Apple is your choice, not Apple's. :)
I never said I was giving up on Apple. I'm complaining about this because I want Apple to succeed, I expect mediocrity from the PC companies, that's their modus operandi. I don't see why all of us here shouldn't hold them to a higher standard.
 
  • Like
Reactions: gusmula and Twigg90
How about you take some of your insane focus from AR/VR and put more attention to the Mac. You ignored the macs in 2013 with the pro segment and in 2016 with the laptop segment. Your M1 products were amazing, where did the focus go? Why won’t Mac Studio get M2? Why is an M2 Pro Mac Mini better than a M1 Max Mac Studio? What happened to the two year transition? Why haven’t you said a word about missing it?

Because covid happened and the finances doesn't line up for upgrading so fast.

It's in apple's interest to have users buy every iteration of every device. Not every other iterations or after a product has matured.
 
Apple failed to meet it's self-imposed two-year deadline...I don't give Apple a pass on this, not in the slightest. Intel, AMD, and Nvidia all managed to ship product mostly on time, despite global events.

They shipped a CPU or a GPU, not an entire system on a new hardware architecture. A more relative comparison would be how long it took Intel to ship their first Titanium workstation.


It's time for Apple to show us what they've got, ship a performance desktop, or admit defeat and get out of the way.

They did so in March 2022 with the Mac Studio. It's very much a "performance" desktop for running macOS software.


Engineering is hard and Apple doesn't care if anyone gives them a pass?

A Quad M* Max package is a shockingly complicated engineering challenge. I assume that an UltraFusion PCIe bridge chip they might need is also really complicated.

*nods in agreement*

Per Gurman, Apple did make an SoC that used UltraFusion to bind four M1 MAX together. But it was so expensive it would likely have been a nearly five figure option so it would have cost more than the base M1 Ultra configuration (much like the M1 Ultra upgrade on the Mac Studio costs almost as much as the Mac Studio with an M1 MAX, 64GB of RAM and a 1TB SSD).

So yes, Apple could deliver a Mac Pro with an "Extreme" SoC, but you'd be paying at least $9999 for it with 128GB of RAM and a 1TB SSD. Then again, the current Intel Mac Pro will run you more than that for similar performance, so maybe it would have been a bargain. :p

Another thing is letting pros use nvidia if they want to - because some of very cool game dev software belongs to nvidia and you can't use it without an nvidia GPU. Not only game devs - nvidia has very good software for storyboard artists, making mockups for pitches / sketches, etc.

Would not all those devs developing on Windows, anyway, since said game would be using DirectX for their APIs?
 
They shipped a CPU or a GPU, not an entire system on a new hardware architecture. A more relative comparison would be how long it took Intel to ship their first Titanium workstation.
Those CPUs or GPUs shipped in PCs of that era, the corporations that Apple is competing with, today. I'm not sure what an Intel Titanium is, I'm assuming you mean Itanium. That's an example of what Apple shouldn't be doing. Intel let the PA-RISC guys make a science project, which is how not to ship marketable product.

They did so in March 2022 with the Mac Studio. It's very much a "performance" desktop for running macOS software.
If we're talking about the Mac Studio, need I trot out that embarrassing graph comparing the RTX 3090 and the M1 Ultra? Apple was comparing their new device to the PCs of that era. They were marketing it directly against their competitors, not specifically the Apple ecosystem, and came up short.
I never moved over to M*. Giving it at least a couple more years.
I'm still sitting on a Intel Mac, as well. I'm waiting to see how the final stages of the transition unfolds before I make any decisions. I'm not locked into the Apple ecosystem, which allows for flexibility.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.