Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Ok, but iPads are just glorified iPhones. In some ways they're worse (iPhones got the better camera). There really isnt THAT much of a difference between the two.
Absolutely, we have so much Apple user community that don't own Macs and all they can relate to is how terrible iOS/IPadOS is and not be aware of MacOS products. You see it all the time in these forums when they complain. If anything this means that WWDC is a wonderful time to talk up Macs instead of talking up services for once. Yes take the services and take a vacation. ;)
 
Maybe building the entire platform around a SoC is the wrong idea. This is a serious problem and could cost them a small but extremely important market
One of the thoughts was that Apple had decided to use removable cards that has the SoC/Unified memory that was swappable, so that the Mac Pro could be configured easier, or even operate as computer cluster with additional slots used which MacOS could support. Again this is speculation as no details have been given.
 
That's not saying Apple won't release a Mac Pro. They might decide to burn the money anyway and do it properly. OR a Mx Ultra with PCIe slots for A/V cards and storage will tick the box for some people (but not the very high-end market the 2019 was aimed at). Or, something more adventurous with multiple Mx Max/Ultra 'compute modules' (again, unlikely to be a drop-in replacement for a 2019 MP). We'll see.

The only urgency is that the M1 Max Mac Studio is currently in collision with the M2 Pro Mac Mini which, with the best CPU/GPU option, costs about the same as the base Mac Studio, has a better CPU and is swings-and-roundabouts on the GPU (fewer cores, but individually faster - so your mileage may vary depending on the application). Must be decimating sales of the base Mac Studio (24 old GPU cores vs. 19 new cores in the top Mini Pro), but the 32 GPU Studio Max, not to mention the 64GB RAM option still seem viable. However, a M2 Max Studio would restore the balance so it seems a pity if that's being held back waiting for the Ultra option.

Trouble is, a kludgey Gurman-esque M2 Ultra Mac Pro that cannibalised a couple of TB4 ports to provide some (non-GPU capable) PCIe slots probably would clash with a new M2 Ultra Studio (c.f. a $4000 M1 Ultra Studio + $2000 Sonnet rackmount/PCIe enclosure). One possible move for Apple is to simply re-badge the M2 Studio as Mac Pro (I mean, it's more credible than the Trashcan in that respect).

You got some great points there.

I suppose it is just a question of:

- If there is enough of a market for a Mac Studio with lots of PCIe slots (running way better than a Sonnet box could) that "tick the box for" enough people, I see nothing wrong with such a product.

- I expect few Mac Pros were actually spec'd that far above what an M* Ultra wouldn't have covered. People bought those Mac Pros with 580 GPUs (shudder). The only problem I see, and Apple knows it too, their inability to get the M* Quadra out the door. Doubling the CPU/GPU/Memory above the Ultra would be enough for almost everyone. I'm sure they're working on it and haven't given up yet.

- While this potential Quad M* Max would still cut off the super high RAM users, the reality is that they are so few that Apple doesn't really care. Apple only offered that much RAM on the Mac Pro because it cost them nothing to do it. :)


I cannot wrap my head around people bothered by the M2 Pro Mac mini being too powerful. "Restore Balance" makes as much sense here as it does in Star Wars (i.e. it makes zero sense). If the M2 Pro Mac mini is suddenly good enough for some people, than yay they get to buy them. If not, the current Mac Studio will probably do almost all users until the next one ships. Apple will always let things rot on the vine, so I just see no reason to be bothered by any specific case of rotting vines. That is just how Apple works.

Make peace with your preferred products rotting on the vine sometimes, or give up on Apple. :/
 
  • Like
Reactions: Realityck
Apple gives us TB4 throughput instead of a zillion different ports, and I approve. Dongles and docks are easy, but first one needs bandwidth.
We need competitors to Intel in the TB4 / USB4 chip space. This is why consumers are getting hosed on the prices of TB docks and devices. ASMedia received USB4 approval for their ASM2464PD in April ... so competition is coming. Though, from the looks of pricing of the ZikeDrive, I am not sure we are getting any deals yet.
 
B - make multiple sockets/slots for multiple SoC's to be populated later if the user chooses to expand, and commit to at least a next generation upgrade support for the chassis/logic board.
This is what I believe the Mac Pro debut at WWDC 2023 will be. I expect the sockets/slots will be for "compute modules" that incorporate one or more SoC's.
 
MKBHD recently said in his Waveform podcast that he doesn't think its ever coming. I thought maybe he is just making this stuff up or he knows something we don't. For someone as powerful and influential as him, I think he knows the truth. And its likely that the Mac Pro is dead, again.

For Apple though I don't think its a big deal. Of course, there is the mindshare factor and how much this can have a domino affect across all product lines. But, Apple probably only sell like 30,000 Mac Pro's a year at most. Not a lot of people are buying tower workstations and its usually for specialized environments. Even in my work environment, I rarely see tower PC's and those who have tower workstations, they are a tiny group.
If that's true then why bother keeping the Intel Mac pro in the lineup? Just kill it then
 
The 68k to PPC transition took 31 months (i.e. the last 68k machine was discontinued 31 months after the first PPC was released). We're at 29 months for the Intel to Arm transition, and if the Mac Pro doesn't get updated at WWDC then it's going to exceed that.

In other words, I agree with you :)
I dunno how we can say reasonably say that Apple has mismanaged the transition. It is very hard to ignore the elephant in the room. A worldwide pandemic was not raging during the PPC or Intel transitions.

Nevertheless, we will see an Apple Silicon Mac Pro debut at WWDC 2023 with the M3 announcement. It will be a "and there is one more thing" moment. First time since Steve Jobs did it last.
 
Apple made the same BS argument for its lack of standard interfaces in the 90s
There is a big difference now. TB4 and USB4 with PCIe tunnelling are compatible. Soon we should have some competition in the TB4 / USB4 chip space which will moderate prices.
 
Because I am a pro user that needs to get as much as I can out of a system. With Apple's release schedule, I would more than likely be able to upgrade to an M3 Pro Mac mini and beat my M1 Ultra Mac Studio for months before they decide to update the Mac Studio to an M3 Ultra.
No, you will not. Your analysis ignores the huge contribution of Apple's new Unified Memory Architecture (UMA), which changes everything. An M2 Mini tops out at 32 GB RAM, 200 GB/s memory bandwidth. The M2 MBP I just bought has 96 GB RAM at 400 GB/s memory bandwidth (that is 6x), and any Mac Pro will be more than that - - even without some new M3 MP architecture that I expect.
 
Last edited:
considering that the last 3 Mac Pros (4 if you count the iMac Pro) all ended up being abandoned for years and then replaced with something radically different, I'm not sure what combination of misconceptions would make "real pro users" back the next Mac Pro horse.
A compelling reminder.
One possible move for Apple is to simply re-badge the M2 Studio as Mac Pro (I mean, it's more credible than the Trashcan in that respect).

I would hope they'd at least tart it up a bit first, but yeah-
 
Oh yeah, it's gonna work out real well for you in the long run. Wanna upgrade your computer? Too bad, buy a whole new machine. SSD drive failed? Too bad, it's soldered to the motherboard. Buy a whole new machine. Need more RAM?Too bad, buy a whole new machine.

Yup, Apple's really done us a huge favour.
Nonsense. Apple is bringing more advanced tech, a very good thing. UMA RAM baked on the chip is indeed a one-time purchase, but hella faster architecture. And my Mac life cycle has extended over the years to 5-6 years now from 3-5 previously.

Paying $400 to add 32 GB of UMA RAM feels like a bargain when in the past I paid $400 for 2 MB of third-party RAM...
 
  • Like
Reactions: eldho
The "AI revolution" will be delivered on mobile & embedded devices, for which Apple Silicon SoCs - with powerful on-die GPUs and neural engines - are not only ideal, but are already in the hands of millions of potential users.

Developing
AI applications, training models, pre-rendering content and providing the server-side processing clout can happen on generic PC boxes stuffed with NVIDIA GPUs for all Apple cares. It's not like Apple services like Music/Maps/TV are being run from old xServes or racks of Mac Minis...

Thing you quoted refers to Apple not seeing the development of AI and AGI and not foreseeing the possibilities and opportunity to make money.
So did Alphabet. Microsoft was fast enough to buy OpenAI. So I'm not sure how thing you said refers to what you quoted..
It's as if I said (purely hypothetically ofc): "Back in the day Apple could have created the best web search engine but google beat them to it" and your reply is "Apple can just use google search on their iphones". Sure they can - but there were trillions to be made and Google did it.

The same refers to what you just said about hardware. Yes - we can use remote render machines. We can even use rented render farms if we want to be efficient. But that argument makes no sense - because the ultimate argument is "we can just buy remote access to computing power via Apple TV like device with a computing power of a mobile chip and a monitor so what is the point of owning a computer at all". It doesn't really make sense discussing a workstation does it. Some people will choose remote access to power, some prefer to have it at hand - for the time being, "pros" prefer to have it at hand with an exception of 3D render guys who use both.
Btw Music / Maps / TV - these applications running off racks at server rooms and data centers arent exactly "pro" applications... "Pro" application is coding, running simulations, CAD, 3D, design, music production, motion picture production, photo editing, etc.
 
I call this BS. If you are not able to keep your products on track with your own chip improvements that is one issue. If you are worried one of your products is cannibalizing another then get your product lineup straight. Even if one of those reasons is true it does not shine a good light on Apple’s product structure.
It may not be about fear of cannibalizing, because Apple since the Jobs era has known better than that. It may be that the next MP evolution involves architectural changes that necessitate the higher transistor density of M3 chips. The Studio is the transition product just before MP, so perhaps Apple is delaying the Studio upgrade to better fit the evolving architecture at the high end.
 
the iPhone pretty much codified the term 'halo product'
I suggest the MP is the halo product and the iPhone is an accessory. The sales team for Apple accessories comprises the legions of fanatical people that use Mac computers. The "average" iPhone user could not care less who made it, they use it because someone told them to use it and they have a goto person who can help them if they have problems.

something more adventurous with multiple Mx Max/Ultra 'compute modules'
i agree we will see something like this debut at WWDC 2023.
 
If that's true then why bother keeping the Intel Mac pro in the lineup? Just kill it then
Its there for those who need it, its a niche. Look at how long they kept the 2013 Mac Pro on the market even after its expiry date. Heck, they are stilling refurbished models of the 2013 on their clearance website at full price. I believe I read somewhere they moved the manufacturing of the Mac Pro itself back to Asia, its not in Texas anymore.
 
What if they found out that it is currently not possible to make Apple Silicon modular?

I wouldn't be surprised if that is what is actually happening.

At this point, professionals should just move to Windows or Linux rather than wait for a product that may or may not materialize.
You act like "modular" is the end goal, when performance is the end goal of professionals. Those who think a furnace configuration of huge old-style RAM plus a hot GPU is the only way to go probably should indeed stick with the old Intel approach. Me, I am curious to see what Apple achieves with their new MP; Apple kicked ass at the laptop and Mini levels with M2 and I expect similar good things from M3.
 
  • Like
Reactions: DavidSchaub
While that comes across as AI is here and now, it’s also indicative of it isn’t. It seems that we are advancing on using the term, but it’s not really AI. All those AI is the future need to be looked at as another form of data mining by google and others. Now as far as separate graphics cards the last thing I saw was Apple making their own external cards rather then partnering with nvidia because of how they are advancing metal3 to be their graphics engine. Supposedly we expect MacOS 14 to add to that. The comparison of what metal3 provides vs what it lacks is best seen with resident village playback demos. Still a WIP. As far as AS Macs being the same as traditional PC workstations that is still not known, because we haven’t had any technical discussions representing a Mac Pro, just extended conjecture on what might be used.

First of all I think you are mistaking two different things: AI and AGI. An example of AI application are Large Language Models - things like ChatGPT, based on layers of transformer nodes that create a neural networks.
Artificial General Intelligence is being developed in parallel with completely different goals - much more lofty goals of creating, well, general intelligence. And maybe even consciousness.
LLM may play a role in development of AGI as Ben Goertzel suggests, while some people like Sam Altman (CEO of OpenAI) thinks LLM and neural networks alone may lead to creating AGI - although noone knows for sure.

AI which is developing much faster than anyone anticipated, so you're wrong on that one - unless you know better than all the biggest AGI/AI specialists in the field;)
LLM are data scraping the internet of course but if anything, it's Microsoft not google - they own OpenAI now.
Since noone really knows what is going on inside the neural network, one can argue are there glimpses of actual understanding the world by ChatGPT4 or whether its purely probabilistic machine with no understanding of text its creating. This is debatable. Also noone knows if AGI can be born purely out of LLM development - though unlikely, it is possible.
As for AGI development, it's highly probable that there will exist a true AGI in 5 years - which sounds crazy. And a bit scary.
I dont know for sure but my guess is you didnt dig deep into the subject - I totally recommend it, it's pretty interesting (and as I said - scary).

As for AI applications for software - well, there are things that require nVidia chips and that's it. There's even very cool nVidia proprietary software for generative image creation, something like Photoshop - you take a few colors, a brush and you generate environment / background image in a matter of seconds. They have software perfect for game devs like that and these tools are being developed with nvidia chip in mind - although they could obviously run on different hardware.
Software developers will go that route - there will be more and more applications, plug-ins and tools using AI that will help with all sorts of tasks - like increasing resolution of images, motion picture tools, TONS of AI tools for 3D graphic designers. All of these run mostly on GPUs - and apple M chips cant compete with any proprietary modern graphic card. So either Apple does something about it or we can say goodbye to a real "pro" apple desktop. Even without all the AI tools that will come - how can you create a "pro" workstation that can't be used for any serious 3D application? :) "Our Pro machines encode 4k and 8k in a blink of an eye, but please don't use any 3D software" :D It's pathetic.
 
The same refers to what you just said about hardware. Yes - we can use remote render machines. We can even use rented render farms if we want to be efficient. But that argument makes no sense - because the ultimate argument is "we can just buy remote access to computing power via Apple TV like device with a computing power of a mobile chip and a monitor so what is the point of owning a computer at all".

But you are using "we" to mean "developers". Only a tiny portion of Apple's income comes from selling kit to developers - they make most of their money selling iPhones and Watches and iPads and MacBook Airs to consumers, and its no real problem from Apple if the developers use cloud services or buy Dell workstations as long as the consumers buy their iDevices and laptops to run the developers' products (and buy them through the Apple Store).
 
Seeming more and more true that Apple really hit a roadblock and backed themselves into a corner with making an Apple Silicon machine truly "modular" and offering something beyond what the Mac Studio offers.

What doesn't make sense to me is how they couldn't have seen this coming.

"Let's put everything onto the SOC!"
"But we're still going to have a super modular, upgradeable Mac Pro!"
The SOC will be modular! 😆 You can replace the whole SOC every year. M2 Extreme SOC, M3 Extreme SOC, each starting at $3,000!
 
  • Haha
Reactions: DavidSchaub
Jeff Geerling's test of Ampere processors was impressive, but even he admitted that the best x86-64 workstation processors are still faster than the arm64 Ampere processor he was testing. If Apple made an Ampere-like processor to power the next Mac Pro, what would Apple have to advertise about the next Mac Pro beside it being second, third, or fourth best?
If Apple made a new class of ARM chip more like what that machine is using, I have no doubt it would destroy x86. But as long as they keep chasing the energy effiicient integrated GPU mobile type chips like the M series, they will be handicapped. That's another reason an M2 Mac Pro is insanely dumb and I just don't believe Apple will actually do that.
 
  • Like
Reactions: gusmula
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.