Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Now you look stupìd and shortsighted canceling Mac Server, Apple!

Very unlikely this processor is going to primarily target macOS. The overwhelming vast majority of Apple's web services infrastructure runs on Linux. Just like most other major web service vendor. There is nothing 'stupid' there in the slightest.

If Apple builds something it is going to run Linux also.
 
But if they develop their own AI chips, they are going to put them in their own servers, with their own M# CPUs. If they design such a server for the dataroom, why not put it in a different case and sell it as a Mac Pro?

Because they would have to design it to be a general-purpose AI server like the commercial ones for sale now.

Instead, this server architecture would be designed specifically to make Apple's back-end AI offerings (Siri plus whatever is coming with the next set of device operating systems that leverages the cloud) more effective. Such a specialized server would be of little to no use to the general public.
 
The irony here is that the Nuvia chip team bought by Qualcomm, reportedly left Apple because Apple wouldn't let them work on a server chip so they left and started their company. Qualcomm of course is probably already a few beats ahead with the server chips, and with the Nuvia-modified server chips landing on Windows laptops later this year, Apple is going to have strong competition.

But as Johny Srouji likes to tell everyone in every interview, "Apple is not a chip company". Which is an odd thing to keep saying when you work for a computer company.
 
Very unlikely this processor is going to primarily target macOS. The overwhelming vast majority of Apple's web services infrastructure runs on Linux. Just like most other major web service vendor. There is nothing 'stupid' there in the slightest.

If Apple builds something it is going to run Linux also.
There was a rumor a few years ago that iCloud ran on Microsoft’s Azure platform.
 
If Apple sold these instead of keeping them in house, catered a version for Bitcoin farming, and kept its power-efficiency standards, it would be good for humanity.

If Apple did this then they wouldn’t be able to optimise the chips to run their own code.
 
And the M series of Apple Silicon can virtualize Linux with almost no overhead, but with Apple Silicon's power efficiency. And power and cooling are very significant costs to running data centers.

In a hyperscaling evironment likely going to run a linux on top of linux. The hypervisor abilities used in large scale datacenters are YEARS past in maturity what Apple is doing. pass-thru 400GbE cards? No problem. pass-thru DPU accelerator ? No problem.

There is no huge gap between Ampere Computing and Arm Neoverse versus Apple Silicon when it comes to power efficiency. Those implementations are scaling in the zone of 192 cores now (deployed in the field running loads currently).

It isn't kicking sand in the face of Intel Xeon SP cores where Apple has to be very significantly better than. It is passing folks who have already been doing that for a couple of years now.




As long as Amazon/Google/Microsoft-Azure/Nvidia/etc keep pouring money into Arm's Neoverse development coffers it isn't like Arm isn't going to have resources to keep the lead over x86 alternatives. Time will tell if Ampere Computing dumps Arm Neoverse completely. They are still selling Altra (Neoverse based ) at same time selling AmpereOne.
 
There was a rumor a few years ago that iCloud ran on Microsoft’s Azure platform.

Azure is mostly Linux. Microsoft isn't 'stupid' either. Just because it is Azure doesn't mean it is Windows.
( contributing factor to why "Linux subsystem for Windows" gets substantive support for end user developer machines to run linux jobs that get deployed after being augmented/debugged . )

Azure has a bigger fraction of "windows servers in the cloud" severs than their competitors, but that isn't most of the service. The provide major competition to what AWS does , they need to provide the same foundational service. Azure growth substantially increased when they dropped the "Windows has to win" attitude for that part of the business.
 
Last edited:
  • Like
Reactions: IllegitimateValor
I cannot wait to see the WWDC presentation where Johny Srouji tells us all about the new Apple ASi AI ecosystem, all the while Chuck Norris-walking thru the full basement of the Apple Mothership which is nothing but rack after rack after rack of said Apple ASi AI servers...!
  • Apple ASi Mn Extreme Mac Pro Cube
  • Apple iCloudAI subscription service
  • Apple ASi AI server farm
 
Last edited:
This makes SO MUCH sense. The biggest long term cost of server farms is power and cooling. And with Apple's chip technology, they have an advantage here. Margins on these things must be excellent and a new line of Apple Servers with custom chips could be a big growth area - something that Tim Cook needs for the stock to continue to grow. Also, NVidia is the only game in town at the moment, so it's not like it's a crowded field.

This is a much better idea than a Car or even VR Goggles, IMHO. (Although they should be throwing a bunch of stuff at the wall to see what sticks)
I think this is highly unlikely to be something Apple sells to anyone else- these are likely to be for internal use only, to help drive whatever AI plans and services Apple has. And those wo think this is going to be some kind of new Xserve retail product are trippin' :)

g\
 
  • Like
Reactions: SFjohn and CWallace
As long as they didn't get the technology for the chip from this thing.🤓
6657cefe85c9001191dacb84122317f8.jpg
 
  • Haha
Reactions: Chuckeee
Because they would have to design it to be a general-purpose AI server like the commercial ones for sale now.

Instead, this server architecture would be designed specifically to make Apple's back-end AI offerings (Siri plus whatever is coming with the next set of device operating systems that leverages the cloud) more effective. Such a specialized server would be of little to no use to the general public.
An AI server is (usually) just Linux running on CPUs talking to GPUs -- the difference between a PC and an AI server is nothing fundamentally different.

Maybe they would use Linux servers like everyone else, but then, why would they actually? Everyone else does it because it's cheap (free) and all the AI libraries run on it. But MacOS is also free to them and most of AI libraries already are running on Apple Silicon now.

If they are going to make a server with a Mac CPU and a Mac AI chip, they would have to roll a Linux compatible-version. With MacOS they already have it and full control of it. I think they would also rather write the equivalent of CUDA under MacOS as well, since they have the experience and tools and the advantage of being able to tweak the full-stack (including the OS) to optimize everything. The underlying BSD Unix is so close to Linux in terms of app compatibility they don't have to worry about future-proofing new libraries that come along (i.e. the AI libraries transitioned to Apple Silicon pretty quickly).

There is a lot of multi-server code for managing large Linux clusters so maybe that would be an advantage.

In terms of the end-user, I could see a Mac Pro with an AI chip being useful for a lot more than Data Scientists. The chips are generally good not just at training but Inference for the same basic reasons. Video and audio editing are doing all kinds of manipulation with AI-models (e.g. look at the latest Adobe tools), and this could be a big advantage to a market Apple already covets with the Mac Pro.

Sorry for the long post... just thinking "out loud".
 
  • Wow
Reactions: gusmula
In terms of the end-user, I could see a Mac Pro with an AI chip being useful for a lot more than Data Scientists. The chips are generally good not just at training but Inference for the same basic reasons. Video and audio editing are doing all kinds of manipulation with AI-models (e.g. look at the latest Adobe tools), and this could be a big advantage to a market Apple already covets with the Mac Pro.

Perchance AI could also provide similar workflow improvements to the 3D/DCC field; off-hand I am thinking of AI boosting rigging of character models, in-betweening of frames for 2D animations, etc. ...?
 
I cannot wait to see the WWDC presentation where Johny Srouji tells us all about the new Apple ASi AI ecosystem, all the while Chuck Norris-walking thru the full basement of the Apple Mothership which is nothing but rack after rack after rack of said Apple ASi AI servers...!
  • Apple ASi Mn Extreme Mac Pro Cube
  • Apple iCloudAI subscription service
  • Apple ASi AI server farm

This rumor says the AI Server chip isn't coming out until 2025. WWDC 2024 is likely going to be focused on on-device AI. In WWDC 2025, likely still going to be the same primary focus; just incrementally better. Highly unlikely Apple is going to come back in 2025 and hype up how developers are suppose to forget all that on-device stuff... all the nifty stuff is in our data center now. Apple doesn't generally flip-flop like that over relatively brief periods of time.

Apple is really not big on pictures of the insides of their data centers. There isn't a big datacenter in Cupertino.
 
  • Like
Reactions: jido
An AI server is (usually) just Linux running on CPUs talking to GPUs -- the difference between a PC and an AI server is nothing fundamentally different.

Not necesarily.

Meta-MTIA-2-System-72x-Accelerator.jpg

https://www.servethehome.com/new-meta-ai-accelerator-mtia-2-revealed/

That chassis without the datacenter Ethernet isn't going to be as effective. It isn't just the GPU card. ( major reason why Nvidia bought Mellanox. And AMD countered with a network buy of their own. )

A sizable number of the cards that Nvidia sells a OAM format 'cards'. They don't even fit in normal PC slots that adherence to the old legacy PCI-e slot standards.
 
I'm old enough to remember Apple using proprietary stuff, then tried to do PReP, then CHRP, and eventually now back to proprietary.

Apple really never seriously tried to do CHRP. Apple always stuck with their own ROM dongles and proprietary I/O chips so some extent. Apple used the PPC chips and were open to other folks making common standard hardware if that got the costs of PPC chips spread out over more users. But Apple always held back some dongle part of their designs that was a 'lock in' .
 
The irony here is that the Nuvia chip team bought by Qualcomm, reportedly left Apple because Apple wouldn't let them work on a server chip so they left and started their company. Qualcomm of course is probably already a few beats ahead with the server chips, and with the Nuvia-modified server chips landing on Windows laptops later this year, Apple is going to have strong competition.
took the words right out of my mouth
 
But as Johny Srouji likes to tell everyone in every interview, "Apple is not a chip company". Which is an odd thing to keep saying when you work for a computer company.
They're technically not a chip company. AMD, Intel, Qualcomm are chip companies (i.e.: they sell processors to other hardware OEM's). Apple's revenue doesn't come from selling processors but from finished products.
 
That chassis without the datacenter Ethernet isn't going to be as effective. It isn't just the GPU card. ( major reason why Nvidia bought Mellanox. And AMD countered with a network buy of their own. )

A sizable number of the cards that Nvidia sells a OAM format 'cards'. They don't even fit in normal PC slots that adherence to the old legacy PCI-e slot standards.
True, but I sort of wrapped that up with the AI chip design since the two go hand-in-hand. I don't think Apple could use these "off-the-shelf" with their own chip. Either way, it would be a superset functionality to a Mac Pro.
 
Apple developing server chip sounds highly unlikely. It's not like they are going to sell it. And without external sales this would be a money losing proposition. Hypescalers (Amazon, Google) do it but Apple is not a hyperscaler, they do not need enough servers to justify R&D.
 
True, but I sort of wrapped that up with the AI chip design since the two go hand-in-hand. I don't think Apple could use these "off-the-shelf" with their own chip. Either way, it would be a superset functionality to a Mac Pro.

At its core a Mac Pro is a Mac. (i.e, primarily runs macOS). The Intel era that was muddled with some end users , but I don't think Apple really viewed it that way ( i.e., the Mac Pro had a major objective to run Windows as best as possible. )

Being a good datacenter AI server processor has no "Mac" property at its kernel. So labeling it a superset is not right connotation. the 'core' of the essential set is not 'Mac-ish'.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.