Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Another misstep by current Apple leadership. Hopefully, it doesn't come back to bite them.
Absolutely. It was obvious at the time that it was a three pronged loiss at the time, Just like not spending the money for the Nvidia AI chips for work on Apple Intelligence when the team was begging for them, and announcing it before it was ready and making fools of themselves.

Eventually Tim and other top execs will run out of people to blame.
 
  • Like
  • Love
Reactions: smulji and nt5672
You seem to misunderstand RAID.
Apparently, so does Apple. Still plenty of bugs in their systems.

raid%2Bslogan.jpg
 
Those all sound cool and all, but they didn't mention the design of the next chip after these. Code name "Maui", it can also vaporize 🍁 and therefor be much more useful than becoming a glasshole.
 
Guess that is the reason Apple Intelligence sucks balls - chips not good enough!
I have yet to get Genmoji to actually give me either a) anything, or b) something that represents what I asked for.
Latest -- "Clock showing time at 3'oclock". What it produced was a clock with 3 hands pointing at 10,12, and 2 o'clock (plus a seconds hand!).

Most other attempts result in nothing. Another useless AI feature on my iPhone 16 PM.
 
  • Like
Reactions: gusmula
I mean a big chunk of the original Apple Silicon team left Apple to develop it on their own. They lost the same amount of people either way. And the Nuvia team was posting about their server chips leapfrogging Apple in performance and power efficiency before being acquired by Qualcomm.

It was a huge three pronged short sighted management failure. 1. Lost a big chunk of top Apple Silicon engineers, 2. Gave competition, Windows and Qualcomm a nice post Intel and X86 gift and sadly and worst of all 3. They needed a server and AI chip development team anyway. Their short sightedness left them behind in AI and the server space that unlike Apple, everyone else could clearly see.
My point was more on it's easy to say this in hindsight, but at the time, there was no way to know what the future holds and AI was not that widely popular then. And their CPU's didn't beat Apple CPU's, their claims were based on giving them full power when testing and then testing the battery longevity on lower power, but their chips were excellent and really made Intel and AMD increase their efforts. But
 
  • Like
Reactions: rp2011
The team that went on to make Arm chips for Qualcomm was eager to build server chips for Apple a decade ago before being told no and leaving Apple to develop them on their own.
...but Apple don't make servers, so why would they want to build server chips?

Apple's core market is in mobile devices and ultra-portable laptops with cool designs, easy-to-use GUIs and seamless integration of apps and services (how well they're delivering on that is for another thread). Those aren't really important in the server market.

Apple Silicon still seems to be comfortably ahead of the game vs. Qualcomm Snapdragon in that respect. Even high-end personal workstations like the Mac Pro are probably a dead-end - more and more can be done with laptops and small-form-factor systems at one end of that scale, and more and more can be done in the cloud (dominated by a couple of fairly inassailable large players) at the other.

Yes, they made XServe. It was good for a few years. Then they dropped it because it ceased to have a reason to exist. Back when the XServe was launched, Macs needed proprietary services for file sharing etc. and the PC industry was using its own proprietary systems like Windows Server & Novell Netware which cost a fortune in per-seat licensing, which MacOS X could undercut. By the time it was dropped, everybody - including Macs - was happily using SMB and open Internet/W3C protocols - and Windows Server was facing huge competition from you-can't-beat-free Linux. Sleek design and user-friendly GUIs really don't matter for a server sitting in a rack running predominantly Unix/Linux server software (...and configuring server software via a GUI is like washing your feet with socks on). The XServe was just a.n.other generic rack-mount x86 *nix server box - not something that could command the soft of premium that consumer Macs & iDevices do.

Today, we're not even talking about the sort of multi-ARM-core/large I/O bandwidth Xeon-killer "server" chip that Ampere and AWS were pushing 10 years ago. We're talking about chips for AI services - i.e. neural engines and non-for-graphics GPUs. Apple Silicon is actually pretty good as a delivery platform for AI services (even if Apple's own implementations of AI software are a bit limp) and maybe Apple Silicon-derived technology has something to offer there... esp. on power efficiency (a selling point who's value waxes and wanes with the oil price...) but it's really not the tech - or even the same service - that we were talking about a decade ago.

...also, the "so-called AI" (aka party tricks with LLMs and diffusion) industry is a massive bubble liable to burst at any moment. That doesn't mean that AI is going away (any more than the web went away after the dot com crash) - but a lot of shirts are going to be lost.
 
Servers so
hot swap storage?
at least raid 1 for main / boot storage?
IPMI / IPMI that can do DFU mode from the web?

More then 128GB max ram?

More then 1 nic?

PCI-E slots for networking / storage devices?
Why not bring back Mac OS Server while they're at it?
 
Do we know anything new about M5 chips? I assume they’ve been working on M6 and M7 but, what do we know about the chips that are already being produced? No new leaks about the base M5?
 
Apple should already have a server room filled with thousands of Mac Minis running M4s crunching AI. Shame on them if they don’t.

SHAME!!
My understanding that that since Apple dropped Mac servers several decades ago they rely exclusively on x86 servers running linux and programmed in C++. Apple definitely does not eat their own dog food. And that also explains why there are so many bugs in Apple non-C++ arm software.
 
Do we know anything new about M5 chips? I assume they’ve been working on M6 and M7 but, what do we know about the chips that are already being produced? No new leaks about the base M5?
Not that I am aware of. I don't think there has even been a codename leak for M5. The big M5 rumor that seems solid, now from at least two different industry sources, is that the M5 Pro/Max will use SoIC instead of SoC.

So M5 and M5 Pro/Max/Ultra could be the first generation of Apple silicon with that distinction, where the M5 Pro/Max/Ultra have an integrated SoIC structure, while the A19, A19 Pro, and base M5 all have a monolithic SoC design.
 
Last edited:
Not that I am aware of. I don't think there has even been a codename leak for M5. The big M5 rumor that seems solid, now from at least two different industry sources, is that the M5 Pro/Max will use SoIC instead of SoC.
Yeah, I am aware of such rumors regarding the M5 Pro/Max. But apparently there are no leaks about the base M5 SoC. We don’t know if they will improve the Neural Engine adding more cores, or improve the GPU with a new architecture or more cores. Or if it will be an overclocked M4 (likely) with bigger cachés (less likely) and faster RAM (even less likely)
 
The Mac Pro is primarily a single user workstation. It pragmatically needs to be able to drive multiple diplays so that the single user can see the GUI. Very good chance that Apple's "AI server chip" has no display function units. First, Apple display processing units are abnormally big. Likely space die trade off will lean more toward the AI compute need to do rather than the display task absolutely not going to do in a data center. Basically same thing for Thunderbolt subsystem. More likely traded for the rumored Broadcomm internode communication system.

A Mac Pro with no display process and no Thunderbolt won't meet the modern criteria for being a "Mac" anymore.
station-a100-intro-graphic.png

Ability to make Mac Pro's chip will directly important for them to create server and super computer just like Nvidia did. If you ever checked Nvidia, they provided a workstation with 4x GPU for a personal user like DGX. Dont forget that Apple also made a server based on Mac Pro.

Errr.. Nvida?
NVIDIA-GH200-Diagram-September-2024.jpg

Apple's solution is much more rigidly "Homogenous Unified Memory" but others are on the MCM and/or multiple die path also. There is no DIMMs slots for mega-modular memory there.
Then where is Mac Pro's chip? Apple cant even make sever and super computer grade chip unlike Nvidia. Cant and Can is a huge difference.

Nvidia SuperChips are a Failure? really???? The longer term trendline toward far more power efficient AI compute is heading directly at chiplets and/or MCM.
I said Apple Silicon, NOT Nvidia.

Apple's possible resources are not limited to that. 'AI' is not equated with solely just large as possible language models. Apple has some self imposed constraints that go beyond avaialble GPUs. Privacy , not streamrolling copyrights. not hovering up as much customer prompt data a possible. etc. Apple has been more focused on inference on modest resources than on powering the most monstrous MW consuming machine possible.

Back to Mac Pro ... it has been capped at normal USA wall socket power levels. Not 'brown out' the neighborhood power levels.

Apple's AI server chip is far more likely to be the most power efficiencent (Perf/W ) of its immediate rivals far more so than the "Most Powerful" option.
Doesn't change the fact that Apple Silicon sucks for GPU and AI resources. That's also why they are struggling with their own AI server which only has 50,000 made out of more than 5 years old GPU, Cant even make a powerful desktop chip, and heavily limited by SoC as they need to make a huge chip.

Power Efficiency? Yes, it's great but when the max performance and expandability is IMPOSSIBLE, it's totally pointless. So far, they only can use as many Mac mini/Studio as possible to create a farm but that would VERY inefficient especially compared to Nvidia as they are being a monopoly by 90% in GPU market and being inevitable as they are the only solution for AI.

If you truly believe that Apple Silicon is the best, think again. There are reasons why Nvidia is dominating the entire
AI markets.
 
If you truly believe that Apple Silicon is the best, think again. There are reasons why Nvidia is dominating the entire
AI markets.
Yes there are, though your hating on Apple has clouded your summary.

Nvidia succeeded because:
1) Intel has been flopping around, for several years, like a dying fish on a wharf;
2) Export bans have kept the cutting edge photolithography out of the hands of the PRC, therefor Chinese startups in their IC industry are hobbled;
3) the major ARM licensees, such as Apple and Samsung, were not in that business.
 
  • Disagree
Reactions: BNBMS
Yes there are, though your hating on Apple has clouded your summary.

Nvidia succeeded because:
1) Intel has been flopping around, for several years, like a dying fish on a wharf;
2) Export bans have kept the cutting edge photolithography out of the hands of the PRC, therefor Chinese startups in their IC industry are hobbled;
3) the major ARM licensees, such as Apple and Samsung, were not in that business.
Being logical is hating? Wow. If you think that way, tell me if there are any Apple Silicon based super computer to train and study AI for their own company before you even say other things.
 
How do they plan to be successful without CUDA support? Except for LLMs, all the cutting edge AI imaging and video is developed in CUDA.
 
How do they plan to be successful without CUDA support? Except for LLMs, all the cutting edge AI imaging and video is developed in CUDA.
Apple have Metal for their own but still, it's such a tiny market especially since Apple lacks their own software. But once Apple can make Mac Pro grade Apple Silicon chip, then they can at least try.
 
As long as this "AI server chip" is a 'behind the curtain' hardware solely for Apple Intelligence, it really isn't going to make much of a difference until Apple Intelligence as an overall system is competitive. Software and Hardware.

There is zero indications that Apple is trying to go into the general purpose web services business ( Microsoft Azure, Amazon AWS , Google Cloud , etc. ) business. Pretty good chance this "AI Server" chip never shows up in a product that is for sale at retail.

Not really 'decade behind' AWS, Azure , or G-Cloud if simply just pulling workload in house and/or creating new AI inference workloads that didn't exist before.
I get beefing up iCloud in order to better share and delineate the personal AI/compute from the cloud. They struggled with that. Even that's been on the roadmap, so where's the news here? Besides, there's been a sense of doing this out-of-order for a few years. Build the reliable stack and scale. Apple hasn't built the first part yet, so this is all putting the cart before the horse. It's not like Apple hasn't made bad bets before (trash can Mac Pro).
 
Apple have Metal for their own but still, it's such a tiny market especially since Apple lacks their own software. But once Apple can make Mac Pro grade Apple Silicon chip, then they can at least try.
The problem is that AI models are basically implemented by researchers, and researchers chose CUDA a long time ago, and they won’t change (an unwise choice if you ask me, because it’s vendor specific, but most researchers are not smart when it comes to real implementation).
 
Right now, it seems nVidia cuda and tensor cores are the star attraction for tensor operations. Even AMDs ROCm can't compete on the same playfield. I would love for Apple to break nVidia's stranglehold, but I don't have much faith in Apple's AI prowess right now, and is the only reason I still keep a PC kicking around.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.