Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.


Apple used Tensor Processing Units (TPUs) developed by Google instead of Nvidia's widely-used graphics processing units (GPUs) to construct two critical components of Apple Intelligence.

Apple-Intelligence-General-Feature.jpg

The decision is detailed in a new research paper published by Apple that highlights its reliance on Google's cloud hardware (via CNBC). The paper reveals that Apple utilized 2,048 of Google's TPUv5p chips to build AI models and 8,192 TPUv4 processors for server AI models. The research paper does not mention Nvidia explicitly, but the absence of any reference to Nvidia's hardware in the description of Apple's AI infrastructure is telling and this omission suggests a deliberate choice to favor Google's technology.

The decision is noteworthy given Nvidia's dominance in the AI processor market and since Apple very rarely discloses its hardware choices for development purposes. Nvidia's GPUs are highly sought after for AI applications due to their performance and efficiency. Unlike Nvidia, which sells its chips and systems as standalone products, Google provides access to its TPUs through cloud services. Customers using Google's TPUs have to develop their software within Google's ecosystem, which offers integrated tools and services to streamline the development and deployment of AI models.

In the paper, Apple's engineers explain that the TPUs allowed them to train large, sophisticated AI models efficiently. They describe how Google's TPUs are organized into large clusters, enabling the processing power necessary for training Apple's AI models. Apple has announced plans to invest over $5 billion in AI server enhancements over the next two years, which should bolster its AI capabilities and reduce its dependence on external hardware providers.

In addition to detailing its use of Google's TPUs, the paper addresses ethical considerations in AI development. Apple emphasized its adherence to responsible data practices, claiming that no private user data was used in training its AI models. The company relied on a mix of publicly available, licensed, and open-sourced datasets for training purposes. Apple added that its training data set, which includes publicly available web data and licensed content, was curated to protect user privacy.

Article Link: Apple Used Google Tensor Chips to Develop Apple Intelligence

What an ABSOLUTE JOKE!

Tensor ... chips that are 2yrs behind the industry standard and using old Samsung Foundry manufacturing processes until next year!

This is a colossal failure of endeavor on Apple's part. The ONLY reason to use Tensor chips in 2024 is simply for being efficient in your code as the SoC is garbage! I'm not just talking out my arse here cause even Samsung's much hated SoC the Exynos 2400 outperforms on ALL measurements and in real world tasks and applications as well.


Sorry ONLY in video playback minutes the Google Pixel 8 which is the tensor chip Apple supposedly used is 'SLIGHTLY' better by just a few minutes more.
 
Responses here are hilarious.

Apple partners with both Google and Microsoft for various cloud things. What they used (the hardware) has absolutely no bearing on the model or Apple's implementation of the model, no? That's kind of like saying "Apple sucks because they use HP in the data center (vs. Dell for example)".

Apple has an initiative called ACDC (Apple Chips in Data Center) which is how they are creating an infrastructure for AI training using their own (M) silicon.

Tensor Vs. Nvidia is kind of like arguing over the brand of shovel you use to dig a trench. It does not mean that Apple Intelligence is all of a sudden a Google Gemini AI rebranded.
 
Last edited:
Apple is reportedly developing their own AI chips. To buy themselves time, it makes sense they are going with a rented solution, rather than buying a lot of hardware they’d want to scrap in 2-3 years. And maybe they evaluated bot Google and Nvidia and found the former to be more cost-effective.
 
You have to wonder, maybe it’s Nvidia that refused to supply chips to Apple.

Everybody knows H100/H200 is the best money can buy. Apple obviously has the money to buy and to cut in line. So there was no reason Apple didn’t go Nvidia.
 
There is a history between Apple and Nvidia. Lot's of bad blood over GPU's led them to AMD prior to Apple Silicon. If senior leaders are still around who lived through that it may have affected the choice. Speculation obviously, but one supported by history.
I know right, Apple hates NVIDIA so much lol
 
  • Like
Reactions: DeepIn2U
Securing Nvidia chips takes time and money- Apple only had one so they settled for what they could get. I’m concerned about the precedence this sets, because it caught them flat-footed and they’re in damage control mode. Compromising on the foundational piece instead of delaying like they say they do- going with best of breed, and not being first but being best. This is a move exuding an utter lack of confidence. It’s got a shiny veneer though! Leaves me concerned for Apple Intelligence. Let’s hope that doesn’t become a running gag like the mention of Siri.

Nah, Apple just went out of their way to not use NVIDIA! There is bad blood between Apple & NVIDA from the past. I am not sure there is anything to see here beyond that!
 
So while Apple may not have used user attributed data, Google still knows exactly what Apple is doing in AI, exactly how their models perform, and can compare their own AI with Apple's AI since they have both.

Sounds like a win-win for Google, Apple not so much. I wonder how much intellectual property Apple is giving up just to say "We can do that too!"

This sounds like a pretty big failure at the executive level.
I’m sure those in the know know more than MR posters’ speculations.
 
I'm sure this came down to cost. Google probably gave Apple an extremely good deal to use their TPUs in exchange for a public reference (the white paper). This public reference will serve as a very powerful marketing tool for Google as it competes against Nvidia in the AI infra space.
 
Apple was already in bed with google but this is a new leap forward that deepens the privacy stance of the company. How can you promise to maintain privacy while usi googles servers? That’s idiotic! Are they giving google all our data? Even if they remove the identifiers it’s still finger-printable.
Building off of and using German WW2 rocket technology doesn’t mean you have to subscribe to German WW2 ideology. 🤦🏼‍♂️
 
So not only was Apple reliant on Google services, but it was also comically lagging behind Nvidia’s hardware offerings. It makes you wonder what outdated tech Siri is still running on.
Apple doesn't make enterprise or data-center-grade AI accelerators. The premise of this comment is incorrect.

Some accelerators may work better with certain foundation models, or certain token size LLMs, or conversational vs generative AI tasks, etc.

AI chips are beginning to move towards specialization, in the same way Apple's ARM chips have specialized neighborhoods. Nvidia is the new Intel. They're on a completely unsustainable and inefficient brute-force architecture.
 
Last edited:
What an ABSOLUTE JOKE!

Tensor ... chips that are 2yrs behind the industry standard and using old Samsung Foundry manufacturing processes until next year!

This is a colossal failure of endeavor on Apple's part. The ONLY reason to use Tensor chips in 2024 is simply for being efficient in your code as the SoC is garbage! I'm not just talking out my arse here cause even Samsung's much hated SoC the Exynos 2400 outperforms on ALL measurements and in real world tasks and applications as well.


Sorry ONLY in video playback minutes the Google Pixel 8 which is the tensor chip Apple supposedly used is 'SLIGHTLY' better by just a few minutes more.
They are using Tensor Processing Units (below), not the mobile SoC named Tensor.
xtldIT7.jpeg
 
For those that don't know the past history of Nvidia on Apples black list. A lot of MacBook Pro computers had a failed Nvidia discreet graphics, resulting in a recall by Apple. Then Nvidia denied any claim and left Apple paying for all the repairs and replacement parts, even paying Nvidia for new parts. This was before Apple came this wealthy.
In addition Nvidia demanded core system access for their slotted GPU cards. Apple refused and never used Nvidia again.
 
So they’re using Google’s Cloud Platform. Their servers. Data centers they own. A competitor to Amazon Web Services and Microsoft Azure. What’s the big deal? A company the size of Apple is going to have their own dedicated block that only feeds them. When Enterprise that large shows up you move mountains to set things up to their spec, not yours. The term is “sticky” and these are sticky services, and you work hard to add in additional services to the relationship to maintain how sticky it is. There’s no concern on any leaks or feeds to Google, that would be a massive reputational risk for them for all Enterprise clients should they ever even take a peak.
 
Their consumer GPUs have had AI tensor cores for about 4 years now. That's why they have grown significantly recently. They saw the future vision. Everyone else jumped on the bandwagon and Apple is last to get onboard.
I don't know that this last statement is entirely accurate. Apple has been shipping NPUs with its consumer SoCs since 2017 for phones, and since 2020 for the Mac. You can quibble on the pros and cons between NPU and TPU cores, but both give the A series and M series a boost for AI and machine learning tasks (as long as they're paired with enough RAM).

On the software side it's certainly fair to critique Apple for being late to the LLM and generative AI bandwagon, but they were early adopters of machine learning and related technologies, including deploying developer accessible APIs to leverage the hardware, so it's not like they've been entirely standing still.
 
Really surprised at the ignorance of many posts here. Samsungs galaxy AI uses Google. If apples offering is the same as theirs, what's the problem? There isn't even a baseline to compare things to
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.