Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,493
37,779


Nvidia has become one of the world's most valuable companies thanks to strong demand for its artificial intelligence (AI) server chips from big tech companies like Amazon, Microsoft, and Google. However, one tech giant that is not a major Nvidia customer is Apple, and a new report attempts to explain why this might be.

Apple-vs-Nvidia.jpg

The Information's Wayne Ma today outlined Apple's historically "bumpy relationship" with Nvidia, but much of the bad blood dates to the 2000s and early 2010s, when Steve Jobs was still CEO and Macs used Nvidia graphics. It is unclear how much these past issues matter today, if at all, and the report acknowledges that Apple's current relationship with Nvidia "isn't entirely acrimonious." Two examples of the companies getting along include Apple's recent collaboration with Nvidia on machine learning research, and Nvidia showcasing the Apple Vision Pro when it announced a new software framework earlier this year.

Apple has mostly rented access to Nvidia GPUs through the cloud from companies like Amazon and Microsoft, and it is reportedly developing its own AI server chip as a longer-term solution, but neither of those things prove that Apple still has an "allergy" to Nvidia as the report states. As has been the case for more than a decade, and even more so in recent years, Apple simply wants to develop as many in-house chips and technologies as possible for many reasons, including lower production costs, improved integration between hardware and software, and reduced reliance on external suppliers. So, this seems less about Apple avoiding Nvidia in particular, and more about Apple owning the whole widget in general.

This trend has been playing out for many years now. In addition to Apple long designing its own iPhone chips, the company started releasing its own Mac chips in 2020, in a transition away from Intel. Apple's long-rumored 5G modem is expected to begin rolling out in iPhones next year, in a move away from its current supplier Qualcomm. Apple is also reportedly developing its own Wi-Fi and Bluetooth chip, a move that will impact Broadcom.

So while Steve Jobs may have once pretended an Nvidia executive was no longer in the room during a meeting, as the report states, it seems most likely that Apple simply has no need to directly purchase GPUs from Nvidia. Apple is clearly fine with renting access to the GPUs from cloud providers until its in-house chip is ready.

The report is nevertheless an interesting read, and it reinforces how Jobs was very good at holding a grudge when he was unhappy with a situation.

Article Link: Apple's Historically 'Bumpy Relationship' With Nvidia Detailed in Report
 
Last edited:
  • Like
Reactions: drrich2
It's clear though that Apple was pissed when they were developing Metal & NVIDIA wouldn't acquiesce to allowing Apple access to the hardware. That issue and the whole situation surrounding their failing graphics in MacBook Pros soured things too. Maybe time has healed old wounds but Apple tends to be a company that has the memory of an elephant & doesn't let go easily.
 
So while Steve Jobs may have once pretended an Nvidia executive was no longer in the room during a meeting, as the report states, it seems more likely that Apple simply has no need to directly purchase GPUs from Nvidia. Apple is clearly fine with renting access to the GPUs from cloud providers until its in-house chip is ready.

The report is nevertheless an interesting read, and it reveals how Jobs was very good at holding a grudge when he was unhappy with a situation.
One of the biggest reasons why, even if he were still alive, he would have probably given up the CEO job years ago.
I mean, just imagine Steve trying to navigate today’s landscape.
Imagine instead of pretending an Nvidia executive wasn’t in the room, instead it was 2017 and he was pretending a certain US president wasn’t in the room.
Probably wouldn’t have ended the same way.
 
Then, Apple, sign the fscking NVIDIA web drivers and allow MILLIONS of older Macs to be spared the 'recycling' heap using OCLP! Show your users, not just say the words.
What words do you think Apple said? You do realize this report is not written by Apple, right? What words did Apple actually say that has you feel slighted?

What actual Macs do you have that you want to run with an NVIDIA board? What Mac OS are you running on them? What apps? How old are those machines? Unless you can provide specifics, this sounds like a spectacular non-problem.

You may want to pick up a different hill to... climb up. Complain about the charging port on the Magic Mouse. Something else, please.
 


Nvidia has become one of the world's most valuable companies thanks to strong demand for its artificial intelligence (AI) server chips from big tech companies like Amazon, Microsoft, and Google. However, one tech giant that is not a major Nvidia customer is Apple, and a new report attempts to explain why this might be.

Apple-vs-Nvidia.jpg

The Information's Wayne Ma today outlined Apple's historically "bumpy relationship" with Nvidia, but much of the bad blood dates to the 2000s and early 2010s, when Steve Jobs was still CEO and Macs used Nvidia graphics. It is unclear how much these past issues matter today, if at all, and the report acknowledges that Apple's current relationship with Nvidia "isn't entirely acrimonious." Two examples of the companies getting along include Apple's recent collaboration with Nvidia on machine learning research, and Nvidia showcasing the Apple Vision Pro when it announced a new software framework earlier this year.

Apple has mostly rented access to Nvidia GPUs through the cloud from companies like Amazon and Microsoft, and it is reportedly developing its own AI server chip as a longer-term solution, but neither of those things prove that Apple still has an "allergy" to Nvidia as the report states. As has been the case for more than a decade, and even more so in recent years, Apple simply wants to develop as many in-house chips and technologies as possible for many reasons, including lower production costs, improved integration between hardware and software, and reduced reliance on external suppliers. So, this seems less about Apple avoiding Nvidia in particular, and more about Apple owning the whole widget in general.

This trend has been playing out for many years now. In addition to Apple long designing its own iPhone chips, the company started releasing its own Mac chips in 2020, in a transition away from Intel. Apple's long-rumored 5G modem is expected to begin rolling out in iPhones next year, in a move away from its current supplier Qualcomm. Apple is also reportedly developing its own Wi-Fi and Bluetooth chip, a move that will impact Broadcom.

So while Steve Jobs may have once pretended an Nvidia executive was no longer in the room during a meeting, as the report states, it seems most likely that Apple simply has no need to directly purchase GPUs from Nvidia. Apple is clearly fine with renting access to the GPUs from cloud providers until its in-house chip is ready.

The report is nevertheless an interesting read, and it reinforces how Jobs was very good at holding a grudge when he was unhappy with a situation.

Article Link: Apple's Historically 'Bumpy Relationship' With Nvidia Detailed in Report
Well, I think that it would only benefit Apple to allow Nvidia to develop drivers for the MacPro - on select GPU models. Who wouldn't love to have a couple of A100s or Ls40's running local LLMs on their MacPro? Either they open up their slots to Nvidia, or make their own "affordable" LLM accelerator card with the chip they're planning to put in their servers (rumored to basically be an M5 with a boatload of APUs for inference and light on the CPU/GPU cores.)
 
  • Like
Reactions: amartinez1660
Who would've thought that a tantrum over an Nvidia leak during the G4 days would still be hobbling Apple today? Shame really.
 
I'm not taking sides here. With the first Mac mini, it shipped with an ATI 9200* Or at least, an ATI 9200 with many features not enabled and clocked down so it wouldn't overheat and would cost less... of course, the version of firmware shipped with a bug incapable of utilizing DVI-D to a high resolution monitor (at the time), and instead needed to revert to VGA.

It was fixed in a firmware update that Apple decided never to purchase from ATI, but the whole tale kind of underscores the point that dollar cost was at issue at Apple, and some hardware decisions mean making cuts somewhere. How many times has keeping to that thermal envelope ripened in to a sour experience time and time again?
 
I'm not taking sides here. With the first Mac mini, it shipped with an ATI 9200* Or at least, an ATI 9200 with many features not enabled and clocked down so it wouldn't overheat and would cost less... of course, the version of firmware shipped with a bug incapable of utilizing DVI-D to a high resolution monitor (at the time), and instead needed to revert to VGA.

It was fixed in a firmware update that Apple decided never to purchase from ATI, but the whole tale kind of underscores the point that dollar cost was at issue at Apple, and some hardware decisions mean making cuts somewhere. How many times has keeping to that thermal envelope ripened in to a sour experience time and time again?
That is what happens when Apple engineers think the only use of Apple devices is for email and social media.
 
Then, Apple, sign the fscking NVIDIA web drivers and allow MILLIONS of older Macs to be spared the 'recycling' heap using OCLP! Show your users, not just say the words.

Exactly this. All these Apple zealots love to hug Apple from the back, but we have amazingly good still working computers that can't upgrade because there's no graphic driver. To this day there is no apple graphic driver for catalina. At some point during one of the updates Apple even replaced the last Nvidia driver for OS monterey. When I look at my graphic drivers now the Nvidia is grayed out even though I installed it myself when I install that OS.

I wanted to upgrade my old iMac years ago, the iMac is still in use and it's been 13 years. No way I'm tossing it and buying another Mac so good luck to Apple there.

Lastly, there's already rumors that Apple's going to start doing something different with its gpus. Thankfully I'm super tech oriented for 30 years now, it's not rocket science to see that they're integrated approach has already started hitting walls and can't hold a candle to discrete graphics which was obvious years ago but the CPUs were doing such amazing work and still are, but it can't cover for weak graphics no matter how efficient it is.

I'm very very interested to see how Apple's going to handle the gpus over the next 5 years. And many people forget that the gpus currently are pulling ram from the overall ram pool. My laptop cost $3,400 and I still have to add even more overpriced ram just to make sure I'm still covered for the heavy apps that I use like after effects. It's absolutely ridiculous.

But let the Apple riders ride. I'm a Mac guy through and through but I absolutely despise how Apple treats it's customers. And this is exactly why I don't subscribe to any of their services I don't use any of their apps no Final cut no keynote no nothing been there done that and I'm not having a rug pulled out on me again.

That's my 2c.
 
Exactly this. All these Apple zealots love to hug Apple from the back, but we have amazingly good still working computers that can't upgrade because there's no graphic driver. To this day there is no apple graphic driver for catalina. At some point during one of the updates Apple even replaced the last Nvidia driver for OS monterey. When I look at my graphic drivers now the Nvidia is grayed out even though I installed it myself when I install that OS.

I wanted to upgrade my old iMac years ago, the iMac is still in use and it's been 13 years. No way I'm tossing it and buying another Mac so good luck to Apple there.

Lastly, there's already rumors that Apple's going to start doing something different with its gpus. Thankfully I'm super tech oriented for 30 years now, it's not rocket science to see that they're integrated approach has already started hitting walls and can't hold a candle to discrete graphics which was obvious years ago but the CPUs were doing such amazing work and still are, but it can't cover for weak graphics no matter how efficient it is.

I'm very very interested to see how Apple's going to handle the gpus over the next 5 years. And many people forget that the gpus currently are pulling ram from the overall ram pool. My laptop cost $3,400 and I still have to add even more overpriced ram just to make sure I'm still covered for the heavy apps that I use like after effects. It's absolutely ridiculous.

But let the Apple riders ride. I'm a Mac guy through and through but I absolutely despise how Apple treats it's customers. And this is exactly why I don't subscribe to any of their services I don't use any of their apps no Final cut no keynote no nothing been there done that and I'm not having a rug pulled out on me again.

That's my 2c.
You've put a spotlight on exactly what's wrong with modern Apple's silicon: a lack of discrete GPUs. Many workloads benefit or flat out require it. It's silly for them not to make an allowance for it but I selfishly chalk it up to being spiteful to gamers
 
It's simple. Do you like buying from Nvidia? No, because they're too expensive. So you buy AMD's graphics cards.
That's precisely why Apple, Microsoft, Amazon, and other major tech companies buy AMD or build chips themselves.
 
For the immediate future Apple needs Nvidia chips for the AI server. But definitely not for long. As soon as Apple's chip is ready, there will be a shift to in house chips for the AI server. Maybe in 2 to 3 years.
 
  • Like
Reactions: mganu
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.