Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Who would've thought that a tantrum over an Nvidia leak during the G4 days would still be hobbling Apple today? Shame really.

Doubt it’s about that. The main reasons will be:

1. AMD needed the money and so gave better discounts than Nvidia. Same as they do for console manufacturers.

2. Apple didn’t want CUDA on their platform. Otherwise every pro software maker would just use that and ignore Metal. Then Apple would be over a barrel to use Nvidia in perpetuity.
 
Doubt it’s about that. The main reasons will be:

1. AMD needed the money and so gave better discounts than Nvidia. Same as they do for console manufacturers.

2. Apple didn’t want CUDA on their platform. Otherwise every pro software maker would just use that and ignore Metal. Then Apple would be over a barrel to use Nvidia in perpetuity.
Because Metal is working out sooooooooooooooo well for attracting Game Developers and games.

Maybe, just maaaaybe if Apple was a bit more willing to play ball with Nvidia, they could actually have that "Mac gaming ecosystem" they have been talking about for.......20 years?

But also the reality is, like it or not, that Nvidia GPU's still take the cake for a lot of "Professional" work flows. I'm not suggesting Apple needs to ship Nvidia DGPU's, but for the love of Cthulhu, just at least sign the drivers so we can use EGPU's (or DGPU in the Mac Pro).
 
One of the biggest reasons why, even if he were still alive, he would have probably given up the CEO job years ago.
I mean, just imagine Steve trying to navigate today’s landscape.
Imagine instead of pretending an Nvidia executive wasn’t in the room, instead it was 2017 and he was pretending a certain US president wasn’t in the room.
Probably wouldn’t have ended the same way.
With people like Jobs still in the world, the narrative would be a lot different. Companies push narratives too, not just media. Apple has done everything to push an lgbt agenda, which I’m pretty sure Jobs wouldn’t have done at all, not even for a second.
 
Because Metal is working out sooooooooooooooo well for attracting Game Developers and games.

Maybe, just maaaaybe if Apple was a bit more willing to play ball with Nvidia, they could actually have that "Mac gaming ecosystem" they have been talking about for.......20 years?

Apple has never given a **** about gaming on the Mac, despite the occasional WWDC segment.

But also the reality is, like it or not, that Nvidia GPU's still take the cake for a lot of "Professional" work flows.

I agree. That’s why Apple don’t want CUDA on the Mac, for the reason I gave.

I'm not suggesting Apple needs to ship Nvidia DGPU's, but for the love of Cthulhu, just at least sign the drivers so we can use EGPU's (or DGPU in the Mac Pro).

Apple care even less about Intel Macs than they do about gaming.
 
After seeing first hand what AI hardware is. And what it requires. I'm VERY happy Apple moved on to ARM chips for everything. 3200WATTS per PSU, and it has 6 of them. To power one of those H200 server racks. Don't get me wrong, it's powerful and can do the job. It costs a fortune, and that is if you can even get them. I'll take the future M chips over Nvidia at this point. Sipping power, and performing VERY well.

What I want is for Apple to create a GPU card based on the M series chips. Go crazy with 400 GPU cores. 1TB of ram. They could charge $10k for it and it would be worth it.
 
  • Like
Reactions: amartinez1660
Because Metal is working out sooooooooooooooo well for attracting Game Developers and games.

Maybe, just maaaaybe if Apple was a bit more willing to play ball with Nvidia, they could actually have that "Mac gaming ecosystem" they have been talking about for.......20 years?

But also the reality is, like it or not, that Nvidia GPU's still take the cake for a lot of "Professional" work flows. I'm not suggesting Apple needs to ship Nvidia DGPU's, but for the love of Cthulhu, just at least sign the drivers so we can use EGPU's (or DGPU in the Mac Pro).
What does CUDA have to do with gaming?
 
What does CUDA have to do with gaming?
I wasn’t implying that CUDA = Gaming.
Two separate thoughts (two separate paragraphs):

1) Nvidia GPU’s are top dog for gaming.
2) Separately, but also true, Nvidia GPU’s take the cake for performance in a lot of Professional workflows.
 
  • Like
Reactions: amartinez1660
I was referring to using an Nvidia GPU with M-Series chips.

That’ll take a bit more than just signing drivers. There’s been no Nvidia Mac drivers since about High Sierra / GTX980, if memory serves.

The whole AS architecture is built around unified memory, and applications expect it to be there. Apple aren’t about to provide alternative code paths for a tiny % of its users. They’d probably just prefer you move to Windows.
 
  • Like
Reactions: amartinez1660
In the late 2000s, Nvidia sold Apple and other computer manufacturers (including Dell and HP) GPUs that turned out to have a manufacturing defect. This issue revolved around certain Nvidia graphics chips that were prone to overheating due to flaws in the soldering material and chip packaging. The defect caused graphical glitches, system crashes, and in severe cases, complete hardware failure. Over time, the chips would overheat, leading to physical deterioration of the solder joints.

Nvidia initially set aside $200 million in 2008 to address warranty claims and repairs related to the faulty chips. However, there was widespread criticism of Nvidia for not clearly disclosing the full extent of the problem. Apple acknowledged the issue in 2008 and offered free repairs for MacBook Pro models affected by the faulty Nvidia chips. These repairs were available even for systems no longer under warranty, and Apple released support articles informing users of the defect.

The defective chips led to lawsuits against Nvidia, as well as negative press and customer dissatisfaction. While Nvidia took financial responsibility for the issue, the problem highlighted the challenges of complex hardware manufacturing and the need for rigorous quality control.
 
In the late 2000s, Nvidia sold Apple and other computer manufacturers (including Dell and HP) GPUs that turned out to have a manufacturing defect. This issue revolved around certain Nvidia graphics chips that were prone to overheating due to flaws in the soldering material and chip packaging. The defect caused graphical glitches, system crashes, and in severe cases, complete hardware failure. Over time, the chips would overheat, leading to physical deterioration of the solder joints.

Nvidia initially set aside $200 million in 2008 to address warranty claims and repairs related to the faulty chips. However, there was widespread criticism of Nvidia for not clearly disclosing the full extent of the problem. Apple acknowledged the issue in 2008 and offered free repairs for MacBook Pro models affected by the faulty Nvidia chips. These repairs were available even for systems no longer under warranty, and Apple released support articles informing users of the defect.

The defective chips led to lawsuits against Nvidia, as well as negative press and customer dissatisfaction. While Nvidia took financial responsibility for the issue, the problem highlighted the challenges of complex hardware manufacturing and the need for rigorous quality control.
I believe it was called Bump-gate at the time (Sony had similar issues with the Nvidia GPU in the PS3). I assume the article’s mention of a “bumpy relationship” is a reference to that problem.
 
2024 Macs should have an M5 with an RTX 4090 option. I want to be able to game on a Mac without having to buy a garbage windows PC.
 
  • Like
Reactions: brandoman
You've put a spotlight on exactly what's wrong with modern Apple's silicon: a lack of discrete GPUs. Many workloads benefit or flat out require it. It's silly for them not to make an allowance for it but I selfishly chalk it up to being spiteful to gamers
…Gamers? Really? Their hardware and high-end GPUs have far more meaningful use than gaming they absolutely need to offer better graphics for that integrated graphics won’t do—various creative professional use cases Apple is preferred for have to resort to having a complimentary PC rig instead.

Various gamer segments very vocal in social media including forums do not often have non-entertainment use cases to afford consistently Apple hardware being primarily made for productivity.

4090s like mine, coveted by 4K+ gamers, as well as A series cards are all primarily made for prosumer use cases beyond games after all.

Nonetheless professionals game too.

Ultimately Apple is not worried about an audience they don’t primarily cater too for many years now with low margins creating content for that use case they by no coincidence have not invested much in compared to other mediums.
 
I wasn’t implying that CUDA = Gaming.
Two separate thoughts (two separate paragraphs):

1) Nvidia GPU’s are top dog for gaming.
2) Separately, but also true, Nvidia GPU’s take the cake for performance in a lot of Professional workflows.

You brought up Metal...."Because Metal is working out sooooooooooooooo well for attracting Game Developers and games."

What exactly would be different if there were Nvidia GPUs in lets say.. A MacBook Pro?
They'd be using metal anyway...

In terms of PPW, Apples own GPUs are ahead of Nvidia right now.
 
  • Like
Reactions: amartinez1660
2024 Macs should have an M5 with an RTX 4090 option. I want to be able to game on a Mac without having to buy a garbage windows PC.

There are few games on the Mac, and no support for PCIe GPUs. I’m afraid a ‘garbage PC’ (or console) is your only option. Complaints about Windows are overblown anyway, if you use Pro and set it up properly.

In terms of PPW, Apples own GPUs are ahead of Nvidia right now.

In the same way a 1L engine is more fuel efficient than a 6L V8. But sometimes you just want the horsepower.
 
  • Like
Reactions: chetzar
You brought up Metal...."Because Metal is working out sooooooooooooooo well for attracting Game Developers and games."

What exactly would be different if there were Nvidia GPUs in lets say.. A MacBook Pro?
They'd be using metal anyway...

In terms of PPW, Apples own GPUs are ahead of Nvidia right now.
As mode11 mentions above, some tasks don't care about performance per watt.

You wouldn't take a Prius to the Dragstrip to compete. Sometimes you just need more power.
 
You mean 2025 Macs should have an M5 with an RTX 5090 option? :)
If Apple were to build a dedicated GPU based on the GPU of the M4 chips now. We would never need AMD or Nvidia GPU's. Have it support external PCI-e enclosures. Any M mac mini, Studio, or M Mac Pro tower would be able to compete with the 4090 or future GPU's. It would come at a cost that would be up there with AMD and Nvidia. But, it would be most likely air cooled and under or just about 100 watts. Sell 3 variants of it. 100, 200, and 400 Core cards.

Heck if they "really" wanted to disrupt the x86 world. Or AMD and Nvidia (now intel too.... LOL) in the GPU space. Make it PC compatible.............. Linus would lose his $#!T..
Push Metal on the PC side too. They could completely up end the market. Since there are no real GPU's on the Qualcomm side of ARM (they are not as good as the M chips). They "could" start it there first. OR!! OR!!!
Build a console. I know it's not happening, but I think it could work if they did.
 
I wasn’t implying that CUDA = Gaming.
Two separate thoughts (two separate paragraphs):

1) Nvidia GPU’s are top dog for gaming.
2) Separately, but also true, Nvidia GPU’s take the cake for performance in a lot of Professional workflows.
This opinion is certainly more valid... in 2021. But today? The new Studio, coming out in a couple months, will have the performance AND price comparable to a Windows PC with Nvidia.

The M4 Max is already performing as a 4070. I have a MBP with one, and can confirm.

I don't know if you've purchased a PC lately, but a 4070 pre-build costs close to $2K - with a much slower CPU (believe me, just bought one as a present). A base Studio with M4 Max will also be $2k. The Ultra?.. it's a 4090 for 3D workflows. Nvidia is done on Apple.

Apple just needs to support or buy a Game Developer AND support OpenVR or some sort of VR support natively. The GPU side "catchup" is completed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.