Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JesterJJZ

macrumors 68020
Original poster
Jul 21, 2004
2,467
830
Just watched the NVIDIA presentation. They're bringing some serious power with the RTX line. If Apple is really serious about professionals, these cards need to be supported in the next MacPro, so that means slots. This anti-NVIDIA thing needs to be squashed.
 
Last edited:
While it's true some people might have a need for a $10,000 card, not everyone is going to want it. The RTX line is deeply connected with ray-tracing, so it's great for those who need it.

That's why the old Mac Pro form factor is important so we all can fit in what we need to do our jobs.

I'm sure NVIDIA will release drivers for Mojave upon release (whether or not they will support Turing is another question). Apple is banging hard on Metal 2 support.
 
RTX is just the new nomenclature replacing 'GTX.' So the new consumer grade cards will be the RTX 2080 and RTX 2070.

Regardless, nVidia hardware probably won't make it into Apple hardware for whatever reason they've been ignoring them for the last few years. AMD's GPU hardware is not competitive in terms of raw power, so I wonder if they provide superior battery life in their notebooks and they don't want to provide native drivers for two different brands of GPU. Either that or there is some political animosity between the two companies.
 
RTX is just the new nomenclature replacing 'GTX.' So the new consumer grade cards will be the RTX 2080 and RTX 2070.

Regardless, nVidia hardware probably won't make it into Apple hardware for whatever reason they've been ignoring them for the last few years. AMD's GPU hardware is not competitive in terms of raw power, so I wonder if they provide superior battery life in their notebooks and they don't want to provide native drivers for two different brands of GPU. Either that or there is some political animosity between the two companies.

Rumor had it awhile back the Apple/AMD contract was an exclusive that should be ending in 2019. What that may legitimately mean for NVIDIA on Mac is another topic.
 
Never. Going. To. Happen.

if you think Apple wants any part in supporting Nvidia's proprietary standards (CUDA, G-Sync, GameWorks, NVLink and so on), then you seriously misunderstand Apple's vision for computing.
 
  • Like
Reactions: MisterAndrew
And what is Apple's vision for computing in this context?

Just look at Apple's various SDKs (WebKit, CoreMotion, ARKit, HealthKit, CoreML, and etc) and software stacks: it wants as much control as possible. Most of the time, primarily on iOS, Apple has complete and unchallenged control. In addition, particularly on the Mac platform, Apple takes existing open software standards to create its own libraries.

Going forward, especially as it wants as much ubiquity as possible between iOS and macOS, this unchallenged control and the importance of open standards will become even more important. If we just look at gaming -- never mind the other aspects of AR or GPU-assisted compute tasks such as ML -- Apple, I'm sure, wants to make iOS' vast gaming library available on macOS with as little hindrance as possible.

Nvidia have shown time and time again that they're not interested in open standards. They've been caught engaging in uncompetitive business practices in the PC and workstation market on numerous occasions. Apple wants no part of their drama.

That's not even talking about hardware features. If Apple wants to enable high refresh rate monitors on their Macs, do you think they want to waste their time and engineering resources paying for Nvidia's G-sync modules? No chance. If anything they'll take Freesync (or whatever VESA calls its official standard), absorb its features, add some additional modifications, and then integrate that into like a future T3 chip or something.


(actually, maybe an Apple T3 chip could provide a solution to integrate Thunderbolt with user-swappable GPUs...)
 
Just look at Apple's various SDKs (WebKit, CoreMotion, ARKit, HealthKit, CoreML, and etc) and software stacks: it wants as much control as possible. Most of the time, primarily on iOS, Apple has complete and unchallenged control. In addition, particularly on the Mac platform, Apple takes existing open software standards to create its own libraries.

Going forward, especially as it wants as much ubiquity as possible between iOS and macOS, this unchallenged control and the importance of open standards will become even more important. If we just look at gaming -- never mind the other aspects of AR or GPU-assisted compute tasks such as ML -- Apple, I'm sure, wants to make iOS' vast gaming library available on macOS with as little hindrance as possible.

Nvidia have shown time and time again that they're not interested in open standards. They've been caught engaging in uncompetitive business practices in the PC and workstation market on numerous occasions. Apple wants no part of their drama.

That's not even talking about hardware features. If Apple wants to enable high refresh rate monitors on their Macs, do you think they want to waste their time and engineering resources paying for Nvidia's G-sync modules? No chance. If anything they'll take Freesync (or whatever VESA calls its official standard), absorb its features, add some additional modifications, and then integrate that into like a future T3 chip or something.


(actually, maybe an Apple T3 chip could provide a solution to integrate Thunderbolt with user-swappable GPUs...)

Unfortunately, this is why the Mac platform is dying a slow painful death. Apple are too busy engineering themselves an impossibly small niche.

Having Machine Learning, AR and all these cool technologies is great, but they're doing so with B-grade hardware on an increasingly isolated platform that is less and less appealing to the very market those technologies are aimed primarily at. They don't have to adopt G-Sync or any of nVidia's software technologies, just license and use their best in market hardware.
 
Unfortunately, this is why the Mac platform is dying a slow painful death. Apple are too busy engineering themselves an impossibly small niche.

Having Machine Learning, AR and all these cool technologies is great, but they're doing so with B-grade hardware on an increasingly isolated platform that is less and less appealing to the very market those technologies are aimed primarily at. They don't have to adopt G-Sync or any of nVidia's software technologies, just license and use their best in market hardware.
We'll have to agree to disagree I think, and let things play out. :)

The Mac platform is hardly dying. I think we can all be accused of a bit of hyperbole on this subforum. Profits on the Mac lineup alone are equal to profits of entire companies.
The Mac's future is bright, so long as Apple delivers on its promise to release a new Mac Pro next year, updates its Mac mini, and releases updates/successors to the MacBook Air and iMac.

Those "cool technologies" are the future of computing, and will keep the Mac relevant and powerful for years to come.

With Nvidia, it's either you're all in or you're all out. There is no middle ground or compromise. I, for one, respect Apple's decision to steer clear from Nvidia and their foul play.
Sometimes you have to see the forest from the trees. While, yes, some extra gaming performance today is nice, it's not worth giving up Metal and having to submit to Nvidia's proprietary standards.

If you really want to game and use CUDA, then just buy a Windows PC/workstation and save yourself the heartache. Apple is locked into adopting AMD RTG graphics IP for the foreseeable future. We may even see that change someday to Intel graphics IP, especially since Raja Koduri (ex-Apple A-series graphics architect and ex-AMD RTG chief) and Jim Keller (ex-Apple A-series CPU architect and ex-AMD CPU lead engineer) are at the helm of Intel graphics now.

I'm sorry but with all the history behind the curtains between Apple, AMD, and Intel -- not to mention the giant cockup that was Nvidia's bumpgate -- I do not see Apple going back to Nvidia, ever.
 
i think there where a few story's about why apple stooped working with nvidia, the laptop GPU failures was one.
also may be that AMD is giving apple a relay low price for parts too
 
Yeah, I think it would be a real shame if Apple didn't include these new GPUs as an option at least. For me personally, it's overkill; I'm not willing to pay 10,000 for just a GPU. But there are MANY people in my field of work who WOULD be willing to pay for that in their Mac Pros.
 
With Nvidia, it's either you're all in or you're all out. There is no middle ground or compromise. I, for one, respect Apple's decision to steer clear from Nvidia and their foul play.

But Nvidia's "foul play" produces products that are better for users, than AMDs. 1080s retail for less than Vega64, use less power, produce less heat and can fit in a 20mm thick laptop. Apps optimised for CUDA generally perform better than AMD using OpenCL or Metal.

Being "locked in" to a better solution, is hardly a cause for suffering.
 
We'll have to agree to disagree I think, and let things play out. :)

The Mac platform is hardly dying. I think we can all be accused of a bit of hyperbole on this subforum. Profits on the Mac lineup alone are equal to profits of entire companies.
The Mac's future is bright, so long as Apple delivers on its promise to release a new Mac Pro next year, updates its Mac mini, and releases updates/successors to the MacBook Air and iMac.

Those "cool technologies" are the future of computing, and will keep the Mac relevant and powerful for years to come.

With Nvidia, it's either you're all in or you're all out. There is no middle ground or compromise. I, for one, respect Apple's decision to steer clear from Nvidia and their foul play.
Sometimes you have to see the forest from the trees. While, yes, some extra gaming performance today is nice, it's not worth giving up Metal and having to submit to Nvidia's proprietary standards.

If you really want to game and use CUDA, then just buy a Windows PC/workstation and save yourself the heartache. Apple is locked into adopting AMD RTG graphics IP for the foreseeable future. We may even see that change someday to Intel graphics IP, especially since Raja Koduri (ex-Apple A-series graphics architect and ex-AMD RTG chief) and Jim Keller (ex-Apple A-series CPU architect and ex-AMD CPU lead engineer) are at the helm of Intel graphics now.

I'm sorry but with all the history behind the curtains between Apple, AMD, and Intel -- not to mention the giant cockup that was Nvidia's bumpgate -- I do not see Apple going back to Nvidia, ever.

It's not an 'either/or' situation though? CUDA and Metal are not mutually exclusive and most of the time the user can choose which they'd prefer to use for compute tasks. All of the other technologies that nVidia is pushing are take it or leave it; for example Apple wouldn't have to run G-Sync on nVidia hardware (it's insanely expensive anyway and their retina displays are much better suited, Mac OS clearly is not a gaming platform).

When I say the Mac platform is 'dying,' I mean it's decreasing in relevance to users who need more horsepower than their notebooks can provide. Apple are providing firmware updates for 8 year old Mac Pros right now to keep them relevant in Mojave - which is great for cMP users! But it's a bit worrying that they just don't seem to have any competent modern machine for this segment of their userbase.

I guess it just seems short-sighted to limit their choice of GPU compute power to AMD when they're not even close to parity when it comes. And the argument of 'closed' and 'proprietary' technologies is a bit weak. Where's Thunderbolt on an AMD chipset? ;)
 
  • Like
Reactions: Project Alice
Apple has a large stable niche and they don’t need Nvidia. They could develop their own graphics accelerator without the bugs Nvidia’s drivers have. The Ax series GPU already has better performance per watt than AMD or Nvidia. They can scale it up for workstations and it would be ridiculously powerful and be upgradable.

If you want third party GPU then there is egpu and MacBook Pros for that. It’s a great choice too. But Nvidia is not showing any progress with decent driver development.
 
  • Like
Reactions: MisterAndrew
The Ax series GPU already has better performance per watt than AMD or Nvidia. They can scale it up for workstations and it would be ridiculously powerful and be upgradable.

That's conjecture. Apple also thinks it can ship a wireless charging mat. Just because they can do one thing, doesn't mean they're capable of scaling it to a different purpose.
 
  • Like
Reactions: bsbeamer
Apple has a large stable niche and they don’t need Nvidia. They could develop their own graphics accelerator without the bugs Nvidia’s drivers have. The Ax series GPU already has better performance per watt than AMD or Nvidia. They can scale it up for workstations and it would be ridiculously powerful and be upgradable.

If you want third party GPU then there is egpu and MacBook Pros for that. It’s a great choice too. But Nvidia is not showing any progress with decent driver development.

You have a LOT more confidence in Apple's software engineers than most folks.
 
i think there where a few story's about why apple stooped working with nvidia, the laptop GPU failures was one.
also may be that AMD is giving apple a relay low price for parts too

That did happen for Nvidia GPUs in 2008 MacBook Pros, but since then the following has also happened:
  • The 2011 MacBook Pro GPU failures that resulted in an extended warranty campaign were AMD GPUs.
  • The 2013 Mac Pro GPU failures resulting in an extended warranty campaign are AMD GPUs.
So if they are really upset about a 2008 Nvidia recall, they are surprisingly fine about the 2011 and still-ongoing 2013 AMD problems.
 
Rumor had it awhile back the Apple/AMD contract was an exclusive that should be ending in 2019. What that may legitimately mean for NVIDIA on Mac is another topic.

Although 2019 would line up nicely with the Mac Pro release, I don't think Apple not liking Nvidia and Nvidia not really liking Apple has changed at all.
[doublepost=1534371061][/doublepost]
That's conjecture. Apple also thinks it can ship a wireless charging mat. Just because they can do one thing, doesn't mean they're capable of scaling it to a different purpose.

It's more realistic than most things you hear around here. They could actually scale up an A series GPU. The only real wildcard is they've never shipped a discrete A series GPU. But it's a lot easier than scaling up a CPU.
[doublepost=1534371162][/doublepost]
You have a LOT more confidence in Apple's software engineers than most folks.

It's chipset engineering, not software or even hardware engineering. And Apple is actually pretty good at chipset engineering.

On the software side, they already have drivers for the A series. They just need to bring over the iOS drivers, which wouldn't be hard. There isn't too much software work that would need to be done.

OpenGL would be the only loose end, as they don't have full OpenGL support on the A series GPUs. So you'd have to have Apple announcing that they're dropping OpenGL before that could happen.

Wait. Hmmmmm....
 
Although 2019 would line up nicely with the Mac Pro release, I don't think Apple not liking Nvidia and Nvidia not really liking Apple has changed at all.
[doublepost=1534371061][/doublepost]

I don't know where the whole "Nvidia hates Apple/Apple hates Nvidia" thing started. But, may I point to something palpable, something that was said where "Apple might hate AMD?"

I think Apple might hate AMD from them officially blaming themselves for designing themselves into a thermal-corner with the trashcan Mac Pro.

So... wait, Apple blames themselves not AMD.

I know.

But, if you read into, the fact that Apple points the finger at themselves might be... ummm... saying, Yeah, we are publicly pointing the finger at ourselves. But, privately, we all know why that is...

do you know what I mean?

Or, am I reading too much into it?
 
  • Like
Reactions: bsbeamer
Or, am I reading too much into it?

As has been commented by none other than Apple's current darling Matt Panzarino, the GTX1080 was a pretty watershed product for content creation purposes. In terms of price, performance, heat and power draw, AMD still doesn't have anything to compare. Look around, there are plenty of "I migrated to Windows to get access to GTX1080s" from formerly Mac-based video producers.

That's gotta annoy Apple, when the absolute best they can get out AMD for their most expensive desktop system, is equivalent to the Laptop GPUs their competitors can access.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.