Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

CarbonCycles

macrumors regular
Original poster
May 15, 2014
122
118
Wondering what the hive thinks of Apple using NVidia's GPU over ATI's GPU. I recall Apple being a huge fan of Open GL, but some of my work is leading me down the path of NVidia's CUDA.

TIA
 
Wondering what the hive thinks of Apple using NVidia's GPU over ATI's GPU. I recall Apple being a huge fan of Open GL, but some of my work is leading me down the path of NVidia's CUDA.

Personally, I believe there is not much chance. Apple and Nvidia don't seem to have a good business relation as of late and Apple has heavily invested into AMD GPUs with their hardware. And then, Nvidia doesn't really have anything better to offer in the bracket that Apple is interested in. In the lower TDP departments, AMD Polaris and Nvidia Pascal perform very similarly, and we will probably see a decent boost with a mobile low-TDP Vega next year...
 
Personally, I believe there is not much chance. Apple and Nvidia don't seem to have a good business relation as of late and Apple has heavily invested into AMD GPUs with their hardware. And then, Nvidia doesn't really have anything better to offer in the bracket that Apple is interested in. In the lower TDP departments, AMD Polaris and Nvidia Pascal perform very similarly, and we will probably see a decent boost with a mobile low-TDP Vega next year...

Ugh, you are validating what my instincts have been dancing around. I hope this isn't the final nail in the coffin for me being an Apple customer :(
 
Ugh, you are validating what my instincts have been dancing around. I hope this isn't the final nail in the coffin for me being an Apple customer :(

If thats sufficient for you, an eGPU is always a possibility. But if your work relies on CUDA heavily, then Apple doesn't offer any tools in that department.

Personally, I think that CUDA is a bad thing for the entire industry and I'd be happy to see it die. Then again, there is no question that Nvidia's marketing department was very smart in seizing the opportunity and that there currently is no viable alternative. Especially since Nvidia is actively sabotaging any other development that could make CUDA use its ground. On Apple's platform, you can and should use Metal, which is just as user friendly as CUDA (if not more user friendly), but still lacks features.
 
If thats sufficient for you, an eGPU is always a possibility. But if your work relies on CUDA heavily, then Apple doesn't offer any tools in that department.

Personally, I think that CUDA is a bad thing for the entire industry and I'd be happy to see it die. Then again, there is no question that Nvidia's marketing department was very smart in seizing the opportunity and that there currently is no viable alternative. Especially since Nvidia is actively sabotaging any other development that could make CUDA use its ground. On Apple's platform, you can and should use Metal, which is just as user friendly as CUDA (if not more user friendly), but still lacks features.

I appreciate your information! I'll take a look at Metal to see what that's all about!
 
Build in NVIDIA GPU odds = Very Low

I think Apple likes the deal they get from AMD as long as performance is somewhat good..

As others have mentioned, I think they'll tell NVIDIA requiring users to enjoy their native eGPU support.
 
Sadly no because Apple is Apple and they'll never admit they are wrong to cram in the graphics cards 1/2 of the current Sufrace Book 2 power and charge you twice as much for it.
 
To be fair, NVidia still doesn't have vastly more efficient GPUs than the Radeon Pro 560. The GTX1060 Max-Q offers just about twice the performance for exactly twice the power consumption. If you need CUDA, that won't help, of course.
 
I wish Apple would put a GTX 1060 in their notebooks, but I do not see that happening. I do Machine Learning and right now CUDA is the only game in town for GPU acceleration. But, AMD does have a new framework that is supposed API compatible with CUDA and open source, so maybe someday.
 
I wish Apple would put a GTX 1060 in their notebooks, but I do not see that happening. I do Machine Learning and right now CUDA is the only game in town for GPU acceleration. But, AMD does have a new framework that is supposed API compatible with CUDA and open source, so maybe someday.
Same situation here, and tbh I don't like this one bit. Not just as an Apple user, but also because I think a situation where an entire engineering/scientific field depends on the products of a single vendor is inherently wrong. I also looked into that AMD framework you mention, hopefully it will be a serious attempt at giving Nvidia some competition (at the moment it looks very limited in terms of platform compatibility, though). But for the time being, there is unfortunately no CUDA alternative that really works.

Back on topic: very, very slim chance for Nvidia on the next MBPs. As others have mentioned, there simply aren't any Nvidia cards that can match or best AMD's offering in the power consumption bracket which fits the MBP.
 
Sadly no because Apple is Apple and they'll never admit they are wrong to cram in the graphics cards 1/2 of the current Sufrace Book 2 power and charge you twice as much for it.

Not surpized there. nVidia and apple don't really go together.
 
Wondering what the hive thinks of Apple using NVidia's GPU over ATI's GPU. I recall Apple being a huge fan of Open GL, but some of my work is leading me down the path of NVidia's CUDA.

Highly unlikely. I don't think the Apple + NVIDIA relationship is as strong as the Apple + AMD relationship. Apple has also had time to develop and improve its' Radeon/Fire Pro GPU drivers for macOS, making hardware optimizations with each new iteration of AMD's GCN architecture. Apple has shown that, with proper drivers and optimizations, the GCN design is quite useful in certain applications.
 
Pretty much zero as NVidia is not likely to acquiesce to Apple's financial demands given NVidia has no shortage business. More realistically as Apple has now converted the MBP to the new MacBook Air, there is simply no possibility of the new chassis or power delivery being remotely capable of supporting even a moderately powerful GPU.

An eGPU is a partial solution, equally not ideal for all and disappointing as Apple's only solution, given the iMac Pro is a little bulky for the hand carry :)

Q-6
 
I wish Apple would put a GTX 1060 in their notebooks, but I do not see that happening. I do Machine Learning and right now CUDA is the only game in town for GPU acceleration. But, AMD does have a new framework that is supposed API compatible with CUDA and open source, so maybe someday.

Did you have a look at Apple's ML framework? https://developer.apple.com/machine-learning/ I don't do ML myself, so I have no idea how good it is, but its something they use quite heavily across the OS.
 
Not going to happen. I think Apple is content with the arrangements they have with AMD, and Apple has always been a company that didn't care to have the fastest GPU in their computers.
 
  • Like
Reactions: Queen6
These responses are very disheartening. I remember when I was very active in Academia that Apple was pushing their products and targeting to the academic/research fields in that they were offering a user friendly Linux kernel that was 1000x more useful in research applications then Windows would ever be.

In today's research field, a dedicated state-of-the-art GPU is almost a necessity. I'm almost afraid that NVidia and AMD's frameworks is going to turn this into the Beta versus VHS wars of years ago where no-one really wins.

I took a look at Metal 2, and it looks very promising but may already be behind the power curve with NVidia's head-start.

For you folks who are in to ML, check out these blogs (if you haven't already):
https://machinelearning.apple.com/

Take note that they also used TensorFlow in some of their applications...does this mean they used a NVidia card at one time?!?
 
Did you have a look at Apple's ML framework? https://developer.apple.com/machine-learning/ I don't do ML myself, so I have no idea how good it is, but its something they use quite heavily across the OS.

The last thing anyone doing research in AI/ML wants to use is a framework tied to a particular computer company. The Nvidia situation is barely acceptable. Most people want to be able to run on any platform, which NVidia allows, and at scale on big farms like AWS, Azure, GCE, etc.
[doublepost=1514392939][/doublepost]
For you folks who are in to ML, check out these blogs (if you haven't already):
https://machinelearning.apple.com/

Take note that they also used TensorFlow in some of their applications...does this mean they used a NVidia card at one time?!?

Tensorflow can take advantages of NVidia CUDA, NVidia CUDA is not a requirement. You can run Tensorflow just fine on a system without a dedicated GPU. It just may run 2 to 5 times slower.

But my guess is that Apple is running CUDA for their ML training. With TensorFlow they should be able to shift work off to Nvidia K40 and K80 GPU servers.
 
These responses are very disheartening. I remember when I was very active in Academia that Apple was pushing their products and targeting to the academic/research fields in that they were offering a user friendly Linux kernel that was 1000x more useful in research applications then Windows would ever be.
I agree 100% with this. Having a Unix-based system with a polished GUI that "just worked" is what made the Mac a fantastic machine for a scientist, and despite some problems I find this to be still the case today.

I had already gone through the material about Apple's ML framework but unfortunately, unless you're developing applications for their devices only, that type of solution is a no-go. It's already frustrating (and, as I said above, bad for both science and the industry) that one has to rely on the hardware of a single company for fast training. Deploying platform-locked models is literally the last thing you want.
 
  • Like
Reactions: Queen6
These responses are very disheartening. I remember when I was very active in Academia that Apple was pushing their products and targeting to the academic/research fields in that they were offering a user friendly Linux kernel that was 1000x more useful in research applications then Windows would ever be.

In today's research field, a dedicated state-of-the-art GPU is almost a necessity. I'm almost afraid that NVidia and AMD's frameworks is going to turn this into the Beta versus VHS wars of years ago where no-one really wins.

I took a look at Metal 2, and it looks very promising but may already be behind the power curve with NVidia's head-start.

For you folks who are in to ML, check out these blogs (if you haven't already):
https://machinelearning.apple.com/

Take note that they also used TensorFlow in some of their applications...does this mean they used a NVidia card at one time?!?

As I understand it. Mac OS X is BSD based not Linux.

Mac OS X is more directly base off NeXTSTEP which is BSD based. The progenitor of Mac OS X. Developed by NeXT which was founded by Steve Jobs. NeXT started development in 1985 with the first NeXTSTEP released in 1989. NeXT was later bought by Apple. Bringing back Steve Jobs and NeXTSTEP began the transition to Mac OS X under the code name Rhapsody.

Linux began development in 1991. The Linux kernel was created by Linus Torvalds.

OS X is more closely related to FreeBSD.
 
  • Like
Reactions: Queen6
I think you are right. I stand corrected. Thank you.

As I understand it. Mac OS X is BSD based not Linux.

Mac OS X is more directly base off NeXTSTEP which is BSD based. The progenitor of Mac OS X. Developed by NeXT which was founded by Steve Jobs. NeXT started development in 1985 with the first NeXTSTEP released in 1989. NeXT was later bought by Apple. Bringing back Steve Jobs and NeXTSTEP began the transition to Mac OS X under the code name Rhapsody.

Linux began development in 1991. The Linux kernel was created by Linus Torvalds.

OS X is more closely related to FreeBSD.
 
It's already been stated but nobody really picked up on it, so I'll repeat. nVidia don't currently make a chip suited to the design philosophy of Apple portables. Even the Max-Q 1060 requires 60-70w verses the 35w Radeon Pro 560. I guess you could underclock a 1050, but the the closest chip for the thermal design of the MBP would be something like the MX150, which is really just a dGPU for the sake of it.

The Radeon Pro 5xx is really the best available chip for Apple portables right now.
 
  • Like
Reactions: 06tb06 and Queen6
This seems very timely:

http://timdettmers.com/2017/12/21/deep-learning-hardware-limbo/

Intel's new Nervana NNP chip sounds pretty amazing. What an awesome yet terrible time to be involved with all this...
A very good reading and pretty much spot-on, thanks for sharing. As a machine learning and scientific computing guy, I have long been hoping to see some genuine competition in the field and combat Nvidia's price gouging, and this could be the right moment if other vendors play their cards well (pun intended).

Software support is the absolute key. Unfortunately the best hardware means nothing today if no programming/modeling framework supports it.
 
This seems very timely:

http://timdettmers.com/2017/12/21/deep-learning-hardware-limbo/

Intel's new Nervana NNP chip sounds pretty amazing. What an awesome yet terrible time to be involved with all this...

We were discussing this blog post in a meetup last week. The place runs has an AMD board running. They say it works pretty well being API compatible with CUDA. I brought up using the MacBook level GPU and they thought that level of processor would not be compatible.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.