Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This contract with AMD... how long will this last? Missing NVIDIA GPUs in the Macbook Pro.

Good luck you’ll be waiting forever. If Apple is planning on making their own cpus work in their laptops I would pretty much guarantee they will be doing their own gpu eventually as well. They already use their own in their A series for their phones so why not in laptops as well
 
The one reason I think NVIDIA has an upper hand is CUDA. None of the other GPUs have support for this.
 
Isn’t cuda cores some marketing gimmick ? Either wait good luck
CUDA is very real and widely used. In theory, AMD cards should be as fast or faster for compute. In practice though, I think they suffer from lack of software support, whereas for CUDA there's ton's of software and libraries for everything. For apps that do tune for AMD, they seem to be quite competitive. Surely we still remember crypto currency mining for example. There's no particular reason why this couldn't be the case for AI and ML too. After all, it's all just lots of matrix multiplication and vector ops, isn't it? But Nvidia has the mindshare. People ask for CUDA, not for compute.
 
This contract with AMD... how long will this last? Missing NVIDIA GPUs in the Macbook Pro.
For all intents and purposes the contract lasts forever. I would not expect any nvidia GPUs anytime in the near or far future. If it happens, that will be great, but I would not be holding out hope or waiting to purchase. If nvidia is a need, then move on to a different laptop that uses them
 
This contract with AMD... how long will this last? Missing NVIDIA GPUs in the Macbook Pro.

GeForce don’t support 10 bit color in macOS or Windows.

Only some full screen games in Windows can force 10 bit color (actually 8bit + dither on GeForce)

Also Nvidia GPUs don’t support fully latest Metal spec according to Apple’s Metal documentation.
[doublepost=1542027633][/doublepost]
CUDA is very real and widely used. In theory, AMD cards should be as fast or faster for compute. In practice though, I think they suffer from lack of software support, whereas for CUDA there's ton's of software and libraries for everything. For apps that do tune for AMD, they seem to be quite competitive. Surely we still remember crypto currency mining for example. There's no particular reason why this couldn't be the case for AI and ML too. After all, it's all just lots of matrix multiplication and vector ops, isn't it? But Nvidia has the mindshare. People ask for CUDA, not for compute.

In compute apps, clock for clock AMD cards tend to be faster than CUDA, but Nvidia ships higher clocks.

For example, the Vega 64 competes against the 1080 in gaming but against the 1080 Ti in compute applications.
[doublepost=1542027740][/doublepost]
The lack of CUDA and support for CUDA Deep Neural Networks greatly limits the use of the Mac in hot tech areas like Machine Learning and AI.

Apple is developing their own ML and AI chips, first versions included in latest Bionic processors. Performance per watt is industry leading.
 
Until Nvidia supports 10bit colour and Metal+MetalML it is not a good choice for the Mac. And with games being few and far between, I don't see Nvidia really being necessary on Mac until they can meet the pro needs or Apple shoves a Quadro RTX in the next Mac Pro as a $10,000-$20,000 BTO option.
 
With the new Vega chips in the Macbook Pro, does that make it more competitive for compute compared to Nvidia?
 
With the new Vega chips in the Macbook Pro, does that make it more competitive for compute compared to Nvidia?

Yes and no. It falls short of a 1060 mobile, which is esentially a desktop equiviliant of a GTX 970. Meaning it's the first chip to give you real desktop class performance. There's a member here who says the new Vega 20 chip will be almost as fast as the 1060, about 17% slower, depending on your needs. It's no slouch, but it's no desktop class preformer either.
 
For all intents and purposes the contract lasts forever. I would not expect any nvidia GPUs anytime in the near or far future. If it happens, that will be great, but I would not be holding out hope or waiting to purchase. If nvidia is a need, then move on to a different laptop that uses them

I share the same feeling. I need a gaming machine, so I just built a desktop myself and slapped a GTX 1070 onto it. Works like a charm.

For people who need an nvidia GPU on a Macbook... I don't think waiting is a smart decision. Just get something else and be done with it.
 
  • Like
Reactions: darksithpro
Apple will never be selling a gaming laptop, if you want a graphics heavy machine you will need to look elsewhere.
 
Apple is developing their own ML and AI chips, first versions included in latest Bionic processors. Performance per watt is industry leading.

Unless makes these chips and libraries available on Windows and Linux platforms that will not catch on with the research community. Even though their code is proprietary, Nvidia became the default ML GPU because you can go down to the store, buy a GTX 1080, slap it into a box you have lying around and create and experiment with models at home or in your university lab.
 
Yeah Nvidia doesn't support all of Metal. They will have to rewrite the drivers if that works.
[doublepost=1542320446][/doublepost]
Unless makes these chips and libraries available on Windows and Linux platforms that will not catch on with the research community. Even though their code is proprietary, Nvidia became the default ML GPU because you can go down to the store, buy a GTX 1080, slap it into a box you have lying around and create and experiment with models at home or in your university lab.

Plenty of Macs in research facilities. In fact in the 80s science labs had mostly Macs because they needed the GUI.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.