Where are the NVIDIA GPUs for Macbook Pro?


Closingracer

macrumors 68040
Jul 13, 2010
3,472
1,372
This contract with AMD... how long will this last? Missing NVIDIA GPUs in the Macbook Pro.
Good luck you’ll be waiting forever. If Apple is planning on making their own cpus work in their laptops I would pretty much guarantee they will be doing their own gpu eventually as well. They already use their own in their A series for their phones so why not in laptops as well
 

Starfyre

macrumors 68030
Original poster
Nov 7, 2010
2,738
970
The one reason I think NVIDIA has an upper hand is CUDA. None of the other GPUs have support for this.
 

CodeJoy

macrumors 6502
Apr 3, 2018
400
586
Isn’t cuda cores some marketing gimmick ? Either wait good luck
CUDA is very real and widely used. In theory, AMD cards should be as fast or faster for compute. In practice though, I think they suffer from lack of software support, whereas for CUDA there's ton's of software and libraries for everything. For apps that do tune for AMD, they seem to be quite competitive. Surely we still remember crypto currency mining for example. There's no particular reason why this couldn't be the case for AI and ML too. After all, it's all just lots of matrix multiplication and vector ops, isn't it? But Nvidia has the mindshare. People ask for CUDA, not for compute.
 

maflynn

Moderator
Staff member
May 3, 2009
63,843
30,362
Boston
This contract with AMD... how long will this last? Missing NVIDIA GPUs in the Macbook Pro.
For all intents and purposes the contract lasts forever. I would not expect any nvidia GPUs anytime in the near or far future. If it happens, that will be great, but I would not be holding out hope or waiting to purchase. If nvidia is a need, then move on to a different laptop that uses them
 

SoyCapitanSoyCapitan

macrumors 601
Jul 4, 2015
4,484
2,497
Paris
This contract with AMD... how long will this last? Missing NVIDIA GPUs in the Macbook Pro.
GeForce don’t support 10 bit color in macOS or Windows.

Only some full screen games in Windows can force 10 bit color (actually 8bit + dither on GeForce)

Also Nvidia GPUs don’t support fully latest Metal spec according to Apple’s Metal documentation.
[doublepost=1542027633][/doublepost]
CUDA is very real and widely used. In theory, AMD cards should be as fast or faster for compute. In practice though, I think they suffer from lack of software support, whereas for CUDA there's ton's of software and libraries for everything. For apps that do tune for AMD, they seem to be quite competitive. Surely we still remember crypto currency mining for example. There's no particular reason why this couldn't be the case for AI and ML too. After all, it's all just lots of matrix multiplication and vector ops, isn't it? But Nvidia has the mindshare. People ask for CUDA, not for compute.
In compute apps, clock for clock AMD cards tend to be faster than CUDA, but Nvidia ships higher clocks.

For example, the Vega 64 competes against the 1080 in gaming but against the 1080 Ti in compute applications.
[doublepost=1542027740][/doublepost]
The lack of CUDA and support for CUDA Deep Neural Networks greatly limits the use of the Mac in hot tech areas like Machine Learning and AI.
Apple is developing their own ML and AI chips, first versions included in latest Bionic processors. Performance per watt is industry leading.
 

ruka.snow

macrumors regular
Jun 6, 2017
158
386
Until Nvidia supports 10bit colour and Metal+MetalML it is not a good choice for the Mac. And with games being few and far between, I don't see Nvidia really being necessary on Mac until they can meet the pro needs or Apple shoves a Quadro RTX in the next Mac Pro as a $10,000-$20,000 BTO option.
 

Starfyre

macrumors 68030
Original poster
Nov 7, 2010
2,738
970
With the new Vega chips in the Macbook Pro, does that make it more competitive for compute compared to Nvidia?
 

darksithpro

macrumors 6502a
Oct 27, 2016
582
4,491
With the new Vega chips in the Macbook Pro, does that make it more competitive for compute compared to Nvidia?
Yes and no. It falls short of a 1060 mobile, which is esentially a desktop equiviliant of a GTX 970. Meaning it's the first chip to give you real desktop class performance. There's a member here who says the new Vega 20 chip will be almost as fast as the 1060, about 17% slower, depending on your needs. It's no slouch, but it's no desktop class preformer either.
 

NickPhamUK

macrumors 6502
May 6, 2013
356
197
For all intents and purposes the contract lasts forever. I would not expect any nvidia GPUs anytime in the near or far future. If it happens, that will be great, but I would not be holding out hope or waiting to purchase. If nvidia is a need, then move on to a different laptop that uses them
I share the same feeling. I need a gaming machine, so I just built a desktop myself and slapped a GTX 1070 onto it. Works like a charm.

For people who need an nvidia GPU on a Macbook... I don't think waiting is a smart decision. Just get something else and be done with it.
 
  • Like
Reactions: darksithpro

MacNut

macrumors Core
Jan 4, 2002
21,542
7,801
CT
Apple will never be selling a gaming laptop, if you want a graphics heavy machine you will need to look elsewhere.
 

jerryk

macrumors 601
Nov 3, 2011
4,841
2,397
SF Bay Area
Apple is developing their own ML and AI chips, first versions included in latest Bionic processors. Performance per watt is industry leading.
Unless makes these chips and libraries available on Windows and Linux platforms that will not catch on with the research community. Even though their code is proprietary, Nvidia became the default ML GPU because you can go down to the store, buy a GTX 1080, slap it into a box you have lying around and create and experiment with models at home or in your university lab.
 

CreeptoLoser

macrumors 6502
Jul 28, 2018
369
320
Birmingham, Alabama
Yeah Nvidia doesn't support all of Metal. They will have to rewrite the drivers if that works.
[doublepost=1542320446][/doublepost]
Unless makes these chips and libraries available on Windows and Linux platforms that will not catch on with the research community. Even though their code is proprietary, Nvidia became the default ML GPU because you can go down to the store, buy a GTX 1080, slap it into a box you have lying around and create and experiment with models at home or in your university lab.
Plenty of Macs in research facilities. In fact in the 80s science labs had mostly Macs because they needed the GUI.