MP 7,1 Four GPUs in the New Mac Pro?

Discussion in 'Mac Pro' started by JesterJJZ, Jun 6, 2019.

  1. JesterJJZ macrumors 68020

    JesterJJZ

    Joined:
    Jul 21, 2004
    #1
    So, if the cost of the Vega II Duos is crazy expensive, like it might be, I've been considering going with four AMD Radeon VII GPUs. There's room in the tower for them, but what about power? From what I can tell, there are two pairs of 8pin connectors with each pair supplying 300w, which is the power requirement of one Radeon VII.

    With 75w from four slots, plus 2x300w, plus 75w aux, that gives us 975w useable power.
    So are we limited to 3 powered GPUs?
     
  2. ruka.snow macrumors regular

    ruka.snow

    Joined:
    Jun 6, 2017
  3. JesterJJZ thread starter macrumors 68020

    JesterJJZ

    Joined:
    Jul 21, 2004
    #3
    That's from the MPX slot. An off the shelf GPU won't connect to that unless someone makes an adapter from the MPX slot to dual 8pin or something.
     
  4. bsbeamer macrumors 68020

    Joined:
    Sep 19, 2012
    #4
    Screen Shot 2019-06-06 at 5.18.35 PM.png

    From the pictures it certainly SEEMS like there are four 8-pin power labeled for slots 1, 2, 3, and 4 and one 6-pin or 8-pin for slots 5-8. Theoretically, this should be able to power 5+ GPUs, depending on your GPU's power requirements. With four straight x16 slots, it may block slot 6 when in use.

    Not 100% clear if it's going to be limited to ONE 8-PIN PER PCIE (1, 2, 3, 4) based on the labels on the images. Also not 100% positive if there is an "activation" required to enable, or if there is a requirement for a card to be physically present, such as a card in slot 2 in order to use the allotted slot 2 8-pin. Those are the types of details we're likely going to need to wait for answers on if you're not planning to use the MPX.
     
  5. LK LAW macrumors member

    LK LAW

    Joined:
    May 30, 2016
    #5
    They said there are 4x 8Pin PCIe power plugs available (PCIe 3.0 spec not the higher wattage PCIe 4.0 one)
     
  6. JesterJJZ thread starter macrumors 68020

    JesterJJZ

    Joined:
    Jul 21, 2004
    #6
    Looks like 2 sets of 8pin. 300w per set. Most powerful gpus use two 8pin each.
     
  7. Slash-2CPU macrumors 6502

    Joined:
    Dec 14, 2016
    Location:
    New Orleans, USA
    #7
    That extra one at the top is maybe a 6-pin for powering the Afterburner ASIC?

    RED Rocket has a 6-pin.
     
  8. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #8
    Unless Apple's film animation and pictures of the Afterburner card is deceptive, there is no aux power. It is pretty clear there is no substantive fan/blower on the card not does it have a "tall fin" system like the MPX modules. It is a simple, single wide card. It only does one thing so 75W is probably pretty good. It may be a bleeding edge FPGA from Intel/Altera since Apple bet on them for major portions of this system. That wouldn't be cheap, but doesn't need tons of thermal headroom.

    For example
    ".. The FPGA is rated at 70 watts, and it is a full-blown Arria 10 GX 1150. This FPGA has its own cache memory and does not include ARM cores as etched next to the FPGA gates like some Arria and Stratix chips from the Altera line do. ..."
    https://www.nextplatform.com/2018/05/24/a-peek-inside-that-intel-xeon-fpga-hybrid-chip/

    Intel has "stuffed' an arria 10 into a Xeon package with just another 70W. Similar thing. they can toss out the ARM or any kind of base OS processor and basically consume and return streams on very simple commands. ( Arria 10 is 20nm implementation. Agile X is 10nm and up to 40% less power, but not cheap. :) )


    That is a better candidate. Probably also some other I/O and more modest GPUs that could need one (if allocated both MPX bays to non GPU hardware ).
     
  9. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #9
    If crazy expensive?
    The Vega VII is based off the Instinct MI50 and it is ~ $700 ( and a decent chance close to at cost).
    The Pro Vega II is based of the Instinct MI60 ( the full 64 ). So the base cost is more and Apple (and/or AMD) is likely to put a 30% mark up on it. At least $999 wouldn't be surprising. ( unless Apple is completely bending the volume cost curve for AMD could easily crack $1K. The whole Mac Pro system is priced with "got plenty of money to spend" mentality) The dual thing probably over 2K ( more MI60 features turned on. can't find a public price for a MI60 ).


    I know "wait for Navi" is a bit thin but there could be some cards that take just one 8-pin. That would mean 4 GPUs. When Apple will deliver drivers for those may not be until 2020, but eventually.
     
  10. JesterJJZ thread starter macrumors 68020

    JesterJJZ

    Joined:
    Jul 21, 2004
    #10
    Two Radeon VII now and upgrade to four Navi later would probably be ok.
     
  11. LK LAW macrumors member

    LK LAW

    Joined:
    May 30, 2016
    #11
    Alternatively, each MPX bay can support:

    One full-length, double-wide x16 gen 3 slot and one full-length, double-wide x8 gen 3 slot (MPX bay 1)

    Or two full-length, double-wide x16 gen 3 slots (MPX bay 2)

    Up to 300W auxiliary power via two 8-pin connectors

    Guess I interpreted it the wrong way
     
  12. vel0city macrumors member

    Joined:
    Dec 23, 2017
    #12
    Your estimate is actually quite encouraging - from the Mac Pro press release page:

    • Maxon’s Cinema 4D is seeing 20 percent faster GPU render performance when compared to a Windows workstation maxed out with three NVIDIA Quadro RTX 8000 graphics cards.

    These cards are $10k. So I was expecting the price of the GPU in the Mac Pro to be at least equal to the Quadro.
     
  13. h9826790 macrumors G5

    h9826790

    Joined:
    Apr 3, 2014
    Location:
    Hong Kong
    #13
    In that case, may be install only two dual 8pin Navi card make more sense. There should be some more powerful cards which can also pull lots of power. But not just same performance card with lower power consumption. Especially AMD still fall behind in the performance game.
     
  14. beaker7 macrumors 6502a

    Joined:
    Mar 16, 2009
    #14
    Quadro RTX 8000 is around $5K
     
  15. Macpro2019 macrumors member

    Joined:
    Jun 7, 2019
    #15
    Can nvidia be supported if driver is available or are they
    Locked out” from Apple’s lost?
     
  16. bsbeamer macrumors 68020

    Joined:
    Sep 19, 2012
    #16
    NVIDIA "simply" needs a macOS driver for 10.14/10.15 to work with whatever GPUs or series of GPUs they determine appropriate or certified. I haven't seen any reports of the graphics API available in/for 10.15 DP1 yet. This was "supposed to" make creating certified/compatible GPU drivers for macOS a lot easier. We'll see, I guess.
     
  17. leman macrumors G3

    Joined:
    Oct 14, 2008
    #17
    The full Vega 20 chip probably has very low yields to begin with. Add 32GB HBM2 and the prices will be through the roof. I’d be surprised if a single card is under 5k.
     
  18. Macintosh IIcx macrumors 6502

    Macintosh IIcx

    Joined:
    Jul 3, 2014
    Location:
    Denmark
    #18
    5K for the single card?! I certainly hope not! :(

    Historically, the Pro AMD Vega cards have not really been that pricey on the Macs, but the Vega 20 with 32GB HBM2 will be another beast, I will agree to that. I’m thinking 1,5K for the upgrade to a single card, 3K+ for the Duo.
    --- Post Merged, Jun 7, 2019 ---
    I’m also getting a feeling that Apply might look at that high base price as where they will earn the most of their Mac Pro 2019 money, maybe less so on the CPU and GPU upgrades. Definitely looking forward to the GPU price reveal with anxiety and some stress! :eek:
     
  19. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #19
    I watched the DriverKit session at WWDC 2019 and the new stuff isn't a magical "get out of jail free" card for Nvidia at all. Developing system extensions is easier now because Apple is in the process of kicking all 3rd party code out of the kernel and into "user space" mode on the processors. So drivers can die and the system has no kernel panic impact. The upside to being in user space is that driver developers can use normal LLDB to debug their drivers ( as opposed to no pragmatically needing two Macs to debug their work). So writing code in a safer manner should get easier for the developers.

    Eventually all current kext will be banned in future versions of macOS. DriverKit only really covers HID ( keyboard mouse) , USB , and one other area that slips my mind at the moment. It doesn't cover IOKits "display" class yet. What DriverKit covers here in 10.15 will get banned in the next iteration after that. 10.16 will cover more legacy IOKit classes.
    Basically over the next couple of years everyone with a kext needs to do some substantive upgrades to their code. Pragmatically IOKIt is being deprecated. ( I don't think Apple is explicitly using that term at the moment you have to be blind not to see the freight train coming at this point. )

    The new "system extensions" will get direct access to special hardware at kernel like level of access, but it will all flow through the kernel (which only Apple is going to write).

    The Nvidia problem is seems to more likely be what Nvidia wants to write that is at variance with the general rules of the road that Apple is laying down. This increasingly looks like a dust up on how to harmonize between Metal and CUDA. Nvidia wanting CUDA to first and Metal second. And Apple not going to take Metal being put in second class status. How they can make the perhaps co-equal first class is something they haven't had to do in the past and still doesn't seem to be worked out. The move to System Extension and DriverKit is an opportunity to come up with a new, better shared solution, but if Nvidia is basically doing nothing and primarily just playing 'hard ball' with Apple .... they are blowing it. The window to what the new graphics interface looks like is probably almost closed at this point.
     
  20. Zdigital2015 macrumors 65816

    Joined:
    Jul 14, 2015
    Location:
    East Coast, United States
    #20
    I think NVIDIA views having any of its technologies take a back seat to Metal 2 and ML Kit(Core ML 3) as completely unacceptable. Imagine Apple allowing NVIDIA back in only to see them prioritize CUDA (NVIDIA writes the CUDA driver now, I don't see them letting Apple do it or sharing proprietary tech with them) while Core ML 3 lags behind. I suspect NVIDIA would also try hard to push AMD out any way they could as they want to dominate the Deep Learning and Machine Learning fields even more than they seem to already.

    I think NVIDIA is really more a competitor and Apple won't let them back in the door after whatever happened in the past between them.
     
  21. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #21
    I was mainly setting a lower bound ( plus as the BTO order cost which doesn't fold in the base GPU cost Apple is making folks fold into price of the card. Apple doesn't particularly give a "rebate ' credit for the card you traded up over. ).


    I think the $10K is the top end ML cards not the Quadro. But still Apple's is topping those three with four of theirs. All four adding up to $10k range maybe. ( the three RTX 8000 is about $15K so if Apple's quad fit into $15K they'd be doing pretty good. But the "we have Metal not CUDA" factor means it is more than this one simple benchmark where everything is tilted in their direction. It is not the Quadro that is going to 'murder' the MPX kit (and Mac Pro) sales if they price these too high. )

    Where the price ends up is whether these Apple MPX GPU kits are a moderate-hight volume selection for the Mac Pro 2019 or relatively low. So if 60-80% of folks are buying them Apple probably could wrangle AMD out of the multiple triple digit mark-ups they are slapping on the basic card component. AMD will make more because Apple is going to sell many thousands of these and having deployed GPUs is better than corner case stuff. Apple did that with the MP 2013 GPU where the D500 and D700 were way below what AMD was charging for FirePro models.

    If Apple is thinking that most folks are going to skip MPX ( that would be a bit odd, but ... ) then the run rate would be low and Apple would be in same boat as AMD in wanted to "print money" off of the far fewer ones sold. So split the triple digit mark-ups. What is also probably missing are some more reasonable MPX GPUs in the middle and similar impact ( number sold largely disconnected from overall Mac Pro sales so prices higher. ). If there is still a "hole waiting to be filled" in the MPX line up at launch they'll need to mention that if don't want more "$999 stand" bad PR.





    .
    --- Post Merged, Jun 7, 2019 ---
    That kind of "your tech has to loose for mine to win" mindset is probably the whole root issue. Whether it is Apple , Nvidia, or most likely both (with slightly differing degrees ) posturing with that attitude. Apple isn't going to 'cave' on this because Metal is directly tied into all the rest of the Apple os instances. If Nvidia wants to pick a fight with iOS, it is going to loose.

    It isn't necessarily about "beat seat" it is about having a co-equal peer. It can't achieve a 'tie' then Apple owning the OS means they own the tie-breaker. At the end of the day Nvidia is just a subcomponent subcontractor in the ecosystem.


    chuckle like the "embrace, extend , extinguish" they did on OpenCL.... yeah wouldn't be surprising. But that tactic isn't necessary. When Jobs got back to Apple he announced that Apple had to leave the notion of "Microsoft has to loose for Apple to win" mindset. [ Well he also needed a giant load of MS money to keep the lights on at the time too] Apple has done better since


    Trying hard to be a extremely good subcontractor involving going to the folks running the overall project and finding out what they want to do and aligning up with their major strategic goals and looking to weave in what your goals are with theirs. A tactic that involves kicking AMD in the kneecaps ( and perhaps Apple indirectly ) to force AMD off the picture is a poor partner. Nvidia GPUs getting along substantially better with Intel GPUs than AMD does would be an "out compete" rather than a "push out of the way".

    Nvidia is heading for the similar kind of bubble on Deep Learning as they had on cryptocurrencies. harvesting data from everywhere and hauling back to mega cloud data repositories has limits. On device DL/ML is coming up and will take over as being a driving force of the overall opportunity. Nvidia is spending lost of energy to dig a deeper most to slow that down but it is coming like a flood. It is more a matter of how much time they are buying until the levee is breached.


    it isn't really about being a competitor. Apple gets along pretty well with Microsoft. Companies with broad footprints know they have "coopetition" challenges with other bigger companies.
     
  22. Zdigital2015 macrumors 65816

    Joined:
    Jul 14, 2015
    Location:
    East Coast, United States
    #22
     
  23. JesterJJZ thread starter macrumors 68020

    JesterJJZ

    Joined:
    Jul 21, 2004
    #23
    I never thought Apple would release a tower again and look what happened. While I don't think it's likely, I don't put Nvidia completely out of the picture. Never know.

    I'd love to fill the new Mac Pro with 2080ti cards.
     
  24. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #24
    It is not impossible but if Apple sells a surprising amount of these and eGPU on macOS grows at a health clip (and some of Nvidia's high growth targets don't pan out ) then could see a change. It won't be surprising for Nvidia to at least take a bit of a 'wait and see' positioning here.

    Nvidia jockeying the RTX to sit on MP 2010-2012 with 2016-21017 macOS version stuck in time only further enables the "wait and see" positioning. That too didn't really align with where Apple was strategically going.


    This system skews a substantive amount of the power toward the CPU socket provision. It isn't really built to absolutely max out on the largest number possible if standard cards. The thermal scope is a bit wider, but it still has that classic Mac Pro scope of "just two big cards".

    This Mac Pro is similar to the Mac Pro 2013 is that it is aimed toward future GPU cards. This time they have lots more wiggle room if the roadmaps are off. Getting 2017's "four cards" worth into two in 202x is what it is geared toward.
     

Share This Page

23 June 6, 2019