It's official-No NVIDIA in the new MacPro

Discussion in 'Mac Pro' started by erigas, Jul 25, 2013.

  1. erigas macrumors member

    Apr 6, 2011
    Atlanta GA
    "Regarding the Mac Pro, Estes says NVIDIA is “very robustly in the mac market, and disappointed that we didn’t get the design with the new Mac Pro.” It is true that NVIDIA graphics are used in other Mac product lines, from the MacBook Pro to the iMac — and these are the lines that are selling for Apple."

    From the article of the new K6000 quadro card in fx guide.
  2. KBS756 macrumors 6502a

    Jan 27, 2009
    I Believe they will get in eventually ... probably even by launch by making press statements like this so won't judge till its out.

    Apple is favoring OpenCL heavily where AMD is king by a long way currently
  3. ActionableMango macrumors G3


    Sep 21, 2010
    And the iTube locks down even further. :mad:

    It takes some seriously wishful thinking to believe Nvidia will be available the Mac Pro by launch time right after reading an article where an Nvidia exec says they aren't in the Mac Pro.
  4. tamvly macrumors 6502a


    Nov 11, 2007
    Well, as we've seen in the past, things change over time.
  5. ToomeyND macrumors 6502

    Sep 14, 2011
    Perhaps I don't understand the limitations as well as you all, but couldn't they mean that NVIDIA isn't in the base configuration?
  6. handsome pete macrumors 68000

    Aug 15, 2008
    Unlikely. Based on that quote and Apple documentation so far, it seems like it'll be AMD only for now.
  7. tuxon86 macrumors 65816

    May 22, 2012
    High end pro graphic cards like the Quadro are a specialized niche market, especially the top $$$$ model. Why should they start a whole new production line for a tiny subset of a niche market.

    They'll wait a couple of years and see if the nMP sells enough to justify the R&D and retooling investment. In the mean time they make plenty of money selling standard PCIe graphic cards to HP, DELL, BOXX, Lenovo and all the custom builder out there.
  8. handsome pete macrumors 68000

    Aug 15, 2008

    Well they could just license the tech to Apple.
  9. tuxon86 macrumors 65816

    May 22, 2012
    Considering apple reputation to screw their business partners, i don't think nvidia would go for that.
  10. Photovore macrumors regular

    Dec 28, 2011
    I would have liked to try my OpenCL code on AMD's GCN architecture, but I don't know that I'll be doing that any time soon!; see:

    I just (<2wks ago) replaced a 5870 Mac Edition with a used GTX 570 off of ebay ... and the kernel that took over >40ms to compute on the 5870 calculates in just under 5ms on the 570. I mean, Moses on a moped!; stupid stunning fast.

    ... and I had spent so many hours optimizing for the 5870, including vectorizing to float4s . . . to then have it quite thoroughly blown out of the water by a card at 26% of the cost....

    When the nMP comes out, I'll rent one to do some timing checks! [I can now (with the 570) do full HD (1920x1080) at 60fps, which is plenty for now, but some day more/bigger screens may be desirable.]
  11. OS6-OSX macrumors 6502a


    Jun 13, 2004
    Don't forget about Apple's ongoing feud with Adobe!
    Probably no coincidence that Nvidia is not in the nMP. Adobe using CUDA, "harnessing the power of the graphics processing unit (GPU)" was a game changer.
    Nvidia just got caught in the middle of the feud! :(
  12. nigelbb macrumors 65816

    Dec 22, 2012
    A standard PCI-e graphics card won't fit in the new Mac Pro so it would require Nvidia to make a very specialised product. Presumably there will be Thunderbolt expander boxes providing PCI-e slots for he new Mac Pro just like the current Cubix boxes.
  13. Umbongo macrumors 601


    Sep 14, 2006
    Of course it isn't a coincidence. AMD have to get OpenCL out there with better support and viability and it is in software/solution vendors interest to break NVIDIA's dominance. Apple are likely getting heavy discounts on the GPU solution for the new Mac Pro because of this where NVIDIA aren't willing to go; just as it is with the game console market. Adobe are also heavily backing Open CL now for the same reasons, but they have a lot less sway over the market due to just being a software company.
  14. GermanyChris, Jul 27, 2013
    Last edited: Jul 27, 2013

    GermanyChris macrumors 601


    Jul 3, 2011

    I think AMD cut them a deal and they went with it, there's probably not any more to the story.
  15. Cubemmal macrumors 6502a

    Jun 13, 2013
    That's definitely part of it. Also AMD accomplishes more with less silicon and less heat, as long as you have optimized drivers and graphics code at least. This is possibly advantageous with their thermal core solution.

    Finally as others have said the biggest reason is probably CUDA which Apple will never support.
  16. brand macrumors 601


    Oct 3, 2006
  17. DawgBone macrumors newbie

    Jan 17, 2013
    I know that you are correct in using plural verbs with "Apple", "NVIDIA", etc.

    But it is jarring to my ear! My own opinion, for what it is worth, is that a noun like "Apple" can be used in BOTH the plural and singular sense. As a plural, it means a large conglomerate of people. As a singular noun, it means a large monolithic corporation.

    But then, you are from England, and I am from the US. We just borrowed your language ...
  18. ArchAndroid macrumors regular


    Aug 26, 2012
    London, England
    5870 is one of the older generations of AMD cards which were based on the VLIW instruction set; 7000 series and the current FirePro Series are based on the GCN architecture so they are much stronger in compute.
  19. Photovore macrumors regular

    Dec 28, 2011
    What is "Wendy's" hamburgers in the US is "Wendy" in the UK, just to toss in another twist....
  20. DawgBone macrumors newbie

    Jan 17, 2013
    I'm confused, P ...
  21. Photovore macrumors regular

    Dec 28, 2011
    Uhuh! I just wonder how much, and would rather rent than buy, just to see -- however, as the 570 can do full HD I think I'm set for now. (i)

    I was stunned by getting ~9x perf out of the 570 vs the 5870; I had hoped for ~3x, based upon luxmark benches.... (and, by reported FLOPs, it should have been 10x as fast as the 330M in my laptop [same manufacturer, nVidia] ... but, it's 30x!, and ~9x the AMD card).

    (i) for now, while I have to run every show myself, I'm set with my MP 4,1 and the 570. However, when this becomes a product that a sound man can run, I'll have to be able to spec a "standard" system that's sufficient, and the nMP it is, I think -- just wanna get my own numbers as obviously I can't rely on either reported theoretical FLOPs or luxmark benches! (I did try to select the ones most representative of the kind of computation I'm doing, but there's no comparison to benching your own code....)

    *** I have NOT EDITED this post. I HAVE TRIED THREE TIMES to reply to DawgBone, and EACH REPLY to that other, OR EVEN a BRAND NEW POST, gets attached to THIS POST as an edit!!!!!??????? ***

    Will try again to reply to the earlier post later. REALLY WEIRD!

    ya know what?; I'm gonna reply to DawgBone's post above. I am NOT EDITING this post. You will see what happens....



    Same yellow sign, same cartoon smiling face with the red hair, but instead of "Wendy's" it says "Wendy" . . . .
  22. seveej macrumors 6502a


    Dec 14, 2009
    Helsinki, Finland
    After having read the linked article, my feeling was, that it is possible that Estes could have meant exactly that - that nVidia would have gladly been the base offering...

    But hey, maybe we're all reading too much into a heavily cut snippet from an interview, in an article focusing on an utterly different matter...

  23. Photovore macrumors regular

    Dec 28, 2011


    Same yellow sign, same cartoon smiling face with the red hair, but instead of "Wendy's" it says "Wendy" . . . .
  24. Cisco_Kid macrumors 6502


    Apr 24, 2005
    British Columbia
    And destroyed it...
  25. throAU macrumors 603


    Feb 13, 2012
    Perth, Western Australia
    Apple don't want CUDA to survive. They want OpenCL to take off, so that their math processing is CPU/GPU agnostic, and the same code runs and makes use of the GPU in Haswell/etc, AMD or Nvidia just as well. So that they can't be screwed by Nvidia, and they don't end up dependent on a technology that maybe doesn't evolve as fast as they may like.

    Nvidia can get on the bandwagon as well or become irrelevant.

Share This Page