Reason why nVidia is failing hard and Apple is bailing on them

Discussion in 'Mac Pro' started by jav6454, Mar 25, 2011.

  1. jav6454 macrumors P6

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #1
    Ever wonder why Apple suddenly went with ATI/AMD with almost every Mac's GPU selection?

    Well, ladies and germs, boys and squirrels, the answer is simple. nVidia produces GPUs that either:

    A. Burn down
    B. Have high power consumption
    C. Barely within 1% of competition
    D. Made from woodscrews and with terrible 1.7% yields

    However, let me allow nVidia or rather one of their GPUs show you what I mean.

    nVidia GPU burned/zapped.

    Not even a moderate overclock. This is nVidia's flagship competing vs AMDs HD6990 and it couldn't even hold an overclock down without blowing up.

    Don't get me wrong, nVidia has made great GPUs, but as of late, their offerings are really bad in the high end sector [GTX460 and GTX560 are exceptional products for their prices].
     
  2. toxic macrumors 68000

    Joined:
    Nov 9, 2008
    #2
    Apple switches on a regular basis. the performance of the high-end chips is irrelevant, none of them go in Macs anyway.
     
  3. nanofrog macrumors G4

    Joined:
    May 6, 2008
    #3
    I see that other seem to be blaming the drivers (267.52 to be specific), but given the physical location of the spark (just beneath the PSIG power connectors), it appears to be the Voltage Regulator that went, not the GPU (would have been bigger and a lot more smoke). That OC seemed rather mild IMO as well, and it should have been able to take it (increase of 0.065V from stock).

    This isn't meant as a defense of the 590 or nVidia (I'm not a big fan of their products these days), but this one appears to be a different issue, not a poorly designed or manufactured GPU. Granted, their GPU's are pigs power wise, but whoever designed/manufactured the PCB should have built a better voltage regulator (one that can handle the load). Decreasing voltages and clocks (...) via drivers is a cheaper fix however than recalling the initial cards that have already been produced, even if they're still at the manufacturing facility sitting on pallets (i.e. PCB rework to replace the parts that are blowing with those capable of withstanding the load).

    Afterall, the information I've seen, is that expected pricing is $600. Not a budget card by any means, so they should have done better.
     
  4. IceMacMac, Mar 26, 2011
    Last edited: Mar 26, 2011

    IceMacMac macrumors 6502

    Joined:
    Jun 6, 2010
    #4
    I always wonder about agendas in threads such as this...corporate, political or other biases often seep into "facts".

    I think Nvidia and ATI make great products for the PC. I'm hopeful that we'll see more of the high end cards for Macs with Lion. We'll see.

    I wouldn't hesitate to buy from either vendor, but here is how i see it:
    --Nvidia rocks with CUDA...and offer more for for 3d professionals and video editors (at least for now as FCP has been left dormant while Premiere leads. Could change any day.)
    --Nvidia's drivers on Mac aren't so good (I blame Apple)
    --Nvidia options are almost non-existent on the Mac right now (I blame Apple)
    --Nvidia cards might run a little hotter and thus require an extra db or two for fan noise.

    I would love to see the world standardize on OpenCL over CUDA...but that's a ways off.
     
  5. nanofrog macrumors G4

    Joined:
    May 6, 2008
    #5
    From my perspective, its based mainly on technical issues, such as high power consumption, excessive heat/insufficient cooling, poor drivers, low yields (drives up costs) and high failure rates (GeForce 8 & 9 series in particular). I also have issues with their business practices as well since they still don't publish documentation, only provide closed drivers = binaries only, and still to date, have failed to come clean (full acknowledgement) of the failure rates & causality to the GeForce 8 & 9 series (users as well as vendors have been burnt by this - pun intended :eek: :p).

    So there's some validity IMO, not just an "I don't like them", but can't give any solid evidence to support it sort of thing.
     
  6. TheStrudel macrumors 65816

    TheStrudel

    Joined:
    Jan 5, 2008
    #6
    Well, ATI/AMD cards have pretty much always performed better in OS X for the same approximate power level.

    I think that's almost entirely due to drivers, but personally, I see no reason to bother with Nvidia if the situation is never likely to improve, and it really hasn't, yet.

    And if drivers are being improved to the point where we'll have multiple choices that don't even necessitate flashing, well, then OS X users will be running out of reasons to use Nvidia cards unless they need a Quadro.
     
  7. Hellhammer Moderator

    Hellhammer

    Staff Member

    Joined:
    Dec 10, 2008
    Location:
    Finland
    #7
    That happens if you use the old GeForce 267.52 drivers. GeForce 267.71 supposedly fixed this issue.

    Currently AMD is ahead of NVidia but like in the past, it can change very quickly.

    Actually, it will be priced at 700$ (to compete with AMD 6990 which goes for ~700$), at least according to AnandTech.
     
  8. MacHamster68 macrumors 68040

    MacHamster68

    Joined:
    Sep 17, 2009
    #8
    nvidia hmm yes cheap and ready available in huge numbers some work if you leave them stock or add a bigger heatsink or faster fan as nvidia seems often a bit optimistic in cooling capabilities of their choosen fans /heatsinks
    .. anyway i am a ATI fan anyway since the mid 80's and would never let a nvidia gpu even near any computer in my possession , no matter how fast it might be and only fit them on special demand
     
  9. jav6454 thread starter macrumors P6

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #9
    Seein as how this thread is in The Mac Pro section and how Mac Pros can be shipped with the 5870. I'd say this qualifies alot as high end.

    Thy switch according to how well the manufacturer makes heir product. Current AMD is leaps ahead of nVidia and house burning GPUs.
     
  10. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #10
    How so? CUDA is the exact same thing as OpenCL, except with vendor lock in.

    Apple made the smart move here.
     
  11. jav6454 thread starter macrumors P6

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #11
    The only advantage is that CUDA is much more available right now and used more. However, OpenCL is getting there. Windows already leverages is and OS X started with SL.
     
  12. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #12
    But CUDA is used more because NVidia is still pushing CUDA hard.

    This is why CUDA bothers me. NVidia has a cross platform solution available to them, but they're still pushing the proprietary solution.
     
  13. dukebound85 macrumors P6

    dukebound85

    Joined:
    Jul 17, 2005
    Location:
    5045 feet above sea level
    #13
    Indeed. I remember back in the 8800GT era, nvidea was wiping the floor with ATI and everyone was wondering if ATI could even compete in the future

    It is always cylical
     
  14. dime21 macrumors 6502

    Joined:
    Dec 9, 2010
    #14
    A driver that was released to the public which causes your hardware to blow up is inexcusable.

    I'm guessing Apple switched for the same reason HP and everyone else did - they got burned bad by the 8 and 9 series mobile chip failures. Yes, it was nvidia's fault that the chips were bad, but the laptop vendors were the ones stuck with the enormous cost of replacing failed system boards. Plus their lost customers who got frustrated with one laptop failing after another. I know five people including myself who purchased HP laptops with the 8600m GT. All 5 of these failed in under a year. Only 3 were covered by HP's warranty. The other 2, despite having the 8600m GT chip, were declined by HP as "unaffected models". Unaffected my ass. I hope nvidia loses a lot of business.
     
  15. Hellhammer Moderator

    Hellhammer

    Staff Member

    Joined:
    Dec 10, 2008
    Location:
    Finland
    #15
    But CUDA is NVidia only so it's not a surprise that they are pushing it. As long as some software is using CUDA instead of OpenCL, the people who use that software will prefer NVidia GPUs. They are doing the same thing with PhysX from what I have read. Paying game developers to use PhysX.

    When they can't fight against AMD in raw performance, they have to use grey methods to create at least some market for their products.
     
  16. Pressure macrumors 68040

    Pressure

    Joined:
    May 30, 2006
    Location:
    Denmark
    #16
    I think the reason is much closer to earth.

    AMD probably outbid nVIDIA on unit price, hence winning this round.
     
  17. jav6454 thread starter macrumors P6

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #18
    Exactly, true a customer with an nVidia GPU gets better performance, but at what price? Progress on the GPU performance is hindered.

    Ever see the "Meant to be played" slogan pop up in games? That's nVidia paying developers to optimize games for nVidia GPUs. Of course its going to look as if nVidia has an upper hand. In reality they don't and benchmarking tools show this.

    Also, comedic relief:

    [​IMG]
     
  18. CaptainChunk macrumors 68020

    CaptainChunk

    Joined:
    Apr 16, 2008
    Location:
    Phoenix, AZ
    #19
    From a business standpoint, NVIDIA's reasoning for this is pretty sound. NVIDIA can market CUDA and make it exclusive to their cards. They ink business partnerships with companies like Adobe (i.e. Premiere Pro) to lock their customers into buying high-end NVIDIA cards. It isn't about CUDA being better than OpenCL or vice-versa. It's about artificially creating a competitive edge.

    NVIDIA did a similar thing with PhysX, in a way. They gobbled the tech up from Ageia (PhysX started as a standalone, add-in card that ran alongside a machine's primary graphics card) and made it exclusive to their GPUs. NVIDIA even went as far as intentionally crippling software (CPU hosted) PhysX in order to promote the sale of a PhysX enabled NVIDIA card.

    This all seems pretty deceptive, but in the end, it's business.
     
  19. CaoCao macrumors 6502a

    Joined:
    Jul 27, 2010
  20. Joshuarocks macrumors 6502

    Joshuarocks

    Joined:
    Mar 12, 2011
    Location:
    Somewhere in Cyberspace
    #21
    Nvidia Geforce 7800 GT

    Hello,

    In addition to my 6-core Mac Pro I acquired a mint-like Power Mac G5 Quad w/ the better LCS in it. My Quad came with the Nvidia Geforce 7800 GT. So far I notice this is a very LOUD video card.

    Can someone confirm that this video card is a lousy one, due to its noise that it produces? I'd figure I'd ask this question here as many 1,1 Mac Pro owners used to have the 7300 GT for their mac pros.

    Is there a firmware update to silence this card? The fan is huge on this thing.

    Thanks,

    Josh
     
  21. RebootD macrumors 6502a

    RebootD

    Joined:
    Jan 27, 2009
    Location:
    NW Indiana
    #22
    I'm using the CUDA GTX 285 because I use Adobe Premiere CS5/After Effects. If and when Adobe decides to allow AMD cards to have the same GPU acceleration I'd be more than happy to switch back.

    Pretty sure Apple just doesn't want to support NVIDIA's proprietary CUDA tech (Plus it makes the current FCP seem slower than dirt). Plus only Apple wants to have proprietary tech on their machines. ;)
     
  22. TheStrudel macrumors 65816

    TheStrudel

    Joined:
    Jan 5, 2008
    #23
    It's not a case of allowing GPU acceleration, it's a case of explicitly supporting OpenCL...

    I would have hoped by now that OpenCL would have seen more traction, but CUDA did get to market first, as it were. Regrettably.

    But these things take time.
     
  23. jav6454 thread starter macrumors P6

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #24
    Another failure round of nVidia.
     
  24. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #25
    CUDA will only give them traction for so long. Heck, I'm a CUDA developer and even I gave up on NVidia after the 10.6.3 driver fiasco. Used to have a dual NVidia GPU system. Now one of those GPUs is replaced with a 5870.

    I'd also take NVidia more seriously if they supported Tesla on the Mac.
     

Share This Page