NVIDIA promises 'fastest DX11 GPU on the planet' very, very soon (video)

Discussion in 'Distributed Computing' started by twoodcc, Nov 7, 2010.

  1. macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #1
    from engadget

    this could be great for GPU folding. lower temps is always good
     
  2. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #2
    And that still leaves a power hungry GPU; 300W seriously? Good for folding still one bad GPU.

    Let's see what GTX600 series bring.
     
  3. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #3
    well if it really does have lower temps, and less fan noise, then maybe they are at least going in the right direction
     
  4. macrumors Penryn

    Eidorian

    Joined:
    Mar 23, 2005
    Location:
    Indianapolis
    #4
    I've heard it's more gaming oriented than Fermi for GPGPU. I got a boost to 14,800 ppd from 10,400 ppd from just updating my drivers.
     
  5. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #5
    Obviously to cut down die size and henceforth increase efficiency while decreasing heat and power consumption, something had to go.

    GPGPU area of the die seems smaller than what GF-100 has.
     
  6. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #6
    dang. which driver do you have now?
     
  7. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #7
    Agreed, maybe some beta driver to be released with performance increase across the line?


    Please share.....
     
  8. macrumors Penryn

    Eidorian

    Joined:
    Mar 23, 2005
    Location:
    Indianapolis
    #8
    I only updated to 260.99 on my tower. Keep in mind this is under Windows as well.
     
  9. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #9
    ok thanks. i'm downloading it now
     
  10. macrumors 65816

    voyagerd

    Joined:
    Jun 30, 2002
    Location:
    Rancho Cordova, CA
    #10
    I have a GTX 480 now... hmm.. get another one for SLI or get a GTX 580 and maybe SLI..
     
  11. macrumors Penryn

    Eidorian

    Joined:
    Mar 23, 2005
    Location:
    Indianapolis
    #11
    I finally got some more of those newfangled 925 point units again. I'm up to 15,600 ppd now.

    The GTX 580 just launched as well. You're probably still better off with an array of GTX 460 in a folding rig though.
     
  12. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #12
    The 580 wasnt that much of a flop, in fact they did a nice job with it. More like what GTX480 should have been (Vista vs Win7 comes to mind).

    Let's see what AND brings in with Cayman.
     
  13. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #13
    nice :) wanna fold with that thing when you're not using it? join our team! :cool:

    yeah probably. at least bang for you buck. but i would imagine that the 580 would be a monster for folding though

    nice to see you give some props ;)
     
  14. macrumors regular

    Meldar

    Joined:
    May 3, 2008
    Location:
    pocket of liberalism in farm country
    #14
    They demoed it with Black Ops? Game looks like it was drawn by little kids. :|

    I'm not a fan of NViDIA's supposed "capping" or decreasing of their GPUs' DP speeds or power, but having just bought an overclocked BFG GTX 260 for my custom machine (not finished or in use yet), I can't really go around saying that NViDIA doesn't make great products.

    As for power usage...300W isn't that bad compared to the 525W that mine needs to reach its full potential. 60 amps on a single 12V rail...and the rest of the 225W is relegated to everything else in my machine - and that's way over what the other components actually need!

    The GTX series has nowhere to go but up in terms of quality and (if it's truly such a huge concern) down in terms of noise. I'm used to exhaust fans that sound like jet engines anyway. Plus, with faulty baseboard heating, it's not such a bad deal...
     
  15. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #15
    Props are given when the situation deems it. In this case they made good on reducing heat and noise. I still think the power consumption for a single chip is still high. 10W drop is nothing.
     
  16. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #16
    well hopefully nvidia's cards continue to get better. for the gaming and folding community.

    care to join our folding team with that rig? :cool:

    i concur.
     
  17. macrumors regular

    Meldar

    Joined:
    May 3, 2008
    Location:
    pocket of liberalism in farm country
    #17
    Theoretically I am already part of the MacRumors F@H team. Perhaps I'll get my custom machine running some -bigadv units once I actually build it.

    At the moment I crunch for numerous BOINC projects but F@H is definitely an interest too, especially given my recent and somewhat unhealthy obsession with GPUs.
     
  18. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #18
    great! yeah if you have a thing for GPUs, folding at home is something you might enjoy! :cool:
     
  19. macrumors 6502a

    Joined:
    Nov 6, 2009
    #19
    Again I pose the question; why don't Mac Pro's have SLI or CrossFire?

    I realize Intel is a tad... pushy, but come on, it would be sweet to put these things in SLI :)
     
  20. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #20
    yes it would. i don't think folding at home supports SLI, but yeah it would be cool to be able to do it
     
  21. macrumors 68020

    Dreadnought

    Joined:
    Jul 22, 2002
    Location:
    Almere, The Netherlands
    #21
    So, if I get this right, this gpu is soooo power hungry and inefficient and runs therefore so hot that Nvidia standard fitted it with water cooling... Let's connect it to the central heating of your house :D
     
  22. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #22
    nVidia is using the Vapour technology. This isn't nothing new. Saphire, an ATI OEM, has been using that technology for ages now, and even AMD used it on their dual-GPU flagship, the HD5970.

    Saphire's usage of the chamber stems from the fact that Saphire can call their cards much cooler than the competition. That is a really good selling point on a GPU taking into account that the GPU by itself is already cool. AMD used the chamber to cool off the entire dual GPU setup. In this case it was needed as two Cypress chips will create much more heat than anticipated.

    Now, nVidia has had problems with it's GF-100 chip. When it came out, it was not even completely packaged. It was missing 32 cores, it consumes enough power to turn on 3 100W light bulbs and generated so much heat nVidia came out with a massive grill and 5 heat pipe solution to keep the chip cool and even with that, the chip was still way too hot. 60*C idles and 96*C loads. I don't even have to mention noise levels.

    What nVidia did with the GTX580 is put lipstick on the pig, lots of lipstick. It is still the same chip with a big more FP16 units. Also, some of the GPGPU capability is gone. A revised chip to yield a lower consumption, 10W. Heat as said before was decreased with the new Vapour chamber. That however doesn't mean the GPU is good. It just means nVidia has now something to compete vs the HD5000 series, which have been out for more than a year. Also, by looks of the HD6000 series, it seems nVidia will once again be battling a giant.

    AMD is going to play the same power game (something I am not a fan off). In other words AMD is going for the same high power consumption rates as the GF-100 chip, but with their new GPU architecture. Right now we got Barts chips (HD6850/HD6870), which by reports, are smaller Cayman chips with lower TDPs and power eating rates. Cayman is reported to be bigger (but still smaller than Cypress) and consume just 10W under GTX580. If AMD can pull of the performance numbers, they got a winner and nVidia will be, well, toasted.
     
  23. macrumors regular

    Joined:
    Apr 2, 2008
    #23
    Just added mine :)
     

    Attached Files:

  24. macrumors G5

    jav6454

    Joined:
    Nov 14, 2007
    Location:
    1 Geostationary Tower Plaza
    #24
    >GTX580

    Shinny new toy I see.
     
  25. thread starter macrumors P6

    twoodcc

    Joined:
    Feb 3, 2005
    Location:
    Right side of wrong
    #25
    alright! thanks for folding for our team! that card should produce a lot of points!
     

Share This Page