Separate names with a comma.
Discussion in 'Distributed Computing' started by twoodcc, Nov 7, 2010.
this could be great for GPU folding. lower temps is always good
And that still leaves a power hungry GPU; 300W seriously? Good for folding still one bad GPU.
Let's see what GTX600 series bring.
well if it really does have lower temps, and less fan noise, then maybe they are at least going in the right direction
I've heard it's more gaming oriented than Fermi for GPGPU. I got a boost to 14,800 ppd from 10,400 ppd from just updating my drivers.
Obviously to cut down die size and henceforth increase efficiency while decreasing heat and power consumption, something had to go.
GPGPU area of the die seems smaller than what GF-100 has.
dang. which driver do you have now?
Agreed, maybe some beta driver to be released with performance increase across the line?
I only updated to 260.99 on my tower. Keep in mind this is under Windows as well.
ok thanks. i'm downloading it now
I have a GTX 480 now... hmm.. get another one for SLI or get a GTX 580 and maybe SLI..
I finally got some more of those newfangled 925 point units again. I'm up to 15,600 ppd now.
The GTX 580 just launched as well. You're probably still better off with an array of GTX 460 in a folding rig though.
The 580 wasnt that much of a flop, in fact they did a nice job with it. More like what GTX480 should have been (Vista vs Win7 comes to mind).
Let's see what AND brings in with Cayman.
nice wanna fold with that thing when you're not using it? join our team!
yeah probably. at least bang for you buck. but i would imagine that the 580 would be a monster for folding though
nice to see you give some props
They demoed it with Black Ops? Game looks like it was drawn by little kids. :|
I'm not a fan of NViDIA's supposed "capping" or decreasing of their GPUs' DP speeds or power, but having just bought an overclocked BFG GTX 260 for my custom machine (not finished or in use yet), I can't really go around saying that NViDIA doesn't make great products.
As for power usage...300W isn't that bad compared to the 525W that mine needs to reach its full potential. 60 amps on a single 12V rail...and the rest of the 225W is relegated to everything else in my machine - and that's way over what the other components actually need!
The GTX series has nowhere to go but up in terms of quality and (if it's truly such a huge concern) down in terms of noise. I'm used to exhaust fans that sound like jet engines anyway. Plus, with faulty baseboard heating, it's not such a bad deal...
Props are given when the situation deems it. In this case they made good on reducing heat and noise. I still think the power consumption for a single chip is still high. 10W drop is nothing.
well hopefully nvidia's cards continue to get better. for the gaming and folding community.
care to join our folding team with that rig?
Theoretically I am already part of the MacRumors F@H team. Perhaps I'll get my custom machine running some -bigadv units once I actually build it.
At the moment I crunch for numerous BOINC projects but F@H is definitely an interest too, especially given my recent and somewhat unhealthy obsession with GPUs.
great! yeah if you have a thing for GPUs, folding at home is something you might enjoy!
Again I pose the question; why don't Mac Pro's have SLI or CrossFire?
I realize Intel is a tad... pushy, but come on, it would be sweet to put these things in SLI
yes it would. i don't think folding at home supports SLI, but yeah it would be cool to be able to do it
So, if I get this right, this gpu is soooo power hungry and inefficient and runs therefore so hot that Nvidia standard fitted it with water cooling... Let's connect it to the central heating of your house
nVidia is using the Vapour technology. This isn't nothing new. Saphire, an ATI OEM, has been using that technology for ages now, and even AMD used it on their dual-GPU flagship, the HD5970.
Saphire's usage of the chamber stems from the fact that Saphire can call their cards much cooler than the competition. That is a really good selling point on a GPU taking into account that the GPU by itself is already cool. AMD used the chamber to cool off the entire dual GPU setup. In this case it was needed as two Cypress chips will create much more heat than anticipated.
Now, nVidia has had problems with it's GF-100 chip. When it came out, it was not even completely packaged. It was missing 32 cores, it consumes enough power to turn on 3 100W light bulbs and generated so much heat nVidia came out with a massive grill and 5 heat pipe solution to keep the chip cool and even with that, the chip was still way too hot. 60*C idles and 96*C loads. I don't even have to mention noise levels.
What nVidia did with the GTX580 is put lipstick on the pig, lots of lipstick. It is still the same chip with a big more FP16 units. Also, some of the GPGPU capability is gone. A revised chip to yield a lower consumption, 10W. Heat as said before was decreased with the new Vapour chamber. That however doesn't mean the GPU is good. It just means nVidia has now something to compete vs the HD5000 series, which have been out for more than a year. Also, by looks of the HD6000 series, it seems nVidia will once again be battling a giant.
AMD is going to play the same power game (something I am not a fan off). In other words AMD is going for the same high power consumption rates as the GF-100 chip, but with their new GPU architecture. Right now we got Barts chips (HD6850/HD6870), which by reports, are smaller Cayman chips with lower TDPs and power eating rates. Cayman is reported to be bigger (but still smaller than Cypress) and consume just 10W under GTX580. If AMD can pull of the performance numbers, they got a winner and nVidia will be, well, toasted.
Just added mine
Shinny new toy I see.
alright! thanks for folding for our team! that card should produce a lot of points!