Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

motionwind

macrumors newbie
Original poster
Mar 19, 2013
9
0
Hello "MAC" world,

I own an 8-Core MacPro 3.1 which was sold to me with a GeForce 8800GT Video card.

Because I do a lot of high-detail motion picture rendering I need to use a Video card that is CUDA 1.3 (or higher) compatible. I bought a used Gigabyte Nvidia 570 GTX card, Model = GV-N570SO-13I, REV 1.0. It is a "super overclocked" version of the GTX 570 card (I didn't realize when I bought it) and has one 6-PIN and one 8-PIN adapter (which doesn't seem to be compatible with the setup of the MacPro?).

Here's some more specs (my "overclocked" version is in the 2nd column)

http://www.gigabyte.com/products/comparison/list.aspx?ck=3&pids=3685,3876

Are the power requirements too high (or different from other 570 GTX's that are known to work???). Did I make a wrong choice, or might it be possible to get this thing connected/working? I need to know soon because I'm not sure how much time I have (if it's even possible) to return the used equipment for money back.


Thanks very much for any advice!

Jens
 

motionwind

macrumors newbie
Original poster
Mar 19, 2013
9
0
I just found some information on the power consuption of this card in question. It suggests that this over-clocked design actually uses "LESS" power than the "reference" 570 GTX !?

http://www.hardwarecanucks.com/foru...eforce-gtx-570-super-overclock-review-13.html

If this is true, then it would seem to be possible to use this with my MacPro (since there are many reports of the 570 GTX running without problems). STILL YET, I wonder how this could be true, because all benchmarks show a consumption of 330 Watts peak, whereas a single PCI slot (together with the 2 6-Pin connectors) ?seems? to provide a total of only 225W (see following post)

https://forums.macrumors.com/threads/1440150/

Apple itself reports a TOTAL MAXIMUM PCI SLOT USAGE of 300W (see following link for the Mac 2008 spec). Is this only the power consumption of the PCI slots alone, or does it include the 2x75W power provided by the 2 6-Pin connectors?

http://support.apple.com/kb/sp11

All-in-all, I'm still confused about whether I can include a card in my MacPro 3.1 that might use a CONTINUOUS 330W for many hours on-end, and I hope someone with a little more technical background than myself might be able to provide a little insight (even though it's just opinion and I will NOT bind them to it).

Thanks :)
 

motionwind

macrumors newbie
Original poster
Mar 19, 2013
9
0
Another thing that gets me confused ... in the APPLE SPEC (see my link above) it also claims ...

"Multiple graphics card configurations including two, three, or four ATI Radeon HD 2600 XT cards"

... how could it be possible to support up to "FOUR" ATI Radeon HD 2600 XT Cards, which use 126W EACH AT IDLE!? This would mean that "even at idle" the 4 cards would be running at 500W, which is 200W more than the Apple PCI Card spec (of 300W total) allows?

Here's the reference for the power consumption ...

http://www.techpowerup.com/reviews/ATI/HD_2600_XT/17.html

Where is my math wrong? I know it's common to have a multiple card configuration, so should a single card that uses "no more" than the combination of 2 normal cards really posing an issue or big risk (as many people seem to imply, without any proof of power-consumption based failures, in other posts)?

regards, Jens
 

motionwind

macrumors newbie
Original poster
Mar 19, 2013
9
0
Success :) !

I first received a 2nd 6-pin cable in the Mail (I had ordered this prior to ordering the 6-to-8 cable, before I knew I had mistakenly ordered an overclocked vid card). Since I had to wait another 3 weeks for the 6-to-8 pin adapter, I got a little impatient and I researched more ... found this ... http://www.tomshardware.com/reviews/...e,3061-12.html ... and then realized the extra 2 pins on the 8-pin card-side are both GROUNDs ... meaning no power "required". They only signal the card to activate OVERCLOCKING and DRAW EXTRA POWER. Also, the article explained that there is normally only ONE safe way to plug the 6-pin PCI-e cable into the card, and "unless you force it", it will normally only work one way (with the snap-on holder facing upward, you would put the 6-pin into the 8-pin toward the RIGHT). So, I figured I'm safe to use ONLY 6-pin PCI-e cables for the time being (of course with this configuration you're bypassing the over-clocking, which is not taking full power advantage of the card). When I get the 6-to-8 I will probably give that one a go too :), but can happily confirm that I've been working with the card for a half a day with no problems so far (ok, let's see after some weeks). Prior to HEAVY, LONGER Rendering activities I'm still gonna do some more tests, and monitor temperatures. -- regards, J.
 

motionwind

macrumors newbie
Original poster
Mar 19, 2013
9
0
... a few weeks later

I just wanted to report that I am STILL using the Gigabyte NVIDIA 570 GTX SOC video card, hooked up 2X 6-Pin (leaving the 2 pins in the 8-pin Video Card connector unconnected) ... and in this configuration I can CONFIRM that there is NO overheating (it runs rather cool according to the temperature monitor) and it is substantially less noisy than my previous NVIDIA 8800GT card even with 3 Fans! I think it's less quiet because 3 fans mustn't work so hard to cool the thing down compared to the single fan on the NVIDIA 8800GT, and of course because the OVERCLOCKING funtionality is not active. I have been doing several over-night renders in blender in CYCLES without any problems and without the temperature rising much. I have TONS of other graphics programs and have tested them all, with 0 problems. All this works correctly on Mountain Lion OS X v.10.8.3. I am very happy.

PLEASE NOTE: Render times with Blender/Cycles are about 2-to-3 times faster than with CPU rendering on average (ok, that's not as substantial as i had hoped, but proves that it can be useful to upgrade the card, and certainly saves me valuable time).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.