Current Model = 980W Output
Assuming a nominal wall voltage of 115VAC, then the efficiency = 71% at full load. Not wonderful.![]()
i wonder if the new mac pros will use an 80plus power supply
Current Model = 980W Output
Assuming a nominal wall voltage of 115VAC, then the efficiency = 71% at full load. Not wonderful.![]()
...the thing I am really hoping for is a gpu refresh - a 260 or 4870 will suffice!
Which is quite normal for a switched-mode power supply but I am pretty sure an expensive power supply ($299.95) has better efficiency than that.
How did you measure the efficiency, out of curiosity?
It's not every day you max out a 980 Watt power supply
The 2006 Mac Pro only used around 200 Watt at load running Cinebench at the standard configuration.
See, the thing there is, we don't know if Apple is going to shaft us with compatibility again.
At this point, I don't believe that they CAN, as any future cards would have EFI64 ROM, so at least the Penryn Mac Pro users would be able to use them, even if the first-rev users are left out.
Does anyone know of any reason that the cards would not work unless Apple specifically made them not work?
It is a "tick," actually. New architectures are "ticks" and shrinks are "tocks." I've done several i7 builds already, and they are "ticks" no matter how you slice it![]()
It is a "tick," actually. New architectures are "ticks" and shrinks are "tocks." I've done several i7 builds already, and they are "ticks" no matter how you slice it![]()
The next cards aren't coming until Q4 with DX11. What's out now is what we're getting, which definitely isn't a bad thingAs far as drivers, I'll probably go with the Nvidia card - ATI's linux drivers are a joke, and more often; nonexistent.
Aside from OEMs not following Nvidias reference design, and using smaller EEPROM for the firmware...See, the thing there is, we don't know if Apple is going to shaft us with compatibility again.
At this point, I don't believe that they CAN, as any future cards would have EFI64 ROM, so at least the Penryn Mac Pro users would be able to use them, even if the first-rev users are left out.
Does anyone know of any reason that the cards would not work unless Apple specifically made them not work?
Um, the Mac Pro uses FB-DIMM... More expensive than regular DDR2 it is.... The new Mac Pro using Gainestown would use ECC DDR3, if it used Beckton it would need FB-DIMM (DDR3 I believe).I can deal (heh) with dual Harpertowns and ddr2 because I can add a bunch of ddr2 on the cheap right now. While ddr3 has dropped dramatically I doubt many Pro its will have nearly the same price point.
Aside from OEMs not following Nvidias reference design, and using smaller EEPROM for the firmware...
Um, the Mac Pro uses FB-DIMM... More expensive than regular DDR2 it is.... The new Mac Pro using Gainestown would use ECC DDR3, if it used Beckton it would need FB-DIMM (DDR3 I believe).
Um, the Mac Pro uses FB-DIMM... More expensive than regular DDR2 it is.... The new Mac Pro using Gainestown would use ECC DDR3, if it used Beckton it would need FB-DIMM (DDR3 I believe).
DDR3 should be much cheaper at the end of the year once more people adopt i7 and AMD comes out with AM3
DDR3 should be much cheaper at the end of the year once more people adopt i7 and AMD comes out with AM3
I hate to admit this too.All DDR3 is STUPIDLY expensive. ECC memory is STUPIDLY expensive.
Combine them and 16GB of DDR2 FB-DIMMs looks like chump change.
Yeah after Intel and AMD literally force DDR3 into the mainstream. It sounds like DDR2 again but why does it feel so much worse this time?DDR3 should be much cheaper at the end of the year once more people adopt i7 and AMD comes out with AM3
Yeah after Intel and AMD literally force DDR3 into the mainstream. It sounds like DDR2 again but why does it feel so much worse this time?
The voltage drop is nice for laptops but the performance gains just aren't there. I remember people saying they'd die for their DDR-400 before going to the stupidly expensive DDR2-533.Maybe because the performance increase doesn't really justify being made to buy it for the sake of having it?
Let's just jump to DDR5, since we already have GDDR5 memory.![]()
Maybe because the performance increase doesn't really justify being made to buy it for the sake of having it?
Let's just jump to DDR5, since we already have GDDR5 memory.![]()
The voltage drop is nice for laptops but the performance gains just aren't there. I remember people saying they'd die for their DDR-400 before going to the stupidly expensive DDR2-533.![]()
You're not wrong. You're a little off context though.The performance isn't really there due to the amount of pins that can be ran. FB-DIMM was supposed to help aleiviate that. GPU memory has stupidly high bandwidth with really really short traces and how the memory controller is set up. I mean look at ATI for a while they had a 512bit bus. Coupled with high speed ram that allowed crazy bandwidth. But it was also expensive, the board traces, the pinout on the GPU, etc. IIRC they have since gone back to a 256bit bus with even higher clocked RAM.
Note: I could be really wrong about all this...
See, the thing there is, we don't know if Apple is going to shaft us with compatibility again.![]()