nMP FirePro Dxxx clocks and TDP

Discussion in 'Mac Pro' started by 666sheep, Dec 24, 2013.

  1. 666sheep, Dec 24, 2013
    Last edited: Dec 24, 2013

    666sheep macrumors 68040

    666sheep

    Joined:
    Dec 7, 2009
    Location:
    Poland
    #1
    D300:
    – GPU 800 @1125 mV (Boost 850 MHz @1175 mV), memory 1270 MHz (5080 effective), TDP 116W

    D500:
    – GPU 650 @1025 mV (Boost 725 MHz @1075 mV), memory 1270 MHz (5080 effective), TDP 108W

    D700:
    – GPU 650 @918 mV (Boost 850 MHz @1100 mV), memory 1370 MHz (5480 effective), TDP 108W

    Taken from PC BIOSes extracted from nMP EFI update.
     
  2. MartinAppleGuy macrumors 68020

    MartinAppleGuy

    Joined:
    Sep 27, 2013
    #2
    Thanks. Just out of curiosity, what clock does my GT 750m 1Gb GDDR5 run at? I know this is a Mac Pro forum, just I would like to compare...
     
  3. Quash macrumors regular

    Joined:
    Sep 27, 2007
    #3
    Wow those are some low clocks on the D500
    With these clocks a D300 would basically be equal to a 7970m/8970m

    Would surprise me if the D500 would be faster at games as a D300.
    I suspect it will be significantly slower, that's gonna piss off some people.
    The D500 will be superior for (double precision) GPU compute.

    Was to be expected though, they needed to keep the GPU to roughly 100/120 watt to not fry the power supply.
     
  4. iBug2 macrumors 68040

    Joined:
    Jun 12, 2005
    #4
    So D300 has the highest TDP? That's interesting. If 116W is ok for one GPU, why are D500 and D700 capped at 108?
     
  5. Quash macrumors regular

    Joined:
    Sep 27, 2007
    #5
    Maybe because it's less likely to be ordered with high end CPU's :confused:
    Even though they all have the same TDP, the 6 core will use more power as a 4 core at max load.
     
  6. Dranix macrumors 6502a

    Dranix

    Joined:
    Feb 26, 2011
    Location:
    Gelnhausen, Germany
    #6
    Yahoo doesn't look so bad for my Dual D300 then ;)
     
  7. slughead macrumors 68040

    slughead

    Joined:
    Apr 28, 2004
    #7
    Thanks for this! I want to find VirtualRain's table and see how close he was with his estimates.

    The W9000 is 975Mhz, 7970 GE is 1,000Mhz.
     
  8. Serban Suspended

    Joined:
    Jan 8, 2013
  9. Pressure macrumors 68040

    Pressure

    Joined:
    May 30, 2006
    Location:
    Denmark
    #9
    The D300 is based on Pitcairn (1280 Stream Processors), while the D500 (1536 Stream Processors) and D700 (2048 Stream Processors) both are based on Tahiti.

    The D300 has 2 GB of RAM and 256-bit memory bandwidth, while D500 has 3 GB of ram and 384-bit memory bandwidth.

    More so, the Double Precision performance of the D300 is 1/16th of its total 2TFlops/sec, while the D500 has 1/4th of its 2.2TFlops/sec.

    In other words, D300 has 125GFlops/sec Double Precision performance and the D500 has 550GFlops/sec.
     
  10. wildmac macrumors 65816

    Joined:
    Jun 13, 2003
    #10
    So I would assume that this still means the D500 is a better performer, depending on what you are doing with the card.. (In my case, it's PS, LR, and WoW).
     
  11. slughead macrumors 68040

    slughead

    Joined:
    Apr 28, 2004
    #11
    Virtual Rain's estimates were based on the FLOPS. I wonder how the nMP will stack up with the power and clock constraints:

    [​IMG]
     
  12. Quash macrumors regular

    Joined:
    Sep 27, 2007
    #12
    Well if the original posters clock speeds are correct these guesstemates are quite a bit off.

    Yes the D500 is gonna murder a D300 in double precision GPU compute.

    But for single precision or stuff like games it's gonna be very very close if the clock speeds by 666sheep are correct. So for a lot of people the D500 might not be a good value pick.

    We'll know within a week i guess.
     
  13. adr1974 macrumors regular

    Joined:
    Nov 15, 2007
    #13
    What types of apps benefit more from double precision vs single precision?
     
  14. BEIGE macrumors member

    Joined:
    Oct 10, 2008
    #14
    scientific simulation that really requires accuracy so there are no rounding errors over long periods or iterations. Everything else is pretty much single precision – even 3D rendering on the GPU is single-precision because it's accurate enough to give you realistic looking results so the significant speed hit wouldn't give you much
     
  15. Dranix macrumors 6502a

    Dranix

    Joined:
    Feb 26, 2011
    Location:
    Gelnhausen, Germany
    #15
    so the D500 is essentialy wasted money for most?
     
  16. ZnU macrumors regular

    Joined:
    May 24, 2006
    #16
    50% more VRAM, 50% more memory bandwidth, 20% more cores. That should be a significant difference for some workloads even leaving aside the better double precision performance.
     
  17. goMac macrumors 603

    Joined:
    Apr 15, 2004
    #17
    The estimates are the boost clock, so they're not terribly far off.
     
  18. Quash macrumors regular

    Joined:
    Sep 27, 2007
    #18
    The W8000 has 40% more cores, 15% more memory bandwidth and it's cores are only clocked 6% slower then a W7000. But the w8000 is still significantly slower then a W7000 at games(around 25%). Read the toms hardware review.

    Now a W8000 is not the same as a D500 but still. (The D500 has more memory bandwidth in exchange for less cores.)

    All i'm saying is: Tahiti cores are not the same as Pitcairn cores. You can't just compare them by multiplying the nr of cores by the frequency of those cores.
    I would wait until your application is benchmarked to make sure you don't waste 400$

    The D700s will always be faster btw ;)

    Ok fair enough but remains to be seen if you can boost for significant amounts of time. Because otherwise the frequency difference is quite big.
     
  19. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #19
    The D500/D700 under sustained, long term computation load is going to hold up better with the lower TDP (especialy if there is a hefty CPU component to the workload also). The D300 won't. Given the D300 is lacking ECC, the heaving lifting GPGPU loads are far more likely to be thrown at the D500/D700. D300 is far more likely to be more burst aligned workloads ( or just plain lightweight ones ).

    D300 is also, at least in the first year or so of usages, to run more solo workloads ( the other GPU is most dark.) That also is going to give single cards more headroom.
     
  20. FredT2 macrumors 6502

    Joined:
    Mar 18, 2009
    #20
    Can you explain how this works?
     
  21. deconstruct60 macrumors 604

    Joined:
    Mar 10, 2009
    #21
    Self segregation. those folks who know they have software that can put both GPUs to work are likely going to buy D500 and D700. With both at work that is just that much more work they can get done.

    Those folks primarily in the camp that have software that can't use more than one GPU and/or GPU limited are going to far more likely choose D300. If not GPU limited not going to buy more than need. Even more likely when they come in pairs.

    I think Apple knows that some fraction of folks buying Mac Pros aren't going to leverage them in pairs. Given the basic Pitcairn that powers the D300 isn't exactly a world-beater anyway it makes less sense to crank the clock down as far.

    If Apple moves a large number of dual Mac Pro and dual GPU MBPs (and lessor extent dual GPU iMacs0 there will be more software over time that leverages it. Apple has been pointing to GCD, OpenCL, etc for years. Eventually it sinks in to the software developers that they are actually serious about it.
     
  22. FredT2 macrumors 6502

    Joined:
    Mar 18, 2009
    #22
    Ok, I get it. So another question: what exactly do the flops number of these cards mean? One (me) would think based on the published numbers (2 vs. 2.2) that the performance difference between the D300 and D500 would not be very great.
     
  23. Dranix macrumors 6502a

    Dranix

    Joined:
    Feb 26, 2011
    Location:
    Gelnhausen, Germany
    #23
    It isn't in float-performance, just 200MFLOPs slower. Gamingwise it should be faster then the D500.
     
  24. mrsavage1 macrumors regular

    Joined:
    Feb 1, 2010
    #24
    Where does it say that the d500 and d700 are using ecc?
     
  25. 666sheep thread starter macrumors 68040

    666sheep

    Joined:
    Dec 7, 2009
    Location:
    Poland
    #25
    They don't use ECC memory, just regular GDDR5. There's no ECC tables in these vbioses.
     

Share This Page