GTX 980 Ti on cMP

Discussion in 'Mac Pro' started by maximage77, Jun 15, 2015.

  1. maximage77 macrumors newbie

    Joined:
    Nov 4, 2014
    #1
    So where are we standing on this at the moment? There was a thread on here a few days ago that has for some reason disappeared ...
     
  2. Gwendolini macrumors 6502

    Gwendolini

    Joined:
    Feb 5, 2015
    Location:
    random
    #2
    It did not disappear, it just went to page 2 or 3, as there are two threads about it, one benchmark thread and that other thread, both by MacVidCards.
     
  3. Lauwie macrumors regular

    Joined:
    Jun 17, 2011
    #3
    It should work with the nVidia webdrivers as it is basically a trimmed down Titan X.
    Though check with MacVidcards as to when they have flashed versions available so you can enjoy all dem speed ;)
     
  4. maximage77 thread starter macrumors newbie

    Joined:
    Nov 4, 2014
    #4
    Ahhh, there was a thread on here that has definitely disappeared as I still had the tab open, and it now refreshes to unknown link, plus I had also posted in it, and that post no longer shows in my post history. This thread had MacVidCards outlining some of his testing etc.
     
  5. Gwendolini macrumors 6502

    Gwendolini

    Joined:
    Feb 5, 2015
    Location:
    random
    #5
    Hmm, maybe it was showing illegal images of nude chips?
     
  6. Synchro3 macrumors 65816

    Synchro3

    Joined:
    Jan 12, 2014
    #6
    Got a copy of a part of this thread:

    GTX 980 Ti in a cMP 5,1 X5690 @ 3.46 GHZ

    CUDA-Z
    Single-precision Float: 6418.66 Gflop/s
    Double-precision Float: 207.253 Gflop/s
    32-bit integer: 2147.72 Giop/s
    24-bit integer: 1519.07 Giop/s

    Unigine Valley Benchmark (Preset: Extreme HD):
    FPS: 50.9
    Score: 2130
    Min FPS: 22.5
    Max FPS: 86.6
     
  7. maximage77 thread starter macrumors newbie

    Joined:
    Nov 4, 2014
    #7
    There is also a thread on netkas about the 980 Ti that has been locked with a link simply pointing across to the thread on here that disappeared. No one seems to be speaking about the 980 Ti on Mac. Is it a bit like Fight Club? :)
     
  8. ActionableMango macrumors 604

    ActionableMango

    Joined:
    Sep 21, 2010
    #8
    You can see the June 11 snapshot of it in Google Cache--I don't see any controversial or secretive content in the thread.

    This thread we're in now is talking about the 980ti and nobody seems to be deleting it. There is also this thread:
    http://forums.macrumors.com/threads/nvidia-geforce-gtx-980-ti.1888207/
     
  9. Upgrader macrumors regular

    Upgrader

    Joined:
    Nov 23, 2014
    #9
    Macvidcards appear to be selling them flashed and ready to run on the internal Mac Pro PSU.
     
  10. maximage77 thread starter macrumors newbie

    Joined:
    Nov 4, 2014
    #10
    They sure are ... ordered one a few days ago ...
     
  11. Upgrader macrumors regular

    Upgrader

    Joined:
    Nov 23, 2014
    #11
    Would be good to hear how you go with this. Another Ti user on another thread is having issues with overheating and card quirks. Interested to see if this is a thing across the board.
     
  12. Shamgar macrumors regular

    Joined:
    Jun 28, 2015
    #12
    I got my GTX 980 Ti from MVC earlier this week. It took a bit to get going. The card was not properly modified to run on 6 pin power, so it wouldn't boot properly when installed with the supplied cables. I eventually figured out that was the problem, and switched back to my previous set of cables which included a 6-to-8 pin version. After that, it loaded up fine.

    Boot screen support worked from the get go. It can run on the EFI drivers if needed, which is helpful. It runs fine in OS X and seems to stay pretty cool and quiet. I've had no issues with heat, power, or fan speed. It is a little bit quieter and cooler than my previous GTX 680.

    I'm having trouble getting Daz3d to consistently recognize it as a valid device for Iray rendering, which cripples rendering times there. I think it might not be recognizing the 6GB of memory when starting a render. Fortunately, that's a minor hobby, so I'm not too concerned about getting that working in a hurry. It's recognized by Cuda-Z just fine, so it's not a matter of it being unavailable as a CUDA device, just something with that particular app. I don't use anything else with CUDA, so I can't speak more to that.

    On the Boot Camp side for gaming, it was a noticeable step up from my GTX 680. The 980 Ti can easily handle anything at 1440p. However, my six-core 3.33GHz CPU (W3680) is starting to show its age. My average frame rates are a solid 60+, but dips are not uncommon as the GTX 980 Ti is starved for instructions. Some games are more pronounced then others. The only other quirk is that I cannot control the LED system from Geforce Experience. So that green light is going to stay on.

    The constant rebooting during the initial trouble-shooting caused my 840 EVO and Velocity Solo x2 to stop talking to each other. They both work, just not with each other. I had to move my boot drive to the optical bay, and that bandwidth bottleneck is rather noticeable in certain usages. I'll try a clean reformat of my 840 sometime next week and see if that gets it going again.
     
  13. Redneck1089 macrumors 65816

    Redneck1089

    Joined:
    Jan 18, 2004
    #13
    Doubt any of that is causing the dips -- the hex 3.33GHz is more than adequate and the PCIe 1 x16 shouldn't cause much more than a 5 fps dip than running PCIe 2 or 3. Drivers, or just particularly intensive areas of games, are most likely causing the dips you're experiencing.
     
  14. Shamgar macrumors regular

    Joined:
    Jun 28, 2015
    #14
    The 980 Ti is running in PCIe 2.0 mode in both OSs, though I suppose I still need to verify that for Windows. The W3680 is still quite beefy as a workstation CPU, but it is lagging noticeably in the single threaded performance compared to the kinds of current gen CPUs typically paired with a GPU this powerful. If a workload exceeds that single-thread capacity, such as a major spike in draw calls, I'll see a frame rate drop paired with a GPU usage drop. Something like an overclocked 4790k would be able to handle those workloads more readily. But building a gaming PC with that kind of CPU has its own trade-offs, which is why I went with just a GPU upgrade.
     
  15. Upgrader macrumors regular

    Upgrader

    Joined:
    Nov 23, 2014
    #15
    But pairing two X5690 3.46ghz chips with a 980Ti for 3D work won't suffer the same issues will it? I'm not a gamer.
     
  16. Upgrader macrumors regular

    Upgrader

    Joined:
    Nov 23, 2014
    #16
    I didn't think the card was physically modified to use a 6 pin in the 8 pin socket. I thought he just supplied a swapped out 6 pin for the bundled 8 pin and you just plug it into the 8 pin slot and it worked fine?? Did you email him about the issue?
     
  17. Shamgar macrumors regular

    Joined:
    Jun 28, 2015
    #17
    Games rarely take advantage of more than four cores, and rely heavily on one in particular, at least until the next generation of graphics APIs comes into place. Having a single thread control the GPU workload creates a potential bottleneck in the system if the single-thread performance can't keep up if the workload spikes. 3D work does not have that kind of issue. Rendering a predetermined image is a much more predictable and parallel matter, and your CPUs should not bottleneck the performance of a 980Ti (or two) in any way.
     
  18. netkas, Jul 11, 2015
    Last edited: Jul 11, 2015

    netkas macrumors 65816

    Joined:
    Oct 2, 2007
    #18

    I've found a way to check that.
    run gpu-z, open sensors tab, you will need to look on gpu load.
    Run f1 2014 Use built-in benchmark.

    For me gpu usage is always lower than 100% because I'm getting cpu bound (even in 4k) on 2.8 ghz yorkfield.

    This might show single core perf - http://forum.netkas.org/index.php/topic,11176.0.html
     
  19. Shamgar macrumors regular

    Joined:
    Jun 28, 2015
    #19
    The supplied cables were 6-pin, as advertised. But the card will not start properly without believing it has 8-pin power. If the last two pins are not receiving power, the GPU will not start up fully. You can fake it by putting jumpers in an 8-pin cable, which is what I have, or you can short the +2 pins on the GPU so that a 6-pin cable will work.

    To clarify, I have not emailed him about it as I solved it for myself. Though I suppose I should inform him that there is a quality-control issue somewhere.
     
  20. Upgrader, Jul 11, 2015
    Last edited: Jul 11, 2015

    Upgrader macrumors regular

    Upgrader

    Joined:
    Nov 23, 2014
    #20
  21. Shamgar macrumors regular

    Joined:
    Jun 28, 2015
    #21

Share This Page