Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

maximage77

macrumors member
Original poster
Nov 4, 2014
36
30
So where are we standing on this at the moment? There was a thread on here a few days ago that has for some reason disappeared ...
 
It did not disappear, it just went to page 2 or 3, as there are two threads about it, one benchmark thread and that other thread, both by MacVidCards.
 
It should work with the nVidia webdrivers as it is basically a trimmed down Titan X.
Though check with MacVidcards as to when they have flashed versions available so you can enjoy all dem speed ;)
 
It did not disappear, it just went to page 2 or 3, as there are two threads about it, one benchmark thread and that other thread, both by MacVidCards.
Ahhh, there was a thread on here that has definitely disappeared as I still had the tab open, and it now refreshes to unknown link, plus I had also posted in it, and that post no longer shows in my post history. This thread had MacVidCards outlining some of his testing etc.
 
Ahhh, there was a thread on here that has definitely disappeared as I still had the tab open, and it now refreshes to unknown link, plus I had also posted in it, and that post no longer shows in my post history. This thread had MacVidCards outlining some of his testing etc.

Hmm, maybe it was showing illegal images of nude chips?
 
Ahhh, there was a thread on here that has definitely disappeared as I still had the tab open, and it now refreshes to unknown link, plus I had also posted in it, and that post no longer shows in my post history. This thread had MacVidCards outlining some of his testing etc.

Got a copy of a part of this thread:

GTX 980 Ti in a cMP 5,1 X5690 @ 3.46 GHZ

CUDA-Z
Single-precision Float: 6418.66 Gflop/s
Double-precision Float: 207.253 Gflop/s
32-bit integer: 2147.72 Giop/s
24-bit integer: 1519.07 Giop/s

Unigine Valley Benchmark (Preset: Extreme HD):
FPS: 50.9
Score: 2130
Min FPS: 22.5
Max FPS: 86.6
 
There is also a thread on netkas about the 980 Ti that has been locked with a link simply pointing across to the thread on here that disappeared. No one seems to be speaking about the 980 Ti on Mac. Is it a bit like Fight Club? :)
 
Ahhh, there was a thread on here that has definitely disappeared as I still had the tab open, and it now refreshes to unknown link, plus I had also posted in it, and that post no longer shows in my post history. This thread had MacVidCards outlining some of his testing etc.

There is also a thread on netkas about the 980 Ti that has been locked with a link simply pointing across to the thread on here that disappeared. No one seems to be speaking about the 980 Ti on Mac. Is it a bit like Fight Club? :)

You can see the June 11 snapshot of it in Google Cache--I don't see any controversial or secretive content in the thread.

This thread we're in now is talking about the 980ti and nobody seems to be deleting it. There is also this thread:
https://forums.macrumors.com/threads/nvidia-geforce-gtx-980-ti.1888207/
 
Macvidcards appear to be selling them flashed and ready to run on the internal Mac Pro PSU.
 
They sure are ... ordered one a few days ago ...
Would be good to hear how you go with this. Another Ti user on another thread is having issues with overheating and card quirks. Interested to see if this is a thing across the board.
 
I got my GTX 980 Ti from MVC earlier this week. It took a bit to get going. The card was not properly modified to run on 6 pin power, so it wouldn't boot properly when installed with the supplied cables. I eventually figured out that was the problem, and switched back to my previous set of cables which included a 6-to-8 pin version. After that, it loaded up fine.

Boot screen support worked from the get go. It can run on the EFI drivers if needed, which is helpful. It runs fine in OS X and seems to stay pretty cool and quiet. I've had no issues with heat, power, or fan speed. It is a little bit quieter and cooler than my previous GTX 680.

I'm having trouble getting Daz3d to consistently recognize it as a valid device for Iray rendering, which cripples rendering times there. I think it might not be recognizing the 6GB of memory when starting a render. Fortunately, that's a minor hobby, so I'm not too concerned about getting that working in a hurry. It's recognized by Cuda-Z just fine, so it's not a matter of it being unavailable as a CUDA device, just something with that particular app. I don't use anything else with CUDA, so I can't speak more to that.

On the Boot Camp side for gaming, it was a noticeable step up from my GTX 680. The 980 Ti can easily handle anything at 1440p. However, my six-core 3.33GHz CPU (W3680) is starting to show its age. My average frame rates are a solid 60+, but dips are not uncommon as the GTX 980 Ti is starved for instructions. Some games are more pronounced then others. The only other quirk is that I cannot control the LED system from Geforce Experience. So that green light is going to stay on.

The constant rebooting during the initial trouble-shooting caused my 840 EVO and Velocity Solo x2 to stop talking to each other. They both work, just not with each other. I had to move my boot drive to the optical bay, and that bandwidth bottleneck is rather noticeable in certain usages. I'll try a clean reformat of my 840 sometime next week and see if that gets it going again.
 
  • Like
Reactions: Upgrader
On the Boot Camp side for gaming, it was a noticeable step up from my GTX 680. The 980 Ti can easily handle anything at 1440p. However, my six-core 3.33GHz CPU (W3680) is starting to show its age. My average frame rates are a solid 60+, but dips are not uncommon as the GTX 980 Ti is starved for instructions. Some games are more pronounced then others. .

Doubt any of that is causing the dips -- the hex 3.33GHz is more than adequate and the PCIe 1 x16 shouldn't cause much more than a 5 fps dip than running PCIe 2 or 3. Drivers, or just particularly intensive areas of games, are most likely causing the dips you're experiencing.
 
  • Like
Reactions: Upgrader
The 980 Ti is running in PCIe 2.0 mode in both OSs, though I suppose I still need to verify that for Windows. The W3680 is still quite beefy as a workstation CPU, but it is lagging noticeably in the single threaded performance compared to the kinds of current gen CPUs typically paired with a GPU this powerful. If a workload exceeds that single-thread capacity, such as a major spike in draw calls, I'll see a frame rate drop paired with a GPU usage drop. Something like an overclocked 4790k would be able to handle those workloads more readily. But building a gaming PC with that kind of CPU has its own trade-offs, which is why I went with just a GPU upgrade.
 
But pairing two X5690 3.46ghz chips with a 980Ti for 3D work won't suffer the same issues will it? I'm not a gamer.
 
I got my GTX 980 Ti from MVC earlier this week. It took a bit to get going. The card was not properly modified to run on 6 pin power, so it wouldn't boot properly when installed with the supplied cables. I eventually figured out that was the problem, and switched back to my previous set of cables which included a 6-to-8 pin version. After that, it loaded up fine.

I didn't think the card was physically modified to use a 6 pin in the 8 pin socket. I thought he just supplied a swapped out 6 pin for the bundled 8 pin and you just plug it into the 8 pin slot and it worked fine?? Did you email him about the issue?
 
Games rarely take advantage of more than four cores, and rely heavily on one in particular, at least until the next generation of graphics APIs comes into place. Having a single thread control the GPU workload creates a potential bottleneck in the system if the single-thread performance can't keep up if the workload spikes. 3D work does not have that kind of issue. Rendering a predetermined image is a much more predictable and parallel matter, and your CPUs should not bottleneck the performance of a 980Ti (or two) in any way.
 
  • Like
Reactions: Upgrader
On the Boot Camp side for gaming, it was a noticeable step up from my GTX 680. The 980 Ti can easily handle anything at 1440p. However, my six-core 3.33GHz CPU (W3680) is starting to show its age. My average frame rates are a solid 60+, but dips are not uncommon as the GTX 980 Ti is starved for instructions. Some games are more pronounced then others. The only other quirk is that I cannot control the LED system from Geforce Experience. So that green light is going to stay on.


I've found a way to check that.
run gpu-z, open sensors tab, you will need to look on gpu load.
Run f1 2014 Use built-in benchmark.

For me gpu usage is always lower than 100% because I'm getting cpu bound (even in 4k) on 2.8 ghz yorkfield.

This might show single core perf - http://forum.netkas.org/index.php/topic,11176.0.html
 
Last edited:
I didn't think the card was physically modified to use a 6 pin in the 8 pin socket. I thought he just supplied a swapped out 6 pin for the bundled 8 pin and you just plug it into the 8 pin slot and it worked fine?? Did you email him about the issue?
The supplied cables were 6-pin, as advertised. But the card will not start properly without believing it has 8-pin power. If the last two pins are not receiving power, the GPU will not start up fully. You can fake it by putting jumpers in an 8-pin cable, which is what I have, or you can short the +2 pins on the GPU so that a 6-pin cable will work.

To clarify, I have not emailed him about it as I solved it for myself. Though I suppose I should inform him that there is a quality-control issue somewhere.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.