Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iamMacPerson

macrumors 68040
Original poster
Jun 12, 2011
3,488
1,927
AZ/10.0.1.1
I'm building an new 5,1 rig for Final Cut and purchased a GeForce 970 GPU for it. I know that nVidia cards have never been stellar for OpenCL, but I thought it was worth a shot. Then everyone has told me that AMD is pretty much the best and only option for Final Cut, so I purchased a 7970 and popped it in.

Comparing both cards with LuxMark Simple Ball, it seems the 970 far excels in OpenCL over the 7970 with 11k vs 8k (my 6,1 with D300s scored almost 14k on the same test). Is the 7970 still superior to any nVidia cards out there and the benchmarks are just messed up, or the 970 actually better then the AMD offerings? What card would you recommend for Final Cut?

EDIT: I'm also considering an R9 380, RX 460 or 480 card.
 
Last edited:
Nvidia Maxwell cards have almost the double OpenCL performance than their Kepler predecessors.

So your Maxwell GTX 970 isn't bad at all for OpenCL.

Kepler cards have bad OpenCL performance.
 
  • Like
Reactions: owbp
Ahh ok that makes sense. I'm also thinking about getting an newer AMD card. It seems all the issues I was reading about with the RX 460/480 apply to Hacks only because of the iGPU on the newer CPUs. Now I'm wondering if the upgrade to a RX 460, RX 480 or R9 380x might be worth it. Would anyone recommend one over the other for Final Cut? I know the 460 and 380 work OOB and the 480 needs a kext hack. I'm fine with either, I just want to get the best for my rig.
 
I just did the same thing. I bought a 5.1 and tried to make it the most powerful 5.1 that I can. I have 2 X r9-390x on it because AMD is supposed to work better with fpcx and compressor which I use.

There seems to be that there is still a difference between openCL benchmark vs eventual performance in the software due to how well the program is written to support the card. the 980ti seems to have better benchmark, I'm interested to know how it translates into performance.
 
  • Like
Reactions: sarkrui
Comparing both cards with LuxMark Simple Ball, it seems the 970 far excels in OpenCL over the 7970 with 11k vs 8k (my 6,1 with D300s scored almost 14k on the same test).

7970 only score only 8k in Luxmark Ball doesn't sounds quite right. I don't have that card, but even my 7950 can score >11K, and from memory, 7970 should score 13K or more.
 
  • Like
Reactions: owbp and Fl0r!an
Yep, there's something wrong with your 7970, should be a lot faster than that. Is it a MacPro3,1? Those have a nasty power management bug in interaction with HD 7xxx AMD cards.
 
  • Like
Reactions: owbp
7970 should score 13K or more.
Agreed, this is my R9 280X result i just did while still using MP during render
Screen Shot 2017-01-22 at 3.11.13 PM.png
 
Yep, there's something wrong with your 7970, should be a lot faster than that. Is it a MacPro3,1? Those have a nasty power management bug in interaction with HD 7xxx AMD cards.

No it's a 4,1 -> 5,1 conversion. I thought there was a problem with the card too and I ended up sending it back. It was an eBay purchase so that could have something to do with it (we all know how eBay can be :rolleyes: ). Does anyone even sell the 7970, 7950 or R9 280x new anymore?
 
The 7970 was introduced at the end of 2011, so it's rather unlikely to find a new one today. ;)
 
There is possibility that R9 280 and R9 280X are still available new in some smaller local stores...

Am I correct in assuming the 280 is the 7950 and the 280x is the 7970? Are the differences great enough to warrant the extra money?

Also, I noticed I can get the R9 380 or RX 460/480 from Amazon new for a reasonable amount. Any reason to get the 280/79xx over the newer GPUs?
 
These are newer revisions of Tahiti chips. 280 (Tahiti Pro2) and second rev. of 280x (XTL) have improved power efficiency, usually a better (quieter cooling) and higher clocks than their predecessors.
 
The 7950 Mac Edition (new) still avail, even though I don't think it worth that much, and I won't recommend anyone to buy it now. But it is avail.

https://eshop.macsales.com/item/Sapphire/100352MAC2/

Also, I can still find quite a lot of 280X (new) avail on Amazon. Even though I personally consider they are also overpriced now. But again, it's avail.

My understanding is that 280 and 7950 share the same device ID. 280X and 7970 also share the same device ID. So, you can treat them like the same card in real world. However, there is some minor differences that virtually transparent to the users.
 
R9 280 = HD 7950 Boost
R9 280X = HD 7970 GHz Edition

Virtually the same as normal 7950/7970 cards, just slightly refined.
 
R9 280 = HD 7950 Boost
R9 280X = HD 7970 GHz Edition

Virtually the same as normal 7950/7970 cards, just slightly refined.

I think not entirely true, my R9 280 has exactly the same speed at the HD7950, both come with stock 800/1250Mhz.
 
Does any of them get recognized as the d700? I read there is tighter integration of the d700 with fcpx, so a more powerful card might even perform slower.
 
Does any of them get recognized as the d700? I read there is tighter integration of the d700 with fcpx, so a more powerful card might even perform slower.

It's cosmetic, you can make the system to ident the GPU as what ever you want (including D700), but it won't make any difference. Everything is just base on the device ID.

And both the 7970 and R9 280X has exactly the same device ID as the D700.
 
  • Like
Reactions: owbp
The reference R9 280 has slightly higher clocks as well as >900 MHz boost states (in contrast to reference 7950 with 800 MHz clock and no boost), so not sure what's wrong with your 280.

My comment was more about the used silicon though, which is (slightly) different as 666sheep mentioned.
 
The reference R9 280 has slightly higher clocks as well as >900 MHz boost states (in contrast to reference 7950 with 800 MHz clock and no boost), so not sure what's wrong with your 280.

My comment was more about the used silicon though, which is (slightly) different as 666sheep mentioned.

Understand that's mainly on the silicon side.

And I just check the AMD website, you are right, the "Standard" R9 280 should clock at 933MHz.

I guess my HIS IceQ R9 280 want to stay at dual 6pin design (non-standard), that's why also stay at 800MHz. The standard 280 should clock at 933MHz but come with 6+8pin config.

In fact, I was trying so hard to find a dual 6pin R9 280 (luckily I got it), I don't even know it should not exist until now. :D
 
I know we're going slightly off topic, just one last remark:
My Sapphire R9 280 also has dual 6-pin and came with 850 MHz base clock (940 MHz boost). I'm usually running it @1.1GHz with stock voltage, so I guess I didn't draw the worst chip in the silicon lottery. :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.