Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

rsths12

macrumors newbie
Original poster
Jul 19, 2018
26
5
I do a lot of work with the Creative Cloud Suite (Premiere, After Effects, Lightroom, etc.) and Cinema 4D, and I recently purchased a Razer Core X and Sapphire RX Vega 64 to boost the performance of my 2018 MacBook Pro. While all the programs run faster for sure, it is not exactly as much performance as I was expecting from the Vega 64. For example, the card only gets 130K in OpenCL compute on Geekbench.

Would a workstation class card such as the new AMD Radeon Pro WX 8200 give me better performance?

The amount and types of graphics cards has been bit confusing, and my initial research seemed to suggest that consumer and workstation cards are the same hardware, but workstation cards have more reliable components and longer warranties. Naturally, I went with the higher spec'd but cheaper consumer card. The more I read, however, it seems that AMD claims workstation cards are optimized for programs such as Premiere.

Just looking for a bit of guidance from people that know more than me as I can't seem to find any benchmarks or articles online that give solid answers to the workstation versus consumer card differences.

Thanks!
 
This is pretty much what you can expect from eGPUs. OpenCL score from internal Vega 64 is around 170k, then loose around 25% of performance due to Thunderbolt 3 limitations, and you have your 130k result.

The better the card, the more performance you will loose in eGPUs. Vega 56 and 64 seems to be the sweet spot for eGPU. An 1080 GTX or better would be overkill, and you will have 30%+ loss.

Would in fact recommend Vega 56 as they are usually cheaper and the performance loss and cost will be less. OpenCL score between 56 and 64 is minimal according to OpenCL Geekbench benchmarks.
 
Would a workstation class card such as the new AMD Radeon Pro WX 8200 give me better performance?
I have Vega FE which under windows supports AMDs pro drivers, just like the workstation cards. Same enclosure, Razor Core X, the Geekbench OpenCL shows 155k, my RX580 gets 130k. What cable do you have, the same as came with the box? I bought the 6ft active with 100W power delivery, it was almost $60. Maybe that's the bottleneck? And Vega FE is pretty much the Vega 64 just with unlocked drivers on Windows, I doubt there is anything special for it on Mac.
 
As somebody who doesn't follow Vega why didn't you pick an Nvidia card? Don't the Adobe programs use CUDA if available - thus being far more accelerated on the Nvidia cards?
 
As somebody who doesn't follow Vega why didn't you pick an Nvidia card? Don't the Adobe programs use CUDA if available - thus being far more accelerated on the Nvidia cards?
Because current driver support for Nvidia card on the Mac is abysmal and getting their cards to work is a pain in the neck. I'd love to have a MBP+eGPU build with CUDA (for machine learning) under macOS, but I guess that's not going to happen. You can probably get it done under Bootcamp, but at that point I think one would be better served by a cheaper Windows desktop, with the added benefit that the GPU wouldn't be TB3-bottlenecked.
 
Because current driver support for Nvidia card on the Mac is abysmal and getting their cards to work is a pain in the neck. I'd love to have a MBP+eGPU build with CUDA (for machine learning) under macOS, but I guess that's not going to happen. You can probably get it done under Bootcamp, but at that point I think one would be better served by a cheaper Windows machine, with the added benefit that the GPU wouldn't be TB3-bottlenecked.

That's a very good reason! Didn't even think of that but now that you mention it, there hasn't been an official Mac with an Nvidia card in a long time has there?

Oops.
 
That's a very good reason! Didn't even think of that but now that you mention it, there hasn't been an official Mac with an Nvidia card in a long time has there?

Oops.
GTX 780Ti is the last one supported natively. For anything newer you need to wait for Nvidia to release web drivers, which is always delayed after each minor OS update, you need to hack your way to make the old drivers work in the meantime. And the app support is quite bad, I had instances on Mac Pro where the 5770 was faster than 1080.
 
As somebody who doesn't follow Vega why didn't you pick an Nvidia card? Don't the Adobe programs use CUDA if available - thus being far more accelerated on the Nvidia cards?

Adobe CC programs, like Premiere Pro, have had OpenCL and Metal acceleration as well for some time now. The difference between all the mercury render options are pretty minimal these days, with Metal getting better on MacOS, as Adobe seems to be putting all its efforts on that acceleration for MacOS now.
 
This is pretty much what you can expect from eGPUs. OpenCL score from internal Vega 64 is around 170k, then loose around 25% of performance due to Thunderbolt 3 limitations, and you have your 130k result.

The better the card, the more performance you will loose in eGPUs. Vega 56 and 64 seems to be the sweet spot for eGPU. An 1080 GTX or better would be overkill, and you will have 30%+ loss.

Would in fact recommend Vega 56 as they are usually cheaper and the performance loss and cost will be less. OpenCL score between 56 and 64 is minimal according to OpenCL Geekbench benchmarks.

Makes sense, thanks for the info. I am still curious. Who are workstation cards recommended for? AMD markets them to media creators, but it doesn't seem like many people pay the premium for them. Opting instead for consumer cards.
 
I just ordered a Razor Core X and PowerColor Red Devil RX Vega 64 8GB HBM2.

I’m a big gamer and after lots of research that’s what I settled on, as the best least hassle setup
 
I just ordered a Razor Core X and PowerColor Red Devil RX Vega 64 8GB HBM2.

I’m a big gamer and after lots of research that’s what I settled on, as the best least hassle setup
Just be aware that this will not work out of the box in bootcamp, look up instructions on eGPU.io. I think it was one of the Mojave betas, when I got my Razor Core X I booted windows, it didn't work and I was browsing the web looking for solution and then I smelled burning plastic. Vega FE had its fan off but was in high power state. You could cook on top of the Razor and burn your hand when touching the outside. Scared the hell out of me, I thought I just fried 1k worth of equipment. It survived though, didn't try to boot windows again until Mojave was released officially. It doesn't do it anymore.
 
Just be aware that this will not work out of the box in bootcamp, look up instructions on eGPU.io. I think it was one of the Mojave betas, when I got my Razor Core X I booted windows, it didn't work and I was browsing the web looking for solution and then I smelled burning plastic. Vega FE had its fan off but was in high power state. You could cook on top of the Razor and burn your hand when touching the outside. Scared the hell out of me, I thought I just fried 1k worth of equipment. It survived though, didn't try to boot windows again until Mojave was released officially. It doesn't do it anymore.

Thanks :)

eGPU.oi is pretty awesome. I see there is a guide there for bootcamp! :)

Tbh the only game on the horizon I'm really itching to play that needs bootcamp is Witcher 3. There are already so many games on Steam for Mac, and have Witcher 2 ready to start on Mac when the new GFX card arrives!
 
This is pretty much what you can expect from eGPUs. OpenCL score from internal Vega 64 is around 170k, then loose around 25% of performance due to Thunderbolt 3 limitations, and you have your 130k result.

The better the card, the more performance you will loose in eGPUs. Vega 56 and 64 seems to be the sweet spot for eGPU. An 1080 GTX or better would be overkill, and you will have 30%+ loss.

Would in fact recommend Vega 56 as they are usually cheaper and the performance loss and cost will be less. OpenCL score between 56 and 64 is minimal according to OpenCL Geekbench benchmarks.

According to geekbench scores, 580 gets 113k opencl vs Vega 64’s 130k, so it really seems like macOS essentially flatlines on benefits past the 580 (maybe that’s why they only bothered to promote the Blackmagic 580 setup to start with.) In boot camp the same Vega 64 over the same thunderbolt cable I got something like 178k. So it’s not the egpu bottlenecking, it’s either the OS or software optimization on the Mac platform.

It’s actually even amazing to see how well the 560x runs in boot camp. It choked on video memory limitations more than i noticed raw speed as a handicap.
 
  • Like
Reactions: doitdada
According to geekbench scores, 580 gets 113k opencl vs Vega 64’s 130k, so it really seems like macOS essentially flatlines on benefits past the 580 (maybe that’s why they only bothered to promote the Blackmagic 580 setup to start with.) In boot camp the same Vega 64 over the same thunderbolt cable I got something like 178k. So it’s not the egpu bottlenecking, it’s either the OS or software optimization on the Mac platform.

It’s actually even amazing to see how well the 560x runs in boot camp. It choked on video memory limitations more than i noticed raw speed as a handicap.


Oh wow. Thats a massive difference. I was going with AMD to not have the hassle, but when there is such a difference it seems like bootcamp is a must whatever card you use. But then I could also just buy a Nvidia gfx card like the new 2080 ti and just sack off ever using it on the Mac partition. Why get AMD when the Mac side is that badly optimised :/
 
As somebody who doesn't follow Vega why didn't you pick an Nvidia card? Don't the Adobe programs use CUDA if available - thus being far more accelerated on the Nvidia cards?

Long standing Adobe CUDA myth that needs to die.

Adobe uses Mercury Engine for acceleration, which piggy backs on top of OpenGL and OpenCL. But since these are being depreciated in macOS that means all of Adobe's apps will migrate to full Metal support very soon. We can see that happening on iPad iOS apps first.

There is some CUDA support in some Adobe apps but it means nothing in real world use. I have tested very thoroughly exporting video using CL or CUDA and it is negligible difference. These APIs are dying and will be replaced on macOS and Windows.
 
I just ordered a Razor Core X and PowerColor Red Devil RX Vega 64 8GB HBM2.

I’m a big gamer and after lots of research that’s what I settled on, as the best least hassle setup

I'll be curious how it does. I tried to play BOPS 4 last night on my new MBP. It played it fine, except I saw a lot of screen tearing. I had VSYNC on too but tried on and off. I couldn't tell what my FPS is since I haven't figure out how to turn it on in the game. This is with my U3415W Dell Ultrawide. I use to play with this monitor fine on my gaming system, but then upgraded to a 165hz monitor before I sold it with my gaming system. Wonder if it's just me not use to 60 hz any more...
 
It’s actually even amazing to see how well the 560x runs in boot camp. It choked on video memory limitations more than i noticed raw speed as a handicap.

I think Windows is doing a great job making all kinds of hardware work, and it's kind of a shame of Apple to let performance go unused in their own OS.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.