Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

stevemiller

macrumors 68020
Original poster
Oct 27, 2008
2,110
1,697
so after the disappointment of missing out being able to exchange my 560x machine for a Vega 20, I indulged in a Black Friday deal on a Vega 64 + Razer core x enclosure.

just tried them out this evening, I'm not getting any errors, but across any tests I've tried, the egpu seems to be making the system perform worse than with the 560x.

on a blender 3d Cycles GPU benchmark, the Vega 64 is literally twice as slow as the 560x. I tried it with both an external monitor running directly off the card to avoid 2-way throughput on the core x's included thunderbolt 3 cable, and again with no external display, both with the same terrible results. tried rebooting, disconnecting and reconnecting the gpu, nothing helped.

thinking it might be a blender issue, I tried the cinebench OpenGL test, and not only was the Vega 64 still slower than the built in 560x, but when I ran the benchmark using the 560x while the egpu was plugged in, it actually made even the 560x run slower than when I just had the Vega 64 fully disconnected from the system.

I'm on Mojave, 10.14.1. can anyone think of anything I'm overlooking, or do I potentially have a lemon gpu/enclosure?

edit: screengrab of cinebench results:
1. 560x with egpu unplugged
2. Vega 64
3. 560x with egpu still plugged in
Screen Shot 2018-11-27 at 9.38.19 PM.png
 
Last edited:
Try running a 3D benchmark such as Unigine Heaven or GFXBenchmark. If those are slower, there is something wrong. I wouldn’t pay much attention to Cindnench, that benchmark doesn’t work.
 
  • Like
Reactions: stevemiller
I think we both ordered the same thing after missing out on the exchange for the Vega. Mine arrives tomorrow. I can run these same tests and let you know what I get.
 
I opted to just get a 555x / 2.2ghz / 32gb ram vs getting a Vega 20 config ($700 difference) I'd like to see the benchmarks!
 
I would think a egpu would be faster. Having the vega in mine has helped alot.
 
I've seen several "disappointing" eGPU performance threads on other forums, where people are getting significantly less performance from their GTX 1070/1080TI on their TB3 equipt windows laptops etc - so it could be an eGPU thing in general?
 
I've seen several "disappointing" eGPU performance threads on other forums, where people are getting significantly less performance from their GTX 1070/1080TI on their TB3 equipt windows laptops etc - so it could be an eGPU thing in general?
TB3 is limited to PCIE x4 bandwidth, when in normal circumstances when GPU is soldered or replaceable is usually PCIE x16, so because of that PCIE x4 is some sort of bottleneck for powerful GPU's
 
EGPU should (of course) be a lot faster:
http://barefeats.com/macbook_pro_2018_egpu.html
If already have tried rebooting, you must systematically try eliminating/isolate the cause. Do you have another machine you could test with and see if that show same behaviour.

agreed something has to be up. in that barefeats article, they did the exact blender benchmark I was doing with 10x better results (I know they had a frontier edition, but a Vega 64 should at least be in the same ballpark and certainly not slower than the 560x)

as for a means of isolating the issue, i guess see if there's anything at my office I can use to test the Vega 64 card.
 
Oh man, Blender is the main reason I would consider getting eGPU for my 2017 MBP. This is not what I wanted to read.
 
TB3 is limited to PCIE x4 bandwidth, when in normal circumstances when GPU is soldered or replaceable is usually PCIE x16, so because of that PCIE x4 is some sort of bottleneck for powerful GPU's

eGPU only has 5-10% hit in performance compared to internally installed desktop cards. He should still be miles and miles ahead of his 560X.

The error is in his configuration. Either the app isn’t configured correctly or doesn’t support eGPU. He can try to force eGPU in Get Info.
 
TB3 is limited to PCIE x4 bandwidth, when in normal circumstances when GPU is soldered or replaceable is usually PCIE x16, so because of that PCIE x4 is some sort of bottleneck for powerful GPU's

True, but this only becomes really noticeable if there is a lot of data copying from and to the host machine. And since the data PCI-E bus is already a magnitude slower than the GPU memory itself, your performance will be bad in such a scenario even if you had 16 lanes. For real-world applications, going down to 4 PCI-e lanes results in around 10% perf. reduction on average, which is not terrible.
 
True, but this only becomes really noticeable if there is a lot of data copying from and to the host machine. And since the data PCI-E bus is already a magnitude slower than the GPU memory itself, your performance will be bad in such a scenario even if you had 16 lanes. For real-world applications, going down to 4 PCI-e lanes results in around 10% perf. reduction on average, which is not terrible.

Could it be to do with using internal vs external display? I've read that it eGPU's perform noticably better on external displays?
 
Could it be to do with using internal vs external display? I've read that it eGPU's perform noticably better on external displays?
To me this would make sense as when connected to an external display, the GPU card just outputs to the external display it is connected to directly.

For the internal display, the GPU output has to transmit down the TB3 wire, then somehow funnel this data to the internal screen through the CPU.
 
  • Like
Reactions: leman
a little bit more info to follow up:

some further benchmarks at very least suggest the egpu hardware likely isn't the issue. Luxmark and Geekbench OpenCL scores appear in line with what I'd expected. Likewise, unigine valley benchmarked similarly to results I'd seen online. unigine heaven for some reason was a bit lower than expected, but still decent.

the bigger issue is of course I didn't buy it to run benchmarks! I use blender pretty extensively for work and for personal projects, and I'd hoped the Vega 64 would give a good boost to both Cycles GPU rendering and the upcoming Eevee realtime viewport. There's clearly some sort of bug with my particular combination of egpu + macOS + blender that's preventing those benefits from being realized.

eager to hear if there are any other users out there who are trying to do the same setup as me and have had similar issues or better luck. or if FrostyF wants to continue to twist the knife, I'd be curious to hear how the Vega 20 performs running the blender bmw27 benchmark! :p

I'm also going to get unity and unreal installed tonight and see if the card gives me any benefits there. otherwise its gonna be tough to justify keeping the egpu. :(
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.