Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

animan

macrumors member
Original poster
May 1, 2010
64
14
Hello,

Looking for recommendations for good eGPU for 2020 13" MacBook Pro, 10th Gen i7, 1TB, 16GB. MBP is on order, should be arriving soon!

Intended applications are gaming, video editing/processing (including 360 videos) with FC Pro, photos, etc

Thanks in advance for your help!
 
Last edited:
I recommend getting the Razer Core X. It's a great, but fairly affordable box that you can put your own GPU into. That also mean it doesn't get outdated as easily as a fixed GPU box, and you can upgrade it down the road. - I'd then advice putting in either a Navi card like the 5700 or 5700 XT in it, or if you can get it really cheap, a Vega card.

Do not expect photo work to be improved by it. Video editing and gaming, yes, but photo work would be slower using the eGPU as the time to transfer the assets over the Thunderbolt link would be more than just processing it locally
 
  • Like
Reactions: animan
Thanks guys! Seems to be unanimous recommendation for the Razer Core. The enclosure + graphics card options is more expensive but seems to be the way to go. I am also leaning towards the 5700 XT graphics card.

Any thoughts on whether the Razer Core X Chroma is worth the extra $100?

I also have an older Dell 5K monitor (UPK2715K) which needs two Displayport connections in 5K mode (5K signal is split between the two Displayports). I believe variants of the 5700 XT cards have multiple DisplayPort outputs. Will I be able to connect the 5700 card to the monitor in 5K mode?

Thanks again.
 
I purchased a Sonnet Breakaway 550 and placed a 5600XT inside and I couldn't be happier. Most of my use is video editing, some After Effects, graphic design and some light gaming such as Fortnite.

Photoshop performance has been top notch for me. Same with Illustrator.

Hello,

Looking for recommendations for good eGPU for 2020 13" MacBook Pro, 10th Gen i7, 1TB, 16GB. MBP is on order, should be arriving soon!

Intended applications are gaming, video editing/processing (including 360 videos) with FC Pro, photos, etc

Thanks in advance for your help!
 
Thanks guys! Seems to be unanimous recommendation for the Razer Core. The enclosure + graphics card options is more expensive but seems to be the way to go. I am also leaning towards the 5700 XT graphics card.

Any thoughts on whether the Razer Core X Chroma is worth the extra $100?

I also have an older Dell 5K monitor (UPK2715K) which needs two Displayport connections in 5K mode (5K signal is split between the two Displayports). I believe variants of the 5700 XT cards have multiple DisplayPort outputs. Will I be able to connect the 5700 card to the monitor in 5K mode?

Thanks again.

Chroma == $100 for a TB3 dock... I thought that was a deal.
 
Last edited:
$100 for a TB3 dock... I thought that was a deal.

Ok Cool.
[automerge]1592120596[/automerge]
I purchased a Sonnet Breakaway 550 and placed a 5600XT inside and I couldn't be happier. Most of my use is video editing, some After Effects, graphic design and some light gaming such as Fortnite.

Photoshop performance has been top notch for me. Same with Illustrator.

Good to know. Thanks for sharing!
 
I purchased a Sonnet Breakaway 550 and placed a 5600XT inside and I couldn't be happier. Most of my use is video editing, some After Effects, graphic design and some light gaming such as Fortnite.

Photoshop performance has been top notch for me. Same with Illustrator.

I have a 13" 10th gen I5 32Gb Ram on the way. I was considering an eGPU for some video work but most of my work is 2D graphic stuff. Someone here mention that it actually makes it slower for image editing...
How do you do? Do you switch it off when working on photos and graphic design work?

I'm confused....
Thanks!
 
I have a 13" 10th gen I5 32Gb Ram on the way. I was considering an eGPU for some video work but most of my work is 2D graphic stuff. Someone here mention that it actually makes it slower for image editing...
How do you do? Do you switch it off when working on photos and graphic design work?

I'm confused....
Thanks!

So it's a bit more complicated than just 2D being slower. Photoshop will more often than not be faster with the built-in GPU, but I would imagine Illustrator could be faster though I don't have experience with that in particular.

Basically it comes down to this.
How much time does it take to do the required computation, and how much time does it take to transfer the data required for the computation.

The internal GPU will do computations slower, but transfer faster and vice versa for external GPU.

So if in a program a certain operation will take
2 seconds to compute and 1 second to transfer on the internal GPU (way larger numbers than it'd probably be in reality for a lot of operations) that's 3 seconds in total.
If the eGPU can do the calculation in 0.5 seconds, but it takes 3 seconds to transfer then that's 3.5 seconds even though it computed it way faster.
So eGPU workflows benefit the most when there's a lot of computation needed without scaling the data transfer size as much as the computational needs.

Regarding turning it off whether you can do that easily or need to eject it depends on how the program is written
 
Chroma == $100 for a TB3 dock... I thought that was a deal.
I've seen youtube videos where they said using the ports on the Chroma for something like an SSD or anything more than a mouse and keyboard would negatively effect the GPU performance due to the limited bandwidth of TB3. I'm not sure if this is true because I have the regular version.
 
So it's a bit more complicated than just 2D being slower. Photoshop will more often than not be faster with the built-in GPU, but I would imagine Illustrator could be faster though I don't have experience with that in particular.

Basically it comes down to this.
How much time does it take to do the required computation, and how much time does it take to transfer the data required for the computation.

The internal GPU will do computations slower, but transfer faster and vice versa for external GPU.

So if in a program a certain operation will take
2 seconds to compute and 1 second to transfer on the internal GPU (way larger numbers than it'd probably be in reality for a lot of operations) that's 3 seconds in total.
If the eGPU can do the calculation in 0.5 seconds, but it takes 3 seconds to transfer then that's 3.5 seconds even though it computed it way faster.
So eGPU workflows benefit the most when there's a lot of computation needed without scaling the data transfer size as much as the computational needs.

Regarding turning it off whether you can do that easily or need to eject it depends on how the program is written


This is an excellent explanation! Thanks so much!
I'm currently using mostly Affinity Photo and Designer (to avoid the subscription model of Adobe). And Sketch for UI design. Affinity apparently supports multiple GPUs and eGPU, I'll test it. Ideally, the best would be for applications to intelligently know which workloads to keep in the internal GPU and the ones which are better to be delegated to the eGPU, so you're always better off.
 
This is an excellent explanation! Thanks so much!
I'm currently using mostly Affinity Photo and Designer (to avoid the subscription model of Adobe). And Sketch for UI design. Affinity apparently supports multiple GPUs and eGPU, I'll test it. Ideally, the best would be for applications to intelligently know which workloads to keep in the internal GPU and the ones which are better to be delegated to the eGPU, so you're always better off.

Cheers

The issue with making apps automatically pick the GPU is that the app can't know in advance how long it'll take on either until it has actually done it. It will change between Mac models, GPU models in both the internal and external setup and even between data. As an arbitrary example, if you apply a colour filter to a totally white photo in Affinity, versus a normal picture taken with a camera, the execution time could be different.

The best that could be done would be rough estimates, but to get the best overall, you'd still have to perform your own testing with your own Mac and GPUs on data that would roughly represent a normal workflow.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.