Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think it could be a toss up between Sonnet box or the OWC box... Looks to be the same.
Back in the days of Firewire and SCSI and Zip Drives i had purchased a Sonnet processor upgrade for a G4. It fried and was not happy at the time. Replaced it with one from OWC and it fried too. But they replaced it really fast and all was well in the world. Guess i still holding a little grudge.
 
Last edited:
  • Like
Reactions: DesertSurfer
So the only way I can get my Mini to use the eGPU is connecting monitor to Mini on initial startup then disconnect and connect to eGPU to get eGPU graphics. Is there a work around this?

Update:

1. If you have a Mac mini (2018) with FileVault turned on, make sure to connect your primary display directly to Mac mini during startup. After you log in and see the macOS Desktop, you can unplug the display from Mac mini and connect it to your eGPU.
 
Last edited:
  • Like
Reactions: F-Train
So the only way I can get my Mini to use the eGPU is connecting monitor to Mini on initial startup then disconnect and connect to eGPU to get eGPU graphics. Is there a work around this?

Update:

1. If you have a Mac mini (2018) with FileVault turned on, make sure to connect your primary display directly to Mac mini during startup. After you log in and see the macOS Desktop, you can unplug the display from Mac mini and connect it to your eGPU.
If your monitor has more than 1 input, is there a problem/downside to having both the mini and the eGPU connected using different inputs and just switching input on the monitor from mini (during boot and login) to eGPU after login is complete? Only downside I can identify is the use of an extra port on the mini.
 
A comment about the Asus housing and the Blackmagic housing...

The important thing about the e-mail from Asus Support (post #186) is that one can purchase the XG Station Pro housing knowing that one can use it with the Vega 56 GPU. I'd want to do some careful calculations before using it with the Vega 64, but I'm wary of the 64 anyway for reasons of heat and noise, and I'm not convinced that I even have a use case for it.

I like the build quality, small footprint and looks of the Asus housing, and especially that the power supply is separate from the housing and that it comes with a 1.5m/5' Thunderbolt 3 cable.

The main argument for the Blackmagic is that it is allegedly markedly quieter than other enclosures. I can't speak for the OWC, Razer or Sonnet enclosures, but having used the Asus with an RX 590, I think that in the case of the Asus this is just sales talk with no basis in reality.

At the same time, the Blackmagic has a clear weakness; namely, that one can't change out the GPU, whether to the new RX 590, a Vega, a GPU from the upcoming AMD Navi series, or a Nvidia GPU when, and if, Apple and Nvidia bury the hatchet.

The price of the Blackmagic may have been understandable when it was released, but with prices for GPUs coming back to earth I can't imagine forking out US$700 for it.

Here in the UK the Blackmagic Pro with its Vega 56 is £1199, the Asus box price is around £250, add a Vega 56 card for what £500, and that’s a saving of £450!!
The standard Blackmagic price I could just about stomach, the Pro model is just a pure rip off.
 
So the only way I can get my Mini to use the eGPU is connecting monitor to Mini on initial startup then disconnect and connect to eGPU to get eGPU graphics. Is there a work around this?

Sorry the question: i will have same problem if i use mini with a monitor with 2-3 plugs? (My monitor has Display port, hdmi, vga ecc )


Have you the problem also after stop or only for first boot?
 
Here in the UK the Blackmagic Pro with its Vega 56 is £1199, the Asus box price is around £250, add a Vega 56 card for what £500, and that’s a saving of £450!!
The standard Blackmagic price I could just about stomach, the Pro model is just a pure rip off.

The U.S. price of the Blackmagic Pro is $1200. The August, 2017 retail launch price of the Vega 56 GPU was $400.* On Black Friday weekend, ASRock was selling its version of the Vega 56 for $340, and I believe that the retail price is going to settle in the next few weeks at $400 to $450. This means that you are paying about $750 to $800 for the enclosure.

It's instructive to read through Blackmagic's web page on its RX 580 and Vega 56 products: https://www.blackmagicdesign.com/products/blackmagicegpu/

Blackmagic's main claim is that its enclosure is "super quiet". However, there does not appear to be any difference in design between its RX 590 and Vega 56 enclosures, the Vega 56 being a more powerful GPU. Also, Blackmagic essentially says, correctly, that noise is a function of fan operation:

Extruded from a single piece of aluminum, the Blackmagic eGPU features a machine anodized finish and a unique thermal grill that’s designed for balanced airflow, convection cooling and efficient heat dissipation. This allows the variable speed fan to run more slowly, resulting in super quiet operation.​

The fact of the matter is that the AMD "partners" that make these AMD GPUs differentiate their products, in part, via heat dissipation and fan operation features. Looking at gaming sites, it's clear that gamers believe, rightly or wrongly, that some partner designs are better at heat dissipation and quiet operation than others. The key point is that there is no difference in principle between what these partners do and what Blackmagic is doing. Indeed, there's a pretty good chance that one of the partners designed the heat dissipation and fan operation features of the GPUs in Blackmagic's enclosure.

For the last three weeks, I've been using Asus's XG Station Pro enclosure with Sapphire's Nitro+ RX 590 GPU, including for gaming with the X-Plane flight simulator. Heat dissipation and noise are absolutely not problems. I have not tested the Blackmagic GPU, but I have a lot of trouble believing that it is somehow "better" when it comes to these issues. Indeed, Blackmagic does not even claim that its enclosure is better; this is just how some people apparently choose to interpret its statements.

Meanwhile, one winds up with an external GPU that can't be upgraded. Forget about AMD's roadmap, which calls for replacement of the Polaris series of GPUs over the next year, and forget about using a Nvidia GPU when, and if, Nvidia and Apple settle their differences.

At US$1200, the Blackmagic Pro strikes me as a very hard sell. Blackmagic's prices might have been understandable when GPUs were selling for crazy money due to demand from cryptocurrency miners, but not now.

* For launch dates and original prices of AMD GPUs, see post #189.
 
Last edited:
I would like to see these enclosures in person, but so far I haven't identified a store in NY where I am able to do so. B&H, which stocks all three (Asus, Razer and Sonnet), apparently does not have any of them on display.

Based solely on reading and videos, I am inclined to purchase the Asus. It looks well built, it looks good, I like that it has a power switch and a 1.5m Thunderbolt cable, and that the power supply is outside the enclosure. It is also said to be the quietest of the three.

I would place the Sonnet enclosures second. My main reservations are looks, size, noise and suggestions on the internet that there may be issues with quality of components, including with the ports. I would like to get clarity on the third issue in particular.

I am impressed with the Razer's design as it relates to getting a card in and out of the enclosure, but given that I don't expect to be changing cards with any regularity, it's not an important feature. For me, the Razer is too big and, by all accounts, too noisy.

I also regard the Sonnet and Razer as complete non-starters unless I can use them with a cable at least 1.5m long. As I understand it, that may be possible given that I will not be using the enclosure to power a laptop (apparently, it's a bandwidth issue). A cable would apparently add about US$60 to the cost. I plan to send an e-mail to both manufacturers asking about this.

I do use the Sonnet Breakaway Box with a 2 m Thunderbolt 3 cable (70 Euros !) and it works fine.
 
  • Like
Reactions: F-Train
If you have a Mac mini (2018) with FileVault turned on, make sure to connect your primary display directly to Mac mini during startup. After you log in and see the macOS Desktop, you can unplug the display from Mac mini and connect it to your eGPU.

@Rockies's post, which is the clearest and most constructive post that I've seen on the FileVault issue, prompted me to try something with my monitor that may be useful to others.

I have been interested in finding an easy way to switch between my mini's internal GPU and my external GPU. There are many things for which I don't need the external GPU and I'd like to minimise power consumption. If you use FileVault, or if your eGPU generates unwelcome noise or heat, an easy way to switch might also be attractive.

On my new monitor, I can summon up an on-screen control panel that lets me select my monitor input port. I have a choice of four HDMI ports, one DisplayPort port and one Thunderbolt port.

As an experiment, I ran an HDMI cable from my mini's HDMI port to one of the monitor's HDMI ports. Then I ran a Thunderbolt 3 cable from one of my mini's Thunderbolt 3 ports to the Thunderbolt port on my Asus external GPU enclosure. Finally, I ran a DisplayPort cable from my external RX 590's DisplayPort port to my monitor's DisplayPort port.

It turns out that I can use the monitor's on-screen control panel to switch between the monitor's HDMI connection to my mini and its DisplayPort connection to my external GPU. The only additional thing that I have to do is turn my external GPU enclosure on or off depending on which connection I want to use.

Coming from an iMac, I don't know whether my monitor is typical. If it is, there would seem to be a simple way to switch between the mini's internal GPU and its external GPU, without the need to mess with cables and port plugs.

P.S. I haven't tested it, but there's no reason why I couldn't do this using two HDMI cables and ports (my RX 590 has two HDMI ports) rather than one HDMI and one DisplayPort.
 
Last edited:
I have a question arising from the test discussed in the post above (#208). Coming from an iMac, my knowledge of current monitor functionality is very limited. Can anyone say whether the monitor function discussed in that post - ability to easily switch between input ports - is now common?

I do realise that this is not the sort of issue that most people are likely to encounter with their monitor day to day, and I'd be surprised if it gets much attention, if any, in user manuals. As far as I know, this idea has not even been raised, let alone tested, in the many discussions about FileVault.

If anyone can test whether their monitor supports this functionality, that would be great. It is highly unlikely that the monitor that I am using is the only one that will support this.

Thanks
 
Last edited:
I have a question arising from the test discussed in the post above (#208). Coming from an iMac, my knowledge of current monitor functionality is very limited. Can anyone say whether the monitor function discussed in that post - ability to easily switch between input ports - is now common?

I do realise that this is not the sort of issue that most people are likely to encounter with their monitor day to day, and I'd be surprised if it gets much attention, if any, in user manuals. As far as I know, this idea has not even been raised, let alone tested, in the many discussions about FileVault.

If anyone can test whether their monitor supports this functionality, that would be great.

Thanks

Yep, extremely common. Some monitors even have built-in KVMs with multiple USB inputs as well.
 
  • Like
Reactions: F-Train
Yep, extremely common. Some monitors even have built-in KVMs with multiple USB inputs as well.

Ha!

For those, like me, who haven't got a clue what KVMs stands for, it's apparently a Keyboard, Video, Mouse switch.

There's been a lot of ink spilled on external GPUs that could have been more focused, if not saved, if people were thinking in these terms.
 
Last edited:
  • Like
Reactions: rmdeluca
Ha!

For those, like me, who haven't got a clue what KVMs stands for, it's apparently a Keyboard, Video, Mouse switch.

There's been a lot of ink spilled on external GPUs that could have been more focused, if not saved, if people were thinking in these terms.

Yeah, I've mentioned it several times now throughout the various eGPU discussions. Yet I'm still regularly seeing people afraid to embrace eGPU because they don't realize they only have to press a few buttons on their monitor when they boot into Windows or when entering the FileVault/firmware password.

The value added by an eGPU is much greater than the slight hassle of pressing buttons on one's monitor during bootup.
 
  • Like
Reactions: F-Train
If your monitor has more than 1 input, is there a problem/downside to having both the mini and the eGPU connected using different inputs and just switching input on the monitor from mini (during boot and login) to eGPU after login is complete? Only downside I can identify is the use of an extra port on the mini.

I do realise that this is not the sort of issue that most people are likely to encounter with their monitor day to day, and I'd be surprised if it gets much attention, if any, in user manuals. As far as I know, this idea has not even been raised, let alone tested, in the many discussions about FileVault.
 
Is anyone running a Gigabyte Vega 56 on their Mac? Any compatibility issues or it’s straight plug-n-play?
 
Does using a egpu through thunderbolt 3 cause performance penalty on your GPU?

For instance i’m looking into a Mac mini or MacBook Pro 15” and pairing one of those with a vega 64. If there’s a performance penalty would it have equivalent performance of a vega 56 or something slower?

Currently i’m swaying towards a 15 inch 2018 MacBook Pro with an egpu as my sole computers for fcpx editing

I’m sure I’ve asked this question in this thread before but I’ve not found a conclusive answer with the facts to back it up anywhere online. I also don’t feel like experimenting as it’s quite an expensive one.
 
Does using a egpu through thunderbolt 3 cause performance penalty on your GPU?

For instance i’m looking into a Mac mini or MacBook Pro 15” and pairing one of those with a vega 64. If there’s a performance penalty would it have equivalent performance of a vega 56 or something slower?

Currently i’m swaying towards a 15 inch 2018 MacBook Pro with an egpu as my sole computers for fcpx editing

I’m sure I’ve asked this question in this thread before but I’ve not found a conclusive answer with the facts to back it up anywhere online. I also don’t feel like experimenting as it’s quite an expensive one.

Yes, there is a performance penalty.

https://forums.macrumors.com/threads/external-gpu-egpu-resources.2154653/page-5#post-26823254
 
Last edited:
Yeah, I've mentioned it several times now throughout the various eGPU discussions. Yet I'm still regularly seeing people afraid to embrace eGPU because they don't realize they only have to press a few buttons on their monitor when they boot into Windows or when entering the FileVault/firmware password.

The value added by an eGPU is much greater than the slight hassle of pressing buttons on one's monitor during bootup.

Are you connecting monitor to both mac mini and eGPU?
 
Does anyone know if these comparisons make sense for x-plane 11?

I have an RX 590 and I've started using it with X-Plane 11. If you use X-Plane, you know that the bottom line is frame rate, and that frame rate depends not just on choice of resolution, but very much on choices in graphics/rendering settings and flight area, some flight areas requiring manipulation of a lot more visual data than others (e.g. NYC vs Nantucket Island). Plus, some of X-Plane's rendering settings are CPU intensive and some are GPU intensive.

There are so many performance variables in X-Plane - one can spend a lot of time tinkering with them - that I think it's a good idea to keep in mind @rmdeluca's caveat in his post: "These are of course just guidelines. Performance in individual titles and vs. different models of GPU cards will vary."
 
Last edited:
Yeah it depends on how the GPU is being utilized. For gaming, the trend will usually be you'll see more relative slowdown as you transition from GPU limited to CPU limited frame rates. In other words, as resolutions and qualities go down and frame rates increase.

If you want to get the equivalent of a GPU's native performance (i.e. if it were seated in an x16 PCIe 3.0 slot in an otherwise equivalent spec PC), the rule of thumb is you need to go up one "class" of GPU. For example, my GTX 1080 as an eGPU is doing almost exactly what a GTX 1070 in a desktop can do at 3440x1440 in Rise of the Tomb Raider with Ultra settings (49.56 vs 49.7FPS).

So, if you want:

Native GTX 1080Ti performance from an eGPU, use an RTX 2080Ti. This is your best shot for 4K@60FPS ultra settings in modern games.

Native GTX 1080 performance from an eGPU, use an RTX 2080 or GTX 1080Ti. This should be good for WQHD (3440x1440) @ 60+FPS ultra settings in modern games. Slightly overkill for 2560x1080 or even QHD (2560x1440).

Native GTX 1070 performance from an eGPU, use a GTX 1080, RTX 2070 or Vega 64. This should be good for 1080p @ 60+FPS ultra settings in modern games.

Native GTX 1060 performance from an eGPU, use a GTX 1070 or Vega 56. This should be fine for 1080p at high settings.

A Radeon RX 590 will be fine for gaming at medium settings @ 1080p.

These are of course just guidelines. Performance in individual titles and vs. different models of GPU cards will vary.

These guidelines do not apply if you're using the eGPU for something else like OpenCL/CUDA.
So for instance i’m only playing to use the egpu with Final Cut Pro and Davinci Resolve so I shouldn’t lose performance based on that correct?
 
So for instance i’m only playing to use the egpu with Final Cut Pro and Davinci Resolve so I shouldn’t lose performance based on that correct?

No. You'll almost always lose some performance running a card over TB3 as opposed to what it could have done if it were running natively in a x16 PCIe 3.0 slot with all other hardware being the same. In most workloads we're seeing the overhead is proving to be around 15-20%.

What you really care about though is performance relative to what you have now.

The UHD 630 iGPU in your Mini is anywhere from 3 to 13 times slower than the GPUs you currently can use under Mojave, if we include all AMD GPUs between an RX 560 and RX Vega 64, inclusive.

If you have no prior reference to compare to other than the 2018 Mini's iGPU, any card in the above range will be faster with workloads that can properly utilize an eGPU than what you had before. The only time the overhead really matters is if you're trying to replicate a specific performance level seen on an otherwise equivalent desktop machine. Then, the "go up one class of GPU" rule of thumb I provided will rarely steer you wrong.

If you're buying an eGPU specifically to accelerate specific professional tasks you should buy as much GPU as your budget allows and not worry about the fact that it's going to run 15 or 20% slower as an eGPU. Right now the bang-for-buck sweet spot is between an RX 580 and RX Vega 64. The price for an RX 580 as an eGPU starts at around $440:

https://www.amazon.com/Sonnet-Breakaway-Thunderbolt-Expansion-GPU-350W-TB3Z/dp/B077K8KNDS?th=1

The Vega 64 is currently available for around $400:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814202326

But you'll have to put it in an enclosure that usually retails for around $350 (there are others that will work too):

https://www.sonnetstore.com/products/egfx-breakaway-box-650?variant=7209888055330
 
Last edited:
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: F-Train
Sapphire, one of the AMD "partners", has started selling an eGPU enclosure. MSRP is US$330, but Amazon is selling it for $260. It's also being sold bundled with a Sapphire RX 580.

Overview here: https://egpu.io/forums/thunderbolt-...apphire-gearbox-thunderbolt-3-egfx-enclosure/

The screenshot below is from http://www.sapphirepgs.com/productdetail.asp?IDno=121&tag=spec&lang=eng



Screenshot 2018-12-03 at 4.12.23 PM.png
 
I have an RX 590 and I've started using it with X-Plane 11. If you use X-Plane, you know that the bottom line is frame rate, and that frame rate depends not just on choice of resolution, but very much on choices in graphics/rendering settings and flight area, some flight areas requiring manipulation of a lot more visual data than others (e.g. NYC vs Nantucket Island). Plus, some of X-Plane's rendering settings are CPU intensive and some are GPU intensive.

There are so many performance variables in X-Plane - one can spend a lot of time tinkering with them - that I think it's a good idea to keep in mind @rmdeluca's caveat in his post: "These are of course just guidelines. Performance in individual titles and vs. different models of GPU cards will vary."
All true. Once you have your settings the way you like them, maybe you could share so we can see what you can do with x-Plane on the RX 590 in an eGPU.
 
No. You'll almost always lose some performance running a card over TB3 as opposed to what it could have done if it were running natively in a x16 PCIe 3.0 slot with all other hardware being the same. In most workloads we're seeing the overhead is proving to be around 15-20%.

What you really care about though is performance relative to what you have now.

The UHD 630 iGPU in your Mini is anywhere from 3 to 13 times slower than the GPUs you currently can use under Mojave, if we include all AMD GPUs between an RX 560 and Vega 64, inclusive.

If you have no prior reference to compare to other than the 2018 Mini's iGPU, any card in the above range will be faster with workloads that can properly utilize an eGPU than what you had before. The only time the overhead really matters is if you're trying to replicate a specific performance level seen on an otherwise equivalent desktop machine. Then, the "go up one class of GPU" rule of thumb I provided will rarely steer you wrong.

If you're buying an eGPU specifically to accelerate specific professional tasks you should buy as much GPU as your budget allows and not worry about the fact that it's going to run 15 or 20% slower as an eGPU. Right now the bang-for-buck sweet spot is between an RX 580 and Vega 64. The price for an RX 580 as an eGPU starts at around $440:

https://www.amazon.com/Sonnet-Breakaway-Thunderbolt-Expansion-GPU-350W-TB3Z/dp/B077K8KNDS?th=1

The Vega 64 is currently available for around $400:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814202326

But you'll have to put it in an enclosure that usually retails for around $350 (there are others that will work too):

https://www.sonnetstore.com/products/egfx-breakaway-box-650?variant=7209888055330
Thanks for such a detailed response, I was considering getting the mini. However I currently have top spec 2017 5k iMac with the 580 but I need a laptop but currently don’t want to keep using two systems.

I’m considering MacBook Pro 15” 2018 six core mobile processor with Vega 64 via eGPU. I’m hoping it’ll give me better graphic performance in Final Cut for video effects and things that are GPU accelerated tasks as for certain videos I do this element causes me the biggest delay.

I gusss i’m trying to figure out if MacBook Pro with Vega 64 can give me more performance than the 580 in the iMac?
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.