Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you want proper official drivers that list support for Maxwell, Pascal, Turing and RTX cards you need to put the spotlight on Nvidia. People blaming Apple are misguided and not using their heads.
Again, NVidia submitted the drivers to apple long ago. It's up to apple to approve them. Or not if there's a problem and send it back to NVidia for improvements. Apple hasn't done either. So it's stuck in limbo and squarely on apple as a result. Now if they did communicate the issues to NVidia and NVidia hasn't rectified this then yes, that would be on them. But from what we know that's not the case.

It could be that apple is releasing an API so companies can develop their own drivers directly as the linked article up above states and that's why they aren't approving or communicating with NVidia as this will all be moot soon.
 
Last edited:
LoL you just contradicted yourself.
Anyway take a look here. https://www.techsiting.com/best-thin-and-light-gaming-laptops/
The Nvidia GeForce RTX 2070 8GB sounds pretty desktop class to me.

Ok, I stand corrected and not afraid to say so :) Though if the desktop class GPU is being heavily used I wonder just how large an effect that has on battery life. The other benefits I wrote of an eGPU are still valid (adding power, upgrading GPU, etc).
 
I really wish eGPU review articles would come up with some kind of way of giving some noise levels for some typical setups. While I care about performance and cost, noise is actually one of my top priorities (which, as you'll see, is why I bought a Blackmagic).

Incompatible with thunderbolt 3 display is a problem. Using all these eGPU except BlackMagic prevent you from using the LG UltraFine 5K.

I'm not sure most people here care about the LG UltraFine 5K though, do they? IMO, the Blackmagic has a lot more going for it than that. I don't have a LG 5K, but bought the Blackmagic.

It’s interesting you assume it’s Apple’s fault.

I don't think we have to assume.

These eGPU’s are confusing to me. Sure having a laptop that you can game on would be cool, and sure a laptop that you can do more heavy editing on would be cool as well. But at what point should we just get a desktop? eGPU IS $400 (if I understand correctly that is not including the actual GPU...) another $300-400 for a great GPU a decent screen that is another $500. That almost $1,500. Seems crazy to me. Maybe I’m missing something?

Different workflows and use-cases for different people. If you primarily use a laptop because you're out and about a lot, then it is great to just sit at your desk and plug-in to a bunch more power, screens, etc. Or, in my case, I don't want to build a desktop and Apple didn't have any good desktops that fit my need, so adding one to a Mac mini created the nearly perfect system for me.

And the only time you get a bottleneck is if you use a laptop with it without an external monitor. Then you'd take about a 30% performance hit. If you use an external monitor the performance hit is minor, if any at all.

... I prefer to use my Mac. It's not just about gaming but also media creation. My girls can take their MacBook air's and just plug it in and use an ultra wide screen monitor and game on it too. Or work on video edits. Everyone in the house can make use of it. It's really rather convenient.

I'd add also to be careful about the ports these eGPUs. If you're using them as a 'dock' you can easily hit performance bottlenecks. Yes, if you only use the GPU aspect, it's about a 10% hit or less (ie: more like 'up to 10% performance hit' 10% if you're doing something that actually maxes out GPU communication).

Good point about usefulness though. My son has a 13" MBP, and my wife a MBA. If either of them needed some extra GPU performance (and could get it away from me) they could use it by just plugging in. And, if I ever need more GPU-power, I can get a new eGPU and pass my current one down to my son, etc.

Nvidia has no working relationship with Apple after burning all their bridges releasing one POS after another card that caused all sorts of issues.

Yeah, and I personally got 'burned' by the whole nVidia thing with two MBPs I owned (one, Apple fixed for a while, but it eventually prematurely became useless to me), yet I'm enough of an adult that I'm past that.... and I'm sure it was harder financially on me than Apple.

While they play their little infantile spats, they are hurting a bunch of their users in substantial ways.

Also with Windows computers you can have it all, gaming and productivity.

And, well, Windows.
I run my setup with Boot Camp, so I'm well aware of how both sides of the fence work (including decades of work in IT, much of it with Windows).

I cringe every time I need to do much work in Windows. Sure, once you're in a game, I suppose there isn't a ton of difference (unless you're trying to hook a controller up or something like that), but they aren't even close in terms of productivity overall (unless you're just talking raw hardware performance possibilities).

To many of us, the OS matters a lot too!

Actually looking at the numbers again this razer enclosure with a 580 is still $100 cheaper than the equivalent black magic. And you can upgrade your card while you can't on the blackmagic which is a dealbreaker. Of course you can get the cheaper razer core X enclosure and save $200.

Some of the early enclosures had issues. Now it's fine. Have you read the crappy blackmagic reviews on apples site?

Only $100 more? Worth every penny (and more) to me! The thing about the Blackmagic is the design, being quiet, compatibility, etc. (It's a bit like the Mac/Windows debate, actually.)

Yes, upgradability is a problem, but depending on your situation (see above), you can easily hand them down, so they don't become instantly useless.

re: reviews - I've read all kinds of reviews and articles (with comment sections), and most of them are clueless people just complaining about overpriced, no upgradability, etc. Are there some real concerns I should be aware of? I love mine... guess I should go leave a review.

Does Windows support external GPU's ?

Yes, but it can be tricky getting it going. That said, it seems to be improving all the time. Just weeks before I got mine going, it involved all kinds of really tricky hacks. I was able to get mine going (was going to write an article on how) without any hacks (just some tricky 'right order' type stuff). Now, from what I've read, it's relatively straight forward (haven't tried again yet), just knowing when to switch cables, etc.

I'm assuming/hoping it will eventually be almost plug-n-play with a core set of eGPUs.

(Note: I'm talking about Boot Camp and eGPUs here... they are plug and play in MacOS. Also, I 'haven't tried again' because I'm mid-project for a few months. After that, I'll probably start from scratch again just to see how the situation has improved.)
 
Last edited:
How do I connect this to my Newton? It's been running kinda slow lately...

I think you need to get the PCMCIA -> PCI adapter and then a PCI -> PCIe adapter. Connect a Thunderbolt 3 card and just a bit of driver software and you should be golden. I have an nVidia RTX 2080 TI connected to my HP 200LX and it almost doubled the performance of the internal display! :-D
 
Only $100 more? Worth every penny (and more) to me! The thing about the Blackmagic is the design, being quiet, compatibility, etc. (It's a bit like the Mac/Windows debate, actually.)

Yes, upgradability is a problem, but depending on your situation (see above), you can easily hand them down, so they don't become instantly useless.

Sure, everyone's needs and priorities are different. For me upgradability is #1, and performance. I don't care about sound. I'm used to having several loud PC's running in the room for decades so that's a non issue for me. But I can see how it would be an issue for others.
 
  • Like
Reactions: SteveW928
Sure, everyone's needs and priorities are different. For me upgradability is #1, and performance. I don't care about sound. I'm used to having several loud PC's running in the room for decades so that's a non issue for me. But I can see how it would be an issue for others.

There is also one other advantage of the Blackmagic I've recently discovered (and hopefully it is safe to do)...

Blackmagic-mini.jpg


It fits nicely on top of the mini, and though I can't tell too much difference yet in terms of fan-speeds or temps, I would think it would help the overall cooling situation just a bit (as the body of the mini gets relatively hot). It also makes for a space-saving/neat desk setup.

But, yeah, some might not care about quiet at all. Then, I'd probably agree with the better value of upgradable eGPU boxes. But, even running 100%, this thing just makes almost no discernible sound unless you know what you're listening for and are right on top of it. The mini is dramatically more noisy (though if you turn off TurboBoost, it is generally pretty quiet too).

Partly, I just like a quiet environment. But, I also do recording, so quiet has practical application. :)
 
Oh I got it, but a stupid post deserves a stupid response, don’t you think?

Sorry that went right over your head... whooossshhhh

Makes sense now doesn't it? Try to keep up mmmmmm kaaaaaay?

So your point is that Macs that need this eGPU are “so far behind” Windows PCs... which... also need this eGPU???

Are you sure you thought this through? Guess you’re the one who missed the point—of your pwn post... D’oh!!!

Bizarre post, hmmmm... yup avatar checks out.

:confused::eek::rolleyes:

The eGPU is needed for WINDOWS laptops .... not WINDOWS DESKTOPS. Try to keep up .... um kaaaaaaaay?
 
There is also one other advantage of the Blackmagic I've recently discovered (and hopefully it is safe to do)...

View attachment 833652

It fits nicely on top of the mini, and though I can't tell too much difference yet in terms of fan-speeds or temps, I would think it would help the overall cooling situation just a bit (as the body of the mini gets relatively hot). It also makes for a space-saving/neat desk setup.

But, yeah, some might not care about quiet at all. Then, I'd probably agree with the better value of upgradable eGPU boxes. But, even running 100%, this thing just makes almost no discernible sound unless you know what you're listening for and are right on top of it. The mini is dramatically more noisy (though if you turn off TurboBoost, it is generally pretty quiet too).

Partly, I just like a quiet environment. But, I also do recording, so quiet has practical application. :)
Really? My mini is dead silent unless I'm video encoding which ramps the temp and the fan. Other than that can't hear it at all. I don't turn off turboboost.
 
How are Mac's far behind in this sense? You can't have a desktop class GPU in a laptop – dedicated power hungry gaming laptops don't count – and good luck trying to upgrade any GPU that's in a laptop. An eGPU case such as this is the perfect device to gain graphics performance if you don't have nor want a desktop. The negative for some is lack of Nvidia support in Mac OS but that's not an issue at all for many. Now if you are saying that iMac's are far behind, you might have an argument, but an eGPU like this can really benefit the laptop user.

I'm not referring to laptops, laptops will always be less powerful than a desktop, because of the small form factor.

I'm referring to imacs, mac pros, mac minis, vs windows desktops.
[doublepost=1556127870][/doublepost]
Oh I got it, but a stupid post deserves a stupid response, don’t you think?

Sorry that went right over your head... whooossshhhh

Makes sense now doesn't it? Try to keep up mmmmmm kaaaaaay?

So your point is that Macs that need this eGPU are “so far behind” Windows PCs... which... also need this eGPU???

Are you sure you thought this through? Guess you’re the one who missed the point—of your pwn post... D’oh!!!

Bizarre post, hmmmm... yup avatar checks out.

:confused::eek::rolleyes:

BEST ... PRESIDENT ... EVER
 
Really? My mini is dead silent unless I'm video encoding which ramps the temp and the fan. Other than that can't hear it at all. I don't turn off turboboost.

That's odd. At first, I thought maybe you didn't have the i7, but see you do from your profile. Maybe your other boxes are too noisy? ;) But, seriously, I was a bit surprised how easily the fans spun up on it. I wanted the more cores, but don't care so much about the TB most of the time, so I just disabled that. Now, I have to do something pretty crazy to get the fans to spin up.

Also, BTW, if you can use HEVC h.265 for your video encoding output, it is WAY faster than using the CPUs (as it uses the T2) and keeps the cores running cooler, too.
 
That's odd. At first, I thought maybe you didn't have the i7, but see you do from your profile. Maybe your other boxes are too noisy? ;) But, seriously, I was a bit surprised how easily the fans spun up on it. I wanted the more cores, but don't care so much about the TB most of the time, so I just disabled that. Now, I have to do something pretty crazy to get the fans to spin up.

Also, BTW, if you can use HEVC h.265 for your video encoding output, it is WAY faster than using the CPUs (as it uses the T2) and keeps the cores running cooler, too.
My other boxes are rarely turned on so it's just the mini.
I do use h.265 but the software I use, handbrake, doesn't support hardware encoding. Well it does with "videotoolbox" however that doesn't do 10bit encoding which I need.
Can't use an AMD card and enclosure because none of those cards support 10bit. Have to wait for navi for those.
 
Last edited:
Again, NVidia submitted the drivers to apple long ago. It's up to apple to approve them.

They submitted drivers for Mojave, they claim, which have not been approved. This should tell you something stinks in the drivers because Apple can't lawfully reject drivers unless there is a problem. Do a quick google search for the bugs in the web drivers for the last 4 years. Then take a look at the driver download pages for Sierra and High Sierra. There are no official drivers for Maxwell or Pascal even when Apple had approved the drivers for those operating systems. Official support is only for 6/7 series including Quadra from that family.

This is not something to debate about. We have already done this like mad on Mac Pro and Hackintosh forums for 4 years. The debate descended into lunatic brand tribalism instead of telling Nvidia to be straight with the facts. Are newer cards officially supported by them or will they keep it in perpetual buggy beta. Ask Nvidia instead of going loony on Internet forums.

Nvidia options and user upgrades will probably be back in the future but not until a new Mac Pro and 10.15 ships. Until then Nvidia has time to finalize the drivers properly. Apple can then roll them into the next version of macOS.
 
Last edited:
2 years and I am still waiting for a solution that isn't the black magic...
[doublepost=1556088391][/doublepost]
I use my 5K and accept the performance loss from going eGPU > Macbook > Display, but your set up sounds very interesting. Any more details of pics/specs?

I have Gigabyte Z390 Ultra, which come with thunderbolt header pin on it (This is a must have on mobo to support), and it's hook up with Gigabyte TitanRidge/AlpineRidge thunderbolt AIC.
The GPU display signal output, use 2 Displayport and connected to the thunderbolt AIC on the back. Gigabyte Thunderbolt AIC then connects with the LG UltraFine 5K. This way thunderbolt AIC will bind dual DP1.2 signal from GPU and connects with the monitor.

Mic, camera, daisy chain, and hubs all work out of box from there..... :)
 
But, yeah, some might not care about quiet at all. Then, I'd probably agree with the better value of upgradable eGPU boxes. But, even running 100%, this thing just makes almost no discernible sound unless you know what you're listening for and are right on top of it. The mini is dramatically more noisy (though if you turn off TurboBoost, it is generally pretty quiet too).

Reading this and your other comment it is very similar to many observations I’ve had with the Blackmagic.

My office setup is a clamshell 2018 MBA with the 580 BM and some nice Edifier speakers. I def do not want any fan noise and stability / integration into the ecosystem is paramount to me.

The clamshell / dock setup with the BM is insanely good. One USB-c I get power, HD Audio, display, webcam, HQ mic its amazing.

I don’t care about upgradeability because it is a fine performer and I’ll sell it for a discount to go either to the Vega or whatever comes next. That kind of money is not worth the hassle for me.

The BM is an excellent piece of hardware. I think I took is truly underrated.

Interesting to see it stacked on a mini!

I am very curious if the 580 BM would run this 6k monitor though. Seems like it might be pushing it even for that.
 
  • Like
Reactions: SteveW928
My other boxes are rarely turned on so it's just the mini.
I do use h.265 but the software I use, handbrake, doesn't support hardware encoding. Well it does with "videotoolbox" however that doesn't do 10bit encoding which I need.
Can't use an AMD card and enclosure because none of those cards support 10bit. Have to wait for navi for those.

Hehe, I was just joking about the other boxes. :)

I suppose it is all about use, though, regarding fans spinning up. If you just did browser, email, or stuff like that then they won't spin up. But, almost any heavy stuff I do does spin them up, and quickly. You don't notice the same?

For example, I run Folding@home (which, yes, pushes things) in the background most of the time. When I first installed and launched it (when I got the mini), fans like crazy. So, I started removing cores (ie: total CPU usage). I think I got down to 1 core before the noise got down to a reasonably quiet level (still not what I'd call quiet...). Around maybe 20-30% load and up, and I may as well just run all-out, as the noise ramps up so much.

I guess I was thinking that maybe if I were running over 50% total CPU utilization, maybe I should expect some fans, but not 20%. And, it isn't like I'm in a hot environment running 20% for hours before they kick in either. We're talking a minute or less (more like 10 seconds probably).

Or, launch Minecraft... lots of fan noise. Or, more than a few seconds of anything that pushes the CPUs up (like any video encoding, rendering, etc.).

The good news, I guess, is that a lot of the day to day things a typical user does probably doesn't any longer push the CPUs up enough... in other words, they probably are only using a few percent of their performance most of the time.

BUT (and that's why I mentioned disabling Turbo Boost), I think the problem is that Apple just designed the thermal system WAY too close. Turbo Boost doesn't impact overall performance THAT much (what is it, like 20% or less?). I still get all my cores, but with TB off, the fans seem capable at running at low enough speeds to not be too noisy, and keep the CPUs at a semi-happy temp (it seems to hold around 90° which seems high, but I'm told is the design and within spec). And, I can run it like that 24x7 and it stays nearly no noise.

That makes me think if Apple had done just a bit better job with the thermals, the full potential of the CPUs could have been realized. I don't think, even with fans screaming, we're getting the full potential out of these things. I guess they wanted to keep that same case, but I'd happily double the size if they could give me better cooling with that space!

They submitted drivers for Mojave, they claim, which have not been approved. This should tell you something stinks in the drivers because Apple can't lawfully reject drivers unless there is a problem.

I think you lost me at the part about Apple can't lawfully... why not? They can reject anything they want to, afaik.

Interesting to see it stacked on a mini!
I am very curious if the 580 BM would run this 6k monitor though. Seems like it might be pushing it even for that.

I hope it is OK (stacked that way), but I don't see why not. The mini seems physically strong enough and it should help the overall heat situation. The BM has tons of excess cooling capability.* The BM is kind of heavy, but not compared to what I've seen some people put on minis, like a monitor. The weight is very evenly distributed and near the edges for the contact points. I mainly did it to save space.

* How do I know? :) When I did my first Boot Camp setup, I couldn't get the GPU drivers working properly so sometimes the BM would just randomly ramp fans up to what I assume was full speed (I think about 1600 rpm, it normally runs about ~525-550 rpm at 100% utilization). It sounded like a small furnace running on my desk! The proper AMD drivers resolved that issue. But, when it was running 100% with the fans spun up to full, I think the GPU temperature wasn't too much over room temp (ie: in the mid to high 20s C) if I remember correctly.

What I'd really like is to put my Mac mini guts in that BM unit. I'd bet it could keep the mini cool, even running full out, w/o much if any noise, too. After a few years (and warranties expired), I'll have to find a used BM and mini and do some experimenting.
 
Hehe, I was just joking about the other boxes. :)

I suppose it is all about use, though, regarding fans spinning up. If you just did browser, email, or stuff like that then they won't spin up. But, almost any heavy stuff I do does spin them up, and quickly. You don't notice the same?

For example, I run Folding@home (which, yes, pushes things) in the background most of the time. When I first installed and launched it (when I got the mini), fans like crazy. So, I started removing cores (ie: total CPU usage). I think I got down to 1 core before the noise got down to a reasonably quiet level (still not what I'd call quiet...). Around maybe 20-30% load and up, and I may as well just run all-out, as the noise ramps up so much.
Folding@home would push the CPU so I'd expect the fans to spin up for sure.
I run a plex server so serving inside or outside the home doesn't cause the fans to get loud. My wife does some photo work and that doesn't do it either.

There's an app, mac fan control, that shows the temps on all the sensors inside the mini and even with the CPU under full load and running hot the other components in the mini stay relatively cool. So the design is very good in terms of isolating and removing CPU heat from the rest of the mini.
 
These eGPU’s are confusing to me. Sure having a laptop that you can game on would be cool, and sure a laptop that you can do more heavy editing on would be cool as well. But at what point should we just get a desktop? eGPU IS $400 (if I understand correctly that is not including the actual GPU...) another $300-400 for a great GPU a decent screen that is another $500. That almost $1,500. Seems crazy to me. Maybe I’m missing something?

The point being that you get to upgrade those components as you want. Instead of just buying an iMac once it has aged. For example, I have a Mac Mini 6 core with 32GB of ram. Once my Radeon 64 is old in the tooth, I can upgrade it to whatever the latest GPU is supported by MacOS. And once I want to replace my display, I just replace it.

That is the benefit for me, for using these enclosures. For other people it might be that they can GPU enable their laptop for gaming, and not have to own a desktop at all.
 
Folding@home would push the CPU so I'd expect the fans to spin up for sure.
I run a plex server so serving inside or outside the home doesn't cause the fans to get loud. My wife does some photo work and that doesn't do it either.

There's an app, mac fan control, that shows the temps on all the sensors inside the mini and even with the CPU under full load and running hot the other components in the mini stay relatively cool. So the design is very good in terms of isolating and removing CPU heat from the rest of the mini.

Yeah, it wasn't that I didn't expect F@H to push the CPU, but how quickly it started the fans and how low of overall CPU usage I had to drop to to somewhat quiet them (until I discovered turning off Turbo Boost). On the previous Mac I ran F@H on seriously (2012 iMac quad-core), I could go up to like 50-60% before I'd hear the fan.

re: Plex, I suppose you'd be OK until it started transcoding. (BTW, I'm a bit ticked at Plex, as they don't allow you to just transfer videos to devices, but seem to want to re-encode everything. And, it looks like people have been requesting that to change since like 2013. :( )

Good to hear the rest of the components stay cool, though.
 
That's because the drivers are built in to MacOS. Nvidia submits them to apple who signs them and puts them in the OS. You don't download them from nvidias site.
exactly apple included native support for all the Nvidia models they used on their Macs
all the cards or models that apple didn't include in their Macs then that is Nvidia's problem
because Nvidia is the one that have to provide the drivers for those as a separate download

if there is any Mac with pascal or turing?
no right so apple is not obligated to make drivers for those

don't think I'm defending apple , I'm a Nvidia fan and I'm very upset at Nvidia and equally upset at Apple
 
Yeah, it wasn't that I didn't expect F@H to push the CPU, but how quickly it started the fans and how low of overall CPU usage I had to drop to to somewhat quiet them (until I discovered turning off Turbo Boost). On the previous Mac I ran F@H on seriously (2012 iMac quad-core), I could go up to like 50-60% before I'd hear the fan.

re: Plex, I suppose you'd be OK until it started transcoding. (BTW, I'm a bit ticked at Plex, as they don't allow you to just transfer videos to devices, but seem to want to re-encode everything. And, it looks like people have been requesting that to change since like 2013. :( )

Good to hear the rest of the components stay cool, though.

Even transcoding it stays silent as it’s not pushing the cpu all that hard. At least not to 100% load.
 
exactly apple included native support for all the Nvidia models they used on their Macs
all the cards or models that apple didn't include in their Macs then that is Nvidia's problem
because Nvidia is the one that have to provide the drivers for those as a separate download

if there is any Mac with pascal or turing?
no right so apple is not obligated to make drivers for those

don't think I'm defending apple , I'm a Nvidia fan and I'm very upset at Nvidia and equally upset at Apple

I never said Apple had to make the drivers. NVIDIA gave them to apple long ago.
 
I never said Apple had to make the drivers. NVIDIA gave them to apple long ago.
hello there, I made a mistake, instead of replying to you and also to soy Capitan
I only replied to you
but I wanted to replied to him too

so one part of my message was to you and the other part was to him
I'm trying to find the same message you replied to so I can fix my comment
but I guess is too late to fix it now

basically my first paragraph was for you and the 2nd paragraph was for soy Capitan
regards
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.