Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm not sure what the problem is. I'm sorry. You might have more help over at egpu.io there are real experts there.

no worries.

I reset the SMC, cleared the PRAM, disabled csrutil again...

The RX 480 works again w/ the background deamon running...

AND got a slightly higher LuxMark score:

Screen Shot 2017-01-22 at 2.33.33 AM.png
 
Last edited:
  • Like
Reactions: sd70mac
Are there any GPUs coming out this year that have thunderbolt3/usbc outputs for video?

I love the size and portability of my 13" MBP but I want to be able to drive 2 5K ultrafines from it and the only way I can do that is with an eGPU enclosure.
 
  • Like
Reactions: jblagden
I've been running my egpu setup on a nMP in Yosemite since Dec 2015 with little to no issues. I use a 980ti for rendering only; I don't need to play games or hook up another monitor to it. Just last night I updated my machine to El Capitain and I'm a bit worried about this whole deal with SIP being disabled, etc.

I just discovered that certain parts of SIP can be disabled, so I just rebooted into Recovery Mode and used 'csrutil enable --without kext' since I figured that if it's only the modified kexts that need to be loaded, then everything else doesn't need to be disabled. It turns out that my system sees the 980ti after the fact, so that's good. Are there any potential issues with leaving parts of SIP enabled and disabled simultaneously?

Here's Terminal's output after 'cstutil status'

Code:
System Integrity Protection status: enabled (Custom Configuration).

Configuration:
    Apple Internal: disabled
    Kext Signing: disabled
    Filesystem Protections: enabled
    Debugging Restrictions: enabled
    DTrace Restrictions: enabled
    NVRAM Protections: enabled

This is an unsupported configuration, likely to break in the future and leave your machine in an unknown state.

I'd also like to note that I used the automate-eGPU script and discovered (maybe in this thread) that the Thunderbolt cable needs to be plugged into the last two TB ports on the nMP (Bus 0). I was initially having boot issues with the cable plugged into other ports, which included the dreaded folder with the question mark, but I had reset the PRAM and re-disabled SIP to get things back up and running again.
 
  • Like
Reactions: jblagden
I've been running my egpu setup on a nMP in Yosemite since Dec 2015 with little to no issues. I use a 980ti for rendering only; I don't need to play games or hook up another monitor to it. Just last night I updated my machine to El Capitain and I'm a bit worried about this whole deal with SIP being disabled, etc.

I just discovered that certain parts of SIP can be disabled, so I just rebooted into Recovery Mode and used 'csrutil enable --without kext' since I figured that if it's only the modified kexts that need to be loaded, then everything else doesn't need to be disabled. It turns out that my system sees the 980ti after the fact, so that's good. Are there any potential issues with leaving parts of SIP enabled and disabled simultaneously?

Here's Terminal's output after 'cstutil status'

Code:
System Integrity Protection status: enabled (Custom Configuration).

Configuration:
    Apple Internal: disabled
    Kext Signing: disabled
    Filesystem Protections: enabled
    Debugging Restrictions: enabled
    DTrace Restrictions: enabled
    NVRAM Protections: enabled

This is an unsupported configuration, likely to break in the future and leave your machine in an unknown state.

I'd also like to note that I used the automate-eGPU script and discovered (maybe in this thread) that the Thunderbolt cable needs to be plugged into the last two TB ports on the nMP (Bus 0). I was initially having boot issues with the cable plugged into other ports, which included the dreaded folder with the question mark, but I had reset the PRAM and re-disabled SIP to get things back up and running again.

From my understanding, you don't need to have SIP disabled forever. You disable SIP, run automate-eGPU, then reenable sip assuming you are using the Nvidia Web Driver.

Anytime you do an OS update/upgrade you will need to do the same process again.
 
From my understanding, you don't need to have SIP disabled forever. You disable SIP, run automate-eGPU, then reenable sip assuming you are using the Nvidia Web Driver.

Anytime you do an OS update/upgrade you will need to do the same process again.

Awesome! Thank you, raha613. I just re-enabled SIP and the system sees the card. I don't recall seeing in the thread that this was possible. I was going off the (helpful) information from hkoster1 on page 2 and was under the assumption that it needed to remain disabled for the eGPU to be seen.

Kexts in the /System/Library/Extensions folder can now only be modified and loaded when System Integrity Protection (SIP) is switched off.

I'm putting together a duplicate eGPU rig for my work machine, and I don't think my IT department will be thrilled that I'm hacking my Mac, so I'm testing this at home first so I can understand what exactly I'm getting myself into.
 
Last edited:
Got into a problem with my card. The problem persists after I changed to a RX 460 (from the 750 Ti) but seems to occur way less frequently.
System just crashing out, I thought it was only at games and hardware-pressuring things, but the crash also happens when I was seeing videos like just no logic.
I was seeing my Mac mini on fire - ok, it's a joke - reaching 70 degrees Celsius in no time gaming, so I set the Macs Fan Control to constant rpm about half of max speed. No crashes have occurred from that on, but since I made this like 4 days ago I think it's too early to say I got rid of this, so I came here to check if anybody got something like this...
 
  • Like
Reactions: sd70mac
Got into a problem with my card. The problem persists after I changed to a RX 460 (from the 750 Ti) but seems to occur way less frequently.
System just crashing out, I thought it was only at games and hardware-pressuring things, but the crash also happens when I was seeing videos like just no logic.
I was seeing my Mac mini on fire - ok, it's a joke - reaching 70 degrees Celsius in no time gaming, so I set the Macs Fan Control to constant rpm about half of max speed. No crashes have occurred from that on, but since I made this like 4 days ago I think it's too early to say I got rid of this, so I came here to check if anybody got something like this...
The crashing is unusual, but the temperatures are normal. My MacBook heats up to 80 degrees after booting with the eGPU plugged in - due to the data compression. But it cools down to 50-60 degrees after a minute or two. Then I fire up a game, and it slowly heats up to 70-80 degrees. But yeah, I have Macs Fan Control keep the CPU temperature under control. I’d rather wear out the fan and have to replace it than to wear out my MacBook and have to replace it - due to cost and also because I don’t like any of the new MacBooks; my next laptop will probably be an Alienware or XPS running Debian(Linux) or Ubuntu(Linux).
 
  • Like
Reactions: sd70mac
The crashing is unusual, but the temperatures are normal. My MacBook heats up to 80 degrees after booting with the eGPU plugged in - due to the data compression. But it cools down to 50-60 degrees after a minute or two. Then I fire up a game, and it slowly heats up to 70-80 degrees. But yeah, I have Macs Fan Control keep the CPU temperature under control. I’d rather wear out the fan and have to replace it than to wear out my MacBook and have to replace it - due to cost and also because I don’t like any of the new MacBooks; my next laptop will probably be an Alienware or XPS running Debian(Linux) or Ubuntu(Linux).

I'm using a Mac mini mid 2011 i5 2415m, TB1.
It happened to me too, but now with constant 3500rpm on fan control it's not even getting over 60ºC with full load.
[doublepost=1493824173][/doublepost]More than 10ºC difference, really big deal.
[doublepost=1493824425][/doublepost]By the way, it seems that the Gigabyte RX 460 2GB GDDR5 (without external power supply cable necessity) is natively compatible with MacOS. You'll just need to rewrite automate eGPU to make changes on the correct kext (the 9510 one, the 9500 only existed until Sierra 12.2) and make it bootable, but it's really just CMD+F and changing all 9500 to 9510 on the script.
 
Last edited:
I'm using a Mac mini mid 2011 i5 2415m, TB1.
It happened to me too, but now with constant 3500rpm on fan control it's not even getting over 60ºC with full load.
[doublepost=1493824173][/doublepost]More than 10ºC difference, really big deal.
[doublepost=1493824425][/doublepost]By the way, it seems that the Gigabyte RX 460 2GB GDDR5 (without external power supply cable necessity) is natively compatible with MacOS. You'll just need to rewrite automate eGPU to make changes on the correct kext (the 9510 one, the 9500 only existed until Sierra 12.2) and make it bootable, but it's really just CMD+F and changing all 9500 to 9510 on the script.
Yeah, OS X has built-in drivers for AMD GPUs because Apple’s currently invested there. Apple used to focus more on Nvidia, including drivers in the OS, but now they leave that up to Nvidia. And for a while, there weren’t any Nvidia drivers for new GPUs on the Mac, which is a big disadvantage of MacOS - inconsistent GPU driver development. Apparently, that hiatus from Mac GPU development was cause by a lack of collaboration on Apple’s part. That’s one of the reasons why Nvidia is far more up-to-date with their Linux drivers - the operating system is open-source, so they don’t need help. Also Nvidia GPUs are often used in supercomputers, which are running Linux. Due to Apple’s hardware choices of late, I’m looking into Linux.
 
Linus Torvalds and the Linux community disagree with you. Nvidia is known to have the worst GPU drivers in Linux known to man and it has been the case since ever. AMD and Intel stepped up their game and even open sourced their drivers. They actually write Linux drivers unlike Nvidia. Nvidia is a cheapskate because their driver consist of 2 parts: the exact same closed source proprietary blob used in Windows and some kind of kernel module that is no more than a messenger between that blob and the kernel. If it would work properly than that would be alright with quite a lot of people but it doesn't. Out of all the graphics drivers for UNIX/Linux platforms, the one from Nvidia sucks the most and thus everyone who uses UNIX/Linux avoids Nvidia hardware.

That said, Nvidia is the king when it comes to GPGPU, their CUDA has become the de facto standard on that and it is the only reason why you'd still see them in Linux servers. OpenCL is the more open version supported by far more manufacturers but it isn't used that often. Even Nvidia puts OpenCL on their cards although they are not very dedicated to it, their focus is entirely on CUDA. This has led to AMD having the better OpenCL implementation.

It is probably due to the above 2 reasons why Apple is using AMD cards instead of Nvidia.

Btw, GPU driver development on every platform has been and still is a PITA. Graphics drivers are one of the crappiest on the planet. The manufacturers of the drivers are the only ones to blame for the inconsistent driver development you speak off.
 
  • Like
Reactions: sd70mac
Linus Torvalds and the Linux community disagree with you. Nvidia is known to have the worst GPU drivers in Linux known to man and it has been the case since ever. AMD and Intel stepped up their game and even open sourced their drivers. They actually write Linux drivers unlike Nvidia. Nvidia is a cheapskate because their driver consist of 2 parts: the exact same closed source proprietary blob used in Windows and some kind of kernel module that is no more than a messenger between that blob and the kernel. If it would work properly than that would be alright with quite a lot of people but it doesn't. Out of all the graphics drivers for UNIX/Linux platforms, the one from Nvidia sucks the most and thus everyone who uses UNIX/Linux avoids Nvidia hardware.

That said, Nvidia is the king when it comes to GPGPU, their CUDA has become the de facto standard on that and it is the only reason why you'd still see them in Linux servers. OpenCL is the more open version supported by far more manufacturers but it isn't used that often. Even Nvidia puts OpenCL on their cards although they are not very dedicated to it, their focus is entirely on CUDA. This has led to AMD having the better OpenCL implementation.

It is probably due to the above 2 reasons why Apple is using AMD cards instead of Nvidia.

Btw, GPU driver development on every platform has been and still is a PITA. Graphics drivers are one of the crappiest on the planet. The manufacturers of the drivers are the only ones to blame for the inconsistent driver development you speak off.

The Nvidia Linux drivers might not be great, but at least they're there. It took Nvidia an extra year to make Pascal drivers for MacOS because Apple didn't want to help. How do the Nvidia drivers compare to the Nouveau drivers?
 
  • Like
Reactions: sd70mac
Linus Torvalds and the Linux community disagree with you. Nvidia is known to have the worst GPU drivers in Linux known to man and it has been the case since ever. AMD and Intel stepped up their game and even open sourced their drivers. They actually write Linux drivers unlike Nvidia. Nvidia is a cheapskate because their driver consist of 2 parts: the exact same closed source proprietary blob used in Windows and some kind of kernel module that is no more than a messenger between that blob and the kernel. If it would work properly than that would be alright with quite a lot of people but it doesn't. Out of all the graphics drivers for UNIX/Linux platforms, the one from Nvidia sucks the most and thus everyone who uses UNIX/Linux avoids Nvidia hardware.

That said, Nvidia is the king when it comes to GPGPU, their CUDA has become the de facto standard on that and it is the only reason why you'd still see them in Linux servers. OpenCL is the more open version supported by far more manufacturers but it isn't used that often. Even Nvidia puts OpenCL on their cards although they are not very dedicated to it, their focus is entirely on CUDA. This has led to AMD having the better OpenCL implementation.

It is probably due to the above 2 reasons why Apple is using AMD cards instead of Nvidia.

Btw, GPU driver development on every platform has been and still is a PITA. Graphics drivers are one of the crappiest on the planet. The manufacturers of the drivers are the only ones to blame for the inconsistent driver development you speak off.

NVIDIA proprietary drivers have been far better than anything AMD put out on Linux until 2015-2016, when AMD started their AMDGPU open-source driver project and started to care about supporting Linux. Go look at any phoronix benchmark article from the past seven years if you need any further proof. I believe you are correct in that RadeonSI has had far better support from AMD then nouveau from NVIDIA.

Apple uses AMD cards because they went all-in on OpenGL/OpenCL when they launched the redesigned Mac Pro in 2013. With the redesigned Mac Pro I don't think they will mind if NVIDIA comes along, as OpenCL has been stagnant it seems.
 
Are there any GPUs coming out this year that have thunderbolt3/usbc outputs for video?

I love the size and portability of my 13" MBP but I want to be able to drive 2 5K ultrafines from it and the only way I can do that is with an eGPU enclosure.

What you want isn't possible. A single 5K monitor uses ~32Gbp/s of bandwidth. TB3 has a limit of 40GBp/s, so there's no possible way for you to run a high end GPU plus 2X 5K monitors off a single TB3 cable.

Basically there aren't enough TB controllers in the 13" MBP to do what you want.
 
What you want isn't possible. A single 5K monitor uses ~32Gbp/s of bandwidth. TB3 has a limit of 40GBp/s, so there's no possible way for you to run a high end GPU plus 2X 5K monitors off a single TB3 cable.

Basically there aren't enough TB controllers in the 13" MBP to do what you want.

Hmm. I know that eGPUs actually don't use the full bandwidth of the Thunderbolt link most of the time. However, I don't know that a single high end GPU exists yet that could drive two 5K monitors simultaneously.
 
Hmm. I know that eGPUs actually don't use the full bandwidth of the Thunderbolt link most of the time. However, I don't know that a single high end GPU exists yet that could drive two 5K monitors simultaneously.

Let's say there's a GPU that can drive 2X 5K. How do you get that video signal to a 13" MBP?
 
You should be able to run 2x 5Ks using monitors connected directly to the eGPU via DisplayPort 1.3 or 1.4. However I wouldn't expect an eGPU to drive an Ultrafine connected directly to the MacBookPro via Thunderbolt3.
 
NVIDIA proprietary drivers have been far better than anything AMD put out on Linux until 2015-2016, when AMD started their AMDGPU open-source driver project and started to care about supporting Linux. Go look at any phoronix benchmark article from the past seven years if you need any further proof. I believe you are correct in that RadeonSI has had far better support from AMD then nouveau from NVIDIA.

Apple uses AMD cards because they went all-in on OpenGL/OpenCL when they launched the redesigned Mac Pro in 2013. With the redesigned Mac Pro I don't think they will mind if NVIDIA comes along, as OpenCL has been stagnant it seems.

They went all-in on OpenCL, not OpenGL. AMD specializes in OpenCL, while Nvidia specializes in OpenGL.
 
Hmm. I know that eGPUs actually don't use the full bandwidth of the Thunderbolt link most of the time. However, I don't know that a single high end GPU exists yet that could drive two 5K monitors simultaneously.

You could just plug the monitors into the eGPU like everybody else.
 
You should be able to run 2x 5Ks using monitors connected directly to the eGPU via DisplayPort 1.3 or 1.4. However I wouldn't expect an eGPU to drive an Ultrafine connected directly to the MacBookPro via Thunderbolt3.

If I remember correctly, someone has gotten it to run with the 5Ks directly connected to the MBP and have the eGPU drive it. In fact, I believe this is the only way to do it.

You could just plug the monitors into the eGPU like everybody else.

TB3 has a maximum bandwidth of 40Gbp/s. A single 5K is ~32Gbp/s, so you can't run 2 of them plus a GPU through a single cable.
 
They went all-in on OpenCL, not OpenGL. AMD specializes in OpenCL, while Nvidia specializes in OpenGL.

Okay. I agree that NVIDIA does a lot more with OpenGL then AMD . I don't see how that changes the picture for Apple in 2013. Care to clarify?

Apple went all-in on OpenCL and OpenGL in 2013 when they decided to use dual AMD FirePro class cards. Now that Vulkan and Metal are out of the bag, NVIDIA is committing to writing drivers again for macOS, and OpenCL is kind of "meh" in terms of adoption and support from Apple that situation has changed.
 
If I remember correctly, someone has gotten it to run with the 5Ks directly connected to the MBP and have the eGPU drive it. In fact, I believe this is the only way to do it.



TB3 has a maximum bandwidth of 40Gbp/s. A single 5K is ~32Gbp/s, so you can't run 2 of them plus a GPU through a single cable.
That’s my point. The only way to avoid the bandwidth issue is the have the eGPU drive the monitors by itself. With an eGPU, you have two parts - a GPU and an adapter. The GPU is a regular PCIe GPU which has ports on it. The adapter takes the PCIe connection from the GPU and switches it over to Thunderbolt for use with computer which don’t have PCIe slots. GPUs generally have three video ports - DVI, VGA, and HDMI.
[doublepost=1495635461][/doublepost]
Okay. I agree that NVIDIA does a lot more with OpenGL then AMD . I don't see how that changes the picture for Apple in 2013. Care to clarify?

Apple went all-in on OpenCL and OpenGL in 2013 when they decided to use dual AMD FirePro class cards. Now that Vulkan and Metal are out of the bag, NVIDIA is committing to writing drivers again for macOS, and OpenCL is kind of "meh" in terms of adoption and support from Apple that situation has changed.
I’m not sure if it was that they were already heavily invested in OpenCL and just wanted more performance for their money, or if they just wanted to switch to AMD because it was cheaper. Another possibility is that it could be just another way of keeping Macs from looking like gaming machines, which could do two things: 1) Get gamers to leave Apple or keep them from asking too much of Apple and 2) Keep Apple from looking like a gaming company so businesses will be more interested in buying Macs (
)
 
That’s my point. The only way to avoid the bandwidth issue is the have the eGPU drive the monitors by itself. With an eGPU, you have two parts - a GPU and an adapter. The GPU is a regular PCIe GPU which has ports on it. The adapter takes the PCIe connection from the GPU and switches it over to Thunderbolt for use with computer which don’t have PCIe slots. GPUs generally have three video ports - DVI, VGA, and HDMI.

If I'm understanding you correctly, 2X 5K connected directly to the eGPU and then the eGPU directly to the MBP? If that's the setup, how can all that bandwidth get through a single cable to the MBP? If it works, I'd like to see it working.
 
If I'm understanding you correctly, 2X 5K connected directly to the eGPU and then the eGPU directly to the MBP? If that's the setup, how can all that bandwidth get through a single cable to the MBP? If it works, I'd like to see it working.
Most of the data for the monitors is actually coming from the GPU, not the Mac. That’s kind of the whole point of eGPUs - to get better graphics performance while lessening demand on the CPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.