Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

RonaldSwanson

macrumors newbie
Original poster
May 14, 2019
9
12
I've gotten conflicting reports. I'm a heavy Adobe Camera Raw/Lightroom user and need the GPU acceleration. I've read that as long as the EGPU is driving a single monitor, it works as it should.

Just wanted to confirm. Thank you for the help!
 

DYER

macrumors 6502
Oct 4, 2008
371
36
London, UK
So, from my experience it's not going well.

When editing RAW files - from a Canon EOS RP and Nikon D600 (don't ask) - there's clearly something not working correctly as I can't even rotate the images in crop mode without half the image greying out for fractions of a second. If anything, performance is worse...

JPEGs are fine, but sadly that doesn't help me when editing RAW.

It's bitterly disappointing as I did expect more, I also found that Lightroom sees no performance improvements either - this is despite running a Vega 64 connect to my i7 mini and then 4K Samsung display connected to the Razer Core - so the set up is correct.

I've fiddled with the eGPU settings but to no real benefit as of yet...
 

iluvmacs99

macrumors 6502a
Apr 9, 2019
920
673
So, from my experience it's not going well.

When editing RAW files - from a Canon EOS RP and Nikon D600 (don't ask) - there's clearly something not working correctly as I can't even rotate the images in crop mode without half the image greying out for fractions of a second. If anything, performance is worse...

JPEGs are fine, but sadly that doesn't help me when editing RAW.

It's bitterly disappointing as I did expect more, I also found that Lightroom sees no performance improvements either - this is despite running a Vega 64 connect to my i7 mini and then 4K Samsung display connected to the Razer Core - so the set up is correct.

I've fiddled with the eGPU settings but to no real benefit as of yet...

Have you tried this yet?

https://egpu.io/forums/mac-setup/potentially-accelerate-all-applications-on-egpu-macos-10-13-4/
[doublepost=1561726290][/doublepost]
I've gotten conflicting reports. I'm a heavy Adobe Camera Raw/Lightroom user and need the GPU acceleration. I've read that as long as the EGPU is driving a single monitor, it works as it should.

Just wanted to confirm. Thank you for the help!

Yes and maybe. The problem with Mac OS is that it does not, unlike Windows, have a system wide eGPU management system. So you have to coerce the Mac OS to utilize any eGPU setup as the defacto GPU for acceleration. The above script will force the mac os to use the eGPU on mac applications that support GPU acceleration. The sticking point is with Adobe's GPU preference. In the past, Adobe prefers Cuda cores from Nvidia, so if you have an Nvidia card, then you will get better acceleration than with AMD cards.
 
Last edited:

CreeptoLoser

Suspended
Jul 28, 2018
369
333
Birmingham, Alabama
The Adobe apps are using it. They have to if the GPU is powering the display. The problem has always been that Adobe apps use OpenGL based acceleration in their Mercury engine which is not a low level API. They need to move to Metal and stop holding back. We need a low level API that uses the GPU properly.
 

iluvmacs99

macrumors 6502a
Apr 9, 2019
920
673
The issue with Photoshop is that it is a photo editing software, therefore it focuses its development on cards that support 10bit output/channel (30bit = 10bit x 3 color channel) with a 1.07 billion color space possible with a 10bit color display monitor' allowing better AdobeRGB or ProPhoto coverage. Consumer GPUs don't necessarily support 10bit/channel output. Its photo rendering Mercury engine is no longer efficient in speeding up rendering output on all but a few GPU functions like smart sharpen and render tree with the latest CPUs. The Mercury engine is really dated. When I chose my graphics card for my Mac Pro, I settled on the RX-580 knowing that I will get the best acceleration over the stock iGPU in the Mini 2018 (630UHD), but not the 10bit support which the Pro range offers like the Radeon Pro series or the Nvidia Quadro series. The eGPU is necessary if you need to run a graphics card that supports 10bit/channel output and the only card I know that supports this in Metal is the Nvidia Quadro K5000 that has the Mac EFI boot screen and works in Mojave. I actually am planning to get an Nvidia Quadro 4000 Mac Edition as my 2nd GPU in my Mac Pro just a Photoshop specific GPU as it supports 10bit and CUDA, while I use the RX580 specifically for Davinci Resolve and Topaz AI suite of apps.

It does not mean that the Vega 64 is wasted. Adobe has been working on the Adobe Sensei program; its own machine learning app and if it is applied their creative suite in 2020 and beyond, it could mean that cards like the Vega 64 will benefit more from A.I than my RX-580. Right now, I really enjoyed using Topaz Sharpen AI and Adjust AI and Gigapixel AI and they had mostly replaced Smart Sharpen, Size Output and Layer Masking with Photoshop at much faster speeds. I'm sure Adobe will counteract with their own AI versions of Smart Sharpen, Size Output and Layer Masking. In the meantime, you need to chose a specific card and sometimes, an older more compatible GPU card is more beneficial to Photoshop rather than a more powerful and modern GPU card. And since I don't plan to upgrade to the latest Adobe CC anything soon, I'll probably buy an Nvidia Quadro and stay in High Sierra and have 10bit output support when I need it.
 
Last edited:

CreeptoLoser

Suspended
Jul 28, 2018
369
333
Birmingham, Alabama
The issue with Photoshop is that it is a photo editing software, therefore it focuses its development on cards that support 10bit output/channel (30bit = 10bit x 3 color channel) with a 1.07 billion color space possible with a 10bit color display monitor' allowing better AdobeRGB or ProPhoto coverage. Consumer GPUs don't necessarily support 10bit/channel output. Its photo rendering Mercury engine is no longer efficient in speeding up rendering output on all but a few GPU functions like smart sharpen and render tree with the latest CPUs. The Mercury engine is really dated. When I chose my graphics card for my Mac Pro, I settled on the RX-580 knowing that I will get the best acceleration over the stock iGPU in the Mini 2018 (630UHD), but not the 10bit support which the Pro range offers like the Radeon Pro series or the Nvidia Quadro series. The eGPU is necessary if you need to run a graphics card that supports 10bit/channel output and the only card I know that supports this in Metal is the Nvidia Quadro K5000 that has the Mac EFI boot screen and works in Mojave. I actually am planning to get an Nvidia Quadro 4000 Mac Edition as my 2nd GPU in my Mac Pro just a Photoshop specific GPU as it supports 10bit and CUDA, while I use the RX580 specifically for Davinci Resolve and Topaz AI suite of apps.

It does not mean that the Vega 64 is wasted. Adobe has been working on the Adobe Sensei program; its own machine learning app and if it is applied their creative suite in 2020 and beyond, it could mean that cards like the Vega 64 will benefit more from A.I than my RX-580. Right now, I really enjoyed using Topaz Sharpen AI and Adjust AI and Gigapixel AI and they had mostly replaced Smart Sharpen, Size Output and Layer Masking with Photoshop at much faster speeds. I'm sure Adobe will counteract with their own AI versions of Smart Sharpen, Size Output and Layer Masking. In the meantime, you need to chose a specific card and sometimes, an older more compatible GPU card is more beneficial to Photoshop rather than a more powerful and modern GPU card. And since I don't plan to upgrade to the latest Adobe CC anything soon, I'll probably buy an Nvidia Quadro and stay in High Sierra and have 10bit output support when I need it.

Yes Mercury is old in the tooth now and no longer efficient, but they could build it on top of Metal. it is still built on top of GL and CL. I think they have progressed slowly here so that they can still support older Macs that lack Metal support.

Photoshop emulates 30 bit display via dithering, the setting is there in 'Preferences>Performance'. It's done that for years because macOS didnt support 10 bit output until El Capitan. Photographers managed for years. Now we have native 10 bit support.

Your Radeon 580 or Vega 64 supports 10 bit color. See in System Information for info. All Radeons since 2013/14 support 10 bit in macOS as long as you pair it with a decent monitor. So you dont need the Quadro, which if not an eGPU supported card so it is off topic anyway. Don’t know why you think you need CUDA for Photoshop. There’s only like 2-3 filters that use CUDA but they are accelerated with CL too.
 
Last edited:

iluvmacs99

macrumors 6502a
Apr 9, 2019
920
673
Yes Mercury is old in the tooth now and no longer efficient, but they could build it on top of Metal. it is still built on top of GL and CL. I think they have progressed slowly here so that they can still support older Macs that lack Metal support.

Photoshop emulates 30 bit display via dithering, the setting is there in 'Preferences>Performance'. It's done that for years because macOS didnt support 10 bit output until El Capitan. Photographers managed for years. Now we have native 10 bit support.

Your Radeon 580 or Vega 64 supports 10 bit color. See in System Information for info. All Radeons since 2013/14 support 10 bit in macOS as long as you pair it with a decent monitor. So you dont need the Quadro, which if not an eGPU supported card so it is off topic anyway. Don’t know why you think you need CUDA for Photoshop. There’s only like 2-3 filters that use CUDA but they are accelerated with CL too.

In the past we are being told that Photoshop is more efficient with CUDA (Nvidia cards rather than AMD) and even when Puget Systems tested the latest Nvidia RTX cards against the AMD versions, most of the time the Nvidia cards come out slightly ahead. The reason I am thinking CUDA is that, unlike PCs, there are a limited selection of GPU cards available natively for Macs, that is GPU cards that have a boot EFI screen and so application developers use cards that are officially supported by Apple to develop their apps around it. And if you read the disclaimer from Adobe about their latest CC applications; they "DO NOT" guarantee full acceleration of their applications with the latest GPU cards even if they meet or exceed the minimum specifications. If I understand it correctly, consumer gaming cards offer 10bit output in gaming applications. They do not, however, offer 10bit display output support in professional applications like Photoshop, so while it's true that Macs do offer 10bit output support in El-Capitan and beyond, you still need a display card that can drive a 10bit display or a 8bit+FRC and as far as I was told, only Workstation class graphics card can do this. In fact, this is the only distinction between a workstation graphics card like the Radeon Pro / Nvidia Quadro and in that they allow display of 10bit/channel color in professional applications and not only in games.
 
Last edited:

CreeptoLoser

Suspended
Jul 28, 2018
369
333
Birmingham, Alabama
In the past we are being told that Photoshop is more efficient with CUDA (Nvidia cards rather than AMD) and even when Puget Systems tested the latest Nvidia RTX cards against the AMD versions, most of the time the Nvidia cards come out slightly ahead. The reason I am thinking CUDA is that, unlike PCs, there are a limited selection of GPU cards available natively for Macs, that is GPU cards that have a boot EFI screen and so application developers use cards that are officially supported by Apple to develop their apps around it. And if you read the disclaimer from Adobe about their latest CC applications; they "DO NOT" guarantee full acceleration of their applications with the latest GPU cards even if they meet or exceed the minimum specifications. This is true with DXO Labs as well as with Topaz as they both told me that they don't offer better performance with GPU acceleration like you find with the equivalent Windows applications, because Apple doesn't offer more PC GPU selections. So even if I have a Mac Pro 5,1 with the same RX580 card, I get "BETTER" performance under Windows bootcamp running the same apps than I do under Mac OS. How can that be?!? That is the inefficiencies of Mac OS. Now I don't know about Mojave and how different it is. 10bit support in Mac OS is only available with consumer version of Radeon GPUs in "games" I think not in professional applications. It will show it under Advanced Graphics Processing tab in Photoshop (30bit should show up). It doesn't on mine and I have a 10bit display. :)

You can only have access to 10bit display for professional apps via MacOS if you have the FirePro cards. The only monitors that can display this and I have access to them are monitors that support 8bit+FRC (fake 10bit) and 10bit true display which display a full 1.07 billion color space.

Sorry Bro but you're being very off topic here. Ronald the OP is a Mac user inquiring about eGPU with his Mac mini.

Puget are a PC builder testing on Windows. PCs, Quadros and Geforces aren't relevant to this discussion. Neither are FirePro cards. None of these things are supported for eGPU use with Macs.

Dozens of forum members over the years have spoken about Radeons supporting 10 bit color on Macs since El Capitan. You can open System Information right now and check in 30 seconds. Anyone can google 'macos radeon system profiler' and then see the screenshots users have posted for years.

Please don't post misinformation. We have a seen members talking about Nvidia or PC sock puppets invading these discussions for a couple of years. You don't want to be mixed up with these sock puppets if you aren't one. Please stay on topic for the sake of the OP. It's not fair to confuse people.

BTW, I'm an ex-Adobe engineer and still take part in the insider program.
 
Last edited:

iluvmacs99

macrumors 6502a
Apr 9, 2019
920
673
Sorry Bro but you're being very off topic here. Ronald the OP is a Mac user inquiring about eGPU with his Mac mini.

Puget are a PC builder testing on Windows. PCs, Quadros and Geforces aren't relevant to this discussion. Neither are FirePro cards. None of these things are supported for eGPU use with Macs.

Dozens of forum members over the years have spoken about Radeons supporting 10 bit color on Macs since El Capitan. You can open System Information right now and check in 30 seconds. Anyone can google 'macos radeon system profiler' and then see the screenshots users have posted for years.

Please don't post misinformation. We have a seen members talking about Nvidia or PC sock puppets invading these discussions for a couple of years. You don't want to be mixed up with these sock puppets if you aren't one. Please stay on topic for the sake of the OP. It's not fair to confuse people.

The discussion here is that; does Photoshop utilizes the eGPU correctly? So I'm not sure why it is not relevant in regards to Quadro cards as some people do use 10bit displays as opposed to 8bit. In the PC world, it is well understood that consumer cards like the GeForce displays 10bit/channel in DirectX, but 8bit/channel through OpenGL so applications like Photoshop on the PC which uses OpenGL can only output up to 8bit/channel and not more. Where if you play games with DirectX however, then you do get 10bit/channel with GeForce. In Windows, displays are completely agnostic; which means that whatever the cards can display, it could only display that. The workstation GPU cards support 10bit/channel through OpenGL at least on the PC.

In Macs, starting with El-Capitan you do get 10bit display support which was lacking when the PC had that support for quite awhile. You would see that in the Adobe Photoshop CC performance pane where it will show 4 dialog boxes. What is interesting is that, some sites show a GeForce card being used and supported through OpenGL that provides 30bit display support on a Mac under Photoshop. The reason you get those Nvidia and PC sock puppets is because, it is well-known that GeForce cards only support 8bit/channel output in OpenGL on PC (that's a hardware thing), so the confusion arises when all of a sudden the same GeForce card can output 10bit/channel on Macs when the hardware can only do 8bit/channel in OpenGL on the PC. I think some PC users get jealous when a Mac with a normal GPU card can do 10bit where you need a workstation GPU card to do the same 10bit with OpenGL on a PC.

But the fact remains; is it true 10bit display or dithered 10bit display you get on the mac with a normal GPU? I was told, by a prominent community of Mac graphics people, that it is not true 10bit display with Photoshop. Now I don't mind to learn more and update my understanding in terms of 10bit/channel display. But just seeing on a system profiler is not enough. Are there professional graphic artist that can attest to the true display nature of 10bit display compared to the PC equivalent rather than just some assumptions that because the system profiler said so, or someone said so not through a blind test?
 
Last edited:

CreeptoLoser

Suspended
Jul 28, 2018
369
333
Birmingham, Alabama
The discussion here is that; does Photoshop utilizes the eGPU correctly? So I'm not sure why it is not relevant in regards to Quadro cards as some people do use 10bit displays as opposed to 8bit. In the PC world, it is well understood that consumer cards like the GeForce displays 10bit/channel in DirectX, but 8bit/channel through OpenGL so applications like Photoshop on the PC which uses OpenGL can only output up to 8bit/channel and not more. Where if you play games with DirectX however, then you do get 10bit/channel with GeForce. In Windows, displays are completely agnostic; which means that whatever the cards can display, it could only display that. The workstation GPU cards support 10bit/channel through OpenGL at least on the PC.

In Macs, starting with El-Capitan you do get 10bit display support which was lacking when the PC had that support for quite awhile. You would see that in the Adobe Photoshop CC performance pane where it will show 4 dialog boxes. What is interesting is that, some sites show a GeForce card being used and supported through OpenGL that provides 30bit display support on a Mac under Photoshop. The reason you get those Nvidia and PC sock puppets is because, it is well-known that GeForce cards only support 8bit/channel output in OpenGL on PC (that's a hardware thing), so the confusion arises when all of a sudden the same GeForce card can output 10bit/channel on Macs when the hardware can only do 8bit/channel in OpenGL on the PC. I think some PC users get jealous when a Mac with a normal GPU card can do 10bit where you need a workstation GPU card to do the same 10bit with OpenGL on a PC.

But the fact remains; is it true 10bit display or dithered 10bit display you get on the mac with a normal GPU? I was told, by a prominent community of Mac graphics people, that it is not true 10bit display with Photoshop. Now I don't mind to learn more and update my understanding in terms of 10bit/channel display. But just seeing on a system profiler is not enough. Are there professional graphic artist that can attest to the true display nature of 10bit display compared to the PC equivalent rather than just some assumptions that because the system profiler said so, or someone said so not through a blind test?

Alright but you're not telling me anything new here. I was at Adobe when Photoshop was ported from OS 9 to OS X. It took a long time because OS X was undergoing so many changes that we had to race to keep up. The OP just wanted to know if performance can be better and the only answer is yes it can be better once Photoshop implements a low level API like Metal. That's the simple answer the OP needs.

Photoshop for iPadOS uses Metal...so watch this space.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.