The discussion here is that; does Photoshop utilizes the eGPU correctly? So I'm not sure why it is not relevant in regards to Quadro cards as some people do use 10bit displays as opposed to 8bit. In the PC world, it is well understood that consumer cards like the GeForce displays 10bit/channel in DirectX, but 8bit/channel through OpenGL so applications like Photoshop on the PC which uses OpenGL can only output up to 8bit/channel and not more. Where if you play games with DirectX however, then you do get 10bit/channel with GeForce. In Windows, displays are completely agnostic; which means that whatever the cards can display, it could only display that. The workstation GPU cards support 10bit/channel through OpenGL at least on the PC.
In Macs, starting with El-Capitan you do get 10bit display support which was lacking when the PC had that support for quite awhile. You would see that in the Adobe Photoshop CC performance pane where it will show 4 dialog boxes. What is interesting is that, some sites show a GeForce card being used and supported through OpenGL that provides 30bit display support on a Mac under Photoshop. The reason you get those Nvidia and PC sock puppets is because, it is well-known that GeForce cards only support 8bit/channel output in OpenGL on PC (that's a hardware thing), so the confusion arises when all of a sudden the same GeForce card can output 10bit/channel on Macs when the hardware can only do 8bit/channel in OpenGL on the PC. I think some PC users get jealous when a Mac with a normal GPU card can do 10bit where you need a workstation GPU card to do the same 10bit with OpenGL on a PC.
But the fact remains; is it true 10bit display or dithered 10bit display you get on the mac with a normal GPU? I was told, by a prominent community of Mac graphics people, that it is not true 10bit display with Photoshop. Now I don't mind to learn more and update my understanding in terms of 10bit/channel display. But just seeing on a system profiler is not enough. Are there professional graphic artist that can attest to the true display nature of 10bit display compared to the PC equivalent rather than just some assumptions that because the system profiler said so, or someone said so not through a blind test?