In the past we are being told that Photoshop is more efficient with CUDA (Nvidia cards rather than AMD) and even when Puget Systems tested the latest Nvidia RTX cards against the AMD versions, most of the time the Nvidia cards come out slightly ahead. The reason I am thinking CUDA is that, unlike PCs, there are a limited selection of GPU cards available natively for Macs, that is GPU cards that have a boot EFI screen and so application developers use cards that are officially supported by Apple to develop their apps around it. And if you read the disclaimer from Adobe about their latest CC applications; they "DO NOT" guarantee full acceleration of their applications with the latest GPU cards even if they meet or exceed the minimum specifications. This is true with DXO Labs as well as with Topaz as they both told me that they don't offer better performance with GPU acceleration like you find with the equivalent Windows applications, because Apple doesn't offer more PC GPU selections. So even if I have a Mac Pro 5,1 with the same RX580 card, I get "BETTER" performance under Windows bootcamp running the same apps than I do under Mac OS. How can that be?!? That is the inefficiencies of Mac OS. Now I don't know about Mojave and how different it is. 10bit support in Mac OS is only available with consumer version of Radeon GPUs in "games" I think not in professional applications. It will show it under Advanced Graphics Processing tab in Photoshop (30bit should show up). It doesn't on mine and I have a 10bit display.
You can only have access to 10bit display for professional apps via MacOS if you have the FirePro cards. The only monitors that can display this and I have access to them are monitors that support 8bit+FRC (fake 10bit) and 10bit true display which display a full 1.07 billion color space.