The Vega on iMac Pro can actually do the hardware decoding (and obviously it's a dGPU).
Also, it's confirmed that the RX580 (the same GPU as in the Apple eGPU developer kit) also able to perform HEVC hardware decoding in the current 10.14 beta. I have this GPU in my Mac Pro now. I have done the test by myself (as per the following screenshot). 4K HDR HEVC BT.2020 video can be play back smoothly in QuickTime Player on my Samsung CHG 90 with just the RX580.
View attachment 774878
I tried to show you the required info in a single unmodded screenshot (I only remove the serial numbers).
The video is the Sony_4K_HDR_Camp.mp4, it's widely available on the internet. Clip info as per Mediainfo shows.
CPU is very low workload (but QuickTime is working). The CPU is W3690 (as per my signature), and definitely not capable to doing any HEVC 10 bit high bitrate decoding in both software and hardware method.
GPU is the Apple recommended (for Mac Pro 5,1), Sapphire PULSE RX580 8GB. Which has reasonable workload when decoding the video. And system info confirmed that was the only GPU installed, and displaying 30-bit colour.
iStat showed both the CPU and GPU usage, plus the monitor was working at around 60 FPS (because the video was playing at smooth 60FPS). If the video was not playing, it will be like this (2.2 FPS in this case). Also, no QuickTime shows in the CPU process because no demand from it.
View attachment 774880
Or if the playback is not smooth, then it will shows the playing frame rate. I use IINA to play the same clip with software decoding. CPU was stressed to 1000%, only 10FPS (The colour seems over saturated in IINA as well)
View attachment 774883
And if I was playing around the mouse pointer etc to achieve higher FPS, then it will easily jump to 144FPS because CHG 90 is a 144Hz monitor.
View attachment 774882
Anyway, I think this is good enough to show you that MacOS can play HEVC 10bit HDR BT.2020 smoothly with dGPU hardware decode. And this is on a 9 years old Mac Pro without Skylake (or newer) CPU installed.