That’s a stretch as far performance arguments go. With 75 Mbps 10-bit 4K HEVC files, which is 3X the bitrate of the highest bitrate Apple TV+ 4K HDR streams and 5X the bitrate of Netflix 4K streams, my 3 year-old quad-core i5-7600 iMac uses less than 10% CPU.Just because it's hardware accelerated doesn't mean you're always getting the maximum energy efficiency.
T2 uses far less power compared to the i9 and Vega48. Also T2 has an AES crypto engine built right in that works really well with Fairplay (which I'm sure Intel chips can do fine, but not as good as T2).
I get that Apple wanted its own solution here, but IMO decode efficiency isn’t a major concern here. I suspect the issues are:
1) The DRM solution used in Windows is a MS/Intel technology and Apple would have to pay royalties to use it and would have no control over its design or robustness.
and/or:
2) Apple wanted something that would be easily implemented across iOS, iPadOS, and macOS, the latter on both Intel Macs and Arm Macs.
With those same uber high bitrate files mentioned above, CPU usage is roughly 25% on my lowly fanless Core m3-7Y32 MacBook, and multi-tasking isn’t a problem at all.Long battery life.
And with iMacs, performance is still critical if you have a video playing while doing other things.