Not sure what you mean. 4K HEVC video has already become mainstream. Or at least will be mainstream in the Apple world as of this fall. Even the iPhone 7 Plus in the fall after the iOS 11 update will be recording 4K HEVC video, and will be able to play it back in realtime of course, since it has a hardware 4K HEVC decoder.
These same videos may bring 2015 MacBooks and MacBook Pros to their knees. And even if they can play them, it will be with terrible battery life.
This is same scenario as the h.264 transition. I told people back then they needed hardware h.264 decoding but people just said if software decoding could be done on the computer it would be fine. Well, not quite. The 2.4 GHz 2008 white MacBook can play 1080p h.264, but it struggles and can't multitask, and the fan becomes annoyingly loud and the battery gives up in no time. In contrast, the 2.26 GHz 2009 MacBook Pro has hardware decoding and it doesn't break a sweat. Very low CPU usage.
Some initial reports are suggesting QuickTime's software decoding of HEVC in High Sierra will be more efficient than current free decoder implementations are now, but the thing it's still software decoding. Even if playback of a video needs say only 40% CPU across all cores, that's still way, way more than 5% like you would get with hardware decode (meaning 2017 iMac, 2017 MacBook Pro, or 2017 MacBook for 10-bit).
1. My purport with "let software optimised decoding" was that let them optimise the software for HEVC decoding when there is no hardware support, and that will bring down the 650 percent to within spec. It will take a toll on battery, but please let's not go there. You want to do everything everywhere is it? ONE thing for everything? How high will your expectations go of a portable computer! Can your MBP become the shower gel for you to use? I know, Apple sucks. Still, if you really want to watch 10 bit 4K on 8 bit display, you could just keep the MBP plugged in, and not bother with battery. A portable is a compromise, always. This is not going to change. My car cannot fly, and an aircraft will have difficulty finding parking space on most streets.
2. You are right, but the question of matter is one of need. Do we need 10 bit decoding capability? Without the MBP capable of displaying it on its own display, or able to output via USB-C? 10 bit is a special requirement right now, as it seems, and might take about 2 years to become mainstream enough for people to "need" to have hardware decode.
For now, if the 2016 can handle 10 bit decode via software and 8 bit decode via hardware, I think it is sufficient for 2 years at least. No?
10 bit decoding capability is a special requirement of those who want top quality and have the means to display that top quality right now. If I were to buy a 10 bit capable TV today, I would love to have a 10 bit capable computer that can decode via hardware and output via ports. The real need for 10 bit is for those who are having displays that can see and show the 10 bit, and importantly, we need content that is 10 bit. It is trickling now, give it a year or two, right? By that time, even Apple would have caught up with 10 bit displays and output abilities. What's the fuss and rush? Why make people perfectly happy with their 2016 machines second-guess their purchase and themselves!
If 4K is here and mainstream enough to necessitate Kaby Lake today, we should start wondering about the ability or inability of Kaby to play 8K since that is just over the horizon, yes? When will we just sit and enjoy content, rather than just desire better and better quality? There was news that I heard, somewhere on the planet a photograph shot with an iPhone won an award, instead of those shot with superior quality full-frame and crop-frame DSLRs and lenses worth gazillions of dollars. Imagine the horror. The judges had to explain that composition and content of photograph matters more than technical aspects, something that today's generation of photographers seem to be out of touch with, lost in the sea of technical specifications.
Even Apple based its business on not providing technical specs, right? They were about content creation, enjoyment of said content, never to be bothered how a certain 200MHz will compete with 250MHz of the other company.
Of course, there will be things that 2015 machines won't be able to do. But the thing is, despite 2017 machines capable of hardware decoding of 10 bit, the display itself, of the machine which is proudly capable of decoding 10 bit, is incapable of displaying said 10 bit. To add insult to that injury, it can't even output that 10 bit. How much different is it from 8 bit Skylake then, in practice?
It is like giving a $100 bill to a Martian, where though the bill has value of a hundred dollars in itself, is useless because of the Martian cannot use that value anywhere. What will I do with a Kaby Lake processor thrown in a 2017 MBP if the MBP display is not 10 bit, if the MBP cannot output 10 bit to a 10 bit capable display for me to enjoy? What will I do with those 2 extra bits? Make a fuss on MacRumors?