Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
IINA and MPV decodes videos through GPU when supported.
MPV with a Sony nature demo video at 4K 60 fps is all software decoded on my 4.2 GHz Kaby Lake i7 7700K with Radeon Pro 580 running 10.12 Sierra. And even with this high powered machine the video will stutter, with 650% CPU usage. This the fastest consumer Mac in existence folks.

Hardware decode is mandatory for 4K HEVC playback to be viable on any Mac.

I have not tried IINA yet.

EDIT:

IINA doesn't work either. 650% CPU usage.
 
Last edited:
  • Like
Reactions: Sovon Halder
MPV with a Sony nature demo video at 4K 60 fps is all software decoded on my 4.2 GHz Kaby Lake i7 7700K with Radeon Pro 580 running 10.12 Sierra. And even with this high powered machine the video will stutter, with 650% CPU usage. This the fastest consumer Mac in existence folks.

Hardware decode is mandatory for 4K HEVC playback to be viable on any Mac.

I have not tried IINA yet.

EDIT:

IINA doesn't work either. 650% CPU usage.

Let software optimised decoding become available, then we will see how much of a difference there is, for normal users and the enthusiast uses and truly demanding professionals.
 
Let software optimised decoding become available, then we will see how much of a difference there is, for normal users and the enthusiast uses and truly demanding professionals.
Not sure what you mean. 4K HEVC video has already become mainstream. Or at least will be mainstream in the Apple world as of this fall. Even the iPhone 7 Plus in the fall after the iOS 11 update will be recording 4K HEVC video, and will be able to play it back in realtime of course, since it has a hardware 4K HEVC decoder.

These same videos may bring 2015 MacBooks and MacBook Pros to their knees. And even if they can play them, it will be with terrible battery life.

This is same scenario as the h.264 transition. I told people back then they needed hardware h.264 decoding but people just said if software decoding could be done on the computer it would be fine. Well, not quite. The 2.4 GHz 2008 white MacBook can play 1080p h.264, but it struggles and can't multitask, and the fan becomes annoyingly loud and the battery gives up in no time. In contrast, the 2.26 GHz 2009 MacBook Pro has hardware decoding and it doesn't break a sweat. Very low CPU usage.

Some initial reports are suggesting QuickTime's software decoding of HEVC in High Sierra will be more efficient than current free decoder implementations are now, but the thing it's still software decoding. Even if playback of a video needs say only 40% CPU across all cores, that's still way, way more than 5% like you would get with hardware decode (meaning 2017 iMac, 2017 MacBook Pro, or 2017 MacBook for 10-bit).
 
Last edited:
Not sure what you mean. 4K HEVC video has already become mainstream. Or at least will be mainstream in the Apple world as of this fall. Even the iPhone 7 Plus in the fall after the iOS 11 update will be recording 4K HEVC video, and will be able to play it back in realtime of course, since it has a hardware 4K HEVC decoder.

These same videos may bring 2015 MacBooks and MacBook Pros to their knees. And even if they can play them, it will be with terrible battery life.

This is same scenario as the h.264 transition. I told people back then they needed hardware h.264 decoding but people just said if software decoding could be done on the computer it would be fine. Well, not quite. The 2.4 GHz 2008 white MacBook can play 1080p h.264, but it struggles and can't multitask, and the fan becomes annoyingly loud and the battery gives up in no time. In contrast, the 2.26 GHz 2009 MacBook Pro has hardware decoding and it doesn't break a sweat. Very low CPU usage.

Some initial reports are suggesting QuickTime's software decoding of HEVC in High Sierra will be more efficient than current free decoder implementations are now, but the thing it's still software decoding. Even if playback of a video needs say only 40% CPU across all cores, that's still way, way more than 5% like you would get with hardware decode (meaning 2017 iMac, 2017 MacBook Pro, or 2017 MacBook for 10-bit).

1. My purport with "let software optimised decoding" was that let them optimise the software for HEVC decoding when there is no hardware support, and that will bring down the 650 percent to within spec. It will take a toll on battery, but please let's not go there. You want to do everything everywhere is it? ONE thing for everything? How high will your expectations go of a portable computer! Can your MBP become the shower gel for you to use? I know, Apple sucks. Still, if you really want to watch 10 bit 4K on 8 bit display, you could just keep the MBP plugged in, and not bother with battery. A portable is a compromise, always. This is not going to change. My car cannot fly, and an aircraft will have difficulty finding parking space on most streets.

2. You are right, but the question of matter is one of need. Do we need 10 bit decoding capability? Without the MBP capable of displaying it on its own display, or able to output via USB-C? 10 bit is a special requirement right now, as it seems, and might take about 2 years to become mainstream enough for people to "need" to have hardware decode.

For now, if the 2016 can handle 10 bit decode via software and 8 bit decode via hardware, I think it is sufficient for 2 years at least. No?

10 bit decoding capability is a special requirement of those who want top quality and have the means to display that top quality right now. If I were to buy a 10 bit capable TV today, I would love to have a 10 bit capable computer that can decode via hardware and output via ports. The real need for 10 bit is for those who are having displays that can see and show the 10 bit, and importantly, we need content that is 10 bit. It is trickling now, give it a year or two, right? By that time, even Apple would have caught up with 10 bit displays and output abilities. What's the fuss and rush? Why make people perfectly happy with their 2016 machines second-guess their purchase and themselves!

If 4K is here and mainstream enough to necessitate Kaby Lake today, we should start wondering about the ability or inability of Kaby to play 8K since that is just over the horizon, yes? When will we just sit and enjoy content, rather than just desire better and better quality? There was news that I heard, somewhere on the planet a photograph shot with an iPhone won an award, instead of those shot with superior quality full-frame and crop-frame DSLRs and lenses worth gazillions of dollars. Imagine the horror. The judges had to explain that composition and content of photograph matters more than technical aspects, something that today's generation of photographers seem to be out of touch with, lost in the sea of technical specifications.

Even Apple based its business on not providing technical specs, right? They were about content creation, enjoyment of said content, never to be bothered how a certain 200MHz will compete with 250MHz of the other company.

Of course, there will be things that 2015 machines won't be able to do. But the thing is, despite 2017 machines capable of hardware decoding of 10 bit, the display itself, of the machine which is proudly capable of decoding 10 bit, is incapable of displaying said 10 bit. To add insult to that injury, it can't even output that 10 bit. How much different is it from 8 bit Skylake then, in practice?

It is like giving a $100 bill to a Martian, where though the bill has value of a hundred dollars in itself, is useless because of the Martian cannot use that value anywhere. What will I do with a Kaby Lake processor thrown in a 2017 MBP if the MBP display is not 10 bit, if the MBP cannot output 10 bit to a 10 bit capable display for me to enjoy? What will I do with those 2 extra bits? Make a fuss on MacRumors?
 
Last edited:
The displays are the iMacs and MacBook Pros themselves as they are 10-bit displays. And even if you're not running a 10-bit display, it's likely that QuickTime and macOS will do the proper dithering to 8-bit compensate as this stuff is built into the OS. Also, if you have a pre-existing 10-bit file, it is not as if it will magically transform itself to an 8-bit file for easy playback just because you happen not to have a 10-bit display.

And no, 8K is not right around the corner in any meaningful sense.

Today with h.265, we are like where h.264 was back in about 2009. It should be noted that that while software decoding of h.264 did improve, it didn't improve enough to solve the problems of software playback. Machines that struggled in 2009 with 1080p h.264 continued to struggle 4 years later.
 
Last edited:
So you are suggesting that we better dump the 2016 and go for 2017 MBPs? And, for my knowledge and reference, where are you getting that MBP displays are 10 bit?
 
So you are suggesting that we better dump the 2016 and go for 2017 MBPs? And, for my knowledge and reference, where are you getting that MBP displays are 10 bit?
No not at all. I'm saying that if you are buying now, lean heavily toward a 2017 model. I also told people in 2016 to wait one more year if they could. However, if you already own a 2016 then it depends upon usage but it's hard to justify spending all that money again after just one year. Luckily Apple is implementing partial support for the 2016 models so that may be enough for many. But if you're buying today it would not be a good idea to buy a 2016 IMO, unless you're getting a killer deal on it.

As for the displays they are not quite true 10-bit from what I understand, but they are functionally 10-bit. They are wide gamut displays on the iMac and MacBook Pro. It's right in the Apple specs for the 2016 in fact:

https://support.apple.com/kb/SP747?locale=en_CA

The MacBook does not get a wide gamut display however.
 
Last edited:
No not at all. I'm saying that if you are buying now, lean heavily toward a 2017 model. I also told people in 2016 to wait one more year if they could. However, if you already own a 2016 then it depends upon usage but it's hard to justify spending all that money again after just one year. Luckily Apple is implementing partial support for the 2016 models so that may be enough for many. But if you're buying today it would not be a good idea to buy a 2016 IMO, unless you're getting a killer deal on it.

As for the displays they are not quite true 10-bit from what I understand, but they are functionally 10-bit. They are wide gamut displays on the iMac and MacBook Pro. It's right in the Apple specs for the 2016 in fact:

https://support.apple.com/kb/SP747?locale=en_CA

The MacBook does not get a wide gamut display however.

Of course, buying today there is no doubt about 2017. Not one bit. Beyond doubt, go for 2017 if buying today. I myself considered selling off my 2016, but that was because of the keyboard updates proposed by many here. In India I do not have the machines in store yet to test, so at a precipice about it. 4K HEVC support makes no sense to me today, considering the transmissions we receive from our cable TV suppliers are 1080p HD max and that too about close to 200 channels. THe only reason I am inclined to take a loss of $700 on my 2016 MBP to buy a new 2017 is the keyboard, since I am a content writer and would use the keyboard. Even then, I have been thinking of adding a mechanical cherry MX blue keyboard when I will type like crazy, so there is that. External keyboard when I am into typing heavily = $50. Satisfaction = >100%. Buying a new MBP 2017 will mean spending the $600 loss, and getting slightly amped up keyboard.

Also.. if I do think of selling my machine today, I do not think I will buy a 2017 machine. I will then manage back with my 2011, since my travel will cease in some days and this 13 will also be a desktop machine. So, I will wait for 2018 where maybe I will get 16GB RAM on 13 inch models as well. There is this thought as well.

Selling it today means I will get back $1500 out of $2200 I spent on this.
 
Hardware decoding of 4K 10-bit HEVC will eventually become as ubiquitous as H.264 is on hardware today.

I worry more about Apple's transition to a unified I/O port (ThunderBolt3), while excluding every other port. HDMI and DP have already advanced to support 10-bit, but these new versions are not in the ThunderBolt3 spec (obviously). So while other PC manufacturers will be able to use the latest-and-greatest port as they become available, will Mac users have to wait for the likes of ThunderBolt4, etc?

For all the convenience of ThunderBolt3, all of which i appreciate, this seems to be a major drawback for pro users IMHO. Each individual port/protocol gets to improve faster than these 'meta' ports like ThunderBolt.
 
Last edited:
No not at all. I'm saying that if you are buying now, lean heavily toward a 2017 model. I also told people in 2016 to wait one more year if they could. However, if you already own a 2016 then it depends upon usage but it's hard to justify spending all that money again after just one year. Luckily Apple is implementing partial support for the 2016 models so that may be enough for many. But if you're buying today it would not be a good idea to buy a 2016 IMO, unless you're getting a killer deal on it.

As for the displays they are not quite true 10-bit from what I understand, but they are functionally 10-bit. They are wide gamut displays on the iMac and MacBook Pro. It's right in the Apple specs for the 2016 in fact:

https://support.apple.com/kb/SP747?locale=en_CA

The MacBook does not get a wide gamut display however.

10-bit and wide gamut are two different things. Gamut is the range of colors the display can reproduce, and 8-bit or 10-bit tells you how many steps there are between each color. The MBPs and iMacs have P3 displays, which means they can reproduce a wider range of colors, but they are still 8-bit, meaning there are still only 256 steps between black and "maximum" red (#000000 to #FF0000). This means that for gradients, there can still be color banding between each color (though proper dithering can mask this).
 
Last edited:
Hardware decoding of 4K 10-bit HEVC will eventually becoming as ubiquitous as H.264 is on hardware today.

I worry more about Apple's transition to a unified I/O port (ThunderBolt3), while excluding every other port. HDMI and DP have already advanced to support 10-bit, but these new versions are not in the ThunderBolt3 spec (obviously). So while other PC manufacturers will be able to use the latest-and-greatest port as they become available, will Mac users have to wait for the likes of ThunderBolt4, etc?

For all the convenience of ThunderBolt3, all of which i appreciate, this seems to be a major drawback for pro users IMHO. Each individual port/protocol gets to improve faster than these 'meta' ports like ThunderBolt.

That is the kind of stuff that weirds me out. By the time a technology matures enough to give really sweet and delicious fruits, we axe that tree and plant a new one and call it the most amazing yet. This helps commercialism, but does not help buyers.
 
  • Like
Reactions: Aquamite
There's been some talk lately that the 2017 15" Pro may have a 10-bit display, but I actually spoke with Apple marketing and confirmed it's still an 8-bit panel...

I don't understand, why 8-bit only if the graphic card support natively 10-bit ?

All the latest Polaris Radeon support it (MBP 15" 2016 and 2017), see it on the video Engine section here:
https://www.notebookcheck.net/AMD-s-Polaris-Architecture.172663.0.html

Please forgive my ignorance :)
 
10-bit and wide gamut are two different things. Gamut is the range of colors the display can reproduce, and 8-bit or 10-bit tells you how many steps there are between each color. The MBPs and iMacs have P3 displays, which means they can reproduce a wider range of colors, but they are still 8-bit, meaning there are still only 256 steps between black and "maximum" red (#000000 to #FF0000). This means that for gradients, there can still be color banding between each color (though proper dithering can mask this).

There's been some talk lately that the 2017 15" Pro may have a 10-bit display, but I actually spoke with Apple marketing and confirmed it's still an 8-bit panel. So are the new iMacs, but they use some advanced built-in dithering methods to simulate higher precision.
That's what I was suggesting by "functionally 10-bit", although I admit that is probably an inappropriate description.

In any case, given that iOS 11 and macOS High Sierra betas have been out there for weeks now, as have the new 2017 Macs, I'm surprised nobody is testing HEVC playback on these new machines. Maybe they feel they need to wait for the Public Beta before releasing this info? Or maybe most just don't care enough to actually test this.
 
Waiting for public beta most likely.
Me too!

I did install iOS 11 on my iPhone 7 Plus and recorded some 4K 2160p30 video in HEVC. This video won't play cleanly via software playback on my 2010 2.93 GHz Core i7 870 iMac with 10.12 Sierra, regardless of the software I use. It's really bad with vlc, a complete disaster. With IINA it's OK but quite stuttery. And QuickTime in Sierra has no support at all for it.

But of course, it's smooth as buttaa on the iPhone 7 Plus for playback. The video also plays easily on my 2017 Core i7 4.2 GHz, and with only about 150% CPU usage.

Out of interest, I tried the same file on a 2017 MacBook 12". Using MPV or IINA, the video was very stuttery. CPU usage up to about 360%. The smoothness was roughly on par with the 2010 Core i7 2.93 GHz. Again, I'm waiting for the High Sierra public beta, for native OS hardware playback.
 
Last edited:
  • Like
Reactions: Aquamite
10-bit and wide gamut are two different things. Gamut is the range of colors the display can reproduce, and 8-bit or 10-bit tells you how many steps there are between each color. The MBPs and iMacs have P3 displays, which means they can reproduce a wider range of colors, but they are still 8-bit, meaning there are still only 256 steps between black and "maximum" red (#000000 to #FF0000). This means that for gradients, there can still be color banding between each color (though proper dithering can mask this).

There's been some talk lately that the 2017 15" Pro may have a 10-bit display, but I actually spoke with Apple marketing and confirmed it's still an 8-bit panel. So are the new iMacs, but they use some advanced built-in dithering methods to simulate higher precision.

I can not see 10 bit display in an rMBP that soon as they do consume a lot more power - at a guess almost double, and that goes against current design ideas. To my knowledge the only current laptop with a 10 bit display is the HP Zbook 17 Gen3. there have been many more dating back to the old 8740w but seems to be a very niche market and at this time only HP is playing in for pro's needing to do color work on site.
 
well, has anyone thought that apple doesn't support this in the Current OS and or VLC may not be the optimal play for this? Perhaps this will change in High Sierra and it will support it officially.
 
  • Like
Reactions: macintoshmac
well, has anyone thought that apple doesn't support this in the Current OS and or VLC may not be the optimal play for this? Perhaps this will change in High Sierra and it will support it officially.

I have switched over to IINA some days ago after years of VLC. Best thing about it is its use of colourspace and I think it is far more responsive than VLC. Oh, also supports Touch Bar. :p Picture definitely looks better in IINA on the new MacBook Pros than in VLC. Try it. :)
 
I have switched over to IINA some days ago after years of VLC. Best thing about it is its use of colourspace and I think it is far more responsive than VLC. Oh, also supports Touch Bar. :p Picture definitely looks better in IINA on the new MacBook Pros than in VLC. Try it. :)

well... that was easy. It plays 10 265 video no problems, while VLC sucks. I never liked VLC on Mac... The color space is better than VLC too, but not as good as quicktime. Can we change the color space to match quicktime?
 
  • Like
Reactions: macintoshmac
well... that was easy. It plays 10 265 video no problems, while VLC sucks. I never liked VLC on Mac... The color space is better than VLC too, but not as good as quicktime. Can we change the color space to match quicktime?

QT is native Apple. :)

VLC has not managed to use HEVC codecs very nicely just yet. 2.xx versions do not show anything playing an HEVC Main10 video on my 2016 MBP, only audio works. VLC 3.xx shows the first scene and then audio plays. IINA plays the video at least, super slow. I think by the time codecs are ripe and Apple releases High Sierra, IINA should kill it on Skylake. :) The moment I played IINA, I knew VLC is going to be retired.

You say you are playing 10 265 video no problem. Mind giving me the link to download? Also, which machine are you playing it on?
 
QT is native Apple. :)

VLC has not managed to use HEVC codecs very nicely just yet. 2.xx versions do not show anything playing an HEVC Main10 video on my 2016 MBP, only audio works. VLC 3.xx shows the first scene and then audio plays. IINA plays the video at least, super slow. I think by the time codecs are ripe and Apple releases High Sierra, IINA should kill it on Skylake. :) The moment I played IINA, I knew VLC is going to be retired.

You say you are playing 10 265 video no problem. Mind giving me the link to download? Also, which machine are you playing it on?

2016 MacBook Pro 2.9GHz - The link is in this thread some place. Plays Butterly smooth for me.
 
well... that was easy. It plays 10 265 video no problems, while VLC sucks. I never liked VLC on Mac... The color space is better than VLC too, but not as good as quicktime. Can we change the color space to match quicktime?
Try playing this video:

Camp (Nature) 4K HDR Demo (Sony)

http://www.4ktv.de/testvideos/

It's a Sony 4K HEVC 60 Hz video. Even my 4.2 GHz Core i7 iMac can't play it perfectly via software in IINA, with 650% CPU usage (out of 800%). It does pretty well, but there are occasional stutters. It really needs hardware playback. I'm waiting for High Sierra's public beta to try hardware playback.

There is an 8-bit version and a 10-bit version.
 
13" right? Now I will find the link and see for myself..

15"
[doublepost=1498717354][/doublepost]
Try playing this video:

Camp (Nature) 4K HDR Demo (Sony)

http://www.4ktv.de/testvideos/

It's a Sony 4K HEVC 60 Hz video. Even my 4.2 GHz Core i7 iMac can't play it perfectly via software in IINA, with 650% CPU usage (out of 800%). It does pretty well, but there are occasional stutters. It really needs hardware playback. I'm waiting for High Sierra's public beta to try hardware playback.

There is an 8-bit version and a 10-bit version.

Interesting... my 2016 MacBook Pro runs it perfect. No skips or stutters. Yes the 10 Bit one. To compare I ran it on my i72600K Windows 10 PC via VLC and it looked and ran terrible. I'm happy with the result, I'm sure it will be even better in High Sierra.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.