Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
in my opinion ( I use only MP4/M4V 1080/720 h264 and a few h265 )
IINA colors 7/10 and some crash
Quicktime player colors 9/10 and low CPU
 
Last edited:
Geekbench OpenCL score is 3X slower than Sierra. Don't know yet what this is about.
I assume this was with a beta release.
What about Metal and OpenCL under the definitive release of High Sierra?
On my Mac Pro 5,1 with a Geforce 980 under Sierra Metal is 54193 and OpenCL 114385 which was respectively 10 and nearly 20% better than El Capitan. If the score under High Sierra is worse, I have another reason not to upgrade.
Can anyone post the Geekbench results under High Sierra?
 
Sorry if this a newb question but can a MacBook Pro 2016 with TouchID play 4k Hevc files? I tried downloading some of the sample files I saw in this thread and either they don't open at all in quicktime and with VLC they play with tons of lag and stutter. This is normal right since its not Kaby lake?
 
Sorry if this a newb question but can a MacBook Pro 2016 with TouchID play 4k Hevc files? I tried downloading some of the sample files I saw in this thread and either they don't open at all in quicktime and with VLC they play with tons of lag and stutter. This is normal right since its not Kaby lake?
You need to know that 4K (UHD) is 4 times the pixels of mainstream 1080p. And often with HDR and 10-bin color depth. So it's more resource intensive. the file size is also huge. To circumvent this they developed HEVC (high efficiency video codec) which is a complex algorithm to decode and it also demands lots of resource. So, dedicated hardware is recommended to decode this high resolution files with complex codec. If a system doesn't have that (or can't utilise hardware), it puts a lot of pressure of CPU because it's done in software; causing it to stutter or lag.

You didn't mention if the MBP is 15" or 13".

The MBP of last year (Skylake) has support for 8-bit HEVC built into the hardware, cpu chip. The model of 2017 with kaby lake has 10-bit support for the same. So if you are plating a 10-bit hevc 4K on 2016 MBP, it would use more cpu than 2017 model. This applies for both the 13" and 15" models.

However, the 15" models had dedicated GPUs that has hardware acceleration. In Windows 15" MBP play 4K HEVC files super fine. MacOS however doesn't support dGPU for this. How ironic.
 
You need to know that 4K (UHD) is 4 times the pixels of mainstream 1080p. And often with HDR and 10-bin color depth. So it's more resource intensive. the file size is also huge. To circumvent this they developed HEVC (high efficiency video codec) which is a complex algorithm to decode and it also demands lots of resource. So, dedicated hardware is recommended to decode this high resolution files with complex codec. If a system doesn't have that (or can't utilise hardware), it puts a lot of pressure of CPU because it's done in software; causing it to stutter or lag.

You didn't mention if the MBP is 15" or 13".

The MBP of last year (Skylake) has support for 8-bit HEVC built into the hardware, cpu chip. The model of 2017 with kaby lake has 10-bit support for the same. So if you are plating a 10-bit hevc 4K on 2016 MBP, it would use more cpu than 2017 model. This applies for both the 13" and 15" models.

However, the 15" models had dedicated GPUs that has hardware acceleration. In Windows 15" MBP play 4K HEVC files super fine. MacOS however doesn't support dGPU for this. How ironic.

Don't forget the version of the OS. (only High Sierra has support for hardware decdoing of HEVC)
 
  • Like
Reactions: Wildhope
You need to know that 4K (UHD) is 4 times the pixels of mainstream 1080p. And often with HDR and 10-bin color depth. So it's more resource intensive. the file size is also huge. To circumvent this they developed HEVC (high efficiency video codec) which is a complex algorithm to decode and it also demands lots of resource. So, dedicated hardware is recommended to decode this high resolution files with complex codec. If a system doesn't have that (or can't utilise hardware), it puts a lot of pressure of CPU because it's done in software; causing it to stutter or lag.

You didn't mention if the MBP is 15" or 13".

The MBP of last year (Skylake) has support for 8-bit HEVC built into the hardware, cpu chip. The model of 2017 with kaby lake has 10-bit support for the same. So if you are plating a 10-bit hevc 4K on 2016 MBP, it would use more cpu than 2017 model. This applies for both the 13" and 15" models.

However, the 15" models had dedicated GPUs that has hardware acceleration. In Windows 15" MBP play 4K HEVC files super fine. MacOS however doesn't support dGPU for this. How ironic.

Thanks. I have the 15 inch model and latest OS. Thanks for the model. So my model should play 8bit 4k content easily but not 10bit? Can I use parallels to play 4K video on Windows? Good to know.
 
Thanks. I have the 15 inch model and latest OS. Thanks for the model. So my model should play 8bit 4k content easily but not 10bit? Can I use parallels to play 4K video on Windows? Good to know.
Your 15" 2016 model has hardware that can support 8-bit HEVC decoding on CPU and 10-bit HEVC decoding on dGPU. But again, this comes down to whether that hardware(hardware acceleration) is being used or not.

Windows operating system will use your cpu and gpu for hardware acceleration whenever necessary to give you an awesome experience. If you are a videophile, you can even use madVR or something as a decoder combined with MPC-HC to get some amazing movie experience.

MacOS however is a bit weird in this respect. Till High Sierra(last month) macOS didn't even have any HEVC hardware acceleration. So even if your computer has the needed hardware, the os won't use it. The more pathetic part is that, now that 10-bit HEVC decoding is supported in os High Sierra, the system still won't use the dedicated GPU for this. It's limited to using the hardware from cpu chip(which is limited to 8-bit for skylake). Why apple chose not to use the hardware that they put in their devices... I donno.

So you get the idea. In your computer - On Windows, you'll have no problem at all playing super high bitrate (300mbps+) 4K HEVC 10-bit HDR content. But on macOS, since it's not using the dGPU, it's still limited to that 8-bit support, hence there will be noticeably higher CPU usage. I hope apple fixes this in their future release of macOS. Because it doesn't make sense and it's driving some people crazy.

EDIT: Just for the info, bitrate makes a great impact on the actual image quality. I would take the risk to even say that.. a very low bitrate 4K video file would look worse than a high bitrate 1080P. Resolution is only a part of the picture. If I am not wrong, the UHD blu-ray discs has a max bitrate standard of 120-130mbps. So if you see files on internet that says 300..400mbps, you can be absolutely certain that in actual real life usage, you would never come across a movie or a music video encoded with bitrate that high.

Also on another note, just so you know, it doesn't make that much of a difference in terms of resource usage - whether you watch a 4K on your laptop's screen(which is not 4K), or a 4K monitor or project it to a wall.
 
Last edited:
So even if your computer has the needed hardware, the os won't use it. The more pathetic part is that, now that 10-bit HEVC decoding is supported in os High Sierra, the system still won't use the dedicated GPU for this. It's limited to using the hardware from cpu chip(which is limited to 8-bit for skylake). Why apple chose not to use the hardware that they put in their devices... I donno.
They probably don't have the ressources to develop the drivers. See their latest Quarter earning report. They're broke. :rolleyes:
 
Haha.

They had amazing growth over the past couple of quarters. Specially in wearables department. Mac also.

They must have some good reason I guess, as to why they are not using dGPUs to their full potential.
 
I suppose they don't want to give full HEVC api support until they themselves have content or software ready for it. Why give competitors edge on their own platform... they need more money for Unca Cooks money bin.

UncaCook.jpg


FCPX 10.4 will have HEVC support, but what hardware does it support? iMac Pro and soon to be released Macbook Pro updates with Kaby Lake chips only?

UPDATE: Because even iMac Pro wont be sufficient for HDR content creation (non HDR display), this leaves market for Mac Pro and a new display. Maybe Apple wants to sell iMac Pro owners 27" ProMotion OLED HDR display with 1000 nts to fully benefit FCPX 10.4? Clever strategy, Apple.
 
Last edited:
  • Like
Reactions: djcristi
When it comes to video decoding/encoding acceleration, Apple is the worst of the vendors. They are so lazy it's like they're insulting their customer base.
It took ages to have H.264 HW decoding, and it was limited to Quicktime at first. There are many codecs they don't implement, they've never used the encoding abilities of dGPUs, and as for decoding, it seems it's limited to intel GPUs these days. The fact that they never port back new features to earlier OSes only makes it worse.
Apple only moves when complains are so loud that the CEO hears them from his tower. VDAdecoder was released only after Adobe publicly said they could not use the hardware for flash video playback. If they hadn't, HW decoding may still be limited to AVFoundation. Thankfully apple moved it to a lower-level framework.

In general, anything related to GPUs appears to be severely controlled by Apple. AMD/Nvidia provide no tool, nothing to tweak their GPUs, no video player/encoder using their hardware. Is it because they think it's not worth it? I tend to believe that Apple tell them NOT to release any Mac software beside bare-bone 3D drivers (which Apple totally control).
 
  • Like
Reactions: djcristi
Its a shame that Apple is essentially "behind" Microsoft in the department of hevc hardware decode support. I remember when they took the lead with adopting and implementing h264.

I guess the times(and CEO and Apple) have changed. Im just glad we have some support for hevc, finally.
 
  • Like
Reactions: djcristi
Ooo.. I just tried mpv.

It's for pro people it seems and the build I downloaded doesn't have hwdec support. But man is it awesome!

I'll read about it and build one tomorrow. It fixes EVERYTHING. I'll keep things posted.
 
multi crashes and color oversaturated with IINA
so ...delete
Quicktime and Movist working very good
VLC handle subtitles MP4/M4V badly
 
Last edited:
What are the real world use cases that is the actual problem here?

Considering Macs use dithering for 10 bit and come no where near the 1000 nit spec of HDR it would seem to me playing back complex 10-bit HEVC video isn't optimal. With some effort you can encode h264 to a be complex enough it wont playback on nearly any device, however its unnecessary so we just don't do it.

When I ask for the real world use cases I'm being genuine. I'm sure there are legitimate reasons I just don't know them so I'm curious.
 
What are the real world use cases that is the actual problem here?

Considering Macs use dithering for 10 bit and come no where near the 1000 nit spec of HDR it would seem to me playing back complex 10-bit HEVC video isn't optimal. With some effort you can encode h264 to a be complex enough it wont playback on nearly any device, however its unnecessary so we just don't do it.

When I ask for the real world use cases I'm being genuine. I'm sure there are legitimate reasons I just don't know them so I'm curious.
It would be an incredible waste of time to have to re-encode existing 10-bit HEVC video files to 8-bit HEVC or h.264 just to play them back.

Your question is like asking why we need 1080p h.264 hardware decode support on an old MacBook which only has a 1280x800 screen.
 
It would be an incredible waste of time to have to re-encode existing 10-bit HEVC video files to 8-bit HEVC or h.264 just to play them back.

Your question is like asking why we need 1080p h.264 hardware decode support on an old MacBook which only has a 1280x800 screen.

I'm not denying the validity of your point but you dodge the question.

What is the current real world use case though? Where are you getting 10-bit HEVC video files at quantities large enough that this current issue is a real problem? What is this current limitation preventing you from doing?

I don't feel your analogy is very accurate because you trivialized the h264 growing pains with time. When current Macs are as old as a MacBook with 1280x800 screens they will HEVC will be supported just as well has h264 is on that particular MB.
 
I'm not denying the validity of your point but you dodge the question.

What is the current real world use case though? Where are you getting 10-bit HEVC video files at quantities large enough that this current issue is a real problem? What is this current limitation preventing you from doing?

I don't feel your analogy is very accurate because you trivialized the h264 growing pains with time. When current Macs are as old as a MacBook with 1280x800 screens they will HEVC will be supported just as well has h264 is on that particular MB.
Uh, no. Back in the day, people would say it didn't matter that such a machine didn't have 1080p h.264 hardware decode support, whereas a number of us made a point of waiting for machines that could do it in hardware.

That's why I bought a MacBook Pro in 2009 with GeForce 9400M, and that's why I bought a cheap Windows laptop in 2010 with Intel GMA 4500MHD. Fast forward a few years, and those who bought machines without such support were kicking themselves. (I considered getting the 2008 aluminum MacBook but it didn't have Firewire or a backlit keyboard so I waited one more year.)

Some of us are making purchasing choices now for similar reasons. I didn't buy the 2015 and 2016 MacBooks because the keyboards sucked and because they didn't have full hardware HEVC decode support. Apple this year has already said they are moving toward HEVC going forward.

The 2017 Macs support the above. As a bonus the 2017 Macs also have hardware DRM support for 4K streaming. Now it's true that currently iTunes and Netflix do not support 4K streaming at all to Macs, but if/when it does come, it is extremely likely neither the 2015 nor 2016 Macs will get it. No guarantee for the 2017s either, but it's likely, since they have the hardware support for it.

Thus, I waited and bought the 2017 MacBook and the 2017 iMac. Although I could be wrong, my prediction is that 2018 will bring 4K streaming to Macs with macOS 10.14, and only 2017 or later Macs will be supported.

BTW, I don't know if I understood your current post 100% correctly about old Macs but it sounds like the contention you made in that post is inaccurate. A 2015 or 2016 Mac will NEVER be able to support hardware 10-bit 4K HEVC decode. Even 5-10 years from now it won't, because the hardware doesn't exist in the machine. (Actually it does in some third-party GPUs, but Apple has stated it requires it to be a certain Intel generation, that being 7th gen or later. Apple is not supporting it in nVidia or AMD GPUs in these machines.)
 
Last edited:
The iMac Pro's Intel CPUs do not include an integrated GPU. Thus we were wondering how they would do hardware HEVC in the iMac Pro.

Well, it turns out the iMac Pro includes an Apple A10 Fusion, that has full hardware HEVC support for 10-bit decode and 8-bit encode.

https://www.macrumors.com/2017/11/19/imac-pro-a10-chip-hey-siri/

So, this may be how Apple has chosen to address this issue, still (as expected) leaving AMD (and nVidia) GPUs completely unsupported.
 
I was feeling so good, until I found the truth about the current hwdec API of macOS, VideoToolbox. Maybe I didn't pay much attention to details earlier.

No matter how much we discuss about 2017 macs supporting 10bit hardware decoding capability, it's far from being a smooth experience. I found that hardware decode by nature degrades image quality in the rendered frames in some cases. Though my mac can do a full 10 bit hardware decode for hevc, the API VideoToolbox doesn't support 10bit yet. So when I play a 10bit HEVC HDR remux and it's being decoded by VT with less than 30% CPU, there are weird quality drops like color banding, artefacts, in some cases complete inability to render frame at all.

When I play the same thing without hwdec the CPU boosts up to 400% but man the quality is superb. Laptop becomes a heated slab of aluminium but the video quality is fantastic. This limitation is because of the API. And QT is really pathetic when it handles a BT2020 encode. I tried remuxing and colors are all off.

VAAPI and D3D11 API however support 10bit output. Try playing the same file in Windows and Linux. It blows mac right out of water. Not that I play super high bitrate 10bit hdr files everyday but it would be nice to have the ability. And nobody knows when Apple would add support for that.

I've spent some time in github raising issues and talking to people in their freenode IRC channel. ViedoToolbox has limitations . MPV has the same limitations and so does all of it's forks.
 
  • Like
Reactions: hawkeye_a
Nope you are wrong. VideoToolbox supports 10 bit HEVC and can decode it to a 10bit pixel format. Are you sure it's not a mpv issue?

QuickTime on mac supports bt.2020 and can display it with right colors. The things it doesn't support is HDR.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.