Good to hear. Someone should try the same procedure for 2016 MBP 13" with Touch Bar, i5 2.9GHz processors, if they have a copy of Windows on hand.
No GPU decode on that. CPU usage on an i5 would be very high.
Good to hear. Someone should try the same procedure for 2016 MBP 13" with Touch Bar, i5 2.9GHz processors, if they have a copy of Windows on hand.
No GPU decode on that. CPU usage on an i5 would be very high.
ffmpeg and Handbrake can't do that. mp4box can remux hev1 to hvc1 in a somewhat clunky way, so that's what I built the script for.
If I'd be Apple's marketing team, I'd hold the HEVC gpu support until they're ready to release the next version of FCPX. That way they'd have an edge ahead of the competition. Along with the release of the iMac Pro they could promote the huge difference the HEVC makes. Apple doesn't want to give Adobe months time to tool Premiere Pro before they get FCPX out.And they got criticised for a long time for being backwards, closed minded and falling behind Windows. This year they signalled several times that they accept the criticism and will modernise the OS. That meant playing a lead with HEVC, improving Metal, attracting more gamers, developing VR, supporting eGPU, and reintroducing the modular tower Mac.
You see now? If we get all the above but then a big FU for HEVC decode on GPU it would be a real insult to Mac users. We're being sold this idea that finally Apple is going to be serious about graphics again. Let's see it happen. Otherwise it's another year of ******** hype and products with planned obsolescence built in.
That's all I need to say nowmy video made the point more obviously.
If I'd be Apple's marketing team, I'd hold the HEVC gpu support until they're ready to release the next version of FCPX.
Sure it is controversial, but not that much different than what Cuda is with Nvidia. Apple doesn't want to open too much to give the competition an edge. As doesn't Nvidia with Cuda.That's potentially illegal of course and similar to the trouble Google and Microsoft have had. Apple too have been slapped for this.
It's possible that Apple has already done this with FCP by having GPU encode drivers only activate within that application but not available to other apps.
On the Mac Pro forum I famously benchmarked Premiere h264 encoding on a Mac Pro. It was a very thorough test comparing the rendering time in macOS and Windows on the same machine.
On macOS when we selected the OpenCL or CUDA renderer, they were no faster than CPU.
The Windows version encoded 4 times faster. This was a sign to us that the macOS version of Premiere didn't have access to h264 encoding on GPU.
Here is my Mac retirement home:
But for more modern purchases, I've become more selective and buy more infrequently. I wondered if Apple would take the easy way out and just support Intel chips for HEVC hardware decode, so I waited. I felt the HEVC transition was going to be a huge one, perhaps not as big as the h.264 transition, but big nonetheless, so I didn't want to take too many chances on this feature.
This. Exactly this. The new codec for mass use needs to gain popularity. 2-3 years and we will have a ripe, mature codec.
I haven't done this test, and I don't have a link, but I thought I read somewhere that some people claim that x265 superfast is similar quality to Intel Quick Sync for h.265 HEVC encoding, while being roughly as fast too.Now I'm curious... anyone has run a quality/bitrate comparison between x264 and Apple's HEVC encoder? Part of why h.264 was so huge was because of the perceptible quality improvements... if HEVC is just about reducing file size then it's useful for streaming but less compelling for personal media. And even then, I wonder how Apple's encoder compares to x264.
I haven't done this test, and I don't have a link, but I thought I read somewhere that some people claim that x265 superfast is similar quality to Intel Quick Sync for h.265 HEVC encoding, while being roughly as fast too.
However, it's not really about encoding speed IMO. For me it's about CPU usage. x265 superfast would still use a ton of CPU.
For me though, the main issue is CPU usage for playback. With some of the TV demos for 4K 10-bit HEVC, my iMac using Kaby Lake's Quick Sync uses less than 10% CPU, and even my MacBook only uses 25% CPU. For software-only playback, it doesn't even work properly on either machine. Can't play those files cleanly even with 100% CPU, and multitasking is impossible. For my i5-7600 iMac, it's like trying to play a 20 Mbps h.264 1080p file on a 2.0 GHz Core Duo without hardware h.264 playback support. You're able to get clearly rendered video but it's still unwatchable because of all the stutters. For the i7-7700K iMac, it's almost smooth but not quite, and not only is the CPU at maximum, so is the case fan, drowning out the audio. And on the MacBook with software-only decode it's a total lost cause.
Which demo videos? Some of the videos are harder to decode than others. The one I have been using the most is the Sony Camp 10-bit (not 8-bit) 76 Mbps HEVC video. Using IINA, I couldn't even do it completely cleanly with an i7-7700K in software on the Mac.On the Pentium Skylake I tried with above videos, Windows Media Player did software decoding of 10bit 4K HEVC at 25% CPU. That's a dual core CPU with no hyperthreading. 60FPS Playback and real time scrubbing.
Which demo videos?
25% CPU on a dual core system on Windows means 50% of a CPU core right? I really don't like how windows reports CPU usage.Sony Camp, Bravia glassmaking, and a few other latest ones that had the highest bit rates on that site.
25% CPU on a dual core system on Windows means 50% of a CPU core right? I really don't like how windows reports CPU usage.
Anyhow 50% CPU usage for a 4K 60fps HEVC video just seems impossible. A 4K 24fps video takes ~250% CPU of my Kaby lake i5 in VLC. The Sony Camp (Nature) 60 fps video is unplayable (350+% CPU and the image stays black). In quicktime, it takes 25% of a CPU (adding usages of QT player, VTDecoderXPCService and WindowServer).
I'd be very surprised if VMP could do miracles and use like 8x less CPU than VLC in software. My guess is that you have hardware decoding going on (by the GPU maybe?).
I just tested on Windows 10 with VLC, and the Sony video is as unplayable as it was on High Sierra. Blank image with the CPUs pegged at 95% or more.
Thanks. Just note though that it would be the most useful to test QuickTime in 10.13.I just tested both of the Sony Camp videos on my hackintosh with i7-7700K @5.1GHz and GTX 10880 on Sierra 10.12.6.
- On VLC, it was unwatchable.
- QuickTime reported that the files could not be opened.
- IINA played both files fine.
- Camp SDR 8-bit showed ~260-300% (out of a possible 800%) CPU usage.
- Camp HDR 10-bit showed ~260-370% (out of a possible 800%) CPU usage.
The way to do the test for software playback in QuickTime would still be in 10.13.In high Sierra with a Kaby lake CPU, it plays very fine. I'm curious to know if Quicktime could handle it in software though.