Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thats exactly a nail on mah head man! Mine is the 2016 15" with full specs. Extreme buyer's remorse. T_T I am a designer too, OCD as ****.
I started by collecting VCDs in my childhood and I've seen how things progressed over to DVD, dual layer DVD, then blu-ray etc. And now UHD blu-ray. 4K is going to be mainstream, sooner or later. In the pace that technology is progressing it would be sooner than later. Maybe in a couple of years. 2-3 years ago 4K UHD blurry was not really that popular of a thing, if I am not wrong. But now there are many titles and new ones releasing in UHD BD format.

A 4000$ MacBook Pro can't beat a 1300$ MacBook Pro (not mine, mine is a TB one) - just thinking of this would make one angry. But hey... not everyone has to be upset. This 4K media frenzy doesn't interest everybody. I watch stuff a lot, so it becomes a big deal to me but I know a lot of people who could care less about the resolution of the movie they watch. People started crying when they removed the SD card slot saying that the "pro" people can't use it anymore. It might be issue for some people, yeah but certainly not for everybody. There are many other professions where the "pro" people don't use their SD card slot AT ALL. Things can be like that here as well. People who are not super into high definition stuff, 4k isn't a matter of importance to them.

In your case, I know I have no place to say. But if I was in your place and I was very much into 4K, I would sell my laptop even if I lose some money and save me some frustration. But if I didn't have to think of 4K this much, I wouldn't think twice about it. A laptop typically lasts 5-6 years for most people. Your milage might vary. So do whatever reduces your worry.

One OCD designer to another.
 
  • Like
Reactions: aristobrat
I started by collecting VCDs in my childhood and I've seen how things progressed over to DVD, dual layer DVD, then blu-ray etc. And now UHD blu-ray. 4K is going to be mainstream, sooner or later. In the pace that technology is progressing it would be sooner than later. Maybe in a couple of years. 2-3 years ago 4K UHD blurry was not really that popular of a thing, if I am not wrong. But now there are many titles and new ones releasing in UHD BD format.

A 4000$ MacBook Pro can't beat a 1300$ MacBook Pro (not mine, mine is a TB one) - just thinking of this would make one angry. But hey... not everyone has to be upset. This 4K media frenzy doesn't interest everybody. I watch stuff a lot, so it becomes a big deal to me but I know a lot of people who could care less about the resolution of the movie they watch. People started crying when they removed the SD card slot saying that the "pro" people can't use it anymore. It might be issue for some people, yeah but certainly not for everybody. There are many other professions where the "pro" people don't use their SD card slot AT ALL. Things can be like that here as well. People who are not super into high definition stuff, 4k isn't a matter of importance to them.

In your case, I know I have no place to say. But if I was in your place and I was very much into 4K, I would sell my laptop even if I lose some money and save me some frustration. But if I didn't have to think of 4K this much, I wouldn't think twice about it. A laptop typically lasts 5-6 years for most people. Your milage might vary. So do whatever reduces your worry.

One OCD designer to another.

I think it is a great lesson =) Your case is very well thought out, you are wise to have waited! In this case, I could just have all my files converted into 8 bit and my MBP will be able to decode it? I do understand your OCD though, supper design OCD!
 
I think it is a great lesson =) Your case is very well thought out, you are wise to have waited! In this case, I could just have all my files converted into 8 bit and my MBP will be able to decode it? I do understand your OCD though, supper design OCD!
That is so not true man. I was very very close to buying one from Amazon in January but I missed the narrow window of good deals. And then I didn't buy because I couldn't buy. But I was actively looking for deals (back to school) and was planning to wait for a while. Later when I found out that referred Macs are coming in June, I felt compelled to wait further. Then when I searched for whats new, I got to know about all these. It's not planned, I'm not wise. It was just luck and coincidence.

To answer your question... Yes. If you play 8bit HEVC, it would play with full hardware acceleration with less than 40-50% CPU usage for (roughly) 80-100mbps stream.

Do one thing... Download this file: http://4kmedia.org/sony-camping-in-nature-4k-demo/
Play it in QuickTime Player in High Sierra. See how it handles. Below is a chart of some testings I did last month.

Media Playback - Stats.png
 
If you have a 2016 15" Pro and need to watch some high bitrate 10-bit HEVC video, just install Boot Camp and watch it in Windows! Works great there!
 
  • Like
Reactions: DarkSel
2016 max specced 15" Pro here;
I've tested 5 different sample videos; 2 from Sony (Camp and Sword), 2 from Samsung (Nature and Sport) and the jellyfish one. All 4k 10bit HDR. Tested in IINA and QT. They all stutter in High Sierra.

I've tested the same videos in Bootcamp Windows. Buttery smooth.

Thanks, Apple :/
 
  • Like
Reactions: Miltz
Windows uses your dGPU to do the decoding in hardware. Mac doesn't.

There is a petition out there. Somebody from MR started that. It's about asking apple to support dGPU for HEVC decode. Please support that.
 
2016 max specced 15" Pro here;
I've tested 5 different sample videos; 2 from Sony (Camp and Sword), 2 from Samsung (Nature and Sport) and the jellyfish one. All 4k 10bit HDR. Tested in IINA and QT. They all stutter in High Sierra.

I've tested the same videos in Bootcamp Windows. Buttery smooth.

Thanks, Apple :/

The "Pro" Software from Apple supports Hardware Acceleration.
That's all Apple wants to sell you.

Even the old "QuickTime Pro" Player once had editing features, but this happened at a time when Apple did not sell Millions of iPhones. So they do not spend time on real features.

Let's just hope the Animoji don't stutter and run smooth.
 
2016 max specced 15" Pro here;
I've tested 5 different sample videos; 2 from Sony (Camp and Sword), 2 from Samsung (Nature and Sport) and the jellyfish one. All 4k 10bit HDR. Tested in IINA and QT. They all stutter in High Sierra.

I've tested the same videos in Bootcamp Windows. Buttery smooth.

Thanks, Apple :/
Terrible....
 
2016 max specced 15" Pro here;
I've tested 5 different sample videos; 2 from Sony (Camp and Sword), 2 from Samsung (Nature and Sport) and the jellyfish one. All 4k 10bit HDR. Tested in IINA and QT. They all stutter in High Sierra.

I've tested the same videos in Bootcamp Windows. Buttery smooth.

Thanks, Apple :/

Sucks big time.
 
I will point out that this difference in hardware decode capability in Intel chips was known well before the 2016 MacBook Pros even were released.

What we weren’t sure about was whether or not Apple would support hardware decode in their dGPUs but it was a safer bet to assume they would not necessarily do that, since so many of their laptops do not have dGPUs these days.

In fact, the main reasons I waited for 2017 to buy was because of these two factors:
1) Only Kaby Lake has full 10-bit hardware HEVC decode.
2) Only Kaby Lake has full hardware support for 4K DRM.

Apple has implemented #1 already obviously. I don't know if Apple will implement #2, but the machine is ready for it. (For #2, think iTunes 4K and Netflix 4K. This is currently not supported on any Macs.)

Skylake has neither, but this fact already made the mainstream tech media by summer 2016, before Apple launched the 2016 MacBook Pros.

BTW, Kaby Lake also has full 10-bit VP9 decode (whereas Skylake has no 10-bit VP9 decode support), although I'm not sure if this will ever be implemented.
 
I will point out that this difference in hardware decode capability in Intel chips was known well before the 2016 MacBook Pros even were released.

What we weren’t sure about was whether or not Apple would support hardware decode in their dGPUs but it was a safer bet to assume they would not necessarily do that, since so many of their laptops do not have dGPUs these days.

In fact, the main reasons I waited for 2017 to buy was because of these two factors:
1) Only Kaby Lake has full 10-bit hardware HEVC decode.
2) Only Kaby Lake has full hardware support for 4K DRM.

Apple has implemented #1 already obviously. I don't know if Apple will implement #2, but the machine is ready for it. (For #2, think iTunes 4K and Netflix 4K. This is currently not supported on any Macs.)

Skylake has neither, but this fact already made the mainstream tech media by summer 2016, before Apple launched the 2016 MacBook Pros.

BTW, Kaby Lake also has full 10-bit VP9 decode (whereas Skylake has no 10-bit VP9 decode support), although I'm not sure if this will ever be implemented.

You're comment is true and painful =P
[doublepost=1507254329][/doublepost]
That is so not true man. I was very very close to buying one from Amazon in January but I missed the narrow window of good deals. And then I didn't buy because I couldn't buy. But I was actively looking for deals (back to school) and was planning to wait for a while. Later when I found out that referred Macs are coming in June, I felt compelled to wait further. Then when I searched for whats new, I got to know about all these. It's not planned, I'm not wise. It was just luck and coincidence.

To answer your question... Yes. If you play 8bit HEVC, it would play with full hardware acceleration with less than 40-50% CPU usage for (roughly) 80-100mbps stream.

Do one thing... Download this file: http://4kmedia.org/sony-camping-in-nature-4k-demo/
Play it in QuickTime Player in High Sierra. See how it handles. Below is a chart of some testings I did last month.

View attachment 723002
Thank you for the tips! I tried those videos and its true as you said.
8 bit is smooth.
10 bit is unwatchable!

You are very wise to get the 2017 version!
 
(Stupid) Question guys:
I've also got the maxed out 2016 15" MBP and have downloaded the Sony camping file suggested in post #253.
When you guys try it out on your 2016 or 2017 Macbooks, do you play it on the integrated display or do you have a 4k Monitor connected? Sorry if this is a stupid question or if its been answered before, but i'm asking it because while the video doesn't even show anything more than a black screen in VLC with fans spinning like crazy, Quicktime opens the file, no problem. It runs buttery smooth in full screen, and zoomed in to actual size, with a cpu load of 4% and on integrated graphics. Only limitation is the MBPs display, as I don't have a 4k Display to connect it to...
 
I'm also very frustrated with that, I've sign the petition but there is only few signs.
Did anyone already tried to contact Mr Craig Federighi ?
 
Thought i'd chime in as a 2016 13"MBP(tb) owner.

1) I purchased the tMBP knowing about the Skylake limitation and that Kabylake would address it.
2) It wasn't an issue for me as the screen on the MBP is neither 4k, nor HDR. And if I wanted to watch 4k HDR content it wouldn't be on a 13" screen anyway (personally).
3) To consume(watch) 4k content, most new TVs will decode HEVC 4K HDR (Or an XBOS or some such device would).

It would have been cool if the 2016 MBPs had the ability, but the chances of me using it, if it did, are so slim anyway that it makes no difference to me. No buyers remorse here.

However, i must say that macOS seems to be technically lagging in this regard. If Windows can accomplish it with ease on the same hardware why not macOS? It might just be that Apple is more focused on watches and homepods these days, and not so much on Macs anymore(big surprise, right?)

Cheers
 
However, i must say that macOS seems to be technically lagging in this regard. If Windows can accomplish it with ease on the same hardware why not macOS? It might just be that Apple is more focused on watches and homepods these days, and not so much on Macs anymore(big surprise, right?)
Apple can do it. They are in fact already doing it on the iMac Pro with AMD Vega. They just don’t want to bother with it on the rest of the Macs because it’s not worth it to them to implement it on AMD Polaris since it's only a few machines, and these machines were marketed without this feature anyway. This way it’s an easy cutoff. ALL 7th Gen or later get it. ALL 6th Gen or earlier do not get it. You won't have a situation where some 6th Gen machines don't have it and some 6th Gen machines do, which could annoy almost as many people.
 
Last edited:
I purchased the tMBP knowing about the Skylake limitation and that Kabylake would address it.
I very recently did a big mistake. Although had I known about it earlier, I assume the outcome won't have been any different.

I recently suggested a friend of mine to opt for a macbook air. It seemed way more capable for the things that might be done on it. My mistake was that I didn't read the spec sheet for the 2017 MBA. In 2017 it has a 5th gen intel processor whereas I imagined in June they refreshed all macs to kaby lake.

Nonetheless, the owner doesn't and won't care about HEVC or 4K anyways and no issues there. The only remorse of mine is that I didn't know. :p
 
I very recently did a big mistake. Although had I known about it earlier, I assume the outcome won't have been any different.

I recently suggested a friend of mine to opt for a macbook air. It seemed way more capable for the things that might be done on it. My mistake was that I didn't read the spec sheet for the 2017 MBA. In 2017 it has a 5th gen intel processor whereas I imagined in June they refreshed all macs to kaby lake.

Nonetheless, the owner doesn't and won't care about HEVC or 4K anyways and no issues there. The only remorse of mine is that I didn't know. :p

Hmmm it's strange that Apple didnt use a Skylake proc.

I should clarify that I DO care about HEVC, 4K and HDR. However, personally I prefer to consume my media on my big screen TV. To top it off, from what i've heard/read, most encoding is apparently(not sure) done in software, so hardware support was inconsequential.

Now-a-days its more important to me that my TV, gaming box and phone be able to hardware decode than my Mac.
 
most encoding is apparently(not sure) done in software, so hardware support was inconsequential
Not really. Common media players like QuickTime, IINA, VLC etc. they WOULD USE a hardware acceleration(hwdec) IF AVAILABLE. But MPlayer or MPV won't, even if you have hwdec, by default. There is a reason. Quality issues, API limitations etc.
 
  • Like
Reactions: Aquamite
Not really. Common media players like QuickTime, IINA, VLC etc. they WOULD USE a hardware acceleration(hwdec) IF AVAILABLE. But MPlayer or MPV won't, even if you have hwdec, by default. There is a reason. Quality issues, API limitations etc.

I'm not sure what your 'not really' refers to?

The quote from my previous post was about ENcoding. Hardware encoding, from what i've read, is generally of a much lower quality than software. Decoding is always better in hardware.

Cheers
 
I'm not sure what your 'not really' refers to?

The quote from my previous post was about ENcoding. Hardware encoding, from what i've read, is generally of a much lower quality than software. Decoding is always better in hardware.

Cheers
I meant 'No' by not really. En/De-coding when done by hardware an API handles all that. The better the API, the better the results. To encode or decode via software, processor resources are used, huge amounts of it in case of high bitrate streams. This thing when done by actual hardware that is embedded physically in a SOC, CPU isn't stressed nearly as much. I can not tell you about quality of hardware encoding from personal experience. But what I can tell you for sure that hardware decode is NOT always better. There are many restrictions and limitations of APIs in many cases. macOS' videotoolbox sucks but Linux's VAAPI / VDPAU or Windows' DXVA is much better.
 
I meant 'No' by not really. En/De-coding when done by hardware an API handles all that. The better the API, the better the results. To encode or decode via software, processor resources are used, huge amounts of it in case of high bitrate streams. This thing when done by actual hardware that is embedded physically in a SOC, CPU isn't stressed nearly as much. I can not tell you about quality of hardware encoding from personal experience. But what I can tell you for sure that hardware decode is NOT always better. There are many restrictions and limitations of APIs in many cases. macOS' videotoolbox sucks but Linux's VAAPI / VDPAU or Windows' DXVA is much better.

While hardware encoding is faster because it is making use of instructions on the metal that doesnt mean the quality of the output image is as good as possible. While software encoding uses more power and is less efficient, the level of flexibility and customization of different encoding parameters makes for a slower yet better quality encode. (Just so we're clear, i'm talking about using a chip's specific instruction set for a specific codec when referring to HW encoding).

From what i've read, that is probably why encoders such as HandBrake encode in software (even though H264 support has been baked into most SOCs for almost 10 years). However, i'm not aware of how Pro software such as Premier or FCP do encodes. (A recent example that springs to my mind, is the RaspberryPi, which has H264 support on the SOC, while faster at encoding, the quality of the output from the HW encoder is far lower than when encoding using software).

I suppose HW encoding is best suited for cameras recording to a specific format as software encoding would be too slow, or for transcoding purposes(such as plex). IMHO

I'm curious, does anyone know if Premier or FCP uses hardware encoders for production quality output when available ? (I can understand that while editing it would make sense to use the hardware encoding as it is much faster)

Cheers
 
Last edited:
So, I have a 2016 15" Pro. Some 10-bit HEVC plays okay at lower bitrates, but that 60fps 75mbps 4K HDR Sony camping video brings my poor machine to its knees. Constant stutters in QuickTime.

UNTIL...

I plug in my Radeon RX 580 external GPU. I guess Apple must be using hardware acceleration from the eGPU, because that same video plays back beautifully on an external monitor with an eGPU attached. That makes me feel a lot more at ease about not having the 2017 model, because 10-bit HEVC acceleration is the only substantial upgrade and the eGPU I bought anyway for other reasons fixes it for my 2016 model.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.