Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just updated to High Sierra but barely able to play h.265 file what the hell is the issue?
 
What hardware? What software? What file?
I just installed High Sierra. Didn't update anything else. Nothing seems different.

The same IINA player is still using same resource as it was in Sierra.

I tried with Jellyfish 4K 10bit HEVC 140mbps & Planet Earth II S01E01 HEVC 2160p UHD BluRay HDR DTSHD5.1-DDR 45mbps 10bit.

The information window is still showing "Hw Decoder: No" as opposed to "Hw Decoder: videotoolbox".

I guess the players are to be updated soon, right?

Screen Shot 2017-09-26 at 8.55.29 AM.png
 
That doesn't support hardware HEVC decode. I have no idea when they would get it.
That's upsetting. I thought the players were up for hardware decode. It's the OS we were waiting for.

Most of the movies are in mkv container, which can't be played in QT. What are you guys doing to check if hardware decode is here?
[doublepost=1506409878][/doublepost]I saw your post about QT using hardware decode in your m3 MacBook. Did you try any media file in any player other than QT? (I mean like MPV or VLC or IINA to check hardware decode) ?
 
I've only just used IINA for the very first time, so I don't have a Sierra comparison to make. In Sierra I only ever tried opening the files with VLC, which doesn't open either file in High Sierra either.

Yeah that's the problem. I did the same test with similar file an IINA on my Late 2013 Mac Pro, both before and after I updated to High Sierra. No change.

So, @EugW is correct. Unless you have Skylake or Kaby Lake CPU, you're not going to play these H.265 files back via hardware, even after a High Sierra/iMovie/FCPX update. And unless you have Kaby Lake, you're not going to encode these for export, either. Regardless of dGPU :(
[doublepost=1506435804][/doublepost]
My 2016 15" CTO TB MBP with max CPU and GPU cannot play the Jellyfish 4k H265 file with VLC. It doesn't even attempt to open with Quicktime. Gonna try IINA now, just heard about it reading this thread!

Edit: Jellyfish 120mbps 4K 10bit opens fine with IINA!
Sony 4K HDR Camp.mp4 opens with IINA but it is stuttering a bit

Edit 2: Sometimes Jellyfish stutters as well. IINA does not use my discrete GPU by default, I have to force it to use the discrete GPU. By default it uses the integrated GPU which goes to around 45-50% usage playing the file, while CPU usage hits around 90-96%. On discrete it uses around 10-15% GPU and CPU usage is around 45-50% or so. Can't say I'm too impressed by this performance, tbh. So much for the discrete card handling H.265 in High Sierra, I guess they never said it would handle 4K 10 bit H.265!

You mention Jellyfish H.265 files, but I only see H.264 versions?
 
MacBook Pro Late 2016 featuring AMD Radeon Pro GPUs does support HEVC 10-bit Hardware Decoding. This means that the vast majority of HEVC videos (coded in 10-bit) can benefit for hardware decoding in these machines and can be watched flawlessly, when compared to software decoding (which makes videos playing sluggish).

When playing HEVC 10-bit videos in Windows 10, via Bootcamp, hardware decoding from the AMD Radeon Pro GPU is used and the video is beautifully played, with very low CPU usage.

MacOS High Sierra does not use AMD Radeon Pro GPU capabilities for video hardware decoding in general. It just use Intel Core processor capabilities. This means that MacBook Pros Late 2016, which feature Intel Skylake processors, are only capable of hardware decoding HEVC 8-bit video. Leaving the 10-bit ones for software decoding, thus almost unplayable.

I created a petition to ask Apple to issue a MacOS High Sierra Software Update to support HEVC 10-Bit hardware decoding in the AMD Radeon Pro GPUs, making the MacBook Pro Late 2016 support HEVC 10-bit hardware decoding.

Please sign here and let put pressure on Apple!
https://www.ipetitions.com/petition/hevc-10-bit-hardware-decoding-in-macbook-pro-2016
 
  • Like
Reactions: tonidavid5
You mention Jellyfish H.265 files, but I only see H.264 versions?

I have a 4K 10bit HEVC Jellyfish file I downloaded months ago (forget where from) just to test the 2016 TB MBP’s ability to run it smoothly.


Signed. I’ll be pleasantly surprised if Apple actually updates the OS to better support their own less than one year old MBPs.
 
MacBook Pro Late 2016 featuring AMD Radeon Pro GPUs does support HEVC 10-bit Hardware Decoding. This means that the vast majority of HEVC videos (coded in 10-bit) can benefit for hardware decoding in these machines and can be watched flawlessly, when compared to software decoding (which makes videos playing sluggish).

When playing HEVC 10-bit videos in Windows 10, via Bootcamp, hardware decoding from the AMD Radeon Pro GPU is used and the video is beautifully played, with very low CPU usage.

MacOS High Sierra does not use AMD Radeon Pro GPU capabilities for video hardware decoding in general. It just use Intel Core processor capabilities. This means that MacBook Pros Late 2016, which feature Intel Skylake processors, are only capable of hardware decoding HEVC 8-bit video. Leaving the 10-bit ones for software decoding, thus almost unplayable.

I created a petition to ask Apple to issue a MacOS High Sierra Software Update to support HEVC 10-Bit hardware decoding in the AMD Radeon Pro GPUs, making the MacBook Pro Late 2016 support HEVC 10-bit hardware decoding.

Please sign here and let put pressure on Apple!
https://www.ipetitions.com/petition/hevc-10-bit-hardware-decoding-in-macbook-pro-2016

Sucks. You say the "vast majority of HEVC videos (coded in 10-bit)"... do you mean that most HEVC videos/captures are normally 10-bit, as opposed to 8-bit? Do we know if the iPhone 8/8 Plus HEVC captures are 10-bit or 8?
 
Sucks. You say the "vast majority of HEVC videos (coded in 10-bit)"... do you mean that most HEVC videos/captures are normally 10-bit, as opposed to 8-bit? Do we know if the iPhone 8/8 Plus HEVC captures are 10-bit or 8?
Most 4K HEVC is probably 8-bit, but there is a lot of 10-bit stuff too. iPhone 7/7 Plus are 8-bit, but I don't know what the 8/8 Plus/X output. My guess is 8-bit. For some reason, Apple isn't utilizing Kaby Lake's ability to hardware encode 10-bit, and I suspect the Ax chips haven't been built for this either. But that is just a guess.
 
  • Like
Reactions: BeatCrazy
Now... in my system, that is 2017 MBP 13” TB... the 60fps 70mbps 4K HEVC Sony Camp plays super fine with hardware acceleration. Doesn’t break any sweat at all and plays the said file at less than 50% load in one cpu core.

ONLY in QuickTime Player though, bacause its the only player with hwdec, as of now.

The other day I noticed something interesting as a fellow MR reader mentioned in his post. Quicktime uses significantly less resource.

I bought iStat Menus 2 days ago and have been using the menu item that shows the remaining battery time. Though I knew it’s expected, I was surprised to see how playing a 2 hour film in IINA differs from playing the same file in QT - strictly measuring battery life.

So, I’ve decided to go with QT for now, as much as possible, until some other player that uses similar kind of cpu resource(in turn battery) comes up. There are major limitations to this QT though, that I’ve found so far.

Does anybody know how to access/edit/delete the controls/settings of QuickTime Player?
 
Last edited:
I just found this thread after doing my own testing. Got a 2017 15'' MacBook Pro yesterday thought I'd open up some 2160p x265 planet earth footage seeing my old iMac couldn't handle it. I knew that Kaby lake had hardware decode features supported in High Sierra so I expected things to be buttery smooth with the playback. Open up the file in VLC and it stutters really badly! So I wonder if I download some mp4 footage if it was any better in quicktime and guess what x265 2160p runs perfectly in QT, then open the same file in VLC and same stuttering that MKV planet earth had.

So I sorta conclude that VLC hasn't been updated to support Mac x265 hardware decoding or the new software decode in older Macs in high Sierra. Who knows?
 
Everyone's different, but the main factor to consider is how everything in your media ecosystem will grow. For example, whenever I record video of even just my kids on my iPhone, I just use the best option is that is available on it. Right now, with my iPhone 7 Plus and iOS 11 beta, all my video now is 4K HEVC 8-bit. Would I downgrade to 1080p h.264 just because my other machines can't handle 4K HEVC? No, I'd rather just get machines that can play it natively.

That is completely unplayable on my old MacBook Pro. On more recent MB Air machines they can be played back for example, but they still have moderate CPU usage (and thus decreased battery life). In this instance, Skylake MacBook Pros (2016) are fine because they will decode 8-bit 4K HEVC in hardware. These will also work fine on the 2016 and 2017 MacBooks. So, in that context, would you actually recommend the MacBook Air or a refurb 2015 MacBook (which can't play it properly)?

Also, these videos don't work at all on my iPad Air 2, even with iOS 11, because the task is simply too much. My iPad Air 2 is a triple-core A8X. Overall the iPad Air 2 is great little device, and in some ways it's better than the most recent iPad, but if someone were to ask me which to buy today, I'd say the later iPad no question, simply because it has 8-bit hardware HEVC decoding built into its dual-core A9.

May I ask you a question?
I’m still not understanding 4K and if I can play it on my Mac. Some articles I read say it cannot play it, others say it can. I have a 2016 MacBook Pro Touch Bar 15”. Will I only be able to view 4K video via an external monitor or can it run natively on my screen?

This whole topic is so confusing to me.
 
May I ask you a question?
I’m still not understanding 4K and if I can play it on my Mac. Some articles I read say it cannot play it, others say it can. I have a 2016 MacBook Pro Touch Bar 15”. Will I only be able to view 4K video via an external monitor or can it run natively on my screen?

This whole topic is so confusing to me.
The 2016 can try to play any HEVC video via software. However, for some 4K, it will be too slow to do it via software.

In order to play it cleanly, ideal is to play it via hardware. Your 2016 is capable of playing 8-bit 4K in hardware (at least for files that are compatible with QuickTime).

However, for 10-bit 4K, Apple does not support hardware playback on 2016 Macs. So for most 10-bit 4K, your 2016 Mac will fail to play it back cleanly. It can try to play it, but it will stutter like mad unfortunately because it is software playback only.

tl;dr:

If the file is compatible with QuickTime, then on your 2016 MacBook Pro:

8-bit 4K HEVC playback should be fine (hardware playback)
10-bit 4K HEVC playback will be attempted but it will likely play poorly with lots of stutters (software playback)
 
Last edited:
  • Like
Reactions: Speechless
May I ask you a question?
I’m still not understanding 4K and if I can play it on my Mac. Some articles I read say it cannot play it, others say it can. I have a 2016 MacBook Pro Touch Bar 15”. Will I only be able to view 4K video via an external monitor or can it run natively on my screen?

This whole topic is so confusing to me.
Let me clarify this for you. I love doing this. :p

4K(3840x2160.. also referred to as 2160p) is 4 times(2x width and 2x height than 1920x1080) the resolution but there are other things in play here like.. high dynamic range, bit depth etc. With larger resolution and other data the file size becomes huge. So 4K files are typically very resource intensive. And if the file is of higher bitrate or higher frame rate, it it even more so.

History of the earlier days... Back in 2007 IIRC, the h264 became popular, one of the most popular codec used till today. Almost all(I think) CPUs(that have integrated graphics) today have inbuilt dedicated hardware chip that does this coding/decoding of h264 encoded file, so that the CPU doesn't take the load. This is called support for "hardware en/decoding". If a system has hardware support for this CPU remains less stressed but if it doesn't the entire thing is done via software processing and cpu is under super stress. So a h264 encoded 4K file might not play properly if the CPU is weak and/or having issues coping up with the heavy 4K resolution.

[Maybe you already know all these better than I do. In that case just ignore the upper part]

Anyways, using this codec, encoding a 4K file in decent bitrate takes huge space. To circumvent this issue a new codec was developed couple of years ago h.265(aka high efficiency video coding or HEVC), successor to h.264. This codec reduces file size drastically and hence it is so popular for 4K media but it is a super resource heavy algorithm to decode.

Now, like old days, good people over cpu manufacturing companies - they decided to put a dedicated hardware chip as an integrated gpu in the CPU to handle this immense stress. The stress of decoding/encoding this new HEVC codec in hardware. Intel's 6th gen Skylake U processor had this for the first time; native 8bit HEVC hardware support. And this 7th generation Kaby Lake processor has full 10bit HEVC hardware support. 8bit 10bit - these are just the number of colours.

[Again, if you already knew these just ignore]

Nonetheless, most of the 4K stuff I've seen so far are 10bit. So your 15" 2016 MBP having a Skylake cpu, it can handle 8bit HEVC 4K just fine. The hardware will do all that. But for 10bit, the CPU has to take some(significant amount) load to handle that encoding/decoding in software processing. For that you will see high cpu usage, faster battery drain and warmer laptop surface. Now... remember that your Mac has a dedicated gpu which can handle 4K HEVC just fine because it has a hardware chip itself to support HEVC. But macOS doesn't support utilising it yet. You can use windows and windows os supports that.

Whether you see the 4K movie in your tiny laptop screen or feed it in a OLED TV or project it in a gigantic theatre - it doesn't matter from the perspective of CPU. (Well it matters but not in this context of HEVC support). If it played in your monitor it will play wherever you feed it and vice versa.

Hope it cleared some of your confusions. :)

EDIT : Please correct me if I've messed up the timeline anywhere or missed the terminology.
 
Last edited:
  • Like
Reactions: Speechless
The 2016 can try to play any HEVC video via software. However, for some 4K, it will be too slow to do it via software.

In order to play it cleanly, ideal is to play it via hardware. Your 2016 is capable of playing 8-bit 4K in hardware (at least for files that are compatible with QuickTime).

However, for 10-bit 4K, Apple does not support hardware playback on 2016 Macs. So for most 10-bit 4K, your 2016 Mac will fail to play it back cleanly. It can try to play it, but it will stutter like mad unfortunately because it is software playback only.

tl;dr:

If the file is compatible with QuickTime, then on your 2016 MacBook Pro:

8-bit 4K HEVC playback should be fine (hardware playback)
10-bit 4K HEVC playback will be attempted but it will likely play poorly with lots of stutters (software playback)

Let me clarify this for you. I love doing this. :p

4K(3840x2160.. also referred to as 2160p) is 4 times(2x width and 2x height than 1920x1080) the resolution but there are other things in play here like.. high dynamic range, bit depth etc. With larger resolution and other data the file size becomes huge. So 4K files are typically very resource intensive. And if the file is of higher bitrate or higher frame rate, it it even more so.

History of the earlier days... Back in 2007 IIRC, the h264 became popular, one of the most popular codec used till today. Almost all(I think) CPUs(that have integrated graphics) today have inbuilt dedicated hardware chip that does this coding/decoding of h264 encoded file, so that the CPU doesn't take the load. This is called support for "hardware en/decoding". If a system has hardware support for this CPU remains less stressed but if it doesn't the entire thing is done via software processing and cpu is under super stress. So a h264 encoded 4K file might not play properly if the CPU is weak and/or having issues coping up with the heavy 4K resolution.

[Maybe you already know all these better than I do. In that case just ignore the upper part]

Anyways, using this codec, encoding a 4K file in decent bitrate takes huge space. To circumvent this issue a new codec was developed couple of years ago h.265(aka high efficiency video coding or HEVC), successor to h.264. This codec reduces file size drastically and hence it is so popular for 4K media but it is a super resource heavy algorithm to decode.

Now, like old days, good people over cpu manufacturing companies - they decided to put a dedicated hardware chip as an integrated gpu in the CPU to handle this immense stress. The stress of decoding/encoding this new HEVC codec in hardware. Intel's 6th gen Skylake U processor had this for the first time; native 8bit HEVC hardware support. And this 7th generation Kaby Lake processor has full 10bit HEVC hardware support. 8bit 10bit - these are just the number of colours.

[Again, if you already knew these just ignore]

Nonetheless, most of the 4K stuff I've seen so far are 10bit. So your 15" 2016 MBP having a Skylake cpu, it can handle 8bit HEVC 4K just fine. The hardware will do all that. But for 10bit, the CPU has to take some(significant amount) load to handle that encoding/decoding in software processing. For that you will see high cpu usage, faster battery drain and warmer laptop surface. Now... remember that your Mac has a dedicated gpu which can handle 4K HEVC just fine because it has a hardware chip itself to support HEVC. But macOS doesn't support utilising it yet. You can use windows and windows os supports that.

Whether you see the 4K movie in your tiny laptop screen or feed it in a OLED TV or project it in a gigantic theatre - it doesn't matter from the perspective of CPU. (Well it matters but not in this context of HEVC support). If it played in your monitor it will play wherever you feed it and vice versa.

Hope it cleared some of your confusions. :)

EDIT : Please correct me if I've messed up the timeline anywhere or missed the terminology.

WOW - THANK YOU BOTH!!

Definitely cleared up those questions and I appreciate this a lot!! :):)
 
Let me clarify this for you. I love doing this. :p

4K(3840x2160.. also referred to as 2160p) is 4 times(2x width and 2x height than 1920x1080) the resolution but there are other things in play here like.. high dynamic range, bit depth etc. With larger resolution and other data the file size becomes huge. So 4K files are typically very resource intensive. And if the file is of higher bitrate or higher frame rate, it it even more so.

History of the earlier days... Back in 2007 IIRC, the h264 became popular, one of the most popular codec used till today. Almost all(I think) CPUs(that have integrated graphics) today have inbuilt dedicated hardware chip that does this coding/decoding of h264 encoded file, so that the CPU doesn't take the load. This is called support for "hardware en/decoding". If a system has hardware support for this CPU remains less stressed but if it doesn't the entire thing is done via software processing and cpu is under super stress. So a h264 encoded 4K file might not play properly if the CPU is weak and/or having issues coping up with the heavy 4K resolution.

[Maybe you already know all these better than I do. In that case just ignore the upper part]

Anyways, using this codec, encoding a 4K file in decent bitrate takes huge space. To circumvent this issue a new codec was developed couple of years ago h.265(aka high efficiency video coding or HEVC), successor to h.264. This codec reduces file size drastically and hence it is so popular for 4K media but it is a super resource heavy algorithm to decode.

Now, like old days, good people over cpu manufacturing companies - they decided to put a dedicated hardware chip as an integrated gpu in the CPU to handle this immense stress. The stress of decoding/encoding this new HEVC codec in hardware. Intel's 6th gen Skylake U processor had this for the first time; native 8bit HEVC hardware support. And this 7th generation Kaby Lake processor has full 10bit HEVC hardware support. 8bit 10bit - these are just the number of colours.

[Again, if you already knew these just ignore]

Nonetheless, most of the 4K stuff I've seen so far are 10bit. So your 15" 2016 MBP having a Skylake cpu, it can handle 8bit HEVC 4K just fine. The hardware will do all that. But for 10bit, the CPU has to take some(significant amount) load to handle that encoding/decoding in software processing. For that you will see high cpu usage, faster battery drain and warmer laptop surface. Now... remember that your Mac has a dedicated gpu which can handle 4K HEVC just fine because it has a hardware chip itself to support HEVC. But macOS doesn't support utilising it yet. You can use windows and windows os supports that.

Whether you see the 4K movie in your tiny laptop screen or feed it in a OLED TV or project it in a gigantic theatre - it doesn't matter from the perspective of CPU. (Well it matters but not in this context of HEVC support). If it played in your monitor it will play wherever you feed it and vice versa.

Hope it cleared some of your confusions. :)

EDIT : Please correct me if I've messed up the timeline anywhere or missed the terminology.

The 2016 can try to play any HEVC video via software. However, for some 4K, it will be too slow to do it via software.

In order to play it cleanly, ideal is to play it via hardware. Your 2016 is capable of playing 8-bit 4K in hardware (at least for files that are compatible with QuickTime).

However, for 10-bit 4K, Apple does not support hardware playback on 2016 Macs. So for most 10-bit 4K, your 2016 Mac will fail to play it back cleanly. It can try to play it, but it will stutter like mad unfortunately because it is software playback only.

tl;dr:

If the file is compatible with QuickTime, then on your 2016 MacBook Pro:

8-bit 4K HEVC playback should be fine (hardware playback)
10-bit 4K HEVC playback will be attempted but it will likely play poorly with lots of stutters (software playback)

Wow according to @Sovon Halder
If most 4K files you saw were all 10bit, then that means that the 2016 MacBook Pro with Skylake is already obsolete for future proofing into the 4K world future???


The thing that is really fustrating is that my 15" 2016 has that dedicated GPU which can all do those decoding but apple doesn't support it. Making my MBP not future proof.
 
Wow according to @Sovon Halder
If most 4K files you saw were all 10bit, then that means that the 2016 MacBook Pro with Skylake is already obsolete for future proofing into the 4K world future???

The thing that is really fustrating is that my 15" 2016 has that dedicated GPU which can all do those decoding but apple doesn't support it. Making my MBP not future proof.
Unfortunately yes, if 10-bit 4K HEVC is important to you, then you're out of luck.
 
Unfortunately yes, if 10-bit 4K HEVC is important to you, then you're out of luck.

I think that 4K is becoming a standard. But according to your experience, whats the usual colour bit that 4K comes in if you purchase it from iTunes, or other sources.

And what reason do you think Apple won't support the dGPUs in the 2016 15" MBPs?
 
I think that 4K is becoming a standard. But according to your experience, whats the usual colour bit that 4K comes in if you purchase it from iTunes, or other sources.

And what reason do you think Apple won't support the dGPUs in the 2016 15" MBPs?
Cuz they didn't say they would. They said you need 2017 or later.
 
I think that 4K is becoming a standard. But according to your experience, whats the usual colour bit that 4K comes in if you purchase it from iTunes, or other sources.

And what reason do you think Apple won't support the dGPUs in the 2016 15" MBPs?

You mean bit depth. Physical UHD bluray discs are encodes in 10bit HEVC to support Rec.2020 colorspace. Not entirely sure what itunes or netflix does. Streaming 4K is worse than UHD bluray anyways. Some might say, a 30mbps h264 1080p looks better than a 15mbps h265 4K on fast scenes or scenes that consists mainly of dark colors. Resolution isn’t everything.

And only apple knows why they intentionally chose to not use the hardware that they put in their computers. Business tricks maybe.
 
Last edited:
You mean bit depth. Physical UHD bluray discs are encodes in 10bit HEVC to support Rec.2020 colorspace. Not entirely sure what itunes or netflix does. Streaming 4K is worse than UHD bluray anyways. Some might say, a 30mbps h264 1080p looks better than a 15mbps h265 4K on fast scenes or scenes that consists mainly of dark colors. Resolution isn’t everything.

And only apple knows why they intentionally chose to not use the hardware that they put in their computers. Business tricks maybe.

Omg, I hope its not some business trick, that would be some dark turn of events.

Do you yourself have a 2016 15" model?
 
Omg, I hope its not some business trick, that would be some dark turn of events.

Do you yourself have a 2016 15" model?

I don't like what Apple is doing either. They should never have released any MacBooks with Skylake cpu in the first place in the end of 2016. And at that time, people who adopted the newly designed Macs, to them, the launch of revised spec of 2017 Macs are just an upsetting event. At least I would be upset if I purchased a 2016 model later to find how I messed up.

I'm from India. We have to wait about a month for the new stuff to launch in India. I bought a stock configuration 2017 13" MacBook Pro last month with discounts and all. I've been planning to buy a macbpook for about a year cause my older system was a real pain in the butt. In India the products launched in December 2016 but I missed some good deals and as I waited a couple of months and there were news of new generation Macs in April with Kaby lake.

Back then I didn't know anything about this 4K, UHD, Skylake/kabylake in MacBooks. But when I found out, I felt very much relieved about not purchasing earlier. All I do is watch movies, listen to music and chat online. I am a designer/developer by profession and I kind of have OCD(in a good way). I guess all designers have OCD to some extent.

Nevertheless.. When apple announced at their event that they will support external GPUs officially and will add native HEVC support for Macs, many people were so happy. I was happy too. But Apple... they are very particular about their vocabulary. During presentation everything seemed dandy but there were hidden points and missing topics. Later we found out that their High Sierra will utilise hardware for HEVC, but only the internal GPUs. Why? I dunno. Maybe they have valid reasons that I don't understand. So, people who already feel bad buying a system that got revised just 6 months later - right when they saw the light at the end of the tunnel, that their system will be supporting HEVC in macOS after all cause they have dedicated GPUs - right then people found out that their hope will ruin cause apple chose to go with the internal GPUs. Why?? I don't fracking know. I feel bad too. It's not right. They should at least acknowledge that they will do something about this in coming future, whenever that is. My idea is that.. when I'm paying a year worth saving in a product, I'm buying experience, I need to feel happy looking at it, using it. And that isn't the case for 2016 MacBook Pro owners.
 
I don't like what Apple is doing either. They should never have released any MacBooks with Skylake cpu in the first place in the end of 2016. And at that time, people who adopted the newly designed Macs, to them, the launch of revised spec of 2017 Macs are just an upsetting event. At least I would be upset if I purchased a 2016 model later to find how I messed up.

I'm from India. We have to wait about a month for the new stuff to launch in India. I bought a stock configuration 2017 13" MacBook Pro last month with discounts and all. I've been planning to buy a macbpook for about a year cause my older system was a real pain in the butt. In India the products launched in December 2016 but I missed some good deals and as I waited a couple of months and there were news of new generation Macs in April with Kaby lake.

Back then I didn't know anything about this 4K, UHD, Skylake/kabylake in MacBooks. But when I found out, I felt very much relieved about not purchasing earlier. All I do is watch movies, listen to music and chat online. I am a designer/developer by profession and I kind of have OCD(in a good way). I guess all designers have OCD to some extent.

Nevertheless.. When apple announced at their event that they will support external GPUs officially and will add native HEVC support for Macs, many people were so happy. I was happy too. But Apple... they are very particular about their vocabulary. During presentation everything seemed dandy but there were hidden points and missing topics. Later we found out that their High Sierra will utilise hardware for HEVC, but only the internal GPUs. Why? I dunno. Maybe they have valid reasons that I don't understand. So, people who already feel bad buying a system that got revised just 6 months later - right when they saw the light at the end of the tunnel, that their system will be supporting HEVC in macOS after all cause they have dedicated GPUs - right then people found out that their hope will ruin cause apple chose to go with the internal GPUs. Why?? I don't fracking know. I feel bad too. It's not right. They should at least acknowledge that they will do something about this in coming future, whenever that is. My idea is that.. when I'm paying a year worth saving in a product, I'm buying experience, I need to feel happy looking at it, using it. And that isn't the case for 2016 MacBook Pro owners.

Thats exactly a nail on mah head man! Mine is the 2016 15" with full specs. Extreme buyer's remorse. T_T I am a designer too, OCD as ****.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.