Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

doctor93

macrumors newbie
Original poster
Jun 26, 2017
6
1
Hello! I am going to buy a new 256GB macbook pro 13" (non touch bar) from United States.
My use is quite simple: Browsing, emailing, movies, music, occasionally photo and video editing, (no gaming)
I am confused between 2016 model with skylake and 2017 model with kaby lake.
The price difference is of 180$ for which I will get new kaby processor, higher frequency RAM (2133 over 1866Mhz).
Does 180$ difference worth paying more or should I save and get older 2016 if you think it doesn't worth.

Apple 13.3" MacBook Pro (Mid 2017, Space Gray) MPXT2LL/A
or
Apple 13.3" MacBook Pro (Space Gray, Late 2016) MLL42LL/A
 

doctor93

macrumors newbie
Original poster
Jun 26, 2017
6
1
I think it is worth it to go for the Kabylake model. Plus, there may be undocumented changes in the manufacturing process since, which improves build quality as the process has matured.
I agree with the mature processing in newer one but I am feeling great deal if I am getting 1299$ skylake. Is there any significant difference in my usage?
 

aevan

macrumors 601
Feb 5, 2015
4,455
7,090
Serbia
I agree with the mature processing in newer one but I am feeling great deal if I am getting 1299$ skylake. Is there any significant difference in my usage?

Short answer: no.

Long answer: no, there isn't any significant difference for your usage. An hour extra of battery life if you watch 4K movies a lot, otherwise, everything is the same. If you have a good deal on 2016. model, go for it.
 

doctor93

macrumors newbie
Original poster
Jun 26, 2017
6
1
Short answer: no.

Long answer: no, there isn't any significant difference for your usage. An hour extra of battery life if you watch 4K movies a lot, otherwise, everything is the same. If you have a good deal on 2016. model, go for it.
I am not into 4K movies or editing. Everything I have is 720p or 1080p. (my DLSR video recording, 49" LED TV)
[doublepost=1498469537][/doublepost]
I am mostly gonna use indoor, I am not into 4K movies or editing. Everything I have is 720p or 1080p. (my DLSR video recording, 49" LED TV)
 

nph

macrumors 65816
Feb 9, 2005
1,045
214
i can just add the likelihood of the keyboard to be improved a bit on the 2017 based on reports and my own experience. Could be other minor internal improvements as well. But again is it worth 20%...?
 

New_Mac_Smell

macrumors 68000
Oct 17, 2016
1,931
1,552
Shanghai

Urr top of the page:

"Apple already updates the MacBook Pro 13 after around 8 months. Modern Kaby Lake processors and faster SSDs"

Apple never said anything about SSDs on the Pro, pretty useless journalism right there.

"Our test model is equipped with the 256 GB PCIe-SSD with the designation AP0256. Not only the designation is similar to the predecessor, we can also see similar performance results"

No you don't say? Could it be because it's actually the same? When they can't even be bothered to check basic facts I'm not going to trust anything they say therein.

OP: Get the newest one then you don't have to worry. It's not necessarily worth the cost in upgrade, and if you can get a good deal on a 2016 it'll be fine. But if buying today I'd want a new machine and not an 8 month old one, at least not for the sake of $180.
 

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
For a $180 price difference, IMO, the choice is clear. Get the 2017 model unless your budget is really, really tight. It has all the latest media upgrades, some of which are absent on the Skylake model. Given that Apple is building their media ecosystem around these features, it would be risky to go with the lower model just to save $180, esp. when the 2017 is faster too.

Put it this way, one poster in another thread was trying to play back 10-bit HEVC in High Sierra and it was completely unwatchable on his Skylake model. The 8-bit HEVC worked perfectly without high CPU usage. Why? Because Skylake has 8-bit HEVC decode support in hardware but does not have 10-bit decode support in hardware. Kaby Lake has both.

If you had Skylake already, I would not recommend upgrading, but since you are buying now, I think the choice is easy to go for Kaby Lake.
 

DarkSel

macrumors 6502
Dec 22, 2012
278
81
For a $180 price difference, IMO, the choice is clear. Get the 2017 model unless your budget is really, really tight. It has all the latest media upgrades, some of which are absent on the Skylake model. Given that Apple is building their media ecosystem around these features, it would be risky to go with the lower model just to save $180, esp. when the 2017 is faster too.

Put it this way, one poster in another thread was trying to play back 10-bit HEVC in High Sierra and it was completely unwatchable on his Skylake model. The 8-bit HEVC worked perfectly without high CPU usage. Why? Because Skylake has 8-bit HEVC decode support in hardware but does not have 10-bit decode support in hardware. Kaby Lake has both.

If you had Skylake already, I would not recommend upgrading, but since you are buying now, I think the choice is easy to go for Kaby Lake.

You are right that integrated graphics on Skylake cannot playback HEVC 10-bit in hardware. However, the Radeon Pros on both the 2016 and 2017 15 inch models are capable of HEVC 10-bit. Only people with the 13 inch 2016 models have to worry.
 

leman

macrumors Core
Oct 14, 2008
19,335
19,373
Urr top of the page:

"Apple already updates the MacBook Pro 13 after around 8 months. Modern Kaby Lake processors and faster SSDs"

Apple never said anything about SSDs on the Pro, pretty useless journalism right there.

"Our test model is equipped with the 256 GB PCIe-SSD with the designation AP0256. Not only the designation is similar to the predecessor, we can also see similar performance results"

No you don't say? Could it be because it's actually the same? When they can't even be bothered to check basic facts I'm not going to trust anything they say therein.

Yep, they made a basic mistake here, but in all honestly, notebookcheck is the only reviewer right now that actually bothers with in-depth, sane benchmarking. I definitively trust them much more than any other reviewer.
 

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
You are right that integrated graphics on Skylake cannot playback HEVC 10-bit in hardware. However, the Radeon Pros on both the 2016 and 2017 15 inch models are capable of HEVC 10-bit. Only people with the 13 inch 2016 models have to worry.
Apple is not supporting HEVC 10-bit hardware decode on AMD GPUs.

And just to confirm, the guy who tried the 10-bit videos in High Sierra had an AMD Radeon Pro 460 GPU in his 2.9 GHz Skylake MacBook Pro. Stuttery mess for 10-bit 4K HEVC in QuickTime, but 8-bit was perfectly smooth. He even tried forcing usage of the AMD GPU, to no avail.
 
Last edited:

New_Mac_Smell

macrumors 68000
Oct 17, 2016
1,931
1,552
Shanghai
Apple is not supporting HEVC 10-bit hardware decode on AMD GPUs.

And just to confirm, the guy who tried the 10-bit videos in High Sierra had an AMD Radeon Pro 460 GPU in his 2.9 GHz Skylake MacBook Pro. Stuttery mess for 10-bit 4K HEVC in QuickTime, but 8-bit was perfectly smooth. He even tried forcing the usage of the AMD GPU, to no avail.

Why would they not support it? It's up to software makers to make use of it, whether it's on the CPU or GPU. QuickTime is not a good player to use for that stuff and shouldn't be looked at for 'cutting edge' codec support. Most Intel chips back a few generations also support hybrid hardware/software decoding through the iGPU (Not so much in Mac though). Basically, Kabylake allows for more efficient encoding, but in real terms there's little difference in decoding.
 
  • Like
Reactions: macintoshmac

jrasero

macrumors regular
Feb 26, 2011
114
9
NYC
Your usage is very similar to mine and I got a 2017 MBP non TB i7 16GB 128GB

Performance wise no difference on basic tasks, one could argue you could get away w/ a MBA or MB, only thing that would really push it would be video editing

Battery is a good hour more or less better with 2017 which in itself might be worth it

Maturation: keyboard which many on here have bantered about has not changed. My "W" key is sometimes extra clicky whether the laptop is hot or cold and doesn't have the same travel feel as the rest of the keys, but is fully functional.

Honestly if Apple had the refurbished 13" non TB i7 16GB I would have gotten that over the 2017, now if your looking into the 15" I'd say Sky Lake is worth it
 

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
Why would they not support it? It's up to software makers to make use of it, whether it's on the CPU or GPU. QuickTime is not a good player to use for that stuff and shouldn't be looked at for 'cutting edge' codec support. Most Intel chips back a few generations also support hybrid hardware/software decoding through the iGPU (Not so much in Mac though). Basically, Kabylake allows for more efficient encoding, but in real terms there's little difference in decoding.
We're not talking about just movie players. We're talking about stuff like iMovie and Photos video export and anything else QT might be involved with in macOS, including Apple's pro applications when they updated. HEVC support is top to bottom in High Sierra, but you won't get that for 10-bit HEVC if you have Skylake, regardless of what other GPU you have. Apple simply isn't supporting them for this.

hevc2.jpg


BTW, Apple isn't supporting VP9 either, and so any VP9 decoding in say Chrome is purely software based, even on Kaby Lake Macs which fully support VP9 decoding in hardware. Early reports indicate this is also true on High Sierra. No hardware support for VP9 regardless of the hardware or software used.

Hell, even on the PC side, look at Netflix 4K on an AMD GPU. It can support it both from the hardware HEVC decoding perspective and for DRM reasons, but it's taken them forever to implement this on anything other than Kaby Lake.

Now it's possible that eventually Apple will add support for AMD GPUs, but guess is they won't, because there isn't really a good reason for them to. They'd rather just sell you a Kaby Lake Mac.
 
Last edited:
  • Like
Reactions: nol2001 and kazmac

New_Mac_Smell

macrumors 68000
Oct 17, 2016
1,931
1,552
Shanghai
We're not talking about just movie players. We're talking about stuff like iMovie and Photos video export and anything else QT might be involved with in macOS, including Apple's pro applications when they updated. HEVC support is top to bottom in High Sierra, but you won't get that for 10-bit HEVC if you have Skylake, regardless of what other GPU you have. Apple simply isn't supporting them for this.

hevc2.jpg


BTW, Apple isn't supporting VP9 either, and so any VP9 decoding in say Chrome is purely software based, even on Kaby Lake Macs which fully support VP9 decoding in hardware. Early reports indicate this is also true on High Sierra. No hardware support for VP9 regardless of the hardware or software used.

Hell, even on the PC side, look at Netflix 4K on an AMD GPU. It can support it both from the hardware HEVC decoding perspective and for DRM reasons, but it's taken them forever to implement this on anything other than Kaby Lake.

Now it's possible that eventually Apple will add support for AMD GPUs, but guess is they won't, because there isn't really a good reason for them to. They'd rather just sell you a Kaby Lake Mac.

But the issue is over hardware support. It could be that Apple chooses not to make use of the GPU, but we won't know until High Sierra actually comes out. Hybrid coding has been around since Broadwell maybe? Which is why is says in that chart "All Macs" for software decode. Skylake is more than capable of encoding/decoding 10bit, it's just Kabylake is more efficient with it, so will be better on battery and CPU usage. And from a working perspective, you'll be able to encode stuff just fine with a lot of Macs.

Real time playback I feel is the concern people are having? Which is fair but we really need to wait for release to find out that, technically there's no reason Skylake would have an issue with it though, regardless of AMD support.

For what it's worth it doesn't bother me at all so I'm not trying to defend anything. And as a question, are people really watching 10bit 4K content on their MacBook Pros that have 8bit '3K' screens? If that were the case it seems a little OTT to me, I suppose you could port it out but why wouldn't you use a $100 Blu-Ray player instead of faffing with cables?
 
  • Like
Reactions: macintoshmac

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
But the issue is over hardware support. It could be that Apple chooses not to make use of the GPU, but we won't know until High Sierra actually comes out. Hybrid coding has been around since Broadwell maybe? Which is why is says in that chart "All Macs" for software decode. Skylake is more than capable of encoding/decoding 10bit, it's just Kabylake is more efficient with it, so will be better on battery and CPU usage. And from a working perspective, you'll be able to encode stuff just fine with a lot of Macs.
Well not really. That's not correct, unless you're talking software decoding/encoding. Skylake has full 8-bit hardware decode, but only hybrid 10-bit decode. However, Apple seems to have chosen not to bother with hybrid decoding on Skylake. It's either hardware 8-bit decoding or software for 10-bit.

For encoding, Skylake has partial hardware encoding support for 8-bit, and no hardware encoding at all for 10-bit. It must be in software.

Software works well, but it is very CPU intensive, and if you have to use that all the time you'll be getting loud fans and crappy battery life, as well as lousy multitasking.

If you are not aware, that slide that I posted above is Apple's slide, indicating what they do and what they don't support. It's right from the horse's mouth. It's a slide from from one of the developer sessions at WWDC.

Real time playback I feel is the concern people are having? Which is fair but we really need to wait for release to find out that, technically there's no reason Skylake would have an issue with it though, regardless of AMD support.
As mentioned, Apple does not support hardware decode of 10-bit HEVC on Skylake, period. This has already been confirmed by some users running the High Sierra beta. And one shouldn't expect this to change, considering that Apple has already flat out stated they don't support hardware decode of 10-bit HEVC on Skylake.

For what it's worth it doesn't bother me at all so I'm not trying to defend anything. And as a question, are people really watching 10bit 4K content on their MacBook Pros that have 8bit '3K' screens? If that were the case it seems a little OTT to me, I suppose you could port it out but why wouldn't you use a $100 Blu-Ray player instead of faffing with cables?
It's a Pro laptop after all. Why wouldn't you want to be able to import a 4K 10-bit file and scrub through it on your Pro video editing machine for example?

Plus, if you happen to have a 4K 10-bit file, it's not as if you're going to transcode it to 8-bit 1080p first before you watch it.

BTW, these were the same arguments made when h.264 was first introduced by Apple. Now look at where we are with h.264. If your computer doesn't support h.264 hardware decode, it's a real PITA even just for surfing.

---

So in 2017, if all that is holding you back from getting a Kaby Lake machine over a Skylake machine is $180, that's not a very good reason, unless you have a really, really tight budget.
 
Last edited:
  • Like
Reactions: kazmac and MrGuder

MrGuder

macrumors 68040
Nov 30, 2012
3,026
2,012
Well not really. That's not correct, unless you're talking software decoding/encoding. Skylake has full 8-bit hardware decode, but only hybrid 10-bit decode. However, Apple seems to have chosen not to bother with hybrid decoding on Skylake. It's either hardware 8-bit decoding or software for 10-bit.

For encoding, Skylake has partial hardware encoding support for 8-bit, and no hardware encoding at all for 10-bit. It must be in software.

Software works well, but it is very CPU intensive, and if you have to use that all the time you'll be getting loud fans and crappy battery life, as well as lousy multitasking.

If you are not aware, that slide that I posted above is Apple's slide, indicating what they do and what they don't support. It's right from the horse's mouth. It's a slide from from one of the developer sessions at WWDC.


As mentioned, Apple does not support hardware decode of 10-bit HEVC on Skylake, period. This has already been confirmed by some users running the High Sierra beta. And one shouldn't expect this to change, considering that Apple has already flat out stated they don't support hardware decode of 10-bit HEVC on Skylake.


It's a Pro laptop after all. Why wouldn't you want to be able to import a 4K 10-bit file and scrub through it on your Pro video editing machine for example?

Plus, if you happen to have a 4K 10-bit file, it's not as if you're going to transcode it to 8-bit 1080p first before you watch it.

BTW, these were the same arguments made when h.264 was first introduced by Apple. Now look at where we are with h.264. If your computer doesn't support h.264 hardware decode, it's a real PITA even just for surfing.

---

So in 2017, if all that is holding you back from getting a Kaby Lake machine over a Skylake machine is $180, that's not a very good reason, unless you have a really, really tight budget.
Good post.
I wonder when Apple releases high Sierra that it's here where we will see the Kaby Lake processors really show their efficiceny in slightly better battery life compared to the Skylake processors. I've been hearing people say the 2017 battery run times are the same as the 2016, it's possible once High Sierra kicks in that we will see better times especially with running video. I'm not talking hours difference but it should be enough for those with Kaby Lake to see some modest increases.
 

ls1dreams

macrumors 6502a
Aug 13, 2009
631
237
It's not just 4k movies. Hardware VP9 decoding means youtube videos will play back with WAY lower cpu utilization, improving both heat and battery life. (Yes, even at 720p).
 

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
It's not just 4k movies. Hardware VP9 decoding means youtube videos will play back with WAY lower cpu utilization, improving both heat and battery life. (Yes, even at 720p).
Yeah. Kaby Lake supports hardware VP9 acceleration but AFAIK, it's not been implemented in Chrome yet for Macs, either in Sierra or High Sierra.

That's one reason why I tend to use Safari on Macs instead of Chrome.

Luckily though, VP9 is fairly to decode in software. Way, way, way easier than HEVC.
 

ls1dreams

macrumors 6502a
Aug 13, 2009
631
237
We're not talking about just movie players. We're talking about stuff like iMovie and Photos video export and anything else QT might be involved with in macOS, including Apple's pro applications when they updated. HEVC support is top to bottom in High Sierra, but you won't get that for 10-bit HEVC if you have Skylake, regardless of what other GPU you have. Apple simply isn't supporting them for this.

hevc2.jpg


BTW, Apple isn't supporting VP9 either, and so any VP9 decoding in say Chrome is purely software based, even on Kaby Lake Macs which fully support VP9 decoding in hardware. Early reports indicate this is also true on High Sierra. No hardware support for VP9 regardless of the hardware or software used.

Hell, even on the PC side, look at Netflix 4K on an AMD GPU. It can support it both from the hardware HEVC decoding perspective and for DRM reasons, but it's taken them forever to implement this on anything other than Kaby Lake.

Now it's possible that eventually Apple will add support for AMD GPUs, but guess is they won't, because there isn't really a good reason for them to. They'd rather just sell you a Kaby Lake Mac.

How is it possible for Apple to not support VP9 decoding in this case? Would Chrome be able to directly leverage the intel chipset features? Or does OSX somehow limit this?

I personally don't care about 4k, but would like to be able to stream 720p and 1080p VP9 youtube videos with minimal cpu usage to save on heat and battery life. (I hate fans spinning)
[doublepost=1498703780][/doublepost]
Luckily though, VP9 is fairly to decode in software. Way, way, way easier than HEVC.

I'm not sure if that's true. VP9 software decoding on my old macbook really crushes the cpu.
 

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
How is it possible for Apple to not support VP9 decoding in this case? Would Chrome be able to directly leverage the intel chipset features? Or does OSX somehow limit this?
I dunno. I was wondering myself. But the bottom line is that the hardware support in Chrome on the Mac just isn't there yet.

I personally don't care about 4k, but would like to be able to stream 720p and 1080p VP9 youtube videos with minimal cpu usage to save on heat and battery life. (I hate fans spinning)
If you switch to Safari, you'll get the h.264 stream. It may be higher bitrate, but CPU usage will be minimal.

I'm not sure if that's true. VP9 software decoding on my old macbook really crushes the cpu.
Yeah, of course it depends on the machine. On a really old MacBook, I'd stick to h.264 (ie. Safari). However, my 2010 iMac, I have no problems. Mind you, that's partially because it's an i7 quad-core with HyperThreading. ;) OTOH, very high rez HEVC is a total disaster on the 2010 iMac.

Which MacBook? FWIW, on my MBP 13" 2009 2.26 GHz Core 2 Duo, everything in Safari up to 1080p h.264 is buttery smooth. However, I just got a 2017 MacBook Core m3 to replace it. :)

Then again, my 2008 2.4 GHz MacBook Core 2 Duo is terrible at everything, since it doesn't even have hardware h.264 decoding. However, I don't mind so much with that machine, since I bought it used for like 400 Canuck bucks or something like that many years ago. It's a kitchen surfing and recipe machine, and I don't mind so much if it gets covered in flow and chocolate. 720p h.264 is good enough for that purpose.
 
Last edited:

New_Mac_Smell

macrumors 68000
Oct 17, 2016
1,931
1,552
Shanghai
Well not really. That's not correct, unless you're talking software decoding/encoding. Skylake has full 8-bit hardware decode, but only hybrid 10-bit decode. However, Apple seems to have chosen not to bother with hybrid decoding on Skylake. It's either hardware 8-bit decoding or software for 10-bit.

For encoding, Skylake has partial hardware encoding support for 8-bit, and no hardware encoding at all for 10-bit. It must be in software.

Software works well, but it is very CPU intensive, and if you have to use that all the time you'll be getting loud fans and crappy battery life, as well as lousy multitasking.

If you are not aware, that slide that I posted above is Apple's slide, indicating what they do and what they don't support. It's right from the horse's mouth. It's a slide from from one of the developer sessions at WWDC.


As mentioned, Apple does not support hardware decode of 10-bit HEVC on Skylake, period. This has already been confirmed by some users running the High Sierra beta. And one shouldn't expect this to change, considering that Apple has already flat out stated they don't support hardware decode of 10-bit HEVC on Skylake.


It's a Pro laptop after all. Why wouldn't you want to be able to import a 4K 10-bit file and scrub through it on your Pro video editing machine for example?

Plus, if you happen to have a 4K 10-bit file, it's not as if you're going to transcode it to 8-bit 1080p first before you watch it.

BTW, these were the same arguments made when h.264 was first introduced by Apple. Now look at where we are with h.264. If your computer doesn't support h.264 hardware decode, it's a real PITA even just for surfing.

---

So in 2017, if all that is holding you back from getting a Kaby Lake machine over a Skylake machine is $180, that's not a very good reason, unless you have a really, really tight budget.

Out of curiosity did you watch the HEVC/HEIF seminar or just take that slide? I've seen that slide passed around a few times is all. You have to remember we're right at the start and trusting betas as any hard proof is very risky, it's a beta after all and not a polished finalised release.

Essentially what Apple has said is Full H/W support 10bit for Kabylake, full 8bit H/W support for Skylake, and full software support for anything that is getting High Sierra. They mention earlier in the seminar about the differences between decoding and playability of files, and note that there's a lot they still don't know and are relying on the developers to optimise everything:

"So how do you make the determination that a format is suitable for playback on a given device? So AVFoundation, through its API, supports the notion of "isPlayable." And this indicates whether a device's video level supports the movie for playback. If true, you should experience smooth playback without incurring any significant power or [inaudible] cost for videos of extended durations.

For example, even though Apple captured 4K30 is decodable across all of our supported systems, it's unlikely to be marked as playable on some of our older hardware like the iPhone 5S. So this is a call out to developers.

It's really, really important, at this junction, for you to be observing these playable state to ensure we provide the best possible user experience.
"

As I mentioned many previous Intel CPU's support hybrid decoding/encoding, I would be surprised if Apple "Software" was doing this 100% without any assistance, that'd be quite remarkable on older Macs. And the only thing in the way then was royalty fees I believe, with it working on Windows but not MacOS.

All I'm trying to point out is that currently, Kabylake works with 10bit better, but it is in no way doom and gloom for every previous Mac out there. And that's based on as much of what Apple have not said alongside what they have, and the fact that it's a very early beta. Of course if you were a professional video producer working with 10bit 4K you'll want the Kabylake as it is better, and of course if buying today you'll want the Kabylake (As I said earlier) as it doesn't make sense to purchase an older computer for a minimal price difference. But until later in the year when High Sierra comes out and people have a chance to use it, we cannot be as certain as you seem to be. Could go either way, as I said it personally doesn't bother me, I'm just looking objectively at what is going around.
 

EugW

macrumors G5
Jun 18, 2017
14,189
11,961
Out of curiosity did you watch the HEVC/HEIF seminar or just take that slide? I've seen that slide passed around a few times is all. You have to remember we're right at the start and trusting betas as any hard proof is very risky, it's a beta after all and not a polished finalised release.

Essentially what Apple has said is Full H/W support 10bit for Kabylake, full 8bit H/W support for Skylake, and full software support for anything that is getting High Sierra. They mention earlier in the seminar about the differences between decoding and playability of files, and note that there's a lot they still don't know and are relying on the developers to optimise everything:

"So how do you make the determination that a format is suitable for playback on a given device? So AVFoundation, through its API, supports the notion of "isPlayable." And this indicates whether a device's video level supports the movie for playback. If true, you should experience smooth playback without incurring any significant power or [inaudible] cost for videos of extended durations.

For example, even though Apple captured 4K30 is decodable across all of our supported systems, it's unlikely to be marked as playable on some of our older hardware like the iPhone 5S. So this is a call out to developers.

It's really, really important, at this junction, for you to be observing these playable state to ensure we provide the best possible user experience.
"

As I mentioned many previous Intel CPU's support hybrid decoding/encoding, I would be surprised if Apple "Software" was doing this 100% without any assistance, that'd be quite remarkable on older Macs. And the only thing in the way then was royalty fees I believe, with it working on Windows but not MacOS.

All I'm trying to point out is that currently, Kabylake works with 10bit better, but it is in no way doom and gloom for every previous Mac out there. And that's based on as much of what Apple have not said alongside what they have, and the fact that it's a very early beta. Of course if you were a professional video producer working with 10bit 4K you'll want the Kabylake as it is better, and of course if buying today you'll want the Kabylake (As I said earlier) as it doesn't make sense to purchase an older computer for a minimal price difference. But until later in the year when High Sierra comes out and people have a chance to use it, we cannot be as certain as you seem to be. Could go either way, as I said it personally doesn't bother me, I'm just looking objectively at what is going around.
I watched most of the seminar and skimmed the rest, online though as I am not a coder and was not at WWDC.

Bottom line is you are being far, far too optimistic about future Skylake support. When Apple says, “We will support this, but not this” for codec support on Macs, you should probably accept it at face value, esp. when we already know the that Skylake is actually incapable of doing full hardware 10-bit decode in hardware as the silicon simply isn’t there.

Software optimization will continue to improve of course, but IMO you’d be foolish to buy a 2016 Skylake model today in June 2017 if you have any intention of dealing with 10-bit HEVC video now or in the future., unless your budget is really, really tight or you’re getting a killer deal on it. $180 off is not a killer deal. I see you agree on that part at least.

For lower bitrate 10-bit HEVC it will play fine on Skylake Macs with optimized software decoder’s but CPU utilization will remain high and battery life will suck. That’s just the nature of the beast. That’s exactly what happened with h.264, and it’s likely what will happen with h.265. We are a decade later and h.264 continues to suck on machines with no hardware h.264 support.

If you KNOW you will never deal with 10-bit 4K HEVC then fine though.
 
Last edited:
  • Like
Reactions: doctor93
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.