Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The #1 complaint here is always about battery life. Letting the customer decide to drain their battery watching 4K would result in knee-jerk reactions here and threatening class action lawsuits, and you know this to be true here.
As already mentioned several times, battery life is not an issue with the models released 2017 or later, since they already have 4K HDR decode support fully implemented in hardware, with low CPU usage. (I'm excluding the 2017 MacBook Air though, since that uses 2015 hardware.)

While T2 may also do hardware HEVC decode, it isn't actually necessary for this purpose, since Apple already supports such hardware HEVC decode on Kaby Lake Macs.

The difference though is the T2 appears to support a more restrictive DRM layer than what is required on other platforms.

So I’ll be able to play 4K content on my 16” MacBook but not on my 2017 iMac 5K?
Yes, but please also note this:

I'm not sure how it applies to Netflix, but I wonder if it means it will be downsampled to 1080p on MacBook Pros.


C75B6BD0-8AE4-485A-AF98-4110D8C9067B.jpeg



Why do you hold Apple responsible for a poor decision by Netflix? Perhaps you should direct your anger at Netflix, as even my 2012 Retina MacBook Pro can play 4k video content. This isn't an Apple decision.
See above. Netflix doesn't require this on other platforms. This appears to be an Apple restriction.
 
My MacBook 13" from 2016 with a T1 chip can play YouTube 4K videos without a problem. Does HDR need that much processing power?
 
My MacBook 13" from 2016 with a T1 chip can play YouTube 4K videos without a problem. Does HDR need that much processing power?
Check out my HEVC thread I linked in this post:


With the 4K HDR HEVC demo files I was testing, using software playback, I would not be able to cleanly play them even with 100% CPU usage on a 2017 iMac Core i7-7700K. Hardware playback is definitely needed for 4K HDR HEVC streaming.

YouTube is not HEVC, but VP9.


Absolutely disappointing that we can't stream 4K Netflix on the beautiful 4K / 5K iMac displays. The "battery life" excuse doesn't fly on an iMac.
The "battery life" excuse doesn't fly on a 2017 MacBook Pro either. And it's moot on earlier machines because they don't have hardware support for this, and thus are too slow to support such playback.
 
  • Like
Reactions: temptee
Apple’s T2 Security Chip Jailbroken by Checkra1n

Hackers jailbreak Apple’s T2 security chip powered by bridgeOS
 
Absolutely disappointing that we can't stream 4K Netflix on the beautiful 4K / 5K iMac displays. The "battery life" excuse doesn't fly on an iMac.
I think the battery life subject was invented somewhere in the comments. The article doesn't state anything about loss of battery life, especially since it mentions iMacs that obviously don't use battery.

From the article: "It's not clear why Macs need a T2 security chip to play back 4K HDR content, given that Windows machines obviously don't, but it could be that this is Netflix's way of ensuring that viewers aren't trying to stream the high-definition content on older Macs, which could result in less-than-stellar performance."
 
From the article: "It's not clear why Macs need a T2 security chip to play back 4K HDR content, given that Windows machines obviously don't, but it could be that this is Netflix's way of ensuring that viewers aren't trying to stream the high-definition content on older Macs, which could result in less-than-stellar performance."
I've already responded about this, but you seem to ignoring those posts. This performance claim does not apply to the 2017 models. All of the 2017 models (except for the 2017 MacBook Air, which uses 2015 hardware) have full 4K HDR HEVC decode support in hardware already, so HEVC playback performance is not an issue at all on those machines.
 
Netflix does not support AMD GPUs on the Windows side for 4k/HDR. Could this have anything to do with it? Apple's discrete GPUs are all AMD-based.
 
I've already responded about this, but you seem to ignoring those posts. This performance claim does not apply to the 2017 models. All of the 2017 models (except for the 2017 MacBook Air, which uses 2015 hardware) have full 4K HDR HEVC decode support in hardware already, so HEVC playback performance is not an issue at all on those machines.
No, I was not ignoring your post. I was addressing a member's comment mis-quoting the article about battery life loss, which was not stated in the article for the reason of needing the T2 chip.
 
Netflix does not support AMD GPUs on the Windows side for 4k/HDR. Could this have anything to do with it? Apple's discrete GPUs are all AMD-based.
No, because discrete GPUs are unnecessary in the first place. All recent Intel consumer CPUs include an integrated Intel GPU, and Netflix on Windows supports these iGPUs for hardware HEVC playback. The supported Intel iGPUs were first released in 2016 in Windows machines.

And as you know, all recent Macs (aside from Apple's Arm dev kit) are Intel. The only recent Macs that do not include an Intel iGPU are the iMac Pro and the Mac Pro.
 
  • Like
Reactions: opeter and gotluck
Also wanted to mention that for the briefest period (Big Sur beta 5) Safari was supporting YouTube VP9 playback at 8K/4K to the built-in monitor on a Late 2013 rMBP, using the Intel Iris 5100. It was beautiful and did a decent job, and maxed out the GPU but left the CPU around 10-15% usage.

Don't know who initiated it, but as of Beta 6 and up the feature has been nerfed, much to the consternation of older Mac owners and Big Sur testers everywhere.

The point being Apple could support HEVC 4K playback on older machines going back to 2012, but they won't because they don't want to bother supporting older machines at that level.

I can agree with their decision to leverage the T2 chip for transcoding, as it leaves the CPU free to do other stuff, but I think totally ignoring older users just stinks.
 
  • Like
Reactions: Mal Blackadder
Nobody can help me?

Hi am I missing something? (or did not see the solution on this thread?)

- i have a MacBook Air 2020
- i have Netflix UHD subscription
- i have Mac OS 11 beta 9
- i have Safari 11

How to recognize, if a title is 4K/UHD
Breaking Bad should be 4k, but I'm seeing it when I play the title

Can anybody tell me, what I'm doing wrong?

Greetings
 
Also wanted to mention that for the briefest period (Big Sur beta 5) Safari was supporting YouTube VP9 playback at 8K/4K to the built-in monitor on a Late 2013 rMBP, using the Intel Iris 5100. It was beautiful and did a decent job, and maxed out the GPU but left the CPU around 10-15% usage.

Don't know who initiated it, but as of Beta 6 and up the feature has been nerfed, much to the consternation of older Mac owners and Big Sur testers everywhere.

The point being Apple could support HEVC 4K playback on older machines going back to 2012, but they won't because they don't want to bother supporting older machines at that level.

I can agree with their decision to leverage the T2 chip for transcoding, as it leaves the CPU free to do other stuff, but I think totally ignoring older users just stinks.
This is not correct.

HEVC decoding (Netflix and iTunes) is much, much harder than VP9 decoding (YouTube). While some older machines can do 4K VP9 (albeit using a LOT of CPU cycles), those same machines can't do 4K HDR HEVC decode. Hardware decoding is required for 4K HDR HEVC.

However, 4K HDR HEVC per se does not necessarily require T2 either. Kaby Lake or later Intel iGPUs also support this in hardware.

BTW, Kaby Lake or later Intel chips also have full support for VP9 decode in hardware. There is limited hardware VP9 decode support in earlier Intel chips though.

7th Gen Intel Core Performance Evaluation-13_575px.png


Screen Shot 2020-10-01 at 10.57.25 AM.png


P.S. The above table represents one of the reasons I chose to buy a MacBook and iMac in 2017. Both are Kaby Lake.
 
Last edited:
Netflix does not support AMD GPUs on the Windows side for 4k/HDR. Could this have anything to do with it? Apple's discrete GPUs are all AMD-based.

AMD does support HDCP though. Might not be understanding your point correctly.

RadeonSoftware_32lkRSwKFv.png
 
That was my initial thought, but what exactly is the DRM here?
Everyone playing high definition contents needs to support HDCP, which is some technology that encrypts the video between your graphics card and the monitor. So that evil hackers can't record a video directly from the output of your graphics card. And then you have to make sure that HDCP cannot be turned off. So it's DRM that is completely invisible to you as long as it works. A typical sign of it is an external monitor that suddenly just shows snow and you have to unplug it and plug it back in, because usually _everything_ is always encrypted.

Anyone know what hardware I would need to record 2 hours of 4k video at 60 fps, uncompressed?
 
Or is it a DRM limitation, not necessarily performance?

Performance is BS. Every Intel CPU since Kaby Lake can do HEVC Main 10 decoding and is more than capable. Kaby Lake was introduced in 2016.

Not sure what Apple is thinking, but the video IP in the Intel CPU is more than capable.

DRM then ? I don't see the problem is you only show the decoded video on the built-in display. Of course 4K UHD requires HDMI 2.0a with HDCP 2.2 .. if you use an external monitor/display.
 
T2 has hardware acceleration capabilities for video. That’s why it’s required.

With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

EDIT: for those who don't believe:

View attachment 961739
It's hardware accelerated from 2016 MacBook Pro onwards, it's just moved on to the T2 chip.

21667-25167-Screen-Shot-2017-06-12-at-11215-PM-xl.jpg
 
  • Like
Reactions: Mal Blackadder
Why do you hold Apple responsible for a poor decision by Netflix? Perhaps you should direct your anger at Netflix, as even my 2012 Retina MacBook Pro can play 4k video content. This isn't an Apple decision.

***EDIT - Appears from further reading that this is probably an Apple restriction. Poo on Apple!
I think the proper thing before blaming anyone is to (a) find out who is really responsible, (b) if there are really necessary reasons (I don't think the T2 chip would be required but maybe all the machines with T2 chip also contain a powerful video decoder), and (c) if this is fixed / changed in a few weeks.
 
It's hardware accelerated from 2016 MacBook Pro onwards, it's just moved on to the T2 chip.

View attachment 961881
Only 8-bit HEVC on those 2015/2016 machines.

10-bit HDR HEVC requires the 2017 models.


I think the proper thing before blaming anyone is to (a) find out who is really responsible, (b) if there are really necessary reasons (I don't think the T2 chip would be required but maybe all the machines with T2 chip also contain a powerful video decoder), and (c) if this is fixed / changed in a few weeks.
a) Very likely not Netflix. That leaves Apple.

b) Performance is not a necessary reason since we already know that Apple supports 4K HDR HEVC playback on older hardware. DRM isn't a necessary reason either since there are other ways to achieve the DRM support (like on Windows), but Apple likely requires T2 in its DRM implementation, so it becomes the reason.

c) That ship has sailed my friend.
 
  • Like
Reactions: Mal Blackadder
T2 has hardware acceleration capabilities for video. That’s why it’s required.

With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

EDIT: for those who don't believe:

View attachment 961739
What UX do you need when watching a video from YouTube? I am pretty sure that Apple decision has more to do with "encouraging" users to upgrade their hardware than anything else. Consider the results:
  • MacOS users with, say, 2018 year hardware, can't play 4K HDR, period.
  • Windows users with same hardware have a choice: watch 4K HDR with or without some tradeoffs (battery life might be one of those, but when your laptop is plugged in is it an issue?) OR watch lower quality video (the one that MacOS users have as the only option).
It should be obvious to everyone which approach is better for the users.
 
  • Like
Reactions: Mal Blackadder
The #1 complaint here is always about battery life. Letting the customer decide to drain their battery watching 4K would result in knee-jerk reactions here and threatening class action lawsuits, and you know this to be true here.
Really? There are plenty of applications that drain the battery fast. Have you ever seen a related class action lawsuit? Where is that need for defending Apple at all costs (using ridiculous arguments) coming from?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.