Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Everyone playing high definition contents needs to support HDCP, which is some technology that encrypts the video between your graphics card and the monitor. So that evil hackers can't record a video directly from the output of your graphics card. And then you have to make sure that HDCP cannot be turned off. So it's DRM that is completely invisible to you as long as it works. A typical sign of it is an external monitor that suddenly just shows snow and you have to unplug it and plug it back in, because usually _everything_ is always encrypted.

Anyone know what hardware I would need to record 2 hours of 4k video at 60 fps, uncompressed?

I see, that makes sense. I'm not sure if you can do much more though than annoy people who might have done this easily in the past. I can just whip out my iPhone and take a 4k video of the video if I wanted to. Not that I'm making excuses here.
 
I was really looking forward to updating to Big Sur for 4K Netflix, but now I need to buy a new 4K monitor that supports HDCP 2.2? :rolleyes:
 
MacOS users with, say, 2018 year hardware, can't play 4K HDR, period.
To nitpick, yes then can. 4K HDR HEVC support has been built right into macOS since 2017, and hardware playback for 4K HDR HEVC playback is fully supported on 2017 and later Macs.

What you're probably referring to is 4K HDR with DRM.

Windows users with same hardware have a choice: watch 4K HDR with or without some tradeoffs (battery life might be one of those, but when your laptop is plugged in is it an issue?) OR watch lower quality video (the one that MacOS users have as the only option).
Not quite. On Windows, if you want to watch DRM'd 4K HDR on Netflix, you only can do so if you have specific hardware like Kaby Lake machines or later machines.

I was really looking forward to updating to Big Sur for 4K Netflix, but now I need to buy a new 4K monitor that supports HDCP 2.2? :rolleyes:
HDCP 2.2 has always been a requirement for external monitors for DRM'd 4K streaming. We've known that for over 5 years now. That's why I waited until new AV receivers were released in 2015 with HDCP 2.2 support before I updated my receivers (in 2016).
 
To nitpick, yes then can. 4K HDR HEVC support is built right into macOS, and hardware playback for 4K HDR HEVC playback is fully supported on 2017 and later Macs.

What you're probably referring to is 4K HDR with DRM.


Not quite. On Windows, if you want to watch DRM'd 4K HDR on Netflix, you only can do so if you have specific hardware like Kaby Lake machines or machines


HDCP 2.2 has always been a requirement for external monitors for DRM'd 4K streaming. We've known that for over 5 years now. That's why I waited until new AV receivers were released in 2015 with HDCP 2.2 support before I updated my receivers (in 2016).
Thanks for clarification. The situation is clearly more nuanced than my statement claimed. Still, your clarification confirms that T2 is not a necessary requirement for playing 4K HDR with DRM. That's an arbitrary restriction on Apple part (either due to their laziness or because of some more nefarious reasons).
 
  • Like
Reactions: Mal Blackadder
So if I install Big Sur, I will be unable to stream 4K HDR Netflix anymore? Seems kind of messed up to remove a feature I already have...
 
I don't know why this is a surprise, the same applied to iTunes 4K video.

(The text on Apple's website stated that it was post 2018 devices, but when Catalina came out several of us on the forum tested it and this was incorrect - the 2019 iMac did not get 4K video from iTunes.)

And yes, it's clearly a DRM limitation. Windows has a wider range of supported hardware because Microsoft arranged for hardware manufacturers to build in dedicated hardware support for their PlayReady DRM years ago. Apple aren't going to enable that in MacOS as they would have to pay a royalty. And it appears that the stronger variations of Fairplay DRM need a T2 chip.
 
HDCP 2.2 has always been a requirement for external monitors for DRM'd 4K streaming. We've known that for over 5 years now. That's why I waited until new AV receivers were released in 2015 with HDCP 2.2 support before I updated my receivers (in 2016).
I didn't know this, and I'm not sure the average consumer knows this either. I'm not upset at Netflix, it's just disappointing you need a special monitor for it.
 
I didn't know this, and I'm not sure the average consumer knows this either. I'm not upset at Netflix, it's just disappointing you need a special monitor for it.

The studios announced it in 2013 even.

Any 4K monitor manufacturer that didn't include HDCP 2.2 is a crook IMO.
 
Thanks for clarification. The situation is clearly more nuanced than my statement claimed. Still, your clarification confirms that T2 is not a necessary requirement for playing 4K HDR with DRM. That's an arbitrary restriction on Apple part (either due to their laziness or because of some more nefarious reasons).
I don't think laziness is a good answer. My guess is that Apple wanted their own DRM mechanism that fits their own ecosystem plans and which is very robust, and they decided to implement this through their T2 chip. Remember also that Apple is switching away from Intel.

Since they were doing this they decided also to implement their own decode acceleration in T2 as well, maybe for a more consistent result across different Mac hardware.

I wonder if the ARM Macs need T2 or T3 to do this, or if the DRM support will be embedded right on Apple's Mac Arm chips.


I didn't know this, and I'm not sure the average consumer knows this either. I'm not upset at Netflix, it's just disappointing you need a special monitor for it.
Well, many consumers might not know this, but for example most Best Buy sales drones that work in their AV department should know this at least.

This was a problem in 2014, but by 2016, HDCP 2.2 was almost ubiquitous (outside Macs). Back in 2014, manufacturers were releasing 4K-compatible devices, that didn't support HDCP 2.2. That sucked.

The only caveat with stuff like TVs though is that on some devices, not all the ports support HDCP 2.2. For example, some TVs made in the last couple of years may have say 3-4 HDMI ports, but only 1-2 of them support HDCP 2.2.


I don't know why this is a surprise, the same applied to iTunes 4K video.

(The text on Apple's website stated that it was post 2018 devices, but when Catalina came out several of us on the forum tested it and this was incorrect - the 2019 iMac did not get 4K video from iTunes.)

And yes, it's clearly a DRM limitation. Windows has a wider range of supported hardware because Microsoft arranged for hardware manufacturers to build in dedicated hardware support for their PlayReady DRM years ago. Apple aren't going to enable that in MacOS as they would have to pay a royalty. And it appears that the stronger variations of Fairplay DRM need a T2 chip.
That's a good point. If Apple had to pay royalties to use that DRM method, that pretty much guaranteed they wouldn't use it.
 
  • Like
Reactions: kazmac
That’s pretty lame, glad I don’t subscribe to Netflix and do not have a 4K TV yet
4K Netflix on a good (and large) 4K TV is totally awesome.

Same with 4K iTunes content & Apple TV+ through Apple TV 4K.

EDIT:

My main setup is an Apple TV 4K with 65" LG OLED, from a 7 foot seating distance. The boost from 4K HDR is obvious.

My secondary setup is with a mid-end 43" Sony LCD from a further seating distance. The boost from 4K there is much less obvious.
 
Last edited:
  • Like
Reactions: jz0309
Netflix does not support AMD GPUs on the Windows side for 4k/HDR. Could this have anything to do with it? Apple's discrete GPUs are all AMD-based.

AMD GPUs are supported since 2018 with the right drivers.

Some info about the requirements for 4K UHD playback trough NetFlix:
- supported NVIDIA, Intel or AMD GPU
- Microsoft Edge browser (PlayReady 3.0)
- 4K UHD monitor with HDCP 2.2 support
- operating system must have an H265 decoder installed/integrated
 
  • Like
Reactions: Mal Blackadder
4K Netflix on a good 4K TV is totally awesome.

Same with 4K iTunes content & Apple TV+ through Apple TV 4K.
I bought 2 Apple TV4k a few months ago as we cut cable, so I’m ready once the old TVs die...
 
The studios announced it in 2013 even.

Any 4K monitor manufacturer that didn't include HDCP 2.2 is a crook IMO.

Sure, but in 2013 I was not thinking at all about 4K video.

Well, many consumers might not know this, but for example most Best Buy sales drones that work in their AV department should know this at least.

This was a problem in 2014, but by 2016, HDCP 2.2 was almost ubiquitous (outside Macs). Back in 2014, manufacturers were releasing 4K-compatible devices, that didn't support HDCP 2.2. That sucked.
Bought a monitor online, so I didn't get to talk with a Best Buy sales drone 😂. Unfortunately, I have Dell monitor released in 2015 and doesn't support it.
 
Bought a monitor online, so I didn't get to talk with a Best Buy sales drone 😂. Unfortunately, I have Dell monitor released in 2015 and doesn't support it.
Ugh, bad timing. Yeah, in 2013-2015 a lot of AV receivers were also available that were 4K capable but which didn’t support 4K DRM. So, you could watch 4K home movies with them but no commercial 4K content. 🙄
 
I am still waiting for the flat edged version of the ipad mini to compliment my 11” iPad Pro. 😉
 
right now it works fine WITHOUT T2 , 4k , whatever. the T2 is NOT REQUIRED to decode video and does not DECODE video right now.

its bs. do some research.

or test it...like i just did.

Safari REQUIRES T2 for DRM (Fairplay encryption) + 4KHDR because that's how Apple designed it. Other browsers like Chrome or Edge on Windows. I already explained it works in Windows and obviously Windows doesn't need T2 like the original article said.

Done talking with you.
 
The Core i9-99900K in my 2019 iMac (3.6GHz/64GB/1TB SSD/Vega48) has 10-bit HEVC decode built into the iGPU and the Vega48 is more than capable of doing the same. This is an artificial limitation, or politics or Apple being d*cks when they don't need to be.
Just because it's hardware accelerated doesn't mean you're always getting the maximum energy efficiency.
T2 uses far less power compared to the i9 and Vega48. Also T2 has an AES crypto engine built right in that works really well with Fairplay (which I'm sure Intel chips can do fine, but not as good as T2).

I never said Intel or GFX cards *can't* do HEVC decoding. They can obviously do it as I've used FFMPEG with QuickSync and CUDA to decode/encode H265 files before. It's just not power efficient so the Safari team decided T2 was the way to go.

You have a choice to use Chrome, so it's really up to Netflix to support Chrome's DRM.
 
The one you pointed out is an Encoder so does not see what it has to do with this because to receive=decode. It might be related to DRM instead unless T2 is also used for Decode and Apple decided to use only that.

No. It says HEVC codec. Codec is encode/decode. Otherwise it would have said HEVC encoder like it said with Apple video encoder above.
 
  • Like
Reactions: Mal Blackadder
It's hardware accelerated from 2016 MacBook Pro onwards, it's just moved on to the T2 chip.

View attachment 961881
1. There is 0 context on that image. I'm sure it has to do with HEVC, but it's pointless without any links. Hardware accelerated could mean anything (hardware accelerated machine learning for example).
2. Yes, I understand Vega graphics and Intel processors can do HEVC. However, T2 chip uses far less energy. Safari team is power conscious with its browser so they limit it to T2. Also Fairplay encryption works well with T2's hardware accelerated AES crypto engine so there's even more reason to use T2.
3. It's up to Netflix to support Chrome which won't require T2 (but likely uses software DRM which would mean bad performance and easily bypassed, so maybe Netflix doesn't want that).
 
What UX do you need when watching a video from YouTube?

Long battery life.

And with iMacs, performance is still critical if you have a video playing while doing other things.

Lastly, you have a choice of using Chrome. If Netflix doesn't want to support Chrome, that's the fault of Netflix so go ask them why Chrome isn't supported (I suspect there's a good technical reason for Edge and Safari being supported but not Chrome).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.