Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Well at least they could give me the choice to lose battery life or performance and display a pop up "Your mac is old, performance might suck, switch to full hd"

Still don´t believe macs from 2015/2017 couldnt play it, when my medicore TV from 2015 can.
I am pretty sure the limitation is HD, not 4K. That is what the article said
 
terrible explanation....T2 has nothing to do with playing video. the decoding is all handled with the GPU , intel gpu's all have hardware decoders. matter in fact. i just tried this on both my macbook pro's one with t2 and one without. the cpu load was the same.

bs.

The T2 chip info specifically mentions HEVC decoding.

Are you running Big Sur? Were you decoding HEVC? Were you using software to play it that would take advantage of the T2 chip? (Then you wouldn't get the advantage).

Two difference MacBooks with the exact same specs other than T2 and the "cpu load was the same"?
 
  • Like
Reactions: Mal Blackadder
Here is an article on windows capability for Netflix 4k HDR:

Netflix is adding High Dynamic Range (HDR) support to both its Windows 10 app and the Microsoft Edge browser. Netflix’s hardware requirements for HDR on PC mean you’ll need Intel’s 7th-generation or higher processors to support the HDR10 encoded content, and Intel’s integrated GPU or an Nvidia 1050 or higher. AMD GPUs or even CPUs are currently unsupported.
Netflix’s support of HDR on the PC side is a boost to the content available to Windows 10 users, arriving shortly after iPhone X and Galaxy Note 8 support. There’s a lack of games that support HDR on PC, and gaming monitors that even enable HDR content to be displayed. Dell only announced its first HDR monitor earlier this year, and LG will be showing its new HDR monitor at CES early next month. We’re hoping we’ll see a number of new HDR monitors at CES and throughout 2018.


So Kaby Lake or better with Intel's integrated GPU, or a not bottom end external GPU. So Apparently 4K HDR has some processing oomph requirement
 
T2 has hardware acceleration capabilities for video. That’s why it’s required.

With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

EDIT: for those who don't believe:

View attachment 961739

The one you pointed out is an Encoder so does not see what it has to do with this because to receive=decode. It might be related to DRM instead unless T2 is also used for Decode and Apple decided to use only that.
 
just Apple excuses to make you buy a new mac

Apple users have money sooo, they should change their devices frequently, doesnt matter if an Android TV box can play 4k, we tell you cannot because you dont have the cryptographic chipset (which has nothing to do with video procress) and thats all

and people need excuses =)
 
  • Disagree
Reactions: subi257
Having a perfectly working 2015 Macbook pro, one with all the 'legacy' ports, this grinds my gears.


I understand, I support an video editing infrastructure, we have 760TB of hardware storage and after 5-6 years it is End of Lifed...not longer supported, so replace it at $100K for each 200TB. It still works, but if you don't replace it we won't be supported for software/firmware updates.
 
  • Like
Reactions: and 1989 others
That was my first thought. That maybe it is Netflix saying we are going to protect our assets and you will need some legit way to view it.
As mentioned, this appears to be a DRM limitation, but imposed by Apple.

Netflix doesn't require it on Windows (obviously, since no non-Apple machines have T2), and is perfectly happy to stream 4K HDR HEVC titles on 2016/2017 Kaby Lake machines (and does it with low CPU usage).

Also note that Apple has been supporting hardware-accelerated 4K HDR HEVC (with low CPU usage) since 2017, and even highlighted that feature at WWDC. Take a look at Apple's table below, from WWDC 2017. "10-bit Hardware Decode" on "7th Generation Intel Core processor" means hardware 4K HDR HEVC acceleration on 2017 Macs with Kaby Lake CPUs.

hevc2.jpg


And I'll post this again, since it demonstrates how well that hardware decoding works on all those 2017 Macs. Very low CPU usage (<10%) on the iMac 2017, and relatively low CPU usage (~25%) on the MacBook 2017, with 4K HDR bitrates much higher than what streaming video uses.



Seriously, if I had a 2018 iMac, looking forward to watching Netflix in 4k, and Apple tells me, sorry, you need a new Mac... Do you think I would spend the money on a new Mac to watch Netflix in 4k? Quite the opposite. I would be pissed off. And continue using my Mac for as long as possible. You sometimes find marketing people with that attitude, they usually get fired quickly because they don't actually sell things but drive people towards competitors.
I bought my 2017 MacBook and my 2017 iMac "knowing" they had the hardware to stream DRM'd 4K HDR HEVC. My prediction was that in 2018 Apple would release a new version of macOS that supported 4K DRM, on new 2018 Macs including a new 2018 iMac, and I also believed they would also support it on 2017 Kaby Lake models.

Well, I was wrong on several accounts:

1) Apple didn't even release a 2018 iMac. The next iMac wasn't until 2019.
2) Even the 2019 iMac didn't get supported.
3) The reason the 2019 iMac didn't get supported was because Apple chose to be more restrictive, requiring the T2 chip for this purpose, which mean that the iMac line didn't get supported until a new model was released in 2020.

I guess I bought at the right time after all. I bought in 2017 because my 2010 iMac was showing its age, but at the time I was afraid I had bought too early since Apple hadn't announced DRM'd 4K streaming yet. Well, while I did buy too early for this feature, it wasn't until three years later did finally Apple support it in iMacs, so I'm glad I bought when I did. Plus, it's been great using my 2010 iMac as a secondary monitor for my dual iMac setup during all that time.
 
terrible explanation....T2 has nothing to do with playing video. the decoding is all handled with the GPU , intel gpu's all have hardware decoders. matter in fact. i just tried this on both my macbook pro's one with t2 and one without. the cpu load was the same.

bs.
You clearly don’t know what the T2 handles then. It has video acceleration built into it.
 
  • Like
Reactions: Maconplasma
Draining quicker is nothing unexpected in my opinion. When I play a game on my mac or a full hd video, it also drains quicker then when doing nothing, so makes sense that when streaming 4K it drains even quicker. Can I use my battery how I want, or how Apple wants?

Let´s see how bad the perfromance is on windows then, without the magical T2 chip..
The #1 complaint here is always about battery life. Letting the customer decide to drain their battery watching 4K would result in knee-jerk reactions here and threatening class action lawsuits, and you know this to be true here.
 
Hi am I missing something? (or did not see the solution on this thread?)

- i have a MacBook Air 2020
- i have Netflix UHD subscription
- i have Mac OS 11 beta 9
- i have Safari 11

How to recognize, if a title is 4K/UHD
Breaking Bad should be 4k, but I'm seeing it when I play the title

Can anybody tell me, what I'm doing wrong?

Greetings
 
Cancelled Netflix years ago for Pureflix never looked back. Even tho we've got a 2020 MacBook Air this T2 chip is nonsense
How odd... Pureflix is strictly for streaming Christian, "faith-forward" content. If you switched to Pureflix from Netflix, as you say, the reason is because you preferred religious-based programming. That's entirely different than what is being discussed here.
 
  • Like
Reactions: Mal Blackadder
I think this is it boys. I’m switching to pc. Straw that broke the camels back
Why do you hold Apple responsible for a poor decision by Netflix? Perhaps you should direct your anger at Netflix, as even my 2012 Retina MacBook Pro can play 4k video content. This isn't an Apple decision.

***EDIT - Appears from further reading that this is probably an Apple restriction. Poo on Apple!
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.