Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think the actual requirement is for HDCP 2.2, which at minimum requires integrated Intel UHD graphics. It's coincidental that all the Macs that support it also came with the T2.

So for example the 2017 MBP with the T1 only came with Intel HD graphics, while the 2018 MBP with the T2 came with Intel UHD.
 
The one thing that worries me about the move to Apple silicon is that they would have complete control over offering new functionalities only for new hardware, in the same way they are doing with iPads from time to time.
This news does not ease my worries.

That’s what Apple’s been doing with the iPhone ever since the A-series chip debuted, though.

And all in all, it hasn’t been so bad. For example, iOS 14, released in 2020, is compatible with the iPhone 6S released in 2015.

Though, of course, there are some things that an iPhone 6S can’t do on iOS 14 because of hardware limitations, but [throttlegate aside], Apple doesn’t take away features that were available when the device was released, e.g., 3D Touch – it’s officially discontinued, but iOS 14 still supports it on the devices that have it [iPhones 6S through XS].
 
T2 has hardware acceleration capabilities for video. That’s why it’s required.

With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

EDIT: for those who don't believe:

View attachment 961739
How convenient. I call ******** on that one. Any modern GPU, even Intel CPU integrated ones support native HEVC decoding.
 
And how does Fairplay perform with that instruction set on an Intel chip? I'll be waiting for your answer.

Gee, Apple designed FairPlay's algo, and they also designed the T2 chip, I wonder which chip will perform more efficiently.
Since when is the goal 100% optimal efficiency "All-or-Nothing". Customers are just fine with 80% Good-Enough solutiuons for mundane stuff like this.
 
The one thing that worries me about the move to Apple silicon is that they would have complete control over offering new functionalities only for new hardware, in the same way they are doing with iPads from time to time.
This news does not ease my worries.

It’s the main thing everyone has been worried about. Everything will be locked into the apple app store and apple to decide what can run on a mac.

Someone people say it’s a good thing but I personally liked the flexibility of running apps not from the app store and also running windows every so often.

We will enjoy increased performance and battery life for macbooks but not sure if it actually benefit the desktop macs where performance per watts is not as important. Thermal envelop of chips would benefit from not throttling as much but I have never felt the effect of throttling on iMac pro in a significant way.
 
We are talking about specifically 4K HDR ( 10 bit ) HEVC here. Along with HDCP 2.2 requirement.

Not all Intel CPU support that. If I remember correctly only Kabylake+. ( i.e Not MacBook Pro 2016 )

So I guess Apple simplify it to T2 only.
 
That is not very Apple and flies right in the face of "It just works". Apple's laptops are known for their exceptional battery life and something that drains your battery 2-3x quicker or worse really flies in the face of that.

Draining quicker is nothing unexpected in my opinion. When I play a game on my mac or a full hd video, it also drains quicker then when doing nothing, so makes sense that when streaming 4K it drains even quicker. Can I use my battery how I want, or how Apple wants?

Let´s see how bad the perfromance is on windows then, without the magical T2 chip..
 
How convenient. I call ******** on that one. Any modern GPU, even Intel CPU integrated ones support native HEVC decoding.
1. You're forgetting Safari's Fairplay (Hardware DRM) which uses T2's AES crypto engine for better performance
2. AMD's and Intel's hardware accelerated HEVC wouldn't match the power efficiency of T2. Much better battery life with T2
3. Netflix can always support Chrome for non-T2 4k HDR playback
 
The one thing that worries me about the move to Apple silicon is that they would have complete control over offering new functionalities only for new hardware, in the same way they are doing with iPads from time to time.
This news does not ease my worries.

I don't know what new features I got over time with Intel processors or AMD graphics.
With iPad, we've gotten new functionalities over time, far more than products with non-Apple chips.
 
Not for power efficiency in Macs. They do ridiculous stuff like putting totally inappropriate i9s in MBPs, delivering minor OS updates as giant multi-GB downloads, and so on

I think you sort of went off topic there. Apple has always tried to be 100% power efficient with Safari. They got rid of Flash early, they paused tabs in the background before Chrome did, they refused to implement VP8/VP9 codecs into the browser (which is why 4k YouTube doesn't work) since it's not hardware accelerated, etc...
 
  • Like
Reactions: ikir and KeithBN
Intel chips since Kaby Lake have had hardware HEVC (10-bit) codec support. Granted, this only adds 2017 models to the supported list but still...
Not sure how many good HDR monitors are available though that don't cost as much as an decent 55" TV.
 
These things are not alike...
Choosing not to use a service and having an opinion on said service. Shouldn't be that hard for you comprehend.

An eXpLaInAtIoN jUsT fOr YoU: I disagree with using a service like Netflix and also disagree with the requirements for viewing 4k HDR content. Those do not need to be mutually exclusive.
 
I think the actual requirement is for HDCP 2.2, which at minimum requires integrated Intel UHD graphics. It's coincidental that all the Macs that support it also came with the T2.

Or, it's not coincidental and it really does depend on the T2 because there aren't enough non-T2 Macs with the requisite GPUs to make it worth Apple's while to support.

Unfortunately, Apple are getting to the "if I drop a $20 bill it's not worth my while to stop and pick it up" stage... so, tough luck for iMacs which have half-decent GPUs but no T2 chip, then.

The one thing that worries me about the move to Apple silicon is that they would have complete control over offering new functionalities only for new hardware,

That ship sailed as soon as Apple introduced the T2 chip - long before the Apple Silicon announcement.

Apple could have locked the Intel Mac down whenever they wanted to, and the T2 makes that easy - and as soon as they've replaced the 21.5" iMac with a T2 model they'll have a clear run at making future MacOS versions T2 only and locked-down (they'll have to support existing non-T2 Macs with maintenance upgrades for a reasonable period but they're under no obligation to support existing Macs with new features). Apple Silicon is irrelevant here - if Apple wants a lock down, they can do it - even T2 only makes it easier and gives them a "security" figleaf.
 
  • Like
Reactions: foliovision
T2's chip is far more efficient than using the dedicated GFX for video decoding.
And don't forget the AES algo which T2's built in AES crypto engine handles quite well with Apple's FairPlay DRM.

I never said a MacBook Pro 2017 couldn't do it, but in Apple's eyes, it's not good UX.

Apple knows you have a choice to use Chrome, so it would be up to Netflix to implement Chrome support for older Macs.
I mean, Intel CPU:s have had AES acceleration for literally 10+ years. The most likely reason they are using T2 is because it, like an iPhone (in fact very much like an iPhone), is a closed system which makes it a lot harder to access the decrypted video stream. Someone will still find a way around it, not to mention the fact that a public jailbreak was released for T2 just the other day, in fact the same day that the news about Netflix 4K T2 requirement came out.

Whatever the reason, everyone in this thread is right, both the T2 chip, the Intel chip, and the AMD GPU in machines that have it, are all capable of decoding any current Netflix HEVC stream with hardware acceleration. It is likely that the T2 chip does it with better energy efficiency, but if you ask me it is way way way more likely the reason has to do with making lives difficult for people attempting to download decrypted Netflix content.

...which, if you ask me, is a futile endeavour. Yet here we are.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.