Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Here is the answer to this question.. you all won't like it:

It's not clear why Macs need a T2 security chipIt's not clear why - Because Apple wants to control our lives and watch us when we use our systems. This is why the T2 or call to home chip is something I am glad I don't have. I want full control of my macs, and not Apple.
 
So you’re telling me my i9-9900K and Radeon Pro Vega 48 aren’t powerful enough to decode 4K video without an old iPhone-class co-processor? What? I don’t even need this in my 16” MBP, even though it can with the T2, because it’s not even a 4K display.
 
Exactly this! I would really love to upgrade my 16“ MBP to ARM, but most lately apple is behaving like a dominant market force that prioritizes business brute force over customer experience.

But what’s the alternative then? Sticking with a MBP which is very quickly going to be phased out by Apple with „ARM exclusive features“. At the same time, I really don’t want to switch to windows, I really like macOS, other than the ever increasing number of bugs. There’s really not enough competition on the market, either way you’re trapped with **** companies...

What a stupid question.. There are plenty of alternatives. Just stay away from the T2 chip as this will give Apple control over your life when on the Mac. My 2015 Dual Graphics does not suffer from this issue and who the hell needs Ultra-4dk ? I dont.
 
  • Like
Reactions: Mal Blackadder
As far as I am aware, And my guess is that Netflix requires Intel SGX, and / or some form of GPU encryption. And Apple doesn't use those encryption technology. They use their own which is only on T2. Since these encryption are part of the requirement of Netflix ( and possibly all other Hollywood or video content ).
 
With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

To be fair, every intel CPU since 2015-2016 has quick sync acceleration support for h.265 which apple could enable.

But I guess most of those machines also have T2 chips in them, so apple probably figured just code it for T2 as that's the path moving forward now they're ditching intel. A shame though.
 
  • Like
Reactions: Shamgar
just Apple excuses to make you buy a new mac

Apple users have money sooo, they should change their devices frequently, doesnt matter if an Android TV box can play 4k, we tell you cannot because you dont have the cryptographic chipset (which has nothing to do with video procress) and thats all

for those who vote negative. if is not just an apple excuse...

why my Android TV box can play Netflix 4k?


Pretty much.

Apple keeps pushing me away.
 
To be fair, every intel CPU since 2015-2016 has quick sync acceleration support for h.265 which apple could enable.

But I guess most of those machines also have T2 chips in them, so apple probably figured just code it for T2 as that's the path moving forward now they're ditching intel. A shame though.

Yes, along with AMD graphics being able to do it too. However those aren't quite energy efficient.

There's also DRM (Fairplay encryption) which is accelerated by T2. So it just makes a lot of sense to require T2 for Safari.
 
Yes, along with AMD graphics being able to do it too. However those aren't quite energy efficient.

There's also DRM (Fairplay encryption) which is accelerated by T2. So it just makes a lot of sense to require T2 for Safari.

Sure, on machines that have it (T2).

Kicking machines as recent as this year that do not have T2 to the curb is a bit of a slap in the face to those owners though. Especially iMacs which aren't too concerned with battery consumption and explicitly marketed as "4k" iMacs.

The discrete GPU and intel iGPU can play 4k without burning through too much power.
 
Sure, on machines that have it (T2).

Kicking machines as recent as this year that do not have T2 to the curb is a bit of a slap in the face to those owners though. Especially iMacs which aren't too concerned with battery consumption and explicitly marketed as "4k" iMacs.

The discrete GPU and intel iGPU can play 4k without burning through too much power.

I'm sure it can be done, but I don't think Apple is deliberately disabling it on older machines. There's just a lot of work that needs to be done to make it work on machines that came out in the last 2-3 years that isn't going to play as elegantly as with T2 chips. Call them lazy, yeah, but calling it planned obsolescence would be inaccurate.
 
That is not very Apple and flies right in the face of "It just works". Apple's laptops are known for their exceptional battery life and something that drains your battery 2-3x quicker or worse really flies in the face of that.

Their battery life has been anything but exceptional for a long time. At least in the 16" they bumped up the battery specs a bit again so I only need to recharge once during the work day if I don't do any photo editing. If I do that for more than 30 minutes, I'm stuck in a charger anyway.
 
  • Like
Reactions: eulslix
And how does Fairplay perform with that instruction set on an Intel chip? I'll be waiting for your answer.

Gee, Apple designed FairPlay's algo, and they also designed the T2 chip, I wonder which chip will perform more efficiently.

Talk about "throwing buzzwords".

Gee, much lower spec Windows machines can display the stuff without problems so no amount of buzzing will make Apple's decision to be anything other than greed.
 
Gee, much lower spec Windows machines can display the stuff without problems so no amount of buzzing will make Apple's decision to be anything other than greed.

Gee, Netflix just needs to implement Chrome and it'll work everywhere. But go on, blame Apple for having different priorities and different encryption methods in Safari.


:rolleyes:
 
Gee, Netflix just needs to implement Chrome and it'll work everywhere. But go on, blame Apple for having different priorities and different encryption methods in Safari.


:rolleyes:


microsoft's playready doesnt seem to have problems with the intel decoder. you are acting like it is a heavy software decoder

you seem to imply the t2 method is 'better' without evidence, they are both hardware decoders no?
 
  • Like
Reactions: Mal Blackadder
Call them lazy, yeah, but calling it planned obsolescence would be inaccurate.

I think it's a bit of both. Apple typically doesn't go out of their way to hamper older products, but Apple is also quite quick to drop features from those when supporting them is otherwise inconvenient. That works out to the same thing.

A similar example would be WatchOS dropping support for force touch even on the older watches that have hardware support for it. Sure, Apple could continue to support Force Touch in software for the sake of those models, but it would take extra effort, and they're old models anyway, so they don't bother.
 
I think it's a bit of both. Apple typically doesn't go out of their way to hamper older products, but Apple is also quite quick to drop features from those when supporting them is otherwise inconvenient. That works out to the same thing.

A similar example would be WatchOS dropping support for force touch even on the older watches that have hardware support for it. Sure, Apple could continue to support Force Touch in software for the sake of those models, but it would take extra effort, and they're old models anyway, so they don't bother.

I disagree with the watchOS example. I believe it's not for planned obsolescence but to simplify development for third party developers. There's not much "extra work" to keep the force touch around as the bulk of the work has been done already (maybe just the new Sleep app needs some extra work). However, asking third party developers to support both force touch and non-force touch UIs means developers would need to buy a series 3, and SE or 6 and test both types of UI. I suspect Apple Watch so far hasn't been too hot for developers to make apps for so they're trying as hard as they can to get people to make more apps.
 
  • Disagree
Reactions: foliovision
Here is the answer to this question.. you all won't like it:

It's not clear why Macs need a T2 security chipIt's not clear why - Because Apple wants to control our lives and watch us when we use our systems. This is why the T2 or call to home chip is something I am glad I don't have. I want full control of my macs, and not Apple.

Obviosuly, Mal blackadder you have no idea that the T2 chip is there to ALLOW APPLE FULL CONTROL and CENSORSHIP of apple users.. no thank you, my 2015 is free of this problem.
 
2019 16Inch Macbook Pro

Working for me 4k :) on Big Sur
 

Attachments

  • Screenshot 2020-10-03 at 18.54.59.png
    Screenshot 2020-10-03 at 18.54.59.png
    155.6 KB · Views: 90
My guess is that this might have more to do with Apple’s migration to Apple Silicon.
 
T2 has hardware acceleration capabilities for video. That’s why it’s required.

With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

EDIT: for those who don't believe:

View attachment 961739
Exactly this. FCPX gains incredible performance boosts at times on specific renders and exports thanks to this.

The sad part though, some gpus do have also hardware based encoders and decoders (which FCPX also crams in along the T2 aid). But maybe they are not suitable for this type of 4K streams.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.