Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just because it's hardware accelerated doesn't mean you're always getting the maximum energy efficiency.
T2 uses far less power compared to the i9 and Vega48. Also T2 has an AES crypto engine built right in that works really well with Fairplay (which I'm sure Intel chips can do fine, but not as good as T2).
That’s a stretch as far performance arguments go. With 75 Mbps 10-bit 4K HEVC files, which is 3X the bitrate of the highest bitrate Apple TV+ 4K HDR streams and 5X the bitrate of Netflix 4K streams, my 3 year-old quad-core i5-7600 iMac uses less than 10% CPU.

I get that Apple wanted its own solution here, but IMO decode efficiency isn’t a major concern here. I suspect the issues are:

1) The DRM solution used in Windows is a MS/Intel technology and Apple would have to pay royalties to use it and would have no control over its design or robustness.

and/or:

2) Apple wanted something that would be easily implemented across iOS, iPadOS, and macOS, the latter on both Intel Macs and Arm Macs.

Long battery life.

And with iMacs, performance is still critical if you have a video playing while doing other things.
With those same uber high bitrate files mentioned above, CPU usage is roughly 25% on my lowly fanless Core m3-7Y32 MacBook, and multi-tasking isn’t a problem at all.
 
Stupid comment. T2 literally has a hardware HEVC codec built right in. Please educate yourself next time before calling "bs".
View attachment 961738
Most recent intel cpu's from about 2017 have hevc decoding in hardware using intel quick sync, a T2 chip is not required, windows machines with those chips can play netflix in 4k if they use microsoft edge browser, it is total bs that you need a T2 chip, my 2017 5k imac has hevc via quick sync built into its cpu. This is Apple making users of older Macs into second class citizens.
 
Most recent intel cpu's from about 2017 have hevc decoding in hardware using intel quick sync, a T2 chip is not required, windows machines with those chips can play netflix in 4k if they use microsoft edge browser, it is total bs that you need a T2 chip, my 2017 5k imac has hevc via quick sync built into its cpu. This is Apple making users of older Macs into second class citizens.

Yes Intel and AMD graphics cards each have their own hardware implementations of HEVC. iMac 2015 has the hardware. However, there's also Safari's DRM which heavily relies on AES algos for Fairplay. There's no guarantee that an iMac 2015 could perform well with DRM enabled but T2's AES crypto engine performs nicely with Fairplay.

Also, there's nothing preventing Netflix from supporting Chrome. Why is it Apple's fault when really it's Netflix choice not to support Chrome?
 
Cancelled Netflix years ago for Pureflix never looked back. Even tho we've got a 2020 MacBook Air this T2 chip is nonsense

Never heard of Pureflix so I looked it up..

Pure Flix is an evangelical Christian film production and distribution studio founded by David A. R. White and Russell Wolfe, inspired by Netflix. Pure Flix produces Christian films, including God's Not Dead, Do You Believe?, Woodlawn, The Case for Christ, and Unplanned.

Although this 4k limitation seems baseless, pureflix doesn't sound like a like for like replacement for Netflix :eek:
 
FWIW: and I answered earlier on a 'why technically' post so here's an opinion.

This is purely about safeguarding a nice clean 4K uninterrupted video stream, simple as that.

There are possibly (and I haven't touched a Windows system with a long mud covered stick for 10+ years) frameworks to call on in Windows (let's not forget that many browsers call on more native rendering and JS frameworks to the OS underneath) that Netflix was happy to use.. or were unhappy that they used on Windows but the horse has already bolted. Possibly they didn't want to expose their valuable assets any further and looked for stronger frameworks.

Now, I haven't looked into the implementation but speculate that there could be an opportunity with a new API for a provider, say Netflix, to add DRM capabilities (like with saw in BRDM) to a stream of data and that would be the perfect reason for choosing a secure part of the ecosystem where the code could not be debugged and/or single stepped.

The possibility of uploading custom pseudo code to a framework is only one possible answer and probably wrong but I make the point that the use of the T2 is probably to do something novel in this area that satisfies Netflix's requirements in this case. Knowing Apple though it would be standards based and open so developers that work on front end (I work on Cloud Systems recently) might know and care to share if they are not under NDA.
[automerge]1601583941[/automerge]
 
  • Like
Reactions: ksec
Yes Intel and AMD graphics cards each have their own hardware implementations of HEVC. iMac 2015 has the hardware. However, there's also Safari's DRM which heavily relies on AES algos for Fairplay. There's no guarantee that an iMac 2015 could perform well with DRM enabled but T2's AES crypto engine performs nicely with Fairplay.

Also, there's nothing preventing Netflix from supporting Chrome. Why is it Apple's fault when really it's Netflix choice not to support Chrome?
Apple have made a concious decision to cripple 3 year old iMacs with regard to 4k netflix, Apple could as easily used the AES built into the intel chips as a fallback for machines that do not have the T2 chip, what other features will be withheld from machines 3 years old just because of a arbitary decision by Apple and nothing to do with whether the hardware can handle it or not.
 
3. It's up to Netflix to support Chrome which won't require T2 (but likely uses software DRM which would mean bad performance and easily bypassed, so maybe Netflix doesn't want that).

Netflix only allows software DRM to get 720p, on any platform.

Notably Chrome has access to hardware DRM on Chromebooks, and gets 1080p on that hardware only.
 
  • Like
Reactions: m0dest
Netflix only allows software DRM to get 720p, on any platform.

Notably Chrome has access to hardware DRM on Chromebooks, and gets 1080p on that hardware only.
Because of software DRM which is prone screen recording. Therefore hardware DRM is needed, so who knows how Fairplay-encrypted 4kHDR performs on non-T2 devices using Intel processors to decrypt.

This isn't an Apple issue, it's Netflix needing to protect its content.
 
T2 has hardware acceleration capabilities for video. That’s why it’s required.

With Windows, it’s possible but you’re going to lose out on performance and battery life since it’ll probably use software decoding. Apple thinks that’s bad UX so they put a requirement on.

EDIT: for those who don't believe:

How does that justify it being required? The T2 having hardware acceleration means that computers with the T2 are better at playing those videos but you are suggesting it's impossible to play these videos without the T2 chip and that is not correct.

Apple's machines are rightly lauded for their high levels of performance and the way that performance persists with age. Generally, a 5 year old MacBook Pro is going to perform better than a 5 year old Windows laptop of the same pricepoint.

Apple using that performance as a selling point while simultaneously saying the older laptops can't handle this video is perplexing.
 
The #1 complaint here is always about battery life. Letting the customer decide to drain their battery watching 4K would result in knee-jerk reactions here and threatening class action lawsuits, and you know this to be true here.
Let's be honest... even the not-so-smart people realize that 4K video playback probably requires more ressources than 1080p. It's like towing a boat with your truck over the highway and then complaining that fuel consumption is higher than advertised.
But you mentioned one very relevant point... "letting customers DECIDE". Something that Netflix really has issues with.
In the past it was possible to force select certain resolutions, but not anymore. In case of 4K there could (and should) be a selectable option if content should be streamed in HEVC HDR 4K possibly running your system at its limits or someting lower.
 
How does that justify it being required? The T2 having hardware acceleration means that computers with the T2 are better at playing those videos but you are suggesting it's impossible to play these videos without the T2 chip and that is not correct.

Apple's machines are rightly lauded for their high levels of performance and the way that performance persists with age. Generally, a 5 year old MacBook Pro is going to perform better than a 5 year old Windows laptop of the same pricepoint.

Apple using that performance as a selling point while simultaneously saying the older laptops can't handle this video is perplexing.

You're forgetting the DRM side of it which uses T2's AES crypto engine. Could Apple implement Safari's Fairplay in a way to spin up CPUs/GPUs at 100% to decrypt 4kHDR content? Sure, but that's terrible UX.

Also, there's a reason why Netflix doesn't support Chrome's soft-DRM for 4kHDR. It's prone to screen recording. But it's perfectly do-able, yet Netflix gets 0 flack for refusing support.
 
You're forgetting the DRM side of it which uses T2's AES crypto engine. Could Apple implement Safari's Fairplay in a way to spin up CPUs/GPUs at 100% to decrypt 4kHDR content? Sure, but that's terrible UX.

Also, there's a reason why Netflix doesn't support Chrome's soft-DRM for 4kHDR. It's prone to screen recording. But it's perfectly do-able, yet Netflix gets 0 flack for refusing support.
There is no need for a DRM to cause 100% CPU usage.

Apple has its own reasons for doing what they did with regards to streaming 4K, but performance on 2017 Macs is not one of them.

Let's be honest... even the not-so-smart people realize that 4K video playback probably requires more ressources than 1080p. It's like towing a boat with your truck over the highway and then complaining that fuel consumption is higher than advertised.
Here's a hint: Car analogies almost always suck.

Anyhow, 4K HDR HEVC decode overhead is low if the machine has hardware decoding implemented, and all Macs since 2017 (except for the 2017 MacBook Air) have it.

The only performance argument you might be able to stretch into being with Kaby Lake Macs is with the 2017 MacBook, because that one might use more than 20% CPU at times.

But even then the CPU overhead is not that big of a deal, and multi-tasking still works fine while simultaneously watching such video. But it's only that machine. Certainly, 4K HDR HEVC decode performance is a complete non-issue on the 2019 iMac. We're talking single-digit CPU usage on the 2019 iMac.
 
You're forgetting the DRM side of it which uses T2's AES crypto engine. Could Apple implement Safari's Fairplay in a way to spin up CPUs/GPUs at 100% to decrypt 4kHDR content? Sure, but that's terrible UX.

Also, there's a reason why Netflix doesn't support Chrome's soft-DRM for 4kHDR. It's prone to screen recording. But it's perfectly do-able, yet Netflix gets 0 flack for refusing support.
On what basis do you make the claim that it would spin up the gpu/cpu to 100%. Intel cpus have AES built into them and have done so for a very long time, can you profide any proof that it would use 100% of cpu or gpu performance or are you just making things up as you go along. So far in this thread you have said that only the T2 could decode HEVC properly, once that was disproved then you went on about how it would adversely affect battery life, though for the life of me I cant see how that would affect a iMac desktop, and now you are claiming about the T2 handling AES even though intel chips are just as capable of handling AES decryption and encryption and have proved so in the fact that windows systems can handle Netflix 4k HDR perfectly, so please show proof of your claims. The fact is Apple and only Apple decided they would implement DMR using the T2 chip and nothing else, even though the intel CPUs are perfectly capable of doing so.
 
  • Like
Reactions: EugW
On what basis do you make the claim that it would spin up the gpu/cpu to 100%.

"100%" wasn't literal.


Intel cpus have AES built into them and have done so for a very long time

So has AMD as they've been used for crypto mining (with algos using AES). Guess which chip of the three is the most power efficient for this usecase?

Just because a chip has AES implemented doesn't mean it performs EXACTLY the same as T2. T2 has the full package and is perfect for 4KHDR with Fairplay encryption.

So far in this thread you have said that only the T2 could decode HEVC properly

No where did I say Intel couldn't decode HEVC. Go ahead, post the comment where I said that. Stop lying and stop putting words in my mouth.
 
Last edited:
"100%" wasn't literal.




So has AMD as they've been used for crypto mining (with algos using AES). Guess which chip of the three is the most power efficient?

Just because a chip has AES implemented doesn't mean it performs EXACTLY the same as T2. T2 has the full package and is perfect for 4KHDR with Fairplay encryption.



No where did I say Intel couldn't decode HEVC. Go ahead, post the comment where I said that. Stop lying and stop putting words in my mouth.
Again why is power efficiency a reason on a desktop? Apple could have easily wrote Fairplay to fallback to using the required hardware features built into the Intel cpu, but they deliberately chose not to. In the summer when Apple anounced the move to Arm cpus they said to people not to worry, Apple have plans to carry on making Intel based systems for 2 years and plan to support them longer than that and yet within months they do this, at the this point anyone thinking about buying a intel based Mac should reconsider, because you never know what Apple will do next. Stop making claims and then saying you didn't mean it litterally when a person asks for proof of your claims.

btw in your first post on this thread you stated " T2 has hardware acceleration capabilities for video. That’s why it’s required." until it was pointed out that intel chips could do decode HEVC in hardware, you were maybe unaware at that point that intel cpus could do that and spouting off about things you really do not know anything about or you were saying intel chips couldnt do it, which is it?
 
Last edited:
So has AMD as they've been used for crypto mining (with algos using AES). Guess which chip of the three is the most power efficient?
I can’t believe you are actually trying to make this argument.

The 4K streaming DRM overhead on Intel is almost negligible. Is decreasing that power utilization mission critical? Obviously not.

Stop trying to mislead people into thinking this is about CPU performance on recent Macs, because it isn’t.
 
Again why is power efficiency a reason on a desktop? Apple could have easily wrote Fairplay to fallback to using the required hardware features built into the Intel cpu, but they deliberately chose not to. In the summer when Apple anounced the move to Arm cpus they said to people not to worry, Apple have plans to carry on making Intel based systems for 2 years and plan to support them longer than that and yet within months they do this, at the this point anyone thinking about buying a intel based Mac should reconsider, because you never know what Apple will do next. Stop making claims and then saying you didn't mean it litterally when a person asks for proof of your claims.

Obvious reason: Thermals. GPU/CPU in a tight enclosure such as the iMac will throttle the performance when being utilized. Having 4kHDR playing the background or on a separate display using the GPU/CPU would slow down all of the other processes that need the GPU/CPU.

While the T2 chip is passively cooled even running it at 100% utilization 24/7.

I wrote my thesis on a lossless GPGPU video codec using CUDA and I've contributed to the FFMPEG project. I know a thing or two about hardware based video compression as I've worked in this area in my professional career. Please don't even lecture me on this stuff as I am sure you wouldn't be able to hold a candle when discussing video codecs with me. So when I say 100% utilization, it obviously wasn't literal. If that's your main argument, then it sounds like you're just unreasonably angry at Apple for whatever reason and need to vent out. In that case, the argument is meaningless.

Do you understand now? I hope you do because I'm sick of your trolling attitude so I'm going to not bother continuing this discussion with you. Feel free to reply but I'm done conversing with you.
 
  • Haha
  • Disagree
Reactions: Okasian and eulslix
Obvious reason: Thermals. GPU/CPU in a tight enclosure such as the iMac will throttle the performance when being utilized. Having 4kHDR playing the background or on a separate display using the GPU/CPU would slow down all of the other processes that need the GPU/CPU.

While the T2 chip is passively cooled even running it at 100% utilization 24/7.

I wrote my thesis on a lossless GPGPU video codec using CUDA and I've contributed to the FFMPEG project. I know a thing or two about hardware based video compression as I've worked in this area in my professional career. Please don't even lecture me on this stuff as I am sure you wouldn't be able to hold a candle when discussing video codecs with me. So when I say 100% utilization, it obviously wasn't literal. If that's your main argument, then it sounds like you're just unreasonably angry at Apple for whatever reason and need to vent out. In that case, the argument is meaningless.

Do you understand now? I hope you do because I'm sick of your trolling attitude so I'm going to not bother continuing this discussion with you. Feel free to reply but I'm done conversing with you.
It uses hardly any cpu or gpu resources, if you have proof otherwise then please post it, but i can tell you from actual experience that this is not the case.
 
That’s what Apple’s been doing with the iPhone ever since the A-series chip debuted, though.

And all in all, it hasn’t been so bad. For example, iOS 14, released in 2020, is compatible with the iPhone 6S released in 2015.

Though, of course, there are some things that an iPhone 6S can’t do on iOS 14 because of hardware limitations, but [throttlegate aside], Apple doesn’t take away features that were available when the device was released, e.g., 3D Touch – it’s officially discontinued, but iOS 14 still supports it on the devices that have it [iPhones 6S through XS].

...but it is in fact “discontinued” on older Apple Watches
 
Does this mean the OSX TV app will finally play UHD content? It's outrageous that you can't play your UHD iTunes movies in UHD on your Mac, only on an Apple TV if you have one.

Reasons I refuse to even consider a lot of Apple products outside of the iPhone. I like the iPhone but man, it doesn’t play nice at all with anything non-Apple really...
 
  • Like
Reactions: foliovision
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.