Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I touched on this earlier, but this copy protection is supported in the Intel chips of all of the 2017 iMacs. In fact, this is one of the reasons I specifically bought my iMac in 2017 (and not the 2015 model). It is not supported by the Intel chips in the 2017 iMac Pro, but it is supported by the AMD chips in the iMac Pro. All the 2017 MacBooks and the 2017 MacBook Pros support this as well. (Note though that it is NOT supported by the 2017 MacBook Air.)

So either it's due to some other separate hardware requirement, or else it's just that Apple arbitrarily decided not to support the 2017 Macs.
 
I touched on this earlier, but this copy protection is supported in the Intel chips of all of the 2017 iMacs. In fact, this is one of the reasons I specifically bought my iMac in 2017 (and not the 2015 model). It is not supported by the Intel chips in the 2017 iMac Pro, but it is supported by the AMD chips in the iMac Pro. All the 2017 MacBooks and the 2017 MacBook Pros support this as well. (Note though that it is NOT supported by the 2017 MacBook Air.)

So either it's due to some other separate hardware requirement, or else it's just that Apple arbitrarily decided not to support the 2017 Macs.

This is essentially a game changer because up until this point the 2019 iMac seemed just like a normal spec performance refresh when compared to the 2017 iMac but now that we know it supports HDR playback while the previous 2017 iMac is essentially left behind then it makes the 2019 iMac way more than just a performance focused spec update and puts it in a way better position. How is hdr content going to be represented on the 5k display though because as far as I’m concerned the iMac’e 5k display not a true HDR display panel
 
  • Like
Reactions: 2Stepfan
This is essentially a game changer because up until this point the 2019 iMac seemed just like a normal spec performance refresh when compared to the 2017 iMac but now that we know it supports HDR playback while the previous 2017 iMac is essentially left behind then it makes the 2019 iMac way more than just a performance focused spec update and puts it in a way better position. How is hdr content going to be represented on the 5k display though because as far as I’m concerned the iMac’e 5k display not a true HDR display panel
My original plan was to wait for both the copy protection and for 6-core but the chip delays killed that plan. I couldn’t wait until 2018 for a new machine so I bought the 2017 4-core thinking it would have the copy protection necessary for iTunes and Netflix 4K (since it has the hardware necessary for 4K streaming on the Windows side). That turned out to be false, but in retrospect it worked out since there was no 2018 iMac anyway. I would have had to wait until 2019, almost 2 years later.

The downside is I won’t get the 4K streaming I was expecting with the 2017 models. At least non-DRM’d 4K HDR does play fine on the 2017 iMac (and not the 2015 iMac). Better than nothing I suppose.

Those of you with the 2019 iMac won’t get true HDR but nonetheless you will get a good dithered image on the iMac, with 4K. Would 4K HDR actually make a difference on an iMac given the screen is only 27”? I would say yes but only if you’re watching full screen from close up. If you are just watching in a window it won’t help much obviously, and 4K won’t help much either if you are more than about 3 feet away or so.
 
Last edited:
With 500 nits of max brightness the current iMac-s are not exactly HDR displays either.
The new XDR display with 1000/1600 nits will be.
 
With 500 nits of max brightness the current iMac-s are not exactly HDR displays either.
The new XDR display with 1000/1600 nits will be.
I get that but Apple seems to be pushing some kind of a narrative that the 2019 iMacs are capable of HDR given the current facts and I don’t think they would do so if it wasn’t capable of HDR playback. We’ll see when Catalina gets released to get a better idea of what they really mean by HDR because we certainly know that in order to get true HDR the display needs to reach brightness levels of more than 500 bits.
 
Last edited:
There is no technical issue preventing this. Zip. Zero. None. We know this since 4K streaming works on Windows over BootCamp — and has since 2017.

But don’t let that stop the mindless, armchair speculation going on here.

iTunes is going away, remember? there will be a new video app. Let’s see what happens.
 
Screen Shot 2019-06-14 at 6.47.51 AM.png


We will need to wait a bit. I don't know if the wording for this footnote is halfassed or intentionally vague.

It can be intreated in many ways.

2018 Macs maybe the requirement for Dolby Atmos (like a later footnote says) and a 4k screen is required for 4k playbacks. Apple has streamlined 4k HEVC HDR to the point Haswell can decode it (my 2013 can). Although I know there are DRM issues publishers will likely require.

Also the footnote says specifically 4k-resolution screens. Again meaning what? Screens apply its built into the Mac however that is limited to a couple iMacs and if 4k is specific a couple 21" iMacs....

Apple as incentive to sell there movies to everyone without a good reason to limit access. Keep in mind this is a public company, Apple needs their services to take off so they can get on stage and say "TV+ now has over 40 quadrillion users!" whoooaaaaa.....

I think we just need to be patient and wait for clarification.
 
There is no technical issue preventing this. Zip. Zero. None. We know this since 4K streaming works on Windows over BootCamp — and has since 2017.

4K streaming is one thing... Dolby Vision and HDR is a lot more.
 
4K streaming is one thing... Dolby Vision and HDR is a lot more.

I know you weren't quoting me so forgive my intrusion but....

With the enhancements to MacOS my 2013 can decode 4k 60hz 10bit w/ HDR10 metadata with a much more complex encode than a streaming service uses all in software all in real time. Granted its one hell of a load on the CPU however its an extreme case on an older machine. The 2017 with hardware decoding shouldn't break a sweat.

I think the issue is just an oversight/poor wording or something related to the content publishers like how they restrict playback software.
 
Hmmm....

After doing further investigation this could be a HDCP issue as well at least on external displays. 2018 Macs started getting Intels Titan Ridge TB3 controller which included DP 1.4. Prior to that the TB3 controller used DP 1.2 which doesn't have support for HDCP 2.2, the requirement for 4k content streaming.

We'll see...
 
Hmmm....

After doing further investigation this could be a HDCP issue as well at least on external displays. 2018 Macs started getting Intels Titan Ridge TB3 controller which included DP 1.4. Prior to that the TB3 controller used DP 1.2 which doesn't have support for HDCP 2.2, the requirement for 4k content streaming.

We'll see...
The Intel CPUs in the 2017 Kaby Lake iMacs support HDCP 2.2. In fact even for the 2017 MacBook, Apple specifically waited for version 2 of Kaby Lake Core m3, and that CPU revision includes HDCP 2.2. The first version of Kaby Lake Core m3-7Y30 (SR2ZY) does not, but Apple doesn't use m3-7Y30 at all so it's moot.

Core m3-7Y30 (SR2ZY) - No HDCP 2.2 (2016 model). Apple does not use this.
Core m3-7Y30 (SR347) - Has HDCP 2.2 (2017 model). Apple does not use this.
Core m3-7Y32 (SR346) - Has HDCP 2.2 (2017 model). This is what Apple uses.

However, regardless of external monitor support, Apple doesn't support 4K streaming on the built-in Kaby Lake iMac screens either.

We can only hope there will be some way to patch the OS to allow streaming support on those 2017 iMacs, since the hardware support does seem to be there.
 
Last edited:
The Intel CPUs in the 2017 Kaby Lake iMacs support HDCP 2.2. In fact even for the 2017 MacBook, Apple specifically waited for version 2 of Kaby Lake Core m3, and that CPU revision includes HDCP 2.2. The first version of Kaby Lake Core m3-7Y30 (SR2ZY) does not, but Apple doesn't use m3-7Y30 at all so it's moot.

Core m3-7Y30 (SR2ZY) - No HDCP 2.2 (2016 model). Apple does not use this.
Core m3-7Y30 (SR347) - Has HDCP 2.2 (2017 model). Apple does not use this.
Core m3-7Y32 (SR346) - Has HDCP 2.2 (2017 model). This is what Apple uses.

However, regardless of external monitor support, Apple doesn't support 4K streaming on the built-in Kaby Lake iMac screens either.

We can only hope there will be some way to patch the OS to allow streaming support on those 2017 iMacs, since the hardware support does seem to be there.

What you mean by "Patch"? if Apple doesn't support HDR Content on pre-2019 iMacs then there has to be a reason for that
 
What you mean by "Patch"? if Apple doesn't support HDR Content on pre-2019 iMacs then there has to be a reason for that
That's the point. We can't figure out any reason for why 4K streaming will not work on the 2017 Kaby Lake iMac models, aside from the fact that Apple has declared them unsupported.

One reason I specifically bought the 2017 iMac was because on paper it checked all the boxes needed for 4K streaming. (The 2015 iMac does not.) Now it's 2019 and the 2017 iMac still checks all the required boxes, except Apple has decided not to support 4K streaming on it. Ironically, 4K streaming (Netflix) actually works on those 2017 iMacs... if you run Windows.

As for the patch: For example, 10.12 Sierra and 10.13 High Sierra are not supported on my mid-2009 MacBook Pro Core 2 Duo. However, the hardware is basically identical to the late 2009 MacBook which came out only a few months later. So, @dosdude1 and friends created a software patch for both Sierra and High Sierra, and both can be installed on the 2009 MacBook Pro and both OSes work fine. The main annoying issue that has come up that is different from a fully supported machine is the fact that some of the latest Security Updates in late 2018 and in 2019 won't easily install. But otherwise High Sierra runs just fine... which isn't a surprise because it's internally basically the same as another machine that is fully supported.
 
Last edited:
The Intel CPUs in the 2017 Kaby Lake iMacs support HDCP 2.2. In fact even for the 2017 MacBook, Apple specifically waited for version 2 of Kaby Lake Core m3, and that CPU revision includes HDCP 2.2. The first version of Kaby Lake Core m3-7Y30 (SR2ZY) does not, but Apple doesn't use m3-7Y30 at all so it's moot.

Core m3-7Y30 (SR2ZY) - No HDCP 2.2 (2016 model). Apple does not use this.
Core m3-7Y30 (SR347) - Has HDCP 2.2 (2017 model). Apple does not use this.
Core m3-7Y32 (SR346) - Has HDCP 2.2 (2017 model). This is what Apple uses.

However, regardless of external monitor support, Apple doesn't support 4K streaming on the built-in Kaby Lake iMac screens either.

We can only hope there will be some way to patch the OS to allow streaming support on those 2017 iMacs, since the hardware support does seem to be there.

I'm sorry maybe I was too vague.

HDCP 2.2 is content protection (DRM) required by movie studios for streaming services to use on 4k/HDR media. Apple will not be an exception. For HDCP protected media to be properly decrypted and displayed on the screen requires all the hardware involved to be HDCP compliant. So the monitor, the GPU (either integrated or discreet), a port and its controller on the Mac/PC, and the cable between the monitor and Mac/PC all need to be HDCP 2.2 compatible. Any break in that chain and the HDR stream won't be able to be unencrypted and if you are lucky you'll get the SDR stream if you don't have any HDCP support you'll get a content error.

That isn't really a big deal because HDCP is baked into the standards its using. Thunderbolt is DisplayPort video out with PCI-e access. HDCP 2.2 is a feature of DisplayPort 1.3 and later.

Screen Shot 2019-06-16 at 6.41.09 AM.png


And finally the problem. The thunderbolt controller used in the 2017 Mac is Intels Alpine Ridge and its using DisplayPort 1.2 which does not have support for HDCP 2.2. So the requirement of the port and controller to handle HDCP decryption is absent.

Thunderbolt 3 controller used in 2017 and earlier (Alpine Ridge)
https://ark.intel.com/content/www/u...1/intel-jhl6540-thunderbolt-3-controller.html

Thunderbolt 3 controller used 2018 and later (Titan Ridge)
https://ark.intel.com/content/www/u...0/intel-jhl7540-thunderbolt-3-controller.html

2018 Macs began using Titan Ridge which is DisplayPort 1.4 with HDCP 2.2 support.

Unless that gap can be filled by Apple via a firmware update which might not be possible because there is a large bandwidth increase going from DP 1.2 to DP 1.3 then there is a actual hardware limitation of the 2017.

All that is for external monitors. Built in displays can likely be addressed with updates and politics with streaming services.
 
I'm sorry maybe I was too vague.

HDCP 2.2 is content protection (DRM) required by movie studios for streaming services to use on 4k/HDR media. Apple will not be an exception. For HDCP protected media to be properly decrypted and displayed on the screen requires all the hardware involved to be HDCP compliant. So the monitor, the GPU (either integrated or discreet), a port and its controller on the Mac/PC, and the cable between the monitor and Mac/PC all need to be HDCP 2.2 compatible. Any break in that chain and the HDR stream won't be able to be unencrypted and if you are lucky you'll get the SDR stream if you don't have any HDCP support you'll get a content error.

That isn't really a big deal because HDCP is baked into the standards its using. Thunderbolt is DisplayPort video out with PCI-e access. HDCP 2.2 is a feature of DisplayPort 1.3 and later.

View attachment 843182

And finally the problem. The thunderbolt controller used in the 2017 Mac is Intels Alpine Ridge and its using DisplayPort 1.2 which does not have support for HDCP 2.2. So the requirement of the port and controller to handle HDCP decryption is absent.

Thunderbolt 3 controller used in 2017 and earlier (Alpine Ridge)
https://ark.intel.com/content/www/u...1/intel-jhl6540-thunderbolt-3-controller.html

Thunderbolt 3 controller used 2018 and later (Titan Ridge)
https://ark.intel.com/content/www/u...0/intel-jhl7540-thunderbolt-3-controller.html

2018 Macs began using Titan Ridge which is DisplayPort 1.4 with HDCP 2.2 support.

Unless that gap can be filled by Apple via a firmware update which might not be possible because there is a large bandwidth increase going from DP 1.2 to DP 1.3 then there is a actual hardware limitation of the 2017.

All that is for external monitors. Built in displays can likely be addressed with updates and politics with streaming services.
Great explanation, Thank you
 
All that is for external monitors. Built in displays can likely be addressed with updates and politics with streaming services.
Indeed. IOW, we know of no technical reason why the 2017 iMacs can't stream DRM'd 4K on their built-in monitors.

Personally I don't care that my 2017 MacBook can't stream DRM'd 4K, or that my 5K iMac can't do it on an external screen. However, I'm irritated that I can't do on my built-in 2017 iMac screen. It's actually my only 4K+ computer screen. My other 4K screens are TVs, and I have other 4K sources for those.
 
Last edited:
Hello,
I'm not that computer savvy, so is there a reason why the iMac Pro (2017) isn't including the 4K/Dolby Vision/HDR support. I've recently purchase it, 4 months ago, and it is still full price on the apple store. This seems really short sighted for apple, screwing over loyal customers who spend small fortunes on their computers. I hope whatever the reason, it can be dealt with and corrected before the official release.
 
Hello,
I'm not that computer savvy, so is there a reason why the iMac Pro (2017) isn't including the 4K/Dolby Vision/HDR support. I've recently purchase it, 4 months ago, and it is still full price on the apple store. This seems really short sighted for apple, screwing over loyal customers who spend small fortunes on their computers. I hope whatever the reason, it can be dealt with and corrected before the official release.
See post #40. That is why.
 
Assuming that eGPU hangs on to a Mac via PCI bus and that the card has HDMI2.0 or DP1.3/1.4 with HDCP2.2 support, why wouldn't it work?
Am I overlooking something?
Read Post #40.

"And finally the problem. The thunderbolt controller used in the 2017 Mac is Intels Alpine Ridge and its using DisplayPort 1.2 which does not have support for HDCP 2.2. So the requirement of the port and controller to handle HDCP decryption is absent."
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.