Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mcc

macrumors member
Original poster
Jul 18, 2007
44
0
I have a MacPro 3,1 (Early 2008, 2 x 2.8 Quad-Core Xeon) running OSX 10.7.4 that had a nVidia 8800GT installed when purchased. My display is a 30" Cinema Display.

I just purchased and installed Apple's ATI Radeon 5770 Graphics Card Upgrade.

The graphic card works flawlessly EXCEPT that my HD movies purchased from iTunes are no longer viewable and reports that an HDCP capable display must be installed. Both QuickTime and iTunes report the same error.

Before uninstalling and returning the ATI Radeon 5770, is there a way to disable the HDCP feature?

I'm assuming that HDCP is a hardware feature implemented in the ATI Radeon 5770 and that it can be disabled by the OSX kernel.
 
Your solutions are:

1. Remove the DRM from the iTunes movie (hooray for DRM screwing over people who legitimately buy the content)
2. Return the card
3. Get a HDCP display, which the 30" isn't
 
mcc - I have the exact same setup and have not had any issues playing iTunes rented content or DVD's. I don't have any purchased movies though, so not sure whether that's any different.
 
Have you already tried removing (recommend renaming or compressing it instead):

/System/Library/Extensions/AppleUpstreamUserClient.kext

Your solutions are:

1. Remove the DRM from the iTunes movie (hooray for DRM screwing over people who legitimately buy the content)

Yup. If there's a way to do this that does not require iTunes DRM to function correctly for the stripping, I'm all ears. I've done it, but had to have iTunes DRM working correctly before I could strip the DRM. PITA, flaky, and would not work if iTunes/QT threw HDCP errors on playback.

What's really fun is that Apple puts DRM on free movies too (e.g., Truth in 24, Truth in 24 II).
 
Last edited:
mcc - I have the exact same setup and have not had any issues playing iTunes rented content or DVD's. I don't have any purchased movies though, so not sure whether that's any different.

The difference may be in the way that you have iTunes configured. I have configured iTunes, recently, for 1080p playback. Previously, it had been configured for 720p playback after Apple released a patch roughly 3 years ago to allow HD content to be played on a Mac Pro with an nVidia 8800GT.

You can change iTunes to use standard-definition as the default; however, the movies lose all their clarity and definition when viewed full-screen on a 30" Cinema Display.
 
Have you already tried removing (recommend renaming or compressing it instead):

/System/Library/Extensions/AppleUpstreamUserClient.kext



Yup. If there's a way to do this that does not require iTunes DRM to function correctly for the stripping, I'm all ears. I've done it, but had to have iTunes DRM working correctly before I could strip the DRM. PITA, flaky, and would not work if iTunes/QT threw HDCP errors on playback.

What's really fun is that Apple puts DRM on free movies too (e.g., Truth in 24, Truth in 24 II).

I haven't modified anything. I would assume for, diagnostic purposes, that there are bit(s) that can be cleared or set in a ATI Radeon HD 5770 device register to enable or disable the HDCP circuitry. As the graphics card and display must be detected when the system boots, Apple should be able to modify its EFI boot code to disable the circuitry when an older Cinema Display is detected.

There also could be some "design to cost" issues involved where the ATI "motherboard" only sees only 2 dual-link DVI ports or 4 mini-DVI ports and not be able to support disabling the HDCP logic on a per port basis.
 
In that case, you might want to.
;)

In case that I might want to, which file(s) in the hierarchy that you suggested should I be looking at?

Okay, i'm an "Old Fart" that got sucked into IT Network Security because I was one of the few persons that knew where the bodies were buried.

Its been years since I was in Engineering and written compilers or modified operating systems. Which part of the directory hierarchy should I be looking at?
 
Only thing I can think of to try is to disable the HDCP handshake by unloading the kext.

In Terminal:

disable:

Code:
sudo kextunload /System/Library/Extensions/AppleUpstreamUserClient.kext

enable:

Code:
sudo kextload /System/Library/Extensions/AppleUpstreamUserClient.kext

Verify it's loaded/unloaded in System Profiler -> Software -> Extensions

Alternatively, you can move it/compress it and reboot.

It might still not play the HD version, but that's the first thing I'd try.
 
I know this is an old thread but if people are still checking does anyone know how to disable the hdcp on a mac pro 2014 running yosamite 10.10.3, as i will never be using copyrighted material and am trying to loop 9 screens and keep getting hdcp issues.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Your solutions are:

1. Remove the DRM from the iTunes movie (hooray for DRM screwing over people who legitimately buy the content)
2. Return the card
3. Get a HDCP display, which the 30" isn't

Even if you remove the DRM, there is a problem with HDCP. I have tested with Requiem, and HDPC is distinct from Apple DRM.


I have tested, it works without problem.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.