HDCP works over DVI and in fact was developed for DVI... Before HDMI existed. Additionally, HDMI is single-link DVI plus 8-channel digital audio rolled into a single, compact connector. The HDMI video specs (even the latest v1.3) are all defined standard resolutions and timings based on DVI v1.1.
The store I was working with has been working with various Cable/Satalite providers, and audio manufacturors to test various setups in their labs and it doesn't work. Basically the HDMI has problems with crossovers. What happens is that the HDCP is not being passed. [/quote]
Actually msot the problems out there with HDMI devices are pretty much limited to satellite and cable receivers. Strange, but true... Starting with Hughes and Echostar... Echostar is the worst as their HDMI port isn't even HDMI. They call it a "HDTV Port" and they have not licensed the HDMI spec. It only works with a very small percentage of displays out there and does not properly interface HDCP, even if you use a HDMI to DVI adapter (nothing more than a cable or adapter with pin reassignment) and send it to a DVI-HDCP display. This is currently a huge problem with their VIP-622 DVR and VIP-211 HD receiver. DirecTv has terrible HDMI audio bugs with their HR-250 DVR. The updated DVR which is in beta market testing right now is reportedly much better. The older H-10 receivers had the same problems as the HR-210/250. The current H-20 is better, but still has some issues. Most SA cable receivers have broken HDMI and on them it's actually not the HDCP that's the big problem. They didn't properly implement the EIA/TIA-861 timings. Connecting them to a DVI display usually works because DVI is required to adapt to the differing timings (or at least within a certain degree of common formats), HDMI is not.
[/quote]So a workaround is to use DVI, which ignores HDCP completely.[/quote]
Uh, no. But most displays with a true DVI connector are far more tolerant of the signals that can be accepted as they are a true DVI interface and not an HDMI interface, which is a subset of DVI and only must adhere to the resolutions and timings set forth within the HDMI spec. And to further restrict that, they don't even have to do that... They only have to support the resolutions and timings that the manufacturer wishes them to. And most manufacturers are using one of the 4 different Silicon Image chipset variants on the market right now and all are fairly limited, even the latest two additions to the SI family which support 1080p. The newest is the v1.3 chipset and has the expanded color depth capabilities and more timing options included.
If you're working around an HDMI issue buy going to DVI, then it's most likely not an HDCP issue, but rather a signal timing issue. However, DVI is also more tolerant of HDCP signals in different places and timings within a format than HDMI is. So this still could be so... What it all comes down to is manufacturers are trying to be too literal with their HDMI implementation instead of allowing some slack or dynamic ability to it. Rather stupid on their part, but I think it's really a way for the consumer electronics industry to essentially stall and delay the massive rollout of HDMI/HDCP. I'm not usually one for consipiracy theories, but the CE industry does not like HDCP - content providers have forced it upon them. Same thing with 1080p, but in reverse. Content providers have fought 1080p from the start, even broadcasters, often citing extra costs or lack of hardware capability as an excuse, but it really comes down to pressure from content providers on manufacturers is what held it up. 1080p at 24, 30, 50 and 60 Hz were ATSC defined standards right from the beginning and there's no reason for them not to be implemented right from the beginning. 1080p30 has no different bandwidth requirements than 1080i @ 60Hz and 1080p24 is even less demanding than that. It took HD-DVD and BluRay to push 1080p into the mainstream fro new displays and even with that look at the prices. true 1080p displays this year are still 40% cheaper than pseudo 1080p displays from last year.
Seriously, HDCP is part of the DVI spec and the shift to HDMI has nothing to do with wanting HDCP in there. The primary reasons behind HDMI is it's actually cheaper to implement than DVI (a smaller-focus, subset of DVI) and it includes audio. DVI is considered a mid-range to professional connection (even though it has proliferated into consumer computer components). HDMI is targeted at the consumer right from the start.
All of that with the exception of the audio can pass on DVI. DVI can also do even more over dual-link connections with layered HDCP and support for stereoscopic displays or other forms of integrated data streams. HDMI was supposed to get dual-link capability with the 1.3 update, but the dual-link portion of the spec was pulled (literally at the last minute) because it would have required a new connector type to be introduced.[/QUOTE]
Thank you so much for your clean and concise explanation.
I have literally walked into the store with my credit card in hand ready to buy my new gear and ended up walking out scratching my head, more confused and frustrated than before.
Quick question (I apologize for being off topic, but perhaps this can help others too). So would this solution work, usd HDMI to go from the Elite Monitor to the Elite A/V receiver and then use component connectors for everything else, since the TV is just a giant monitor.
Thanks again