Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Docadd

macrumors newbie
Original poster
Nov 5, 2013
17
2
Oxford, UK
Hello,

I've got a Yamaha RX-475 receiver which supports 4K but does not support HDR..... I believe!

Now I've done a work around by plugging in the Apple TV (old one) direct into my T.V (Samsung k8000) via HDMI and used an optical cable to use the audio with the AV receiver.

When I watched the Apple presentation it looked like 4K and HDR can be combined, is this correct or have I got my wires crossed. I'm just wondering if I've just done a lot of faffing around with my receiver for nothing.
 
I don't know what you mean by HDR and 4k combined? That's just how it is.
You plug the atv 4k in and you get what your tv supports. Feeding audio from your tv to your receiver is never ideal but can be done if your tv supports it. Either optical out or ARC from your tv.
In that regard the new atv is no different than your current one.
 
Last edited:
What I mean is the new Apple TV 4K is 4K HDR but my receiver is only 4K compatible. Does that mean I can get the 4K but not the HDR quality?
 
You could use an optical cable instead then. I'm not sure how your TV works, but on my Samsung KS8000 I just connect everything with HDMI to the little box (One connect or something) and then plug the Optical Out into my receiver. I think Optical only supports 5.1 though, so if you have 7.1+ your SOL.
 
What I mean is the new Apple TV 4K is 4K HDR but my receiver is only 4K compatible. Does that mean I can get the 4K but not the HDR quality?
If you run it through your receiver that is correct.
Does your tv support HDR? If so the only way to get it HDR is running the apple tv through the tv.
[doublepost=1505578961][/doublepost]
You could use an optical cable instead then. I'm not sure how your TV works, but on my Samsung KS8000 I just connect everything with HDMI to the little box (One connect or something) and then plug the Optical Out into my receiver. I think Optical only supports 5.1 though, so if you have 7.1+ your SOL.
Yes, Dolby digital 5.1, not DD+ so the sound isn't that great.
 
  • Like
Reactions: Docadd
You could use an optical cable instead then. I'm not sure how your TV works, but on my Samsung KS8000 I just connect everything with HDMI to the little box (One connect or something) and then plug the Optical Out into my receiver. I think Optical only supports 5.1 though, so if you have 7.1+ your SOL.

I have the same T.V and have currently just purchased an optical cable to get this working, will I loose sound quality using an optical cable over using HDMI? By the way its 5.1.
 
99% of people won't notice the difference, plus I don't think the Apple TV even supports the lossless audio formats. If your sound system is so high end that you COULD notice the difference, your receiver would most likely already support HDR anyways.
 
I have the same T.V and have currently just purchased an optical cable to get this working, will I loose sound quality using an optical cable over using HDMI? By the way its 5.1.
Optical and hdmi OUT of TV to the receiver deliver the same audio.
The best you can get out of the atv is hdmi from the atv to the receiver.
[doublepost=1505579393][/doublepost]
Does the new Apple TV 4K support DD+ then?
Yes, 7.1 DD+

The difference between DD and DD+ is VERY noticeable.
 
It seems my JS8500 can only take up to 24-bit/48kHz audio, so I cannot Neo into my receiver via optical higher than that.

If I connect HDMI into the receiver I can input 24-bit/96kHz, but then PLII does not work.

So no luck for Qobuz Hi Res upmixed at present.
 
Optical and hdmi OUT of TV to the receiver deliver the same audio.
The best you can get out of the atv is hdmi from the atv to the receiver.
[doublepost=1505579393][/doublepost]
Yes, 7.1 DD+

The difference between DD and DD+ is VERY noticeable.

So me having a 5.1 system I should be fine with the optical cable solution? My receiver does not support HDR but does support 4K. So without buying a new receiver I think this is my only option.
 
So me having a 5.1 system I should be fine with the optical cable solution? My receiver does not support HDR but does support 4K. So without buying a new receiver I think this is my only option.
Sounds like it. But a new receiver on your wishlist!
 
  • Like
Reactions: Docadd
The difference between DD and DD+ is VERY noticeable.
That depends entirely on the codec configuration. When Netflix introduced DD+, they streamed at 192kbps low-bitrate mode, which sounded audibly worse than the 384kbps DD they used before. Not sure what configurations they use today though. At higher bitrates such as 640kbps I personally cannot detect a difference between DD and DD+ with 5.1 channels. Of course DD+ does support higher bitrates than that, so this may or may not bring some audible improvements.
 
Apple TV 4K is HDCP 2.2 over HDMI 2.0a

When I plugged an HDCP 2.2 HDMI 2.0a source into an HDMI 2.0 receiver I got no picture. It was from a PC with nVidia card, even changing the settings to 1080p etc. still didn't allow the picture to pass through.

HDMI 2.0a receivers are new this year (there was a couple that got updates last year). Buying an HDMI 2.0a receiver this year means that when HDMI 2.1 comes out next year you'll be outdated again.

I'm 99.9% sure that you won't, as you say, get 4K but just not HDR/Dolby Vision. You won't get anything because it won't pass through. The solution will be to plug directly into your TV and I don't think it will optical out the sound to your receiver. We'll find out when the Apple TV 4K gets into someone's hands who knows what to test for.
 
In your case I suspect the best results will be via HDMI Arc, with a K8000 TV and RX-V475 you really shouldn't be using optical anymore as you're only shooting yourself in the foot by doing so.
 
That depends entirely on the codec configuration. When Netflix introduced DD+, they streamed at 192kbps low-bitrate mode, which sounded audibly worse than the 384kbps DD they used before. Not sure what configurations they use today though. At higher bitrates such as 640kbps I personally cannot detect a difference between DD and DD+ with 5.1 channels. Of course DD+ does support higher bitrates than that, so this may or may not bring some audible improvements.
I can't speak to when dd+ was introduced on netflix. But now the difference is huge.
 
One issue that may pop up is with DV HDR content. Most of the tvs out there that can display DV HDR, can not display it at 60 hz. Most are only capable of doing HDR10 at 24 and 60 hz and DV only at 24 hz. This would also affect Netflix, VUDU and Amazon which do have DV content. They may not display properly on DV capable tv's. So we may be forced to use 30hz, which creates a lot of judder, or down rez to 1080p. From the Apple support page.

"If your TV doesn’t support HDR10 or Dolby Vision at 60Hz (50Hz in Europe), Apple TV 4K can use these formats at 30Hz (25Hz in Europe), but you'll need to manually select a lower refresh rate in Settings > Video and Audio. Using lower refresh rates can result in poor performance, or choppy video when navigating on the home screen, within apps, or playing games. In these cases, Apple recommends lowering resolution to 1080p at 60Hz (50Hz in Europe) instead, and letting your television upscale to 4K. You will still be able to use HDR10 and Dolby Vision at these resolutions."
 
Any streaming service. Netflix, Amazon, etc. I don't know what audio bitrate they're using now. Just that the difference is obvious.
OK, I just fired up my Roku and started "Rogue One" in Netflix. According to my AVR the 5.1 DD+ track still has 192 kbps. I have a hard time believing that this low-bitrate track sounds any better than DD at 384 kbps or more. I could easily hear compression artifacts in the opening music.
 
OK, I just fired up my Roku and started "Rogue One" in Netflix. According to my AVR the 5.1 DD+ track still has 192 kbps. I have a hard time believing that this low-bitrate track sounds any better than DD at 384 kbps or more. I could easily hear compression artifacts in the opening music.
You'll need to do an a/b comparison. It's pretty obvious. I would be absolutely shocked if you couldn't hear the difference. To me dd+ sounds closer to truehd than to DD.
 
At 192kbps? No way. That's really low for a 6 channel track.
Perhaps your roku limited bitrate? I don't know. I tested on my lg b6 over arc and would flip input back and forth to my shield. Lg being dd and the shield with dd+
 
Perhaps your roku limited bitrate? I don't know. I tested on my lg b6 over arc and would flip input back and forth to my shield. Lg being dd and the shield with dd+
Very likely what you heard is just one of them being louder than the other (which doesn't mean it's better).
 
Perhaps your roku limited bitrate? I don't know. I tested on my lg b6 over arc and would flip input back and forth to my shield. Lg being dd and the shield with dd+

Keep in mind that some TV makers implement ARC weirdly. Some only allow 2.0 to go through for some reason. If anything I would test HDMI vs Optical, since they are both known to support DD+/DD (HDMI/Optical).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.