Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Another good video! Sadly my TV doesn't do Dolby Vision, glad it does HDR10 tho...
 
Very interesting. So it is finally confirmed that he Apple TV does indeed play DV only content in HDR10. But hopefully Apple can provide the correct metadata for each film as the platform matures. Or better yet, add HDR10+.
 
Does the DV to HDR conversation also work with the Netflix App? Fi i switch from DV to HDR in the Apple TV settings Stranger Things on Netfix loose the DV tag and only 4K remains. But no HDR is added.
 
Outputs are fudged from a DV feed for HDR10 only sets then? Some sets without tone mapping (?) are at a disadvantage? it would seem.
 
So you are not able to select the Chroma Option for when DV is enabled? That sucks.

atvchroma.JPG
 
I’m sorry but this video is not a good new at all for HDR10 only TV set owners.
What you are watching is a native Dolby Vision movie down converted on the fly in fake HDR10 by the Apple TV.
If you watch the complete video there is a comparison between the values used for the HDR10 layer on a native HDR10 UHD Blu-ray vs the values created by the on the fly conversion of the Apple TV.
The Apple TV is applying the same fixed set of values regardless of the movie you are watching.
This is completely faked !
Apple need to add a HDR10 logo on the iTunes Store for movies that are capable of true native HDR10 presentation.
I don’t want to buy HDR movies to watch it in a fake down converted HDR...
 
I’m sorry but this video is not a good new at all for HDR10 only TV set owners.
What you are watching is a native Dolby Vision movie down converted on the fly in fake HDR10 by the Apple TV.
If you watch the complete video there is a comparison between the values used for the HDR10 layer on a native HDR10 UHD Blu-ray vs the values created by the on the fly conversion of the Apple TV.
The Apple TV is applying the same fixed set of values regardless of the movie you are watching.
This is completely faked !
Apple need to add a HDR10 logo on the iTunes Store for movies that are capable of true native HDR10 presentation.
I don’t want to buy HDR movies to watch it in a fake down converted HDR...
Maybe that's how streaming/broadcasting box manufacturers have decided that's the way to do HDR? It sounds as though you really need 4K UHD BluRay. In a way it's like buying a Audi A4 and expecting it to be identical in every respect to the Porsche 911 that really pleased you or loved dearly. £5, £8, £10 iTunes films compared to £20-£40 4K BluRay discs and £630 for the Oppo player. In reality, the ATV and Oppo 4K BluRay player are very different and yet compliment each other beautifully. Rent the film at a seconds notice from iTunes and if you love it and want to own a copy, buy it on 4K UHD BluRay.
However, if you were not doing any comparing side by side, I'm sure that most ATV 4K and iTunes customers would be blown over by the picture quality of the film whilst they are sat on their sofas at a distance from the TV and of course, when Dolby Atmos comes to the ATV, it should make the audio experience better.
As I said, renting an iTunes films when you suddenly decide you'd like too is a true luxury (I remember the days as a kid, where you'd go to the video store and put your name on a waiting list for a new rental. Sometimes you'd wait 2 or 3 weeks - now I wait 2 seconds for it to start :) :) :) :) :) AND if you demand the extra very best, then buy it on the disc and see it in a new way.
Sadly, iTunes isn't quite there yet for the AV purists out there but give it two years and maybe or maybe not, we'll have to see how the market goes!
I guess in 8 years we've come a long way - 2009 broadband in Europe was 1.5mbps for the majority of cities, now we expect 250-300 here in the UK - 2009 iTunes films were in SD, then we saw HD 720p and 18months later 1080p and now 4K HDR/DV - crazy how the audio options hasn't changed though.
 
Last edited:
This is not the problem here.
I don’t talk about the difference of quality between a UHD Blu-ray and a UHD stream from iTunes. Of course there is difference.
I’m talking about the fact that if you have a HDR10 TV set, Apple need to send the correct HDR10 metadata. Not a faked down conversion from a Dolby Vison feed.
The UHD Blu-ray in this video is only here to let us know what are the true values of the native HDR10 metadata vs what Apple is doing with the not wanted conversion.
 
This is not the problem here.
I don’t talk about the difference of quality between a UHD Blu-ray and a UHD stream from iTunes. Of course there is difference.
I’m talking about the fact that if you have a HDR10 TV set, Apple need to send the correct HDR10 metadata. Not a faked down conversion from a Dolby Vison feed.
The UHD Blu-ray in this video is only here to let us know what are the true values of the native HDR10 metadata vs what Apple is doing with the not wanted conversion.

Well sadly, it seems that Apple are doing things their way. Certainly the SDR to HDR is bizarre. It totally seems that Apple are still learning from this process and it's very early days. I get the impression that ATV4K is thinking of the future and is aimed solely at OLED TVs - having backlight & contrast settings on 100% all of the time surely cannot be good for the longevity of LCD TV? Maybe they are learning. Don't waste your beautiful life worrying about what they are doing. Apple clearly don't listen. Just enjoy it for what it is and enjoy other things. Just my take on it and how I live. It's a dead product at the end of the day - it doesn't live, breathe, make your live amazing (it won't have a lasting effect on you) - don't let it consume you! :)
 
I’m talking about the fact that if you have a HDR10 TV set, Apple need to send the correct HDR10 metadata. Not a faked down conversion from a Dolby Vison feed.
Now, after replying to a different thread, Does Dolby Vision Content from iTunes play on my HDR10 TV?
I would say that you are wrong. I would say any DV stream for aTV 4K (Profile 5) has HDR10 as base layer. Hence, nothing is faked, just the DV enhancement layer is ignored.
My Despickable Me UHD bluray (one of the few in DV so far) must have same setup, because it plays in HDR10 on non-DV screen.
 
Have you watched the video? The guy is showing that this is not the case at all.
As for the uhd Blu-ray this is absolutely normal because HDR10 is part of the standard and have to be present at the minimum. Unfortunately this is not the case for streaming services.
 
Have you watched the video? The guy is showing that this is not the case at all.
As for the uhd Blu-ray this is absolutely normal because HDR10 is part of the standard and have to be present at the minimum. Unfortunately this is not the case for streaming services.
Can’t say about netflix et al. But what I read from aTV spec (only DV profile 5) and DV level specs, I conclude (maybe falsely, would love to learn better), that a DV L5 movie needs to have HDR10 base layer.
I watched that YT video as well, and I still do not understand, where did the postulate, that aTV is intrinsically converting from DV into HDR10, come from? Based on what evidence?
We could only tell for sure, if we could see the metadata on incoming stream.
In HDR, so many things are handled inside the display, that I would as easily say - the artefacts show how LG managed to do its tonemapping. DV needs to go via validation process with Dolby Labs, but HDR10 is at the mercy of every vendor.
PS The MaxCLL/MaxFALL values from Panasonic bluray player seem to be really off the charts (10000/4000). The aTV outputs of 1000/400 tend to be the standard values seen on most HDR graded things.
White clipping beyong 1k nit mark is also known as a strategy for many tonemappers in today's HDR panels, only Samsung has been reported to do a roll-off. Sonys, for example, are known to clip anything beyond it's capabilities.

Even though the picture differences shown in that video are undeniable, the mere difference in MaxCLL and MaxFALL values causes the tonemapping engine inside the display to provide completely different response.
Also, the show with aTV OSD causing shift in highlights does not prove anything. This menu lowers always the luminance of the rest of aTV UI (meaning darkens the picture), what in that test caused specular highlights to move below LG's clipping/knee level. That's it.

So, the bottom line for me, still, is that wherefrom did the statement that this all happens inside aTV (and not inside LG), come from???

And the bottom-bottom line? HDR tech is so complex and new, that we do not understand it completely yet. We'd like someone from Dolby Labs to explain these things here.

PPS I also heard second time in his presentation, that his camera does not have enough dynamic range to display the highlights produced by OLED panel. I really have hard time in believing this! Today's CMOS sensors have plenty of DR, that is one of the reasons to now pull the display tech up to reveal at least some of it. My own Sony s6300 has no trouble in capturing a DR wide enough to fill the 1000 nits.
What is true though, using a YouTube delivery to SDR screen will not be able to retain that DR. But the camera has it.
 
I watched that YT video as well, and I still do not understand, where did the postulate, that aTV is intrinsically converting from DV into HDR10, come from? Based on what evidence?
We could only tell for sure, if we could see the metadata on incoming stream.

The developer OSD is showing the incoming stream is Dolby Vison on both Apple TV regardless of the fact that one is set to Dolby Vision and the other to HDR10.
Concerning the metadata we have the values, this is what the reviewer do with the little hdmi box.
The metadata are all the same for 4 completely different movies. This is clearly not the result of a native HDR10 stream.

I don’t know, the reviewer is maybe wrong or his little hdmi box don’t work properly. But the guy seems pretty serious and well regarded in the tv review department.
In fact I hope he is wrong ;)
 
Last edited:
That little box is indeed nifty. But it shows the output from aTV.
If we could see also the input to it, then we could draw a more solid conclusion about possible manipulation inside tvOS.
Plus, there is a fair deal of chance that the iTunes HDR grade (master) is different from what goes to bluray factory.
Actually, I am already thinking about how to get hold of the incoming stream. Remains to be seen, if DRM has been applied to content only or also to the envelope (MP4 or TS headers and metadata). In the latter case, I do not see myself trying to decode that stream's parameters.
 
Last edited:
We’ll switch back and forth from Netflix; 4K hdr strange things 2. My new Apple TV is killing HDR10 :(
None of the highlights pop, everything is muted and muddy compared to my Sony’s Netflix native hdr10 output. I feel ripped off. Even the snowy mountain screen savers sky is blown out (Dam you apple!)
[doublepost=1509409041][/doublepost]Also Home kit Doesn't even work !

I might return this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.