Have you watched the video? The guy is showing that this is not the case at all.
As for the uhd Blu-ray this is absolutely normal because HDR10 is part of the standard and have to be present at the minimum. Unfortunately this is not the case for streaming services.
Can’t say about netflix et al. But what I read from aTV spec (only DV profile 5) and DV level specs, I conclude (maybe falsely, would love to learn better), that a DV L5 movie needs to have HDR10 base layer.
I watched that YT video as well, and I still do not understand, where did the postulate, that aTV is intrinsically converting from DV into HDR10, come from? Based on what evidence?
We could only tell for sure, if we could see the metadata on incoming stream.
In HDR, so many things are handled inside the display, that I would as easily say - the artefacts show how LG managed to do its tonemapping. DV needs to go via validation process with Dolby Labs, but HDR10 is at the mercy of every vendor.
PS The MaxCLL/MaxFALL values from Panasonic bluray player seem to be really off the charts (10000/4000). The aTV outputs of 1000/400 tend to be the standard values seen on most HDR graded things.
White clipping beyong 1k nit mark is also known as a strategy for many tonemappers in today's HDR panels, only Samsung has been reported to do a roll-off. Sonys, for example, are known to clip anything beyond it's capabilities.
Even though the picture differences shown in that video are undeniable, the mere difference in MaxCLL and MaxFALL values causes the tonemapping engine inside the display to provide completely different response.
Also, the show with aTV OSD causing shift in highlights does not prove anything. This menu lowers always the luminance of the rest of aTV UI (meaning darkens the picture), what in that test caused specular highlights to move below LG's clipping/knee level. That's it.
So, the bottom line for me, still, is that wherefrom did the statement that this all happens inside aTV (and not inside LG), come from???
And the bottom-bottom line? HDR tech is so complex and new, that we do not understand it completely yet. We'd like someone from Dolby Labs to explain these things here.
PPS I also heard second time in his presentation, that his camera does not have enough dynamic range to display the highlights produced by OLED panel. I really have hard time in believing this! Today's CMOS sensors have plenty of DR, that is one of the reasons to now pull the display tech up to reveal at least some of it. My own Sony s6300 has no trouble in capturing a DR wide enough to fill the 1000 nits.
What is true though, using a YouTube delivery to SDR screen will not be able to retain that DR. But the camera has it.