Start with Dynamic Contrast usually the culprit...
another thing is that if your TV does not have Dolby Vision but instead only HDR10 capabilities then the set/appletv will in turn attempt to convert the signal to HDR10 and some quality is lost.
This is why the Netflix that comes with your TV looks better as Netflix will actually reproduce an HDR10 version of the movie and not a DV converted to HDR. Wish there was a way to change this on AppleTV but it forces DV (at least last time i checked i never use Netflix on AppleTv for this reason).
I don’t believe that’s the way this works at all.
I have a 900E and the ATV apps play DV content over HDR just as effectively as the internal apps, and to my eye, better — Netflix included.
From what I’ve read, the TV’s capabilities are reported to the ATV which calls the applicable stream from the server. SD, HD, 4K, HDR, or DV. These streams are all discreet encodings, not downscaled versions of one master 4K DV file.
These are both correct however the first quote is more accurate when referencing the AppleTV 4k.
It should be kept in mind there isn't exactly a specifically encoded version of every movie/show for specific types of HDR (HDR10 and DV). This is metadata (tracks) and while a terrible analogy its like there being multiple encodes for audio tracks (stereo, 5.1, etc). Netflix uses packages (IMF) and their own profiles (CE4 is this case) and this is done to save on resources and makes sure there isn't multiple slightly different version due to multiple slightly different masters.
Currently content makers supply Netflix with a single Dolby Vision master (VDM) that has been pre color graded. From there Netflix derives every version of the movie/show (HDR type, codec, etc) from that single master and further color grading and such is performed for each to match the master as best as possible. Netflix can derive all their content from the master using Dolby's toolkit. SMPTE ITU-R BT.2100 (I think), is the HDR10 open standard and since its a standard its "easy" for Dolby to develop tools in the toolkit for Netflix to extract the HDR10 base adjust it to the master.
Color grading is required due to the shift in color space between the Dolby Master, HDR10, and SDR. A trim pass is required for SDR since the master has a higher dynamic range.
The problem here and the reason why some HDR10 smart tv apps look better than HDR10 AppleTV apps is because the AppleTV 4k is converting Dolby Vision to HDR10 if the display doesn't support DV. Alone that isn't a problem however the content isn't custom tailored by the producer and Netflix anymore, its generic. This is why it displays "Dolby Vision" in Netflix on the AppleTV 4k even if you don't have a DV TV btw.
Here is the evidence for that.
Top is an AppleTV 4k displaying a Dolby Vision source to an HDR10 TV, bottom is HDR10 4k Blu Ray source on an HDR10 TV. Those columns are showing differences in the metadata for luminance and lighting. So in the end you get this...
IMO its not actually "bad" its just different, you still have better contrast and lighting with colors that look better compared SDR. In the above example the lightning in the background is blown out on the AppleTV but the character is visible which is opposite (albeit as intended) on the Blu Ray.
Those screenshots are from this youtube video btw. If you watch it in full screen (or with a dark perimeter) its a lot easier to see the difference in the dark detail.
Since most content providers will fallback on HDR10 via Dolbys HDR toolkit there should be an identifier in the AppleTV + setting for type of HDR. This way you get the content as intended. Maybe there is some sort of technical reason this can't be done, maybe it doesn't matter enough. However video and audiophiles aren't exactly known for "generic" and "not quite as intended" lol.