This is almost surely because the TV you are using isn't really capable of properly producing an HDR image. Many manufacturers sell TVs that offer "HDR" support, but in reality, it's in name only and the panel itself isn't capable enough to display the HDR image correctly. For these TVs, HDR support really just means it can read the hdr signal format, and display something on screen. For these TVs to properly display an HDR signal on their more limited panels, they'd need to have very capable dynamic tone-mapping abilities, so the signal can be properly adjusted on the fly to within spec of what the TV is able to display. These TV's almost always don't have such features, because like the panel itself, implementing it is fairly costly.
Dolby Vision is supposed to be the solution to this on lower-end TVs, and it works fairly well from what I've seen on lower end panels that implement it, but it obviously doesn't help with HDR10 content or if the TV doesn't support Dolby Vision. The resulting Dolby Vision picture on lower-end TVs is usually fairly good (at a minimum not worse than SDR), but in all honesty it usually doesn't really look that much different than SDR in the first place, again due to panel limitations.
The correct solution to this would be to get TV manufacturers to stop adding HDR signal support to TVs that don't really have the capabilities to display it, but that won't happen.
Alternatively, If you have an Apple TV 4K, you would probably be best setting the main display mode to SDR in the settings, and disabling the match dynamic range option if it's enabled. That will force streaming services to stream in 4K SDR if available, or fall back to 1080p SDR if not available.