There are actually techniques to improve the quality when upscaling. LG and nvidia have AI upscalars that work quite well.
Then it is time to ask how do they do this? There is no information for the missing pixels and they have to get from 1080 to 2160 pixels. This can only mean that the missing pixels are a fantasy of the tv that is performing this magic. And when I calculate it well it means that they need to do this on half of all that you see on the screen.
I don't believe that the quality of the picture is improved. It is just adapted to your screen without any improvement.
The best way to view 1080p on a 4 k screen is to to set either AppleTV or 4kTV to adapt to the original material. The AppleTV I have can only be set to given formats. And I perceive these as preferable format. I have set it to the apparently highest quality e.g. Dolby Vision at 2160p
However not everything I stream has this quality. Some, or most, movies are still at 1080p. And when the AppleTV recalculates everything to 2160p it takes a long time before I can see something. This is very boring when watching short items like news etc or YouTube. Each time the AppleTV takes a long time to calculate the picture when I switch to something else.
The same thing happens when I set it also to 'show the original format'. Then it takes a long time before it shows the homescreen in Dolby Vision format.
Therefor I am not using the 'show original format' on the AppleTV. I use it instead on my LG OLED TV. That one is much quicker to switch from and to different formats. What the difference is between upscaled 1080p on a 4k tv and original non-scaled 1080p on a 4k TV, is beyond my understanding. Because one way or the other the empty pixel information has to be calculated. But it will most certainly not be better.
Only when you have a 4k signal on your 4k tv you will see a difference.