Are you talking full field or window? Those numbers mean little if you don't specify what you're actually measuring. And if you're going by spec sheets (which says little about actual performance) then manufacturers already claim 4000 nits for 8k.An issue with 8K is that it cant get as bright as many bright LCD tv’s. You are talking closer to 1000 nits, whereas some 4K LCD tvs are over 2000 nits.
Also energy consumption... 😆
Here's a pic from Peter Montoulieu's latest installation. 200" Sony µLED wall. This things converts so much energy into heat, you need HVAC to run it.
https://www.avsforum.com/threads/sony-crystal-led-owners-thread-2020-solfar.3125600/post-60267708
Not really, the same rules apply as always. Back in the days we used line doublers for laserdiscs, then quadruplers, then MuseLD came along. Then DVD, HD-DVD and BluRay. Upscaling always looked better if done right. The availability of 4k made upscaling less of a requirement, but there's still a benefit in doing it. It can be done much cheaper today than back then. A cheap Faroudja doubler ran for $30k back then, Snell & Wilcox and Teranex started in the $50k range, up to $150k depending on options. Today quality scaling can be done under $10k, just not with anything that's integrated in a TV or source device. A HTPC with madVR, Lumagen or Envy are your best best. With what manufacturers are putting into TVs/projectors/players these days, I wouldn't bother... I agree.Image quality for 4K sources is usually best unaltered. Upscaling benefits tremendously from processing, but if it is already 4K, it more often makes it worse.
At some point, the display devices becomes the problem. An expensive 1080p option can still show a higher resolution than a cheap 4k solution. In the end, the materials used to built the device come into play.