Thought I'd give my 2 cents on the QLED discussion.
A year ago I decided to get a Nvidia BFGD (65" 4K 120Hz monitor) as my main monitor.
I waited and waited for it to be released until I decided to go for a TV instead.
I checked out LG:s latest high-end OLED offerings, and Samsung Q9F in the store and compared visual quality.
I really wanted OLED, because... OLED is awesome! But I was worried about burn-in since I'll be using it as a computer monitor first and foremost.
The difference was indiscernible, and it made clear to me that QLED was a lot better than normal LCD.
Colors were as good as OLED. Brightness was great, better than OLED. No risk of burn-in. Local array backlight ensures complete darkness, although not per-pixel perfect like OLED.
Another thing I valued was frame rate, since I will also use the TV for PC gaming. LG supports 120Hz at 1080p, but Samsung supports 120Hz at 1440p.
I weighed pros and cons and decided to go for QLED.
Then Samsung announced their 8K line, to be launched just weeks later, with HDMI 2.1 support to be added in 2019 (through a free hardware upgrade) which hopefully includes 4K at 120Hz. And the brightness was 3000 nits.
I decided I'd get one.
I now have a Samsung Q900R 65" hooked up as my main computer monitor.
So, am I happy with it? Yes.
Does it make my Acer X34A (which is a beast) cry? Yes.
Is it perfect? No.
Does it beat OLED in every way? No.
Does it beat normal LED in every way? Yes, oh yes.
Compared to OLED, it can suffer for example if there are white subtitles on a dark background. Local array backlighting can't handle that, and there is a halo around it. If there is a space scene with stars, the stars will be dimmer than they are supposed to be.
But that is the only con.
Local array backlighting + 3000 nits brightness combined with HDR content is almost jaw dropping. There is nothing else on the market that comes close, and I guarantee you that it is not a marketing gimmick.
I sit about 3-4 feet away from the TV. The move from 4K to 8K moves us into "higher resolution that the eye can perceive" territory, even when close to the screen. I'd say that the HDR performance makes for a MUCH larger impact than 8K though, and I'm sure I would've been happy with 4K as well in that regard.
BUT, 8K enables me to have pixel perfect 1440p.
4K signal = 1 pixel in the signal becomes 2x2 pixels on screen.
1440p signal = 1 pixel in the signal becomes 3x3 pixels on screen.
1080p signal = 1 pixel in the signal becomes 4x4 pixels on screen.
So now I get pixel perfect sharpness at 120Hz while gaming, and pixel perfect sharpness at 4K 60Hz when working.
And this is the main reason 8K is great for me. Until we get HDMI 2.1 and GPU:s can handle all games at 4K 120Hz.
Sorry for long post
