I've seen both. I've also got a Panasonic Plasma and a 2018 QLED (now at my mother's house as she likes it better than I do and I watch all my movies on my projector so just leave TV for the plasma for the most part; I still like the plasma for SD material as the scaler sucks on the QLED).
I love the contrast on OLED (perfect blacks) and the newer QLEDs have quite decent blacks and tons of NITS for HDR, but when at a glance I can't tell if a movie is 2K or 4K from 6 feet away, WTF is the point? HDR? Sorry, but I have to compare the two movies to tell what it's doing with the exposure in one versus the other. The picture looks good on both and contrast/highlights are pretty damn subjective, IMO. I also see no point in adding HDR to old movies as it's revisionist. People get upset about the studios messing with film grain, but HDR is OK? Some don't want Atmos added even if the director and sound guys approve because it changes what was originally at the theater, but they all seem to jump on the 4K/HDR wagon even though NO THEATER ON EARTH (outside a 70mm print) ever showed much more than 720p equivalent resolution (I can quote the study to prove it if need be). That's due to the loss in resolution from a 3rd generation print by the time it gets to the theater. 70mm preserved maybe 2K by comparison since it's still taken from a 35mm master twice removed. I find the sheer hypocrisy on what was seen in say 1979 with Alien at the theater to the 4K re-release with HDR to be amusing, to the say the least. I complain that it doesn't get an Atmos update (they offered; Ridley turned it down) and they (on the blu-ray forums that is) have a field day about the 70mm 6.0 soundtrack (similar to 5.1 today, but not exactly as it used 4 on-screen speakers or in later years stereo surrounds instead with three on-screen, which is VERY similar to 5.1) being as good as it got in 1979 and I'm like yeah, but 2K was about as good as you got outside true 70mm films due to the losses above (and 720p or less on regular 35mm prints). Needless to say, people don't like looking in the mirror.
Honestly, I couldn't care less either way as long as it looks and sounds good, but people will argue to the ends of the earth about these things and all that REALLY matters is someone is happy with what they have. I'd like a 4K projector, but I'm not giving up 3D to get it as 3D is night and day different from 2D whereas 4K is just a linear improvement of sorts, IMO. I also need lens shift in my room and that pretty much kills everything but the Epson pseudo-4K/3D projectors and the high-end Sony models.
I have yet to see any HDR knock my socks off, but then I've only watched so many 4K films as the set is no longer at my house. It was too small to compete with a 92" screen. Now maybe when they get a reasonable price roll-down OLED around that size, I can ditch the projector entirely (but that would likely leave me without 3D since newer sets don't support it, just projectors and as I've said, 3D is where it's at. 4K is a JOKE compared to good 3D. I really don't understand the 3D hate out there, but then most people like Windows and that doesn't register with me either....
You see, that could be every piece of music on Earth. Life is way too short to listen to GARBAGE (not meaning Steely Dan) just because it's well recorded. I can record a kid banging on a cymbal and piano with life-like precision. That doesn't mean anyone in their right mind would want to hear it.
It's simple. It's subjective. What I mean is I think it's absurd to listen to music you DO NOT LIKE just because it's recorded well (see above about kid banging on a piano). But that is the audiophile mentality. They want the best recorded music they can find to show off their $50k 2-channel system with 200 pound Class A amps. It doesn't matter if they like the music. Studies I read about have shown the typical audiophile might have $10k-50k in equipment, but own maybe $200-500 in music. A 'musicphile' (if there were such word) might own $50k in music and have $500 in their 2-channel system. Personally, I prefer a bit more balance than that, but then I've tried both extremes over the years.
You make strange assumptions. Yes, I'm in the U.S. There is no "standard" here. Please don't make things up to make your point because it makes no difference at all. 60-65" instead of 55" makes ZERO difference either way. It's still TINY compared to a 90-120" screen. I mean TINY. You look at a set sitting against a wall. I look at picture over half the size of my wall (the short one). Yes, you can sit two feet from your 65" screen and have a similar view in your field of vision, but it's darn uncomfortable to focus on a screen two feet in front of your face for a long period of time (like staring at a monitor non-stop for hours). 8-10 feet away is a much more comfortable viewing distance, but requires a larger screen to fill your field of view. 20 feet is better yet, but you'd need a 150-200" screen to do the same.
Again, what are you talking about? You're inventing strawmen to make your argument that I'm really not interested in either way. What people? What theater plays over 100dB except for peaks? Dolby average 85dB, which is quite safe for 8 hours. Have you been to a Dolby Atmos certified theater? It matters not to my point because I don't go to the theater much anymore. I watch at home on a 92" screen 8 feet from the screen with 11.1.6 sound (17 high quality PSB brand speakers and a large subwoofer that is dead flat to 20Hz). There is no poor quality. There is also no tiny picture. I play at typically 8dB under Dolby levels 77dB average with 97dB peaks, although I did watch Raiders of the Lost Ark the other day at reference level (5.1 upmixed to 11.1.6).
The bottom line is this. If you're happy with a 65" 4K HDR set with a sound bar or whatever, great. If you like streaming video, fine. My arguments in this thread are about reasons to buy physical media (better picture, better sound, has a digital copy included most of the time anyway, etc.). You're less likely to notice shortcomings of streaming on a smaller set too.