Wrong, wrong and wrong. CRT was exactly the same, yet you can't use it for HDR content (please try it). Do the test on your OLED. Calibrate it to a peak white of 5 nits, then try HDR. That will easily show you how wrong you are.
Your next argument will be, that you meant within a reasonable range (one that OLED can manage

) and that's what you meant. There's actually a lot more than that involved, such as screen size, area of image that Neds to be bright, ambient light, viewing distance, etc.
1) Sorry, you said "
In fact, most of the time I find myself using the screens at lower than 50% brightness.". So I assumed you mean the brightness setting in the menus. The overall brightness of the panel can be adjusted with the "Backlight / OLED Light" setting. The LG manuals are wrong and bad in what they say about brightness: "
Brightness Adjusts the overall screen brightness. The closer to 100, the brighter the screen will become.
You may use the Brightness settings specifically to set the dark part of the image.". The first sentence is misleading. The second is correct, the setting is used to set black level.
So why is the first sentence misleading? It's not entirely wrong. What happens is a mapping from input value (what your source is sending) to an output value (on screen). There's no identity mapping anymore (ignore tone mapping for now), but your black level is mapped to a higher (or lower) value. This can result in the overall APL to be higher/lower. However, it also results in decreased dynamic range or clipped blacks. Contrast does the same btw, only for whites. And if your display is capable enough, raising blacklevel and whites won't result in reduced dynamic range from the source.
Look at this as an example:
(quickly grabbed that from research gate)
2) You do not want to raise black level per se, you want to calibrate the black level of your input source (which varies with source) to your display. Ideally you want to do this for each individual source you're using (and not just the black level, but "everything"). While the protocols are standard, the actual input information varies on manufacturer and will be different depending on the model of your disc player, streaming client, PC/Mac, etc. It also varies by firmware or the device, transceiver, GPUs (including their drivers). What you want to do, to get the representation as intended, match/calibrate your specific source to your specific display. And you want to do it every now and then, because your display characteristics change overtime, particularly non-uniform changes as with OLED, Plasma and CRT.
Check it out, open System Preferences in macOS, then open Display, go to Color, then open the Profile. You will see a few of these parameters there. It will also give you the option to calibrate and change settings independent of your display.
As far as iPhone/iPads go, don't fall for the hype. The displays are not very good. There's mainly marketing at work. Neither iPad nor iPhone have the ability to be properly calibrated on their own. However, you can do this on a software level for individual applications. Take Affinity for example which does graphics/print. You can import specific profiles into the software and that will change each individual pixel value for you in software and make things look the way are supposed to look, so the calibration is done in the "software source" and not the display. The problem is the back and forth workflow required, because you can only create/change color profiles on a desktop. You can use test pattern to measure on the iPhone/iPad though.
Raising black level on OLED works by not turning off pixels, but have them very dim. Ironically that's the part where OLED is pretty bad at. OLED has very bad near black performance as seen by shadow detail in low APL scenes. The "turning off" thing is an old trick to get better measurements and used to be the backdoor for on/off CR measurement. They do similar things in laser engines with projection. This works well on total fade to black, but as soon as some other pixels are lit (say a stars in space), it becomes irrelevant because another instrument kicks in (your eyes). You will perceive the black level differently, as your iris adjusts. It's a much better method to measure on/off CR with some white pixels lit, but outside the measurement area. This accounts for light spill by the panel, glass and other reflections. Of course the resulting number may not look that nice and marketing won't be happy.
3) What I'd pick depends on the situation. What is the use case? Watching the news? Playing games? Movies? TV shows? Watching sports? Display size? Viewing distance? Room properties? Also, at what budget?
µLED is the top choice for displays (non projection) if you can manage the size, heat/cooling and higher costs vs small OLED/LED TVs (<100").
OLED has absolute black level above LED, as long as you're in a dark/black room. However it comes with color uniformity issues, banding and poor shadow detail performance. As with all things, whether you see it or not can depend on the content (The Long Night low APL scenes from a proper source vs a NFL game, ...).
LED has a higher black level, but is better in color uniformity (that's why you can seamlessly combine LED panels and not OLED), generally better shadow detail (there are poor LED models out there) and genreally no banding. Again, this might depend on the specific model and panel quality. A higher prices Z-series Sony is better than a cheaper model, as usual.
As to LED vs mini-LED, that depends on the implementation. How many dimming zones? Is it a dual layer LCD, so there's a dimming zone for every pixel? This is hard to answer and depends on the model.
But in general here's what I'd do. Since TVs are small (the usual 65" to 85", maybe up to 100"), they take up a small area of your field of view. This is generally not good for perception, as it has the same effect as one shining a flashlight in your eyes. You could correct this by sitting closer, but I guess no one wants to sit so close to a small screen. Seating distance : screen width with a ratio of 1:1 is a good immersive experience, some like to sit closer. So sitting about 20' away from a 20' wide screen is great. In more professional settings there's an easy trick to compensate for the flashlight effect of TVs. Place a D65 backlight behind the TV with a brightness that matches your setup (assuming your TV is calibrated to that temperature). Not only will this be better for the perception of the image, it will also deal with the overall black level. Your eyes will perceive a slightly elevated black level the same as the off-level from OLED. And yet, it won't be clipped (remember mapping?).
So that is my answer, pick the right technology for what you want to do and make it work. If you have to rearrange the room, construct some things, etc. do it. That the only way to properly do it. That's also why you design rooms for home theaters for acoustics and visuals, then build them and not just put some equipment in a room, which never gives top-performance.
No technology is perfect, they're all flawed one way or another. So pick what suits you best and live with the consequences or deal with them and make them work. LED isn't perfect, but neither is OLED as people make it out to be.
I come from a CRT projection approach in home theater, I used to have pitch black rooms, velvet material which absorbed all reflections. This changed over time, barely anyone these days is doing it the old way. What changed is we have much bigger screens (between 15' and 25' wide) now, equipment for large screens got more expensive ($50k to $100k for the old CRTs vs $150k to $700k for modern light cannons).
There's always a compromise, no matter what you do. Christie seems to have the holy grail in their pocket, but it's nowhere to be seen yet (demo at CEDIA this year). I'm having a very close eye on it.