Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So let me get this straight:
  • iPads - a mixture of mini-LED, LED, and possibly OLED, or maybe, by 2023, it will be a mixture of mini-LED and OLED, with LED having been phased out completely on all SKUs?
  • iPhone - LED (only on SE) and OLED (all other models)
  • MacBook Air/Pro - LED (on Air, possibly) and mini-LED (on Pro)
  • iMac - LED
  • Mac Pro - mini LED (on Pro Display XDR)
So, pretty confusing in terms of display tech across all product lines. Do I have this right? Do you think Apple will ever standardize the display tech to be all OLED or all mini-LED in all of their products?
When micro LEDs become available, if ever, they will combine the advantages of OLEDs & LEDs and become dominant.
 
  • iPads - a mixture of mini-LED, LED, and possibly OLED, or maybe, by 2023, it will be a mixture of mini-LED and OLED, with LED having been phased out completely on all SKUs?
Unlikely that LED will be phased out completely any time soon as long as an entry-level 10.x-inch iPad still exists. It's basically the iPhone SE of the iPad family.

So, pretty confusing in terms of display tech across all product lines. Do I have this right? Do you think Apple will ever standardize the display tech to be all OLED or all mini-LED in all of their products?
Well, never say never, but in the short term, I think it's going to be a mix.

It's also important to keep in mind that OLED/LED/Mini-LED aren't things that Apple really talks about. It barely mentioned "Mini-LED" during its iPad Pro unveiling, and I can't remember hearing "OLED" used much — if at all — even when the iPhone X arrived back in 2017, much less anytime since.

Apple prefers to emphasize marketing terms like "Super Retina XDR" and "Liquid Retina XDR", where clearly "Super" has become synonymous with "OLED" and "XDR" with the higher contrast ratios offered by both OLED and Mini-LED.

On the basis of that, the lineup sounds less confusing, and Apple can continue to market the cooler-sounding displays as selling points for the higher-end models.

Further, as others in this thread have pointed out, the shift seems to be toward Mini-LED in Apple's larger "Pro" class of devices, likely because it has basically zero problems with burn-in. I also suspect that creating smaller mini-LED screens might be more challenging right now — hence the 11-inch iPad Pro sticking with the older and more established display technology, at least for now.
 
  • Love
Reactions: anshuvorty
Really? I have never seen a OLED TV push out 2000 nits like my QLED TV.

And my QLED TV has the same black backgrounds as OLED, except it can push much much higher nits too.

I was expecting the nits statement. Nits aren’t important when the contrast is infinite and you have perfect blacks - which is why an OLED needs not hit a high nits target. And no, your Samsung doesn’t have even remotely close to the black levels of an OLED, nor does it hit 2000 nits on even a 2% window. The best Samsung has a 25,000:1 contrast ratio. It also has terrible viewing angles and handle motion like garbage compared to an OLED.
 
  • Disagree
Reactions: JBGoode
I was expecting the nits statement. Nits aren’t important when the contrast is infinite and you have perfect blacks - which is why an OLED needs not hit a high nits target. And no, your Samsung doesn’t have even remotely close to the black levels of an OLED, nor does it hit 2000 nits on even a 2% window. The best Samsung has a 25,000:1 contrast ratio. It also has terrible viewing angles and handle motion like garbage compared to an OLED.

Handle motion? My TV does 120hz no problem and it is extremely low latency. It is actually one of the best TV on the market for the Xbox Series X. And QLED is just as good with black as OLED.

Low brightness = bad HDR. It is simple as that. So OLED aren’t good.
 
  • Like
Reactions: Seanm87
Been using OLED iPhones since the very first X (had 4 different ones) never any issue with burn in.
Also had few Samsung Note phones with OLED as well as a Samsung Galaxy book Windows laptop with OLED heavily used again no burn in.
LCD/LED in all forms are obsolete and once you experience a good OLED you’ll never wanna go back to dull colors and ugly grey blacks of even the best LCD.
what about QLED and miniLED?
 
We have an LG TV that’s OLED. The quality is jaw dropping amazing beautiful. if you’ve been in a Dolby Theater it looks exactly the same. Even though it’s a 2D picture, it really has incredible depth. We can have it in a room with skylights & it’s perfectly fine during the day and it’s amazing how bright it gets. Would I use it outdoors, no. There is no better picture & value than investing in an OLED today. You will not regret the purchase of an OLED TV. All that nonsense of burnin, color shift, off access has to do with older models. Incredible strides have been made with today’s OLED TV’s. Try one out, once you see what it’s like you won’t go back to LED TV’s.

how does it do with older tv recordings? I kind like to watch older tv shows from 80s like sitcoms , do those look better or worse?
I am sure its great on the modern 4K stuff.
 
what about QLED and miniLED?
QLED is LCD.(with better blacks and better colors) but it’s LCD.

OLED is organic each pixel can turn on or off vs LCD which has one large panel that needs back light or edge light.

Mini LED is better than all forms of LCD it has much smaller and more number of backlight ,they have better brightness than OLED but blacks still aren’t as deep.

With OLED Black is deep inky black because the pixels can just be turned off.
 
  • Like
Reactions: jhollington
You are aware that brightness actually controls the black level (low IRE) of your image and not the peak whites (high level IRE), right? So if anything, if you want to make the image brighter (raise peak white), you'd have to adjust the contrast setting. This goes way back to the analogue days.

I'd be really careful with such claims, because your OLED TV isn't even DCI compliant and can't process cinema content. So it can't look the same. It's technically not possible.


No, it's the same with brand new models as well. But keep telling yourself that, maybe it makes you feel better.
Why do you thing LG implemented a timer in the firmware of current models that will, when run 8 hours straight (or was it 10?) without changing the channel, void you warranty?

And what is UHD Premium certification, besides a sticker that a manufacturer pays for? Have you actually measured an OLED screen? Which one? What spectroradiometer did you use (just to make sure your wavelength is suitable for OLED and it's specific panel types)?

Just for reference, you can easily get a UHD Premium certification with the following requirements:
  • resolution min 3840x2160
  • Input must accept 10-bit and BT2020 color space (does not have to cover it 100%)
  • must be conform to SMPTE ST2084 EOTF. Min peak white of 1000cd/m^2 at 0.05 black level or 540cd/m^2 at a black level of 0.0005.
In other words, if UHD premium certificates were olympic, it's like doing high jump with the bar at 1' height.



I wish people would at least sometimes try to understand these things from a technical perspective. I've been working in image processing for decades, been a research scientist at the institute that came up with the h264 standard... so what's posted on this forum sometimes really want to make me cringe and bite my nails. :rolleyes:

1) I was talking about brightness of the display as in the nits thing, they claim higher nits is better. I am saying I do not need so many nits unless you are display a TV under the noon's sun

2)even if brightness deals with blacks, why do I want to raise the brightness that will only make the blacks look more grey?! And many displays including my iphone and iPad do not have contract adjust just brightness.

Also, how does raising brightness on OLED works since OLED literally turns the black OFF on each pixel?

3) So since you work in image processing and a scientist, which tv technology would you pick? OLED, QLED, or MiniLED?
 
how does it do with older tv recordings? I kind like to watch older tv shows from 80s like sitcoms , do those look better or worse?
I am sure its great on the modern 4K stuff.
I'm in the same boat... I watch a ton of 480p stuff from that era — I have an iTunes and Plex library full of it, in fact — and I can't say there's really much of a benefit for that stuff.

The LG CX technically offers upscaling improvements, but honestly there's only so much it can do. I've never been able to get a definite answer on this, but logically I don't think it tries to upscale unless it knows that it's getting a 480p signal in the first place. Since the Apple TV 4K always spits everything out at whatever resolution you've set it to, that means in my case I'm feeding old 80's sitcoms into my LG CX at 2160p. I haven't tried setting my Apple TV to 480p output, as I don't expect it would make that much of a difference, and it's not like I'm going to manually switch modes every time I want to watch something different anyway.

Of course, LG does also offer the usual assortment of noise reduction and other picture quality features that help in a variety of other ways, so it's still a great picture as long as you expect it to look like what it is — 480p content from the 80's 😏

In the case of the Apple TV 4K, the biggest thing to watch out for if you're going to watch older content on any HDR TV is to make sure the Match Dynamic Range setting is enabled on the Apple TV (see https://support.apple.com/en-us/HT208288). Otherwise, your Apple TV is going to try and send everything out in HDR, which is going to make older shows look horribly blown out in terms of saturation and contrast.

As an added bonus, as long as you're matching SDR/HDR with the LG CX, you'll be able to use different picture settings for each, so you can fine-tune your SDR picture without worrying about it affecting Dolby Vision 4K shows.
 
Nits aren’t important when the contrast is infinite and you have perfect blacks
Wrong, wrong and wrong. CRT was exactly the same, yet you can't use it for HDR content (please try it). Do the test on your OLED. Calibrate it to a peak white of 5 nits, then try HDR. That will easily show you how wrong you are.

Your next argument will be, that you meant within a reasonable range (one that OLED can manage ;) ) and that's what you meant. There's actually a lot more than that involved, such as screen size, area of image that Neds to be bright, ambient light, viewing distance, etc.
1) I was talking about brightness of the display as in the nits thing, they claim higher nits is better. I am saying I do not need so many nits unless you are display a TV under the noon's sun

2)even if brightness deals with blacks, why do I want to raise the brightness that will only make the blacks look more grey?! And many displays including my iphone and iPad do not have contract adjust just brightness.

Also, how does raising brightness on OLED works since OLED literally turns the black OFF on each pixel?

3) So since you work in image processing and a scientist, which tv technology would you pick? OLED, QLED, or MiniLED?
1) Sorry, you said "In fact, most of the time I find myself using the screens at lower than 50% brightness.". So I assumed you mean the brightness setting in the menus. The overall brightness of the panel can be adjusted with the "Backlight / OLED Light" setting. The LG manuals are wrong and bad in what they say about brightness: "Brightness Adjusts the overall screen brightness. The closer to 100, the brighter the screen will become.
You may use the Brightness settings specifically to set the dark part of the image.
". The first sentence is misleading. The second is correct, the setting is used to set black level.

So why is the first sentence misleading? It's not entirely wrong. What happens is a mapping from input value (what your source is sending) to an output value (on screen). There's no identity mapping anymore (ignore tone mapping for now), but your black level is mapped to a higher (or lower) value. This can result in the overall APL to be higher/lower. However, it also results in decreased dynamic range or clipped blacks. Contrast does the same btw, only for whites. And if your display is capable enough, raising blacklevel and whites won't result in reduced dynamic range from the source.
Look at this as an example:
Gray-level-transformation-by-tone-curve-z-Tx-a-Gray-level-transformation-for.png

(quickly grabbed that from research gate)


2) You do not want to raise black level per se, you want to calibrate the black level of your input source (which varies with source) to your display. Ideally you want to do this for each individual source you're using (and not just the black level, but "everything"). While the protocols are standard, the actual input information varies on manufacturer and will be different depending on the model of your disc player, streaming client, PC/Mac, etc. It also varies by firmware or the device, transceiver, GPUs (including their drivers). What you want to do, to get the representation as intended, match/calibrate your specific source to your specific display. And you want to do it every now and then, because your display characteristics change overtime, particularly non-uniform changes as with OLED, Plasma and CRT.

Check it out, open System Preferences in macOS, then open Display, go to Color, then open the Profile. You will see a few of these parameters there. It will also give you the option to calibrate and change settings independent of your display.

As far as iPhone/iPads go, don't fall for the hype. The displays are not very good. There's mainly marketing at work. Neither iPad nor iPhone have the ability to be properly calibrated on their own. However, you can do this on a software level for individual applications. Take Affinity for example which does graphics/print. You can import specific profiles into the software and that will change each individual pixel value for you in software and make things look the way are supposed to look, so the calibration is done in the "software source" and not the display. The problem is the back and forth workflow required, because you can only create/change color profiles on a desktop. You can use test pattern to measure on the iPhone/iPad though.

Raising black level on OLED works by not turning off pixels, but have them very dim. Ironically that's the part where OLED is pretty bad at. OLED has very bad near black performance as seen by shadow detail in low APL scenes. The "turning off" thing is an old trick to get better measurements and used to be the backdoor for on/off CR measurement. They do similar things in laser engines with projection. This works well on total fade to black, but as soon as some other pixels are lit (say a stars in space), it becomes irrelevant because another instrument kicks in (your eyes). You will perceive the black level differently, as your iris adjusts. It's a much better method to measure on/off CR with some white pixels lit, but outside the measurement area. This accounts for light spill by the panel, glass and other reflections. Of course the resulting number may not look that nice and marketing won't be happy.


3) What I'd pick depends on the situation. What is the use case? Watching the news? Playing games? Movies? TV shows? Watching sports? Display size? Viewing distance? Room properties? Also, at what budget?

µLED is the top choice for displays (non projection) if you can manage the size, heat/cooling and higher costs vs small OLED/LED TVs (<100").

OLED has absolute black level above LED, as long as you're in a dark/black room. However it comes with color uniformity issues, banding and poor shadow detail performance. As with all things, whether you see it or not can depend on the content (The Long Night low APL scenes from a proper source vs a NFL game, ...).

LED has a higher black level, but is better in color uniformity (that's why you can seamlessly combine LED panels and not OLED), generally better shadow detail (there are poor LED models out there) and genreally no banding. Again, this might depend on the specific model and panel quality. A higher prices Z-series Sony is better than a cheaper model, as usual.

As to LED vs mini-LED, that depends on the implementation. How many dimming zones? Is it a dual layer LCD, so there's a dimming zone for every pixel? This is hard to answer and depends on the model.

But in general here's what I'd do. Since TVs are small (the usual 65" to 85", maybe up to 100"), they take up a small area of your field of view. This is generally not good for perception, as it has the same effect as one shining a flashlight in your eyes. You could correct this by sitting closer, but I guess no one wants to sit so close to a small screen. Seating distance : screen width with a ratio of 1:1 is a good immersive experience, some like to sit closer. So sitting about 20' away from a 20' wide screen is great. In more professional settings there's an easy trick to compensate for the flashlight effect of TVs. Place a D65 backlight behind the TV with a brightness that matches your setup (assuming your TV is calibrated to that temperature). Not only will this be better for the perception of the image, it will also deal with the overall black level. Your eyes will perceive a slightly elevated black level the same as the off-level from OLED. And yet, it won't be clipped (remember mapping?).

So that is my answer, pick the right technology for what you want to do and make it work. If you have to rearrange the room, construct some things, etc. do it. That the only way to properly do it. That's also why you design rooms for home theaters for acoustics and visuals, then build them and not just put some equipment in a room, which never gives top-performance.

No technology is perfect, they're all flawed one way or another. So pick what suits you best and live with the consequences or deal with them and make them work. LED isn't perfect, but neither is OLED as people make it out to be.

I come from a CRT projection approach in home theater, I used to have pitch black rooms, velvet material which absorbed all reflections. This changed over time, barely anyone these days is doing it the old way. What changed is we have much bigger screens (between 15' and 25' wide) now, equipment for large screens got more expensive ($50k to $100k for the old CRTs vs $150k to $700k for modern light cannons).

There's always a compromise, no matter what you do. Christie seems to have the holy grail in their pocket, but it's nowhere to be seen yet (demo at CEDIA this year). I'm having a very close eye on it.
 
The LG CX technically offers upscaling improvements, but honestly there's only so much it can do.
That is perfectly normal for display devices, the processing is usually pretty poor. Get a Lumagen Radiance Pro, it will do a much better job at upscaling SD content. In addition for HDR you also get dynamic tone mapping, which brings performance to a whole new level.
 
  • Like
Reactions: jhollington
OLED for iOS device? hmmm, doesn't this mean it will have a burn in of the GUI being the most used Apps like say Affinity or a browser like Safari?



MiniLED tvs are a thing? do they have blooming effect? I am in the market of a new tv but I am so confused on which tech to pick. I would go with OLED but what is off-axis colours? I thought OLED gives better viewing angles?



What is with the craziness about "bright" displays? Unless you are watching in your outdoor garden or in an all glass room, all screens are pretty viewable to me. In fact, most of the time I find myself using the screens at lower than 50% brightness. Even if you have big windows, during a movie watch you can just lower the curtains.

HDR-capable displays make the brights very bright and the darks very dark - looks like you’re actually there. Not possible without a tech like miniLED that can truly get bright.

We have an LG TV that’s OLED. The quality is jaw dropping amazing beautiful. if you’ve been in a Dolby Theater it looks exactly the same. Even though it’s a 2D picture, it really has incredible depth. We can have it in a room with skylights & it’s perfectly fine during the day and it’s amazing how bright it gets. Would I use it outdoors, no. There is no better picture & value than investing in an OLED today. You will not regret the purchase of an OLED TV. All that nonsense of burnin, color shift, off access has to do with older models. Incredible strides have been made with today’s OLED TV’s. Try one out, once you see what it’s like you won’t go back to LED TV’s.

and I guess PWM-induced headaches from OLED displays are a fantasy, too?
 
Last edited:
QLED is LCD.(with better blacks and better colors) but it’s LCD.

OLED is organic each pixel can turn on or off vs LCD which has one large panel that needs back light or edge light.

Mini LED is better than all forms of LCD it has much smaller and more number of backlight ,they have better brightness than OLED but blacks still aren’t as deep.

With OLED Black is deep inky black because the pixels can just be turned off.

You left out:

MicroLED has the best of all possible worlds - brightest image and best blacks and no image retention
 
I was expecting the nits statement. Nits aren’t important when the contrast is infinite and you have perfect blacks - which is why an OLED needs not hit a high nits target. And no, your Samsung doesn’t have even remotely close to the black levels of an OLED, nor does it hit 2000 nits on even a 2% window. The best Samsung has a 25,000:1 contrast ratio. It also has terrible viewing angles and handle motion like garbage compared to an OLED.

Motion handling surely isn’t about panel tech?
 
In more professional settings there's an easy trick to compensate for the flashlight effect of TVs. Place a D65 backlight behind the TV with a brightness that matches your setup (assuming your TV is calibrated to that temperature). Not only will this be better for the perception of the image, it will also deal with the overall black level. Your eyes will perceive a slightly elevated black level the same as the off-level from OLED. And yet, it won't be clipped (remember mapping?).
And the backlight's brightness is what?
Right amount depends on the displays contrast.
If you put 100 nits backlight and the max brightness of a scene is 10 nits, the backlight will spoil it.
This is of course hypothetical, because we know, that to make scene look "dark" doesn't mean there's no peaks.
Without peaks it just looks underexposed.

I find this nit-race to a bit odd.
If you want to use the screen in uncontrolled enviroment, no display will be optimal.
The display's surface will be over 10 nits even when it's off, when the room is bright.

We have to remember that temporal contrast perception of an eye has limited dynamics.
At any given moment, the human eye can only resolve 6.5 stops, even if the full range over time is 46.5 stops. Full range will drop in age.
Looking at full midday sun is only 10x too bright for an eye (10^8cd/m2) and with ND1.2 ie. ND 1/16 it's just 6.25Mcd/m2 and the blackest black human eye can see then is 62500cd/m2.

So if you want optimal picture, you'll have to control the room's light anyway.
Why not control it in a way that's optimal for the display?

So, if the display will have dynamic contrast, there's no need for more than 200:1 at an instant.
After that it's just like megapixel-game, advertising.

Do people really enjoy using their HDR1000 tv's at full brightness, even at night, because otherwise they would miss "the full HDR experience"?

Color accuracy is of course a point, but in near black people usually don't notice the color much, if it's not just way off.
For just pure contrast oled wins in dim lit room, but looses in every other case.

Btw, if you want cinematic experience at home with tv, you can always move closer. Or decide to not have an cinematic experience and look far from the other side of big room.

End note: I miss 3d tv, I like it in cinema and surprisingly also at home.
Hopefully it will come back soon, when they have to upgrade UHD-BD to sell all the movies again. And new tv's...
 
Last edited:
  • Like
Reactions: pdoherty
HDR-capable displays make the brights very bright and the darks very dark - looks like you’re actually there. Not possible without a tech like miniLED that can truly get bright.



and I guess PWM-induced headaches from OLED displays are a fantasy, too?

but why do I want HDR? From what I have seen HDR is just not a normal looking image. Its like it has the saturation way high. Sounds like another 3D TV novelty thing?
 
  • Disagree
Reactions: Seanm87
Wrong, wrong and wrong. CRT was exactly the same, yet you can't use it for HDR content (please try it). Do the test on your OLED. Calibrate it to a peak white of 5 nits, then try HDR. That will easily show you how wrong you are.

Your next argument will be, that you meant within a reasonable range (one that OLED can manage ;) ) and that's what you meant. There's actually a lot more than that involved, such as screen size, area of image that Neds to be bright, ambient light, viewing distance, etc.

1) Sorry, you said "In fact, most of the time I find myself using the screens at lower than 50% brightness.". So I assumed you mean the brightness setting in the menus. The overall brightness of the panel can be adjusted with the "Backlight / OLED Light" setting. The LG manuals are wrong and bad in what they say about brightness: "Brightness Adjusts the overall screen brightness. The closer to 100, the brighter the screen will become.
You may use the Brightness settings specifically to set the dark part of the image.
". The first sentence is misleading. The second is correct, the setting is used to set black level.

So why is the first sentence misleading? It's not entirely wrong. What happens is a mapping from input value (what your source is sending) to an output value (on screen). There's no identity mapping anymore (ignore tone mapping for now), but your black level is mapped to a higher (or lower) value. This can result in the overall APL to be higher/lower. However, it also results in decreased dynamic range or clipped blacks. Contrast does the same btw, only for whites. And if your display is capable enough, raising blacklevel and whites won't result in reduced dynamic range from the source.
Look at this as an example:
Gray-level-transformation-by-tone-curve-z-Tx-a-Gray-level-transformation-for.png

(quickly grabbed that from research gate)


2) You do not want to raise black level per se, you want to calibrate the black level of your input source (which varies with source) to your display. Ideally you want to do this for each individual source you're using (and not just the black level, but "everything"). While the protocols are standard, the actual input information varies on manufacturer and will be different depending on the model of your disc player, streaming client, PC/Mac, etc. It also varies by firmware or the device, transceiver, GPUs (including their drivers). What you want to do, to get the representation as intended, match/calibrate your specific source to your specific display. And you want to do it every now and then, because your display characteristics change overtime, particularly non-uniform changes as with OLED, Plasma and CRT.

Check it out, open System Preferences in macOS, then open Display, go to Color, then open the Profile. You will see a few of these parameters there. It will also give you the option to calibrate and change settings independent of your display.

As far as iPhone/iPads go, don't fall for the hype. The displays are not very good. There's mainly marketing at work. Neither iPad nor iPhone have the ability to be properly calibrated on their own. However, you can do this on a software level for individual applications. Take Affinity for example which does graphics/print. You can import specific profiles into the software and that will change each individual pixel value for you in software and make things look the way are supposed to look, so the calibration is done in the "software source" and not the display. The problem is the back and forth workflow required, because you can only create/change color profiles on a desktop. You can use test pattern to measure on the iPhone/iPad though.

Raising black level on OLED works by not turning off pixels, but have them very dim. Ironically that's the part where OLED is pretty bad at. OLED has very bad near black performance as seen by shadow detail in low APL scenes. The "turning off" thing is an old trick to get better measurements and used to be the backdoor for on/off CR measurement. They do similar things in laser engines with projection. This works well on total fade to black, but as soon as some other pixels are lit (say a stars in space), it becomes irrelevant because another instrument kicks in (your eyes). You will perceive the black level differently, as your iris adjusts. It's a much better method to measure on/off CR with some white pixels lit, but outside the measurement area. This accounts for light spill by the panel, glass and other reflections. Of course the resulting number may not look that nice and marketing won't be happy.


3) What I'd pick depends on the situation. What is the use case? Watching the news? Playing games? Movies? TV shows? Watching sports? Display size? Viewing distance? Room properties? Also, at what budget?

µLED is the top choice for displays (non projection) if you can manage the size, heat/cooling and higher costs vs small OLED/LED TVs (<100").

OLED has absolute black level above LED, as long as you're in a dark/black room. However it comes with color uniformity issues, banding and poor shadow detail performance. As with all things, whether you see it or not can depend on the content (The Long Night low APL scenes from a proper source vs a NFL game, ...).

LED has a higher black level, but is better in color uniformity (that's why you can seamlessly combine LED panels and not OLED), generally better shadow detail (there are poor LED models out there) and genreally no banding. Again, this might depend on the specific model and panel quality. A higher prices Z-series Sony is better than a cheaper model, as usual.

As to LED vs mini-LED, that depends on the implementation. How many dimming zones? Is it a dual layer LCD, so there's a dimming zone for every pixel? This is hard to answer and depends on the model.

But in general here's what I'd do. Since TVs are small (the usual 65" to 85", maybe up to 100"), they take up a small area of your field of view. This is generally not good for perception, as it has the same effect as one shining a flashlight in your eyes. You could correct this by sitting closer, but I guess no one wants to sit so close to a small screen. Seating distance : screen width with a ratio of 1:1 is a good immersive experience, some like to sit closer. So sitting about 20' away from a 20' wide screen is great. In more professional settings there's an easy trick to compensate for the flashlight effect of TVs. Place a D65 backlight behind the TV with a brightness that matches your setup (assuming your TV is calibrated to that temperature). Not only will this be better for the perception of the image, it will also deal with the overall black level. Your eyes will perceive a slightly elevated black level the same as the off-level from OLED. And yet, it won't be clipped (remember mapping?).

So that is my answer, pick the right technology for what you want to do and make it work. If you have to rearrange the room, construct some things, etc. do it. That the only way to properly do it. That's also why you design rooms for home theaters for acoustics and visuals, then build them and not just put some equipment in a room, which never gives top-performance.

No technology is perfect, they're all flawed one way or another. So pick what suits you best and live with the consequences or deal with them and make them work. LED isn't perfect, but neither is OLED as people make it out to be.

I come from a CRT projection approach in home theater, I used to have pitch black rooms, velvet material which absorbed all reflections. This changed over time, barely anyone these days is doing it the old way. What changed is we have much bigger screens (between 15' and 25' wide) now, equipment for large screens got more expensive ($50k to $100k for the old CRTs vs $150k to $700k for modern light cannons).

There's always a compromise, no matter what you do. Christie seems to have the holy grail in their pocket, but it's nowhere to be seen yet (demo at CEDIA this year). I'm having a very close eye on it.

Ok thanks this was informative, hopefully others benefit from it too. I appreciate the effort.

1) So a guy like me what can he do to calibrate his screen? I once tried getting images from online and calibrating to the but the results were horrible imo. Do you recommend I get something like data colour spyder ? I can't justify the price at $215

2) You seem to prefer the QLED over OLED, you seem to imply that OLED's only advantage is the deep darks/off-pixels. So when is OLED the better choice?

3) With the D65, are you suggesting I install a light behind my tv? that sounds like distracting?! I have seen it as a feature in some tvs.

4) I see you recommend a device called Lumagen Radiance Pro , that costs like +$7000 ?! I don't think thats reasonable to get a slightly better picture. What is it even used for with so many HDMI inputs?
 
Faster processor, better cameras including depth sensors. Plus it’s called Pro. 🏋️‍♀️
lol. Yeah it is called Pro. And yes, it will be faster. But i don’t think people will pay up because the pro ipads have better cameras. You don’t really think “camera” when buying an iPad. To me, one of the biggest selling points of spending more for an iPad Pro besides the speed(which isn’t really being utilized with the software yet)... is the screen. To give the better OLED screen to the iPad Air first makes zero sense. Regardless what this article says. I just don’t believe it.
 
I wonder what the biggest display Apple will get to on these things. There inching in small steps
 
but why do I want HDR? From what I have seen HDR is just not a normal looking image. Its like it has the saturation way high. Sounds like another 3D TV novelty thing?

Capturing more of the dynamics between bright and dark areas is an improvement. Maybe you’re just not used to it.
 
So if you want optimal picture, you'll have to control the room's light anyway.
Why not control it in a way that's optimal for the display?
Sorry, I am talking about controlled environments. If anyone is watching with light sources in the room or through a window, you won't be able to see elevated blacks anyway.

The right amount of D65 backlight behind the TV is enough to make the blacklevel look "black" and not elevated.
So, if the display will have dynamic contrast, there's no need for more than 200:1 at an instant.
For the eye only? Yes. From a neurological point of view when it comes to perception, no. More in the 800-1000:1 ballpark.
Color accuracy is of course a point, but in near black people usually don't notice the color much, if it's not just way off.
Color shift can lead to variation in brightness.

Btw, if you want cinematic experience at home with tv, you can always move closer. Or decide to not have an cinematic experience and look far from the other side of big room.
Moving closer deals with the "point light" issue from small TVs. It still doesn't give the feeling of a large screen. As long as one has clues about the size of the screen, such as visible frame or an object next to the screen, the perception of size will only change with the perception of distance. So one still knows it's a small TV sized screen. That's why moving an iPad right up to your nose doesn't work, it's still perceived as a 11" or 13" screen. In addition, if cinematic experience is considered, moving up to the screen can be problematic when it comes to audio. Nothing can replace a properly sized screen. But of course this comes at a cost which is certainly more than a $2k or $10k TV. For professional environments price doesn't matter. For home use, it's a hobby. Some buy sneakers, some are into photography with expensive lenses, others are collecting cars and some decide to have home theaters. Never question a hobby.
End note: I miss 3d tv, I like it in cinema and surprisingly also at home.
Hopefully it will come back soon, when they have to upgrade UHD-BD to sell all the movies again. And new tv's...
Hate to break it to you, but 3D is dead. The whole concept with glasses is flawed. And don't forget the hardware to do it properly is expensive. The best 3D I've seen (and actually enjoyed) is the Sim2 HDR double stack which is using triple flash @144Hz. It looks great when the source material looks great. Unfortunately there's few such material. And at that point, I'm not sure if spending $150k on two 1080p machines with 5000 lumens for 3D is worth it at all, given there's better 4k with HDR in similar price ranges. I've never been a fan of 3D on TVs, way too small. But it's a personal preference, so to each their own.
Ok thanks this was informative, hopefully others benefit from it too. I appreciate the effort.

1) So a guy like me what can he do to calibrate his screen? I once tried getting images from online and calibrating to the but the results were horrible imo. Do you recommend I get something like data colour spyder ? I can't justify the price at $215

2) You seem to prefer the QLED over OLED, you seem to imply that OLED's only advantage is the deep darks/off-pixels. So when is OLED the better choice?

3) With the D65, are you suggesting I install a light behind my tv? that sounds like distracting?! I have seen it as a feature in some tvs.

4) I see you recommend a device called Lumagen Radiance Pro , that costs like +$7000 ?! I don't think thats reasonable to get a slightly better picture. What is it even used for with so many HDMI inputs?
1) I can't recommend these cheap colorimeters. There was a study a few years back published to ISF members that basically said more than half of the Spyder stuff is coming out of the factory having false readings, are extremely inaccurate (particularly in the low IRE region) or have non reproducible (random) readings. I bought a bunch of <$1500 sensors back in the day and never found them to work I expected. They're also not suited for every type of display device. If one is serious about this stuff and willing to put the time and effort (one has to learn the science behind it, it's not a push of a button), then I always recommend the Klein K10 as an entry level instrument. However, it's $7k (sometimes on sale in the $5k range). Anyone who isn't willing to deep dive into this, is better off hiring a ISF certified calibrator to do it.

2) OLED is the better choice when you're in a pitch black room without any type of ambient light and you don't have a LED panel at hand that can compete with the black level of OLED. That depends on number of dimming zones, if it's a single or dual LCD design, etc.

Btw, technically it is a misconception that OLED is totally black. There's still a very small amount of light coming from a "turned off" pixel, it's just so low that most sensors can't read it (and eyes can't see it), you need to go into the NIST certified six-figure $ range to be able to measure it.

OLED is also better for integrating the display in your environment (check the wall series from LG). You can also curve certain OLED screens (we've also seen curved LCDs in the past), as they're flexible (some). You can completely hide them with roll out screens too. Power consumption can be point, depending on model.


3) Yes, properly done is not distracting at all. You've probably seen this on Philips TVs, they call it Ambilight. But it's not D65, so not color neutral which will result in changed perception of colors and it's way too bright. They have the right idea, but the implementation is more a gimmick.

4) Lumagen has different models, so as always, pick the right choice for the job. Street price for the top-model is more around $5k. I recommended it for better upscaling of SD material. But in reality it does so much more. It does not only upscale, it does frame rate and aspect ratio conversion. It can do sharpening, softening, can remove edge enhancement artifacts, noise, artifacts from older film prints, etc. It has full 3D LUTs for calibrating source devices and dynamic tone mapping for each frame which brings HDR to a whole new level of quality by having a fully calibrated chain from source to display unlike standard HDR10 and Dolby Vision, which is generic, static (at least for now) and does not consider your specific setup at all. You can also have different setups depending on situation. For example with projectors, 3D as pointed out requires more light than 2D. So you can have calibrated 2D and 3D settings, even with multiple lamps (if not laser based) turned on in the projects. I've seen installations with up to 4 lamps in a single projector, depending on required light output.

All of that is only useful, when your display device is up to it of course. It makes no sense at all for bad displays. I'm going to go one step further and say, that anyone who is really serious about the best video quality possible will need one of these. It's a must have device and not a gimmick and the difference is not small (again, depending on display quality).


Let me add a few things in general, because I think there's a lot of misconception when people talk about "nits" and brightness. The term these days is mostly used as a marketing thing. More = better. But technically a nit is the amount of light equal to one candela / meter^2. So this is specific to the area of one m^2 (about 10.8 sqft). Change the area and it doesn't work anymore, the definition isn't valid anymore. No one is arguing that an OLED isn't bright enough when putting on a full white field. It will probably be so bright, that you have to look away or close your eyes. But again, this is not what it is about. What we want is small objects to be super bright, such as stars in the black of space. Lets say your display is rated 600 nits, then it can't reproduce stars at 600 nits, because in order to do that, the star would have to be one m^2 in size on your display. However when the star is only a few pixels large, you only get a fraction of those 600 nits from the area of one m^2. So in order to show these stars at a very high brightness, you need a much higher brightness in one m^2, that is why 2000 nits and more displays make sense. Not to have a full white field and blind you with it, but to be able to have high brightness for smaller objects.

Here's a practical example. Buy a small, dim flashlight and have someone shine it into your eyes from some distance. You will be ok. Then use 100 of those flashlights and do the same. Then 1000 and so on. Many flashlights will be too bright to look at, however if you want extreme brightness from a single light source (flashlight), you have to buy a brighter one.

So in addition to the simple nits number, we have to look at what we want to do and put it into perspective. If anyone is familiar with the series "Home Before Dark" on Apple TV, the following is specific to episode 5 of season 2 I recently looked at from a technical point of view. Color primaries are BT2020, the color primaries of the mastering display were P3. The mastering display luminance is between 0.005 cd/m^2 and 1000 cd/m^2. The maximum content light level in that episode is 1597 cd/m^2. That is the brightest pixel in the entire episode. However, the maximum frame-average light level is 163 cd/m^2. That is, the frame with the average brightest luminance level in the entire episode. It seems "dim" at only 163 cd/m^2, however it could be mostly dark with the bright pixels going up to >1000 cd/m^2 (remember this is an average over all pixels). And that is why you need high brightness, to properly show those small bright parts of the image. It becomes less of a problem, the larger the objects get. Some definitions here: https://docs.microsoft.com/en-us/windows/win32/api/dxgi1_5/ns-dxgi1_5-dxgi_hdr_metadata_hdr10

So in addition, as someone mentioned Dolby Cinema theaters above. One has to consider that home HDR content is usually mastered to at least 1000 nits. We're already seeing 4000 nits and the industry is making approaches to push to 10000 nits. HDR in Dolby Cinema theaters however is mastered to 108 nits. So you really can't compare these two or use Dolby theater content at home without further processing (leave aside the issue on specific equipment needed). So why is that? Well, here comes human perception which is not linear and in general a little strange (but hey, it works). A bunch of papers exist (also from Dolby) that say given two objects (one small, one large) with the same brightness, the larger object is perceived is brighter. So in general, large object seem brighter than smaller objects. Given the large screen size in Dolby Vision theaters, that might make sense when it comes to mastering to 108 nits (I personally think it could be more, but less than home HDR. There's a bit more involved, but lets ignore that to not blow up the thread). That also mirrors the general experience for our home theaters where screens are usually between 15' to 25' wide for the enthusiastic movie fan. I have seen some 35'+ wide home theaters, but they're not that common. Mastering nit wise, they could fall between the massive Dolby screens and the small TVs. I'm saying could because depending on equipment and quality one can use both home content and cinema content.


All of that brings me to answer your question from above:
but why do I want HDR?
To have that large dynamic range between the "the black of space and that super bright tiny star far, far away". So it's not really about that super massive on/off contrast ratio of millions : 1 (infinite :1 is nonsense, its marketing). It is your intra scene contrast ratio that you want for HDR.

I hope this puts things a bit more into perspective, even if I've simplified it and skipped some stuff.
 
Wrong, wrong and wrong. CRT was exactly the same, yet you can't use it for HDR content (please try it). Do the test on your OLED. Calibrate it to a peak white of 5 nits, then try HDR. That will easily show you how wrong you are.

Your next argument will be, that you meant within a reasonable range (one that OLED can manage ;) ) and that's what you meant. There's actually a lot more than that involved, such as screen size, area of image that Neds to be bright, ambient light, viewing distance, etc.

I think you mean... "right, right, right".

Which is why they have a minimum standard, well above 5 nits.

Good straw man attempt though. You almost got me to humor you further.
 
Handle motion? My TV does 120hz no problem and it is extremely low latency. It is actually one of the best TV on the market for the Xbox Series X. And QLED is just as good with black as OLED.

Low brightness = bad HDR. It is simple as that. So OLED aren’t good.

Absolutely incorrect.

It's about contrast. The brightest of brights matched to the darkest of darks present all within the same scene -- which is why there's a minimum black level you're ignoring.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.