Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think you mean... "right, right, right".

Which is why they have a minimum standard, well above 5 nits.

Good straw man attempt though. You almost got me to humor you further.
Sorry, I mean exactly what I said, even if that hurt your feelings when you have to convince others to make yourself believe that you made the right investment. You shouldn't care, as long as you're happy with it. It's outdated already again, you have to learn to live with it... it's tech. I once bought a Faroudja video processor for $30k, it was outdated soon after and today an Apple Watch has more processing power. After that I bought a Teranex for $125k, guess what outdated and the above mentioned Lumagen blows it away. Our display are most likely outdated as well.

So didn't you say above and I quote
Nits aren’t important when the contrast is infinite and you have perfect blacks - which is why an OLED needs not hit a high nits target.
?
Now you do exactly what I predicted you'd do. ;)
So what is that minimum standard? What norm? And why is it a standard?

Your infinite argument does not make sense, as it doesn't actually exists. The reason people quoting that is because most cheap sensors can't read such low values and display either Inf., Error or something similar. Using an extremely high end and expensive sensor (six-figure, usually NIST certified), you can read the black level value from OLED.
What sensor are you using btw?

These sensors are usually reserved to lab work. The company manufacturing the raw material for LG used in OLEDs is using these sensors. How I know? Because I was there when the plant was built and when it opened. The company doing it is Merck KGAA with HQ in Germany. I've been a consultant for them in the past, so I have a some idea about what they're doing.

Please feel free to humor us all (including the video guys working in mastering I forward this thread to). And don't worry, I teach at a University and am used to know-it-all-but-in-fact-know-nothing first semester math/EE/CS students, so I'm used be humored.


It's about contrast. The brightest of brights matched to the darkest of darks present all within the same scene -- which is why there's a minimum black level you're ignoring.
What contrast? There's not one contrast, so at least be more specific. How do you match bright to dark? That would result in zero dynamic range available, which makes no sense. I assume you mean the delta between black and peak white? Why do you need a black level for that? You could just as well go the other way and start from the top. As long as DR doesn't change, you can squeeze all values in there. Whether you have a loss of DR or not, is another question.

So what luminance resolution is OLED able to resolve?

I posted an example above about "Home Before Dark", how would you solve this on a display (no matter if OLED or otherwise) that does not have the desired peak brightness? In the end you want to display things as intended.

What is your take on the Universal/Sony/Warner HDR mastering process for discs and the BDAs requirements for studio releases? How do you solve this? Do you see it as problematic at all?
 
Last edited:
  • Like
Reactions: jhollington
Sorry, I mean exactly what I said, even if that hurt your feelings when you have to convince others to make yourself believe that you made the right investment. You shouldn't care, as long as you're happy with it. It's outdated already again, you have to learn to live with it... it's tech. I once bought a Faroudja video processor for $30k, it was outdated soon after and today an Apple Watch has more processing power. After that I bought a Teranex for $125k, guess what outdated and the above mentioned Lumagen blows it away. Our display are most likely outdated as well.

So didn't you say above and I quote

?
Now you do exactly what I predicted you'd do. ;)
So what is that minimum standard? What norm? And why is it a standard?

Your infinite argument does not make sense, as it doesn't actually exists. The reason people quoting that is because most cheap sensors can't read such low values and display either Inf., Error or something similar. Using an extremely high end and expensive sensor (six-figure, usually NIST certified), you can read the black level value from OLED.
What sensor are you using btw?

These sensors are usually reserved to lab work. The company manufacturing the raw material for LG used in OLEDs is using these sensors. How I know? Because I was there when the plant was built and when it opened. The company doing it is Merck KGAA with HQ in Germany. I've been a consultant for them in the past, so I have a some idea about what they're doing.

Please feel free to humor us all (including the video guys working in mastering I forward this thread to). And don't worry, I teach at a University and am used to know-it-all-but-in-fact-know-nothing first semester math/EE/CS students, so I'm used be humored.



What contrast? There's not one contrast, so at least be more specific. How do you match bright to dark? That would result in zero dynamic range available, which makes no sense. I assume you mean the delta between black and peak white? Why do you need a black level for that? You could just as well go the other way and start from the top. As long as DR doesn't change, you can squeeze all values in there. Whether you have a loss of DR or not, is another question.

So what luminance resolution is OLED able to resolve?

I posted an example above about "Home Before Dark", how would you solve this on a display (no matter if OLED or otherwise) that does not have the desired peak brightness? In the end you want to display things as intended.

What is your take on the Universal/Sony/Warner HDR mastering process for discs and the BDAs requirements for studio releases? How do you solve this? Do you see it as problematic at all?

I'm sorry, I meant exactly what I said -- to softly state you're wildly incorrect, that you made a BS straw man to sound smart, and to try to gaslight your way through things -- all on display again here.

I don't need to use a sensor - these things are captured by professionals and available to me with a simple google search. I don't need to be a technician to see the differences, or to be backed up by every single professional review ever written. You're a prime example why doctors have to carry malpractice insurance, even they get things wrong.

What are the minimum specs? Well, you were kind enough to provide them already, 540 nits at a contrast of 1,080,000:1 contrast ratio, which OLED crushes currently; or 1000 nits at 20,000:1 contrast -- a massive difference... so I don't know why you create a straw man at 5 nits, or use a CRT as an example -- you're a pro after all, you should know that a CRT contrast is an ANSI 100:1, that it can't get black once the electrons are firing unless the entire scene is black, that it can't realistically exceed around 1200p because of interference in the gas (even though 1080i CRTs at RETAIL were very real). Increasing the horizontal deflection rate beyond 1200 would create a trade off in reasonable brightness and eliminate it from the discussion... or you'd end up with an air traffic control screen weighing 300 pounds for a 20" square monitor. But, you know this... you're the genius, so explain why you'd ever think this was relevant. You can't, because you'd have to admit you're wrong.

Yet, here we are... not only is your straw man garbage, you double down with more straw man arguments. DCI requires 90% P3, OLED is 98%. I think the real term you were looking for was Rec.2020, which is a 37% increase in spectrum from DCI-P3. Sure, it doesn't meet Rec.2020, yet. Quantum Dot OLED is next year... with Samsung leading the pack, I can't imagine why.......
 
I'm sorry, I meant exactly what I said -- to softly state you're wildly incorrect, that you made a BS straw man to sound smart, and to try to gaslight your way through things -- all on display again here.

I don't need to use a sensor - these things are captured by professionals and available to me with a simple google search. I don't need to be a technician to see the differences, or to be backed up by every single professional review ever written. You're a prime example why doctors have to carry malpractice insurance, even they get things wrong.

What are the minimum specs? Well, you were kind enough to provide them already, 540 nits at a contrast of 1,080,000:1 contrast ratio, which OLED crushes currently; or 1000 nits at 20,000:1 contrast -- a massive difference... so I don't know why you create a straw man at 5 nits, or use a CRT as an example -- you're a pro after all, you should know that a CRT contrast is an ANSI 100:1, that it can't get black once the electrons are firing unless the entire scene is black, that it can't realistically exceed around 1200p because of interference in the gas (even though 1080i CRTs at RETAIL were very real). Increasing the horizontal deflection rate beyond 1200 would create a trade off in reasonable brightness and eliminate it from the discussion... or you'd end up with an air traffic control screen weighing 300 pounds for a 20" square monitor. But, you know this... you're the genius, so explain why you'd ever think this was relevant. You can't, because you'd have to admit you're wrong.

Yet, here we are... not only is your straw man garbage, you double down with more straw man arguments. DCI requires 90% P3, OLED is 98%. I think the real term you were looking for was Rec.2020, which is a 37% increase in spectrum from DCI-P3. Sure, it doesn't meet Rec.2020, yet. Quantum Dot OLED is next year... with Samsung leading the pack, I can't imagine why.......
Thank you proving my point by not answering anything I asked and showing everyone that you have absolutely no idea what you're talking about. But keep telling yourself things and try to convince others to make you feel better.

Thank you again for making me and a bunch of other people I'm in touch with laugh a lot with this last post. Among all the nonsense you've posted, gas in a CRT is really the funniest thing in video technology I've read in a long, long time. This shows what happens when you quickly google things, but don't understand it at all. Gas in CRTs was used by Thomson in the late 1890s when he experimented with electrons, as any EE student should know. However, there's no gas in CRTs used for video application. In fact, the tube is "filled" with nothing and that nothing is of course a vacuum. First time I've read that one for CRTs used in video application. But I give you that one, it was indeed hilarious. 😂
 
I've received a few messages and was asked to comment a view things, so here wo go:

I don't need to use a sensor
Your display device is unique and different from others, even from the same model, so you do. But of course you don't have one.

What are the minimum specs? Well, you were kind enough to provide them already,
I did not, you're referring to a certification based on arbitrary values. These are not based on any actual standard, nor is the certification a standard. Again, what minimum standard? Do you know what a standard is?

so I don't know why you create a straw man at 5 nits
You claimed brightness doesn't matter.

you should know that a CRT contrast is an ANSI 100:1
For some yes, for others not. Some are less, some are more.

that it can't get black once the electrons are firing unless the entire scene is black
Wrong and can easily be measured. There's no leakage current on the phosphor.

that it can't realistically exceed around 1200p
Depends on the actual CRT and it's size. Some don't even go beyond 480p. Others go beyond 1200p.


because of interference in the gas
There's no gas in CRTs used for video applications.
EDIT: I'm sure you're not referring to fermi gas, because that would truly be ridiculous. How about talking about dark matter for a bit? ;)
Increasing the horizontal deflection rate beyond 1200 would create a trade off in reasonable brightness and eliminate it from the discussion...
Theres no raster, if anything increasing resolution would result in blooming and in turn result in higher brightness.
DCI requires 90% P3, OLED is 98%.
You don't even understand what I'm talking about, because you have no experience with the equipment. Here's a starter, explain how you connect DCI compliant equipment to a OLED TV.
 
Last edited:
You left out:

MicroLED has the best of all possible worlds - brightest image and best blacks and no image retention
Micro LED is still not commercially available and price and production cost is still unrealistic. so needs at least another half a decade to be an option.
 
I've received a few messages and was asked to comment a view things, so here wo go:


Your display device is unique and different from others, even from the same model, so you do. But of course you don't have one.


I did not, you're referring to a certification based on arbitrary values. These are not based on any actual standard, nor is the certification a standard. Again, what minimum standard? Do you know what a standard is?


You claimed brightness doesn't matter.


For some yes, for others not. Some are less, some are more.


Wrong and can easily be measured. There's no leakage current on the phosphor.


Depends on the actual CRT and it's size. Some don't even go beyond 480p. Others go beyond 1200p.



There's no gas in CRTs used for video applications.
EDIT: I'm sure you're not referring to fermi gas, because that would truly be ridiculous. How about talking about dark matter for a bit? ;)

Theres no raster, if anything increasing resolution would result in blooming and in turn result in higher brightness.

You don't even understand what I'm talking about, because you have no experience with the equipment. Here's a starter, explain how you connect DCI compliant equipment to a OLED TV.

I claimed contrast matters. Which is a factor you get to by factoring in both brightness and darkness.

So… no, I didn’t. Context matters.

Keep gaslighting. You have zero credibility.
 
I claimed contrast matters. Which is a factor you get to by factoring in both brightness and darkness.

So… no, I didn’t. Context matters.

Keep gaslighting. You have zero credibility.
Nits aren’t important when the contrast is infinite and you have perfect blacks - which is why an OLED needs not hit a high nits target.

I'm heartbroken, all this from a guy who thinks there's gas in a CRT. 😂

You keep repeating nonsense which you googled, copy&pasted without actually understanding it. All at the same time avoiding to answer a few simple questions to which you can't find the answer with a quick google search copy&pasting it. And yes I did ask those question specifically because I knew you're not able to answer them, to show everyone you have no idea what you're talking about. Anyone who is familiar with this topic could easily answer them.

I suggest you educate yourself and come back wen you're actually able to contribute to this thread by answering a few of those questions.


For everyone else, I'd like to clarify something about CRT. It's long dead but deserves better. Of course the 1200p limit as claimed in this thread is nonsense. The Barco M series was among the first 5k CRTs. They were usually sold for medical applications, but could be used for anything, sold by specialized dealers. It came with a different choice of phosphor and resulting lifetime, with a peak brightness of up to 800cd/m^2. It also came in different resolutions. I've used a specific model often which had a resolution of 2560x2048 (this is directly from the data sheet). We usually drove those with custom Matrox GPUs back in the day. The were calibrated and we had to check calibration once every months to ensure they didn't drift for critical workflow.

Another gem was the Barco Reality series, several models with the top-end one offering the highest resolution. Out of the box, and I'm directly quoting the Belgian spec sheet here, it did up to 3,200 x 2,560 pixels. The limit came from the bandwidth of the RGB amplifiers. There was a modified version of the amp board later on which allowed full home and DCI 4k resolution (home and DCI 4k has different resolutions). There were not many of these built due to size and cost, as well as energy consumption and heat (they did eat up over 1kW). In addition, this was around a time when people already went digital.

The first ones who did a "digital" workflow for quality assurance back in the day were the guys around Rick McCallum for the Star Wars prequels using modified JVC for video and Meridian DSP speaker audio system. I remember talking to a few guys involved back then and they were all totally happy about it. Good old times I guess. :)
 
  • Like
Reactions: jhollington
I'm heartbroken, all this from a guy who thinks there's gas in a CRT. 😂

You keep repeating nonsense which you googled, copy&pasted without actually understanding it. All at the same time avoiding to answer a few simple questions to which you can't find the answer with a quick google search copy&pasting it. And yes I did ask those question specifically because I knew you're not able to answer them, to show everyone you have no idea what you're talking about. Anyone who is familiar with this topic could easily answer them.

I suggest you educate yourself and come back wen you're actually able to contribute to this thread by answering a few of those questions.


For everyone else, I'd like to clarify something about CRT. It's long dead but deserves better. Of course the 1200p limit as claimed in this thread is nonsense. The Barco M series was among the first 5k CRTs. They were usually sold for medical applications, but could be used for anything, sold by specialized dealers. It came with a different choice of phosphor and resulting lifetime, with a peak brightness of up to 800cd/m^2. It also came in different resolutions. I've used a specific model often which had a resolution of 2560x2048 (this is directly from the data sheet). We usually drove those with custom Matrox GPUs back in the day. The were calibrated and we had to check calibration once every months to ensure they didn't drift for critical workflow.

Another gem was the Barco Reality series, several models with the top-end one offering the highest resolution. Out of the box, and I'm directly quoting the Belgian spec sheet here, it did up to 3,200 x 2,560 pixels. The limit came from the bandwidth of the RGB amplifiers. There was a modified version of the amp board later on which allowed full home and DCI 4k resolution (home and DCI 4k has different resolutions). There were not many of these built due to size and cost, as well as energy consumption and heat (they did eat up over 1kW). In addition, this was around a time when people already went digital.

The first ones who did a "digital" workflow for quality assurance back in the day were the guys around Rick McCallum for the Star Wars prequels using modified JVC for video and Meridian DSP speaker audio system. I remember talking to a few guys involved back then and they were all totally happy about it. Good old times I guess. :)

I haven't copied and pasted anything... you, however, are quite suspect. There's also no cathode rays in a CRT, yet there it is in the name. It's just a stream of electrons. The gas many people see is the phosphorus, which anyone that has broken one has seen. It's actually released as a powder, but since the hardware operates in a vacuum, as soon as the air tight seal is broken air around you will rush in and "puff" the powder into the air... hence, for the layman (who happens to be your audience) it is perceived as a gas.

The consumer upper limit for CRTs is around 1200p. Commercial use is higher, as I've already stated with air traffic control monitors, which are 2048x2048. They also weigh 300 pounds and are 20" squares - I've already said this. Anyone that had a 35" or bigger CRT HDTV in the early 2000s knows what those weighed at 1080i.

But, you're the (fake) genius here. You knew all this.

You don't have questions, you have straw man arguments because you have no ground to stand on. They aren't meant to be answered, they're meant to confuse anyone reading them. You're gaslighting, you're a pretend know-it-all -- it's nice that you finally admitted that you're making things up and doing exactly what I said you were. You probably worked in a call center for one of these companies, at best.
 
The below is from a professional calibrator as to why OLED isn't bright enough for HDR
What a surprise. ;)

But I guess (almost) everyone knows that by now. With the rest trying to justify their purchase as a holy grail.

I haven't copied and pasted anything... you, however, are quite suspect. There's also no cathode rays in a CRT, yet there it is in the name. It's just a stream of electrons. The gas many people see is the phosphorus, which anyone that has broken one has seen. It's actually released as a powder, but since the hardware operates in a vacuum, as soon as the air tight seal is broken air around you will rush in and "puff" the powder into the air... hence, for the layman (who happens to be your audience) it is perceived as a gas.

The consumer upper limit for CRTs is around 1200p. Commercial use is higher, as I've already stated with air traffic control monitors, which are 2048x2048. They also weigh 300 pounds and are 20" squares - I've already said this. Anyone that had a 35" or bigger CRT HDTV in the early 2000s knows what those weighed at 1080i.

But, you're the (fake) genius here. You knew all this.

You don't have questions, you have straw man arguments because you have no ground to stand on. They aren't meant to be answered, they're meant to confuse anyone reading them. You're gaslighting, you're a pretend know-it-all -- it's nice that you finally admitted that you're making things up and doing exactly what I said you were. You probably worked in a call center for one of these companies, at best.
You have still not answered anything, because you have no idea about the technology. :rolleyes:

But hey, be happy with OLED (I have some as well, just like other technologies).



For the others reading. Of course there's no gas in the phosphor or in the tube, total nonsense. The only time when gas can become an issue is when the tube is evacuated, that's why it's done in a controlled environment. There's absolutely no gas in a tube when operating it. For further detail on CRT, one might be wise to get in touch with Curt Palme, decade long CRT guru. He's still using them today. Some tubes can be refurbished, not sure if he is still offering that service through VDC. I've not been in touch with Curt for ages. I think I had my last Barco shipped from him in 2001 or 2002.
 
  • Like
Reactions: jhollington
Sorry, I am talking about controlled environments. If anyone is watching with light sources in the room or through a window, you won't be able to see elevated blacks anyway.

The right amount of D65 backlight behind the TV is enough to make the blacklevel look "black" and not elevated.

For the eye only? Yes. From a neurological point of view when it comes to perception, no. More in the 800-1000:1 ballpark.

Color shift can lead to variation in brightness.


Moving closer deals with the "point light" issue from small TVs. It still doesn't give the feeling of a large screen. As long as one has clues about the size of the screen, such as visible frame or an object next to the screen, the perception of size will only change with the perception of distance. So one still knows it's a small TV sized screen. That's why moving an iPad right up to your nose doesn't work, it's still perceived as a 11" or 13" screen. In addition, if cinematic experience is considered, moving up to the screen can be problematic when it comes to audio. Nothing can replace a properly sized screen. But of course this comes at a cost which is certainly more than a $2k or $10k TV. For professional environments price doesn't matter. For home use, it's a hobby. Some buy sneakers, some are into photography with expensive lenses, others are collecting cars and some decide to have home theaters. Never question a hobby.

Hate to break it to you, but 3D is dead. The whole concept with glasses is flawed. And don't forget the hardware to do it properly is expensive. The best 3D I've seen (and actually enjoyed) is the Sim2 HDR double stack which is using triple flash @144Hz. It looks great when the source material looks great. Unfortunately there's few such material. And at that point, I'm not sure if spending $150k on two 1080p machines with 5000 lumens for 3D is worth it at all, given there's better 4k with HDR in similar price ranges. I've never been a fan of 3D on TVs, way too small. But it's a personal preference, so to each their own.

1) I can't recommend these cheap colorimeters. There was a study a few years back published to ISF members that basically said more than half of the Spyder stuff is coming out of the factory having false readings, are extremely inaccurate (particularly in the low IRE region) or have non reproducible (random) readings. I bought a bunch of <$1500 sensors back in the day and never found them to work I expected. They're also not suited for every type of display device. If one is serious about this stuff and willing to put the time and effort (one has to learn the science behind it, it's not a push of a button), then I always recommend the Klein K10 as an entry level instrument. However, it's $7k (sometimes on sale in the $5k range). Anyone who isn't willing to deep dive into this, is better off hiring a ISF certified calibrator to do it.

2) OLED is the better choice when you're in a pitch black room without any type of ambient light and you don't have a LED panel at hand that can compete with the black level of OLED. That depends on number of dimming zones, if it's a single or dual LCD design, etc.

Btw, technically it is a misconception that OLED is totally black. There's still a very small amount of light coming from a "turned off" pixel, it's just so low that most sensors can't read it (and eyes can't see it), you need to go into the NIST certified six-figure $ range to be able to measure it.

OLED is also better for integrating the display in your environment (check the wall series from LG). You can also curve certain OLED screens (we've also seen curved LCDs in the past), as they're flexible (some). You can completely hide them with roll out screens too. Power consumption can be point, depending on model.


3) Yes, properly done is not distracting at all. You've probably seen this on Philips TVs, they call it Ambilight. But it's not D65, so not color neutral which will result in changed perception of colors and it's way too bright. They have the right idea, but the implementation is more a gimmick.

4) Lumagen has different models, so as always, pick the right choice for the job. Street price for the top-model is more around $5k. I recommended it for better upscaling of SD material. But in reality it does so much more. It does not only upscale, it does frame rate and aspect ratio conversion. It can do sharpening, softening, can remove edge enhancement artifacts, noise, artifacts from older film prints, etc. It has full 3D LUTs for calibrating source devices and dynamic tone mapping for each frame which brings HDR to a whole new level of quality by having a fully calibrated chain from source to display unlike standard HDR10 and Dolby Vision, which is generic, static (at least for now) and does not consider your specific setup at all. You can also have different setups depending on situation. For example with projectors, 3D as pointed out requires more light than 2D. So you can have calibrated 2D and 3D settings, even with multiple lamps (if not laser based) turned on in the projects. I've seen installations with up to 4 lamps in a single projector, depending on required light output.

All of that is only useful, when your display device is up to it of course. It makes no sense at all for bad displays. I'm going to go one step further and say, that anyone who is really serious about the best video quality possible will need one of these. It's a must have device and not a gimmick and the difference is not small (again, depending on display quality).


Let me add a few things in general, because I think there's a lot of misconception when people talk about "nits" and brightness. The term these days is mostly used as a marketing thing. More = better. But technically a nit is the amount of light equal to one candela / meter^2. So this is specific to the area of one m^2 (about 10.8 sqft). Change the area and it doesn't work anymore, the definition isn't valid anymore. No one is arguing that an OLED isn't bright enough when putting on a full white field. It will probably be so bright, that you have to look away or close your eyes. But again, this is not what it is about. What we want is small objects to be super bright, such as stars in the black of space. Lets say your display is rated 600 nits, then it can't reproduce stars at 600 nits, because in order to do that, the star would have to be one m^2 in size on your display. However when the star is only a few pixels large, you only get a fraction of those 600 nits from the area of one m^2. So in order to show these stars at a very high brightness, you need a much higher brightness in one m^2, that is why 2000 nits and more displays make sense. Not to have a full white field and blind you with it, but to be able to have high brightness for smaller objects.

Here's a practical example. Buy a small, dim flashlight and have someone shine it into your eyes from some distance. You will be ok. Then use 100 of those flashlights and do the same. Then 1000 and so on. Many flashlights will be too bright to look at, however if you want extreme brightness from a single light source (flashlight), you have to buy a brighter one.

So in addition to the simple nits number, we have to look at what we want to do and put it into perspective. If anyone is familiar with the series "Home Before Dark" on Apple TV, the following is specific to episode 5 of season 2 I recently looked at from a technical point of view. Color primaries are BT2020, the color primaries of the mastering display were P3. The mastering display luminance is between 0.005 cd/m^2 and 1000 cd/m^2. The maximum content light level in that episode is 1597 cd/m^2. That is the brightest pixel in the entire episode. However, the maximum frame-average light level is 163 cd/m^2. That is, the frame with the average brightest luminance level in the entire episode. It seems "dim" at only 163 cd/m^2, however it could be mostly dark with the bright pixels going up to >1000 cd/m^2 (remember this is an average over all pixels). And that is why you need high brightness, to properly show those small bright parts of the image. It becomes less of a problem, the larger the objects get. Some definitions here: https://docs.microsoft.com/en-us/windows/win32/api/dxgi1_5/ns-dxgi1_5-dxgi_hdr_metadata_hdr10

So in addition, as someone mentioned Dolby Cinema theaters above. One has to consider that home HDR content is usually mastered to at least 1000 nits. We're already seeing 4000 nits and the industry is making approaches to push to 10000 nits. HDR in Dolby Cinema theaters however is mastered to 108 nits. So you really can't compare these two or use Dolby theater content at home without further processing (leave aside the issue on specific equipment needed). So why is that? Well, here comes human perception which is not linear and in general a little strange (but hey, it works). A bunch of papers exist (also from Dolby) that say given two objects (one small, one large) with the same brightness, the larger object is perceived is brighter. So in general, large object seem brighter than smaller objects. Given the large screen size in Dolby Vision theaters, that might make sense when it comes to mastering to 108 nits (I personally think it could be more, but less than home HDR. There's a bit more involved, but lets ignore that to not blow up the thread). That also mirrors the general experience for our home theaters where screens are usually between 15' to 25' wide for the enthusiastic movie fan. I have seen some 35'+ wide home theaters, but they're not that common. Mastering nit wise, they could fall between the massive Dolby screens and the small TVs. I'm saying could because depending on equipment and quality one can use both home content and cinema content.


All of that brings me to answer your question from above:

To have that large dynamic range between the "the black of space and that super bright tiny star far, far away". So it's not really about that super massive on/off contrast ratio of millions : 1 (infinite :1 is nonsense, its marketing). It is your intra scene contrast ratio that you want for HDR.

I hope this puts things a bit more into perspective, even if I've simplified it and skipped some stuff.

Thanks this was very informative and I appreciate the effort put into explaining to me. I never thought of it this way but brighter screens are wanted to show smaller objects... interesting.

May I ask which tv you use?

The below is from a professional calibrator as to why OLED isn't bright enough for HDR


a long time ago I made a decision: If you can not tell by your eyes the difference immediately, it does not matter. No need to bring out scientific gauging tools to find which was is better. At that point it does not matter.
 
  • Like
Reactions: jhollington
a long time ago I made a decision: If you can not tell by your eyes the difference immediately, it does not matter. No need to bring out scientific gauging tools to find which was is better. At that point it does not matter.
Exactly this. I came to that conclusion about 18 years ago when I first set up iTunes and was trying to decide on the best bitrate to rip my CDs. I went through a whole bunch of anxiety reading all sorts of opinions and technical analyses before I finally decided to do my own blind ABX testing and realized that my aging ears that spent 20 years hanging around jet engines in the Air Force really couldn't tell the difference between fully lossless audio and a 256kbps AAC.

The same is now true with my eyes. My LG CX is a massive upgrade from my 2013-ear Panasonic ST60 Plasma. While some of the higher-end LED sets may technically provide better HDR, I can't for the life of me tell the difference. On the other hand, I very much can see the much deeper blacks afforded by OLED and Plasma screens, so for me the choice was obvious.

FWIW, here's another good write-up on the differences... https://www.techradar.com/news/oled-tvs-arent-always-great-for-hdr-heres-why ... seems like a pretty balanced take IMHO, but perhaps others in this thread who obviously have far more expertise on the subject than I can weigh in with their own thoughts. Doesn't change the fact that I'm very happy with my LG CX, of course, but I'm enough of a nerd to be curious on a purely technical and theoretical level.
 
May I ask which tv you use?
Several by nature. ;)
Main displays are Samsung 85" QN90A and a 88" LG Signature Z9 (which I thought would be the holy grail, but isn't).

Some others in addition from Panasonic, Sony and Samsung, but they get less use (bedrooms, media rooms for games, the bar at the entrance to the home theater, etc.).

I'm only using TVs for TV/series content and a game every now and then, usually not movies which is reserved for the home theater.

Currently I have an eye on the upcoming µLED TVs from Samsung (they only come in 99" and 110"). They were supposed to ship mid 2021, but probably thanks to corona, they're nowhere in sight. Have to wait and see what happens, but I guess µLED is the next stop.
 
Doesn't change the fact that I'm very happy with my LG CX, of course, but I'm enough of a nerd to be curious on a purely technical and theoretical level.
Perfect technology doesn't exist, everything is flawed one way or another. No-one is arguing that you can't be happy with what you use.

I'm using Macs as well and for some things I do, they're far from perfect and outright annoying. Yet, overall I am very happy with my Macs (and iPhone, iPad, Watch... Apple in general).
 
  • Like
Reactions: jhollington
The right amount of D65 backlight behind the TV is enough to make the blacklevel look "black" and not elevated.
I'm interest in what is the right amount?
If your content's brightness vary greatly, con you put a static backlit?
If that would be eg. 10 nits, you can't see smaller brightness than 1:1000 of it.
So, you let the backlight define your perceived black level?
For the eye only? Yes. From a neurological point of view when it comes to perception, no. More in the 800-1000:1 ballpark.
Can you point any link to give details about this increase from eye -> neurological?
I'd appreciate a lot to be able to educate myself a bit more.
Color shift can lead to variation in brightness.
Isn't that an almost too small issue to perceive?
How much you'd put a weight on that?
Is it something like 10-20 value in 16-bit grayscale?
Moving closer deals with the "point light" issue from small TVs. It still doesn't give the feeling of a large screen. As long as one has clues about the size of the screen, such as visible frame or an object next to the screen, the perception of size will only change with the perception of distance. So one still knows it's a small TV sized screen.
I'm well aware of the psychological effect of "real sizes". But we are not in a binary world. I'm already old, so moving things too close, doesn't work. But looking a movie from "tv" at home, there is a big difference in experience, if your angle of view (for the picture) is 10° or 30°. You could also argue that sociological aspect of movie has to be counted also. So you need at least 50 other viewers in a same space to have a "real" movie experience.
Hate to break it to you, but 3D is dead.
I still see 3D movies in IMAX-theaters, are they going away?
What we want is small objects to be super bright, such as stars in the black of space. Lets say your display is rated 600 nits, then it can't reproduce stars at 600 nits, because in order to do that, the star would have to be one m^2 in size on your display. However when the star is only a few pixels large, you only get a fraction of those 600 nits from the area of one m^2. So in order to show these stars at a very high brightness, you need a much higher brightness in one m^2, that is why 2000 nits and more displays make sense. Not to have a full white field and blind you with it, but to be able to have high brightness for smaller objects.
This is where I'd need to have more published studies about how image is really perceived.
Everybody can check from wikipedia that eye has only 100:1 momentary contrast ratio.
( https://en.wikipedia.org/wiki/Human_eye#Dynamic_range )
It is of course gigantic oversimplification of things.
And maybe very far from true.
References say:
[16] https://books.google.fi/books?id=DR9UyqLkgH8C&pg=PT110 p.92: "The eye is capable of registering a contrast range of approximately 1000:1." No references to this claim is given.
[18] https://books.google.fi/books?id=LL5orppYlJsC&pg=PA1 p.1: "Humans can see detail in regions that vary 1:10^4 at any given adaptation level." No references to this claim either.
(I added a question in the talk page of that wiki-page.)
Well, here comes human perception which is not linear and in general a little strange (but hey, it works). A bunch of papers exist (also from Dolby) that say given two objects (one small, one large) with the same brightness, the larger object is perceived is brighter. So in general, large object seem brighter than smaller objects. Given the large screen size in Dolby Vision theaters, that might make sense when it comes to mastering to 108 nits (I personally think it could be more, but less than home HDR. There's a bit more involved, but lets ignore that to not blow up the thread). That also mirrors the general experience for our home theaters where screens are usually between 15' to 25' wide for the enthusiastic movie fan. I have seen some 35'+ wide home theaters, but they're not that common. Mastering nit wise, they could fall between the massive Dolby screens and the small TVs. I'm saying could because depending on equipment and quality one can use both home content and cinema content.
So, did I understand your meaning right:
Small object need to have bigger contrast, compared to the background, for that object to be perceived as very bright?
And bigger objects need less contrast between in and the background (to be perceived as a very bright object)?
Small star vs. bg needs to have something like 10 000 : 1 and a star filling up half the screen needs only something like 500 : 1 in star vs. bg?
 
Last edited:
Several by nature. ;)
Main displays are Samsung 85" QN90A and a 88" LG Signature Z9 (which I thought would be the holy grail, but isn't).

Some others in addition from Panasonic, Sony and Samsung, but they get less use (bedrooms, media rooms for games, the bar at the entrance to the home theater, etc.).

I'm only using TVs for TV/series content and a game every now and then, usually not movies which is reserved for the home theater.

Currently I have an eye on the upcoming µLED TVs from Samsung (they only come in 99" and 110"). They were supposed to ship mid 2021, but probably thanks to corona, they're nowhere in sight. Have to wait and see what happens, but I guess µLED is the next stop.

I am not sure if you know anything about this but, I keep hearing that Sony has the best "video processing" with its chip. I have no idea what that means, is it true or is it just marketing mumbo jumbo?
 
Exactly this. I came to that conclusion about 18 years ago when I first set up iTunes and was trying to decide on the best bitrate to rip my CDs. I went through a whole bunch of anxiety reading all sorts of opinions and technical analyses before I finally decided to do my own blind ABX testing and realized that my aging ears that spent 20 years hanging around jet engines in the Air Force really couldn't tell the difference between fully lossless audio and a 256kbps AAC.

The same is now true with my eyes. My LG CX is a massive upgrade from my 2013-ear Panasonic ST60 Plasma. While some of the higher-end LED sets may technically provide better HDR, I can't for the life of me tell the difference. On the other hand, I very much can see the much deeper blacks afforded by OLED and Plasma screens, so for me the choice was obvious.

FWIW, here's another good write-up on the differences... https://www.techradar.com/news/oled-tvs-arent-always-great-for-hdr-heres-why ... seems like a pretty balanced take IMHO, but perhaps others in this thread who obviously have far more expertise on the subject than I can weigh in with their own thoughts. Doesn't change the fact that I'm very happy with my LG CX, of course, but I'm enough of a nerd to be curious on a purely technical and theoretical level.

personally, I have played DVDs on my tv and can't tell if its bluray or not and I still think cassette tapes sound great 🤣🤣🤣
 
  • Haha
Reactions: jhollington
I am not sure if you know anything about this but, I keep hearing that Sony has the best "video processing" with its chip. I have no idea what that means, is it true or is it just marketing mumbo jumbo?

Sony has excellent motion-handling for one. Better than any others I’ve seen.
 
The Barco M series was among the first 5k CRTs. They were usually sold for medical applications, but could be used for anything, sold by specialized dealers. It came with a different choice of phosphor and resulting lifetime, with a peak brightness of up to 800cd/m^2.
I still have few Matrox cards in a box somewhere, at least one Parhelia...
IIRC, every crt that I ever did see had this behavior that black levels would rise, if you tuned brightness to the max.
What was the black level of that Barco, when the peaks were set to 800 nits?
 
The consumer upper limit for CRTs is around 1200p. Commercial use is higher, as I've already stated with air traffic control monitors, which are 2048x2048. They also weigh 300 pounds and are 20" squares - I've already said this. Anyone that had a 35" or bigger CRT HDTV in the early 2000s knows what those weighed at 1080i.
When you say "consumer", I guess you mean 1080i-hdtv's?

Because saying that all crt's have a certain fixed resolution does not make any sense.

I still have 2 Sony's trinitron computer monitors at my father's basement.
They are model G520, but there were even higher quality version, maybe it was F520 and called the "Artisan".
These were specced 1600x1200 @ 85Hz.
With software like PowerStrip, you could tweak the picture to whatever resolution needed.
Since trinitron's have aperture grille, not a shadow mask, there's no hard limit on vertical resolution. The pitch for aperture grille was 0.22mm, which set the max horizontal resolution. Later, they made a cheaper Artisan, C520, which had aperture grille of 0.24mm.
21" trinitrons had horiz pic size of 403.8mm, so with ap.gr of 0.22mm, you could have horiz resolution of 1835.
Again IIRC, you could even stretch that viewable image size a bit wider, before the borders went under the chassis.
Still IIRC, I did use them with 2048x1536@85Hz for some of the time... So they could take quite much of "overdrive"...

EDIT: well, wormhole to memory lane...:
Maybe I will some day tune those G520's...
 
Last edited:
Sorry for the late reply, things got really busy the past two weeks and still working through some stuff. But here's a quick reply.

I'm interest in what is the right amount?
As I said, raise the static backlight to the point when a black screen appears black. There's no fixed number, as that depends on the display and environment.
Can you point any link to give details about this increase from eye -> neurological?
Don't have one at hand. I doubt you'll find one that specifically is about contrast of the eye and neurological perception. I've never seen a single one handling it all. But basically, the brain is processing multiple images, not single ones using recurrent pathways. In a similar way to what we've seen in artificial neural networks, e.g. LSTMs. So even if a single image has low CR, combining multiple can yield increased CR "performance". There are also some papers that change in CR is important, since the eye is constantly moving "scanning" the environment, that can help as well. That 800 to 1000:1 ANSI CR I mentioned is from several experiments that have been performed at/after shows like CES, Cedia, etc.
I still see 3D movies in IMAX-theaters, are they going away?
I'd say it's a niche market. But 3D was always niche.
Small object need to have bigger contrast, compared to the background, for that object to be perceived as very bright?
And bigger objects need less contrast between in and the background (to be perceived as a very bright object)?
Higher brightness. Dolby published a paper about this a while ago. If I find the time and the paper, I'll try to post a link. But no promises.
I am not sure if you know anything about this but, I keep hearing that Sony has the best "video processing" with its chip. I have no idea what that means, is it true or is it just marketing mumbo jumbo?
In terms of panel control, as already pointed out by someone else, yes. They're doing a better job than others. However, in terms of general video processing, they still can't compete with the likes of Lumagen. But they manage to get the most out of the panels.

I still have few Matrox cards in a box somewhere, at least one Parhelia...
IIRC, every crt that I ever did see had this behavior that black levels would rise, if you tuned brightness to the max.
What was the black level of that Barco, when the peaks were set to 800 nits?
We used some Parhelias back in the day. Black level on CRT depends on many things among those electronics and beam spot shape, also the energy. Once can easily drive CRTs into blooming to make them brighter, but resolution will suffer, possible burn in, blacklevel, etc.

I'd have to make a guess what the black level of a max-brightness Barco was. It's been ages ago since we've used them. I probably still have some calibration reports somewhere (god knows where), as we had to calibrate each and every display device in use and then check the calibration in specific intervals. Funny enough blacklevel never really mattered as long as the overall CR was good enough and a min. brightness could be reached. Even for medical applications.
 
  • Like
Reactions: MacBH928
As I said, raise the static backlight to the point when a black screen appears black. There's no fixed number, as that depends on the display and environment.

Does the philips Ambilight work like backlit? Is it better or worse? I tried to see how much it cost but setting one up without a Philips tv costs an arm and a leg. I wonder if there is a chinese knock off option.
 
Does the philips Ambilight work like backlit? Is it better or worse? I tried to see how much it cost but setting one up without a Philips tv costs an arm and a leg. I wonder if there is a chinese knock off option.
Any light should work. You want it to be neutral light and dimmable, but as long as you have that the source doesn't really matter. See https://www.waveformlighting.com/d65-bias-lighting.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.