Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Those curious about burn in should look at real-world examples over at https://www.avsforum.com, https://www.avforums.com and several other forums. While the Rtings test is nice, these particular patterns are at least partially embedded in firmware to prevent burn in. Others are not. As usual, there's variance among displays. Manufacturers are learning from this though and usually improve burn-in from generation to generation. It also becomes less of an issue due to new technology. When people throw aways their TVs/displays after two or three years to buy something new, it's less of a problem. Same can be said for an iPad, just upgrade every year. :apple:
Right - I think the Rtings tests are essentially "worst case scenario", which is why I pointed them out. The generational changes is also why I mentioned that the TVs they used are the older C7 models and not current gen. I've not read anything on AVS suggesting that burn-in is going to come knocking at someone's door. I personally have the Vizio OLED H1, which has the same screen as the LG CX and haven't had any issues - but that didn't stop me from being a little nervous about burn-in at the beginning. It's kinda like that fear some people have of owning an electric car and running our of charge in the middle of the highway.
 
It is all over the web. Is it as bad as the in the photos? Usually not. But it's simple physics, the second you turn it on OLED starts to degrade. Why do you think LG is implementing a ton of algorithms in firmware to fight burn in? Why do you think they have a counter running in the TV firmware that stores how long you stay on the same channel and after a specific amount of time without switching channels your warranty is void?

Go and measure the output of a OLED panel when it's brand new, then again after 500 hours, 1k, 2k, ...
The human eye won't notice this change, as it occurs over time. But the specs change over time. OLED having absolute black and therefore don't need to be as bright is nonsense. We've been there back then we bought tube based machines (as in CRT) for $100k back in the day for our home theaters and studios. How did that go?

Not that other technologies don't drop in brightness, but not as much and the drop is uniform.

That being said, you're still fighting the usual OLED problems with color uniformity, banding, poor shadow detail in low IRE regions, etc. that's why it's not used in professional setups. For the average Joe at home, who doesn't even have a properly calibrated display, it probably won't matter. Professional equipment is more expensive anyway.

For the home, sure, it's good enough. Too bad it's not available in immersive sizes. ~80" is tiny. I want 15' to 20' wide. If I could get that for $100k to $200k, I'd happily buy an OLED display, even if I'd have to replace it every 5 years. What remains is Barco/Christie (in all price ranges above $100k) or Sony/Samsung with their µLED walls which go for around $800k+, consume a ton of power and require dedicated cooling. So meh, even though Christie has a nice solution in the pipeline starting at around $300k. We'll see at CEDIA later in the year. The meh is just for me, I know a few people who are very happy with their µLED Sony and Samsung walls.

So again, OLED iPads for the average Joe and mini-LED for content creators working on studio/broadcast content. Then again, without a proper calibration it doesn't make much sense and I doubt Apple will provide these features. And then I doubt many people using an iPad for this will buy the necessary calibration equipment and yes, that is not the consumer stuff that measures all over the place, but professional equipment that actually works and starts at around $7k depending on light source and display type and goes up to $100k+ depending on the requirements.

This is an iPad (a toy) after all.

I am not a visual pro, so do not need super-precise colour calibration. I do use my iPad to take notes at work, but I guess I qualify as your average Joe. I guess as the technology advances, the burn in is becoming less of an issue. Especially if we consider that contents consumers, like myself, constantly move/change/manipulate the contents of their screens. Even when we watch a movie in full screen, the screen is never static. Good to know, though, I guess Apple will continue refining the mini/micro-LED for their Pro devices and will give us, less demanding folk, OLED. Sounds good to me.
 
What blooming? I haven’t noticed anything at all. I’m also not looking for it. I just enjoy my first iPad ever and it’s incredible. Some people just need to look for the negative things in life. I’m keeping this iPad for a long time. I don’t need to upgrade every single year.
I'm in the exact same boat. I keep reading all of these threads with people arguing about the screen tech... honestly, this is a beautiful upgrade from my 2013 MBP and that's all I need to know. I will make my new set up last at least 5+ years.
 
What blooming? I haven’t noticed anything at all. I’m also not looking for it. I just enjoy my first iPad ever and it’s incredible. Some people just need to look for the negative things in life. I’m keeping this iPad for a long time. I don’t need to upgrade every single year.
I'm in the exact same boat. I keep reading all of these threads with people arguing about the screen tech... honestly, this is a beautiful upgrade from my 2013 MBP and that's all I need to know. I will make my new set up last at least 5+ years.
 
I've used a new iPad Pro 12.9 - it looks great. I even tried it in a completely dark room using Books with a black background and white text. You'd need to set the brightness pretty low in that environment to be comfortable, and at that brightness, blooming, IMO, is negligible. If you crank up the brightness, well, blooming isn't as much an issue as the unusable level of brightness, in general, in a dark room. miniLEDs are fine in my book.

My thinking though is, I'd much rather have miniLED on a laptop with apps that can do it justice in 'dark mode' and would still rather have OLED on watch/phone/ipad

And no matter what is driving the display, I hope I could get 800 nits - or be as bright as an Apple Watch in the sunlight, so I can use these outside.
 
Screen tech in migration.

What will be final destination?
Micro-LED

pros
  • Each pixel is it's own light source. Like OLED.
  • Fully black background. Like OLED/Mini-LED
  • No blooming. unlike Mini-LED
  • No burn-in. unlike OLED.
  • No blue-pixel fade. unlike OLED.

cons
  • not available now. probably still a few years out.
uknown
  • OLED PWM dimming causes flashing affect for some viewers. Does Micro-LED use the same dimming method?
 
PWM dimming is not exclusive to OLED. It's also used in LCD displays. And some manufacturers decide to use low frequency PWM (220-240Hz) which is know to cause effects for human eyes. Not everyone notices it the same but it affects everyone to some degree.

It's apparently sometimes even used in certain regular desktop monitors. But manufacturers never tell you it uses PWM. You will always have to wait for some reviews that measure that.

And from what I've heard, microLED also uses PWM. There are already microLED displays on the market, just not to consumers.
 
  • Like
Reactions: Moyapilot
Makes sense, cheaper, less color accurate display for mid-range iPad. With that Air won’t suffer from light bleed anymore. I will never understand why they used OLED in iPhone Pro though, wonder how long it will take them to go back to LCD. For mini LED the display is probably too small, so I guess we have to wait all the way for microLED.
 
...And tell me when was the last time your iPhone OLED screen burned-in, you would likely replace your iPad before that could happen!
My iPad Pro 9.7 is five years old now, so I'd hate it if my screen had burn-in of the status bar icons and text.

The 13" iPP will stay Mini LED because it is more color accurate while the smaller ones will get OLED.
 
Last edited:
Makes sense, cheaper, less color accurate display for mid-range iPad. With that Air won’t suffer from light bleed anymore. I will never understand why they used OLED in iPhone Pro though, wonder how long it will take them to go back to LCD. For mini LED the display is probably too small, so I guess we have to wait all the way for microLED.
So OLED is less color accurate in what world exactly? Color is all part of the calibration. OLED can have better calibration than an LCD and visa versa. It has no bearing on being an OLED.

I don't know how many times I keep reading this on MR forums. That and the burn-in fear mongering needs to be put on a rocket and shot into space.

If the Ipad Air next gen gets OLED, I'll be selling my beloved 2018 IPP 11" and getting that. Its strange and confusing that a lower end product will get a better display than the Pro model that's much more expensive. I'll take the savings though!
 
Color is all part of the calibration.
No, it is not. While color representation can be calibrated, it is limited by the physical light source or material emitting the light in the device. You simply can not calibrate any device to represent any colors as you like. There are limits and for certain things, you need to apply color filter. In addition, the material in OLEDs changes with energy/heat. This isn't a linear function. OLEDs by nature are not very uniform when it comes to this. That's also the reason you can't stich together multiple displays as with mini LED or µLED (well you can, but it's not seemless). This is harder to see on mixed content scenes, much more on homogeneous colors where you will also see banding.

You can also measure this with the proper equipment as well. It's been discussed up and down in the forums I linked to earlier in this thread. Here's a good thread on OLED uniformity: https://www.avforums.com/threads/post-your-oled-uniformity-photos-and-discuss.2325857/

Notice it's worse with low IRE patterns, which is bound to the poor shadow detail performance of OLED. They have excellent absolute black, but just above that they're lacking details, which shows in dark/night scenes where other technologies are much better at representing detail. Game of Thrones comes to mind, The Battle of Winterfell is an example for this. Many people complained about it, because it's dark and there should be lots of detail in those dark scenes which OLEDs fail to represent. To quote Chris Welch from The Verge:
That said, while OLEDs excel at black level and contrast, they sometimes tend to crush details in dark scenes like Winterfell’s huge battle — especially if your TV hasn't been calibrated.
Mini-LED, µLED and dual panel/chip technologies are much better at this, but also more expensive. Sometimes much more expensive, depending on size.
 
Notice it's worse with low IRE patterns, which is bound to the poor shadow detail performance of OLED. They have excellent absolute black, but just above that they're lacking details, which shows in dark/night scenes where other technologies are much better at representing detail. Game of Thrones comes to mind, The Battle of Winterfell is an example for this. Many people complained about it, because it's dark and there should be lots of detail in those dark scenes which OLEDs fail to represent. To quote Chris Welch from The Verge:

Mini-LED, µLED and dual panel/chip technologies are much better at this, but also more expensive. Sometimes much more expensive, depending on size.
No, you can't use Game of Thrones to justify your argument. The problem with those dark episodes of Game of Thrones was the broadcast encodes, and it sucked in general, both on OLEDs and on LCDs. It was simply just bad. The cinematographer made a stupid statement saying that everything was fine with the original source material, without bothering to factor in the fact that these shows are re-encoded for TV.

In contrast ;), near-black detail on Netflix 4K shows like Lost in Space is amazing, including on OLEDs.

Now there is one issue with near-blacks that did differentiate some OLEDs from other TVs, and that is near-black posterization and other problems that occurred with LG TVs on poorly encoded material. However, that mostly wasn't inherent to the panels. That was due to the inferior image processing by the LG TVs' SoCs. Sony OLEDs (which used the same LG OLED panels) fared better.
 
  • Like
Reactions: Moyapilot
No, you can't use Game of Thrones to justify your argument.
I'm not using GoT to make that argument, all I said was it's good material to show this issue. Of course you'd want to use the discs for this. Plenty of other examples out there show this. You can also use masters for this to not rely on bad encodes or shoot your own material.

As pointed out, this can be easily measured with proper equipment. The issue, among others, is there, sometimes more, sometimes less depending on the display.

OLED is cheap in the end, so people can make the choice. I'm curious to see what Samsung brings with their µLED TVs (besides the already available The Wall). Should come in 99" and 110" sub $150k. Too small for the home theater, maybe too big for the living room to watch the news. A 88" and 76" version will follow, no price yet. These should fill the gap.
 
I'm not using GoT to make that argument, all I said was it's good material to show this issue. Of course you'd want to use the discs for this. Plenty of other examples out there show this. You can also use masters for this to not rely on bad encodes or shoot your own material.
And my point is that no it’s not good material to show this because it’s nothing like what you’re trying to make it out to be.

To put it another way, I know a bunch of people who tried watching it on HBO and just stopped watching because the quality was so bad. This is on both OLEDs and LCDs. They then proceeded to torrent the episode which was ripped from a better quality stream, and the near black detail was just fine, again both on OLEDs and LCDs.

Those commentators online claiming it was due to specific TV tech were basically just clueless. It was all to do with the encodes.

BTW, this is not unexpected. Whatever general algorithms they use for these broadcast encodes are especially poor for near blacks. In the past this didn’t matter because content creators didn’t try to include this type of detail in their dark scenes. Now they do, but the encoding algorithms have not been keeping up. This is what HBO demonstrated with GoT, NOT deficiencies in OLED technology.
 
Last edited:
To put it another way, I know a bunch of people who tried watching it on HBO and just stopped watching because the quality was so bad. This is on both OLEDs and LCDs.
Again, use proper sources. It was also used by Christie among others for demos, not bad encodes. Invitation only events, selected people from the industry. Pioneer did similar events back in the days (plasma) and used LotR from masters. Issue is there. Then again the perfect technology does not exist. Pick your poison.
 
What a rip, apple tricked everyone into getting the sham mini-LED technology, then flips the table with OLED.

smh

1. It’s a rumor.
2. They tricked no one. I brought my 12.9 knowing full well that next years model will make this look like last years model.
3. They’re not flipping any table. They’ve been quite vocal about their screen tech progression.
 
Phones spent a lot less time with attic elements. I think and has a lot less UI elements. For example Drawing apps on the iPad have elements open non stop.

TVs have real issues with it when people only watch the same channel - but they have methods to move the logos around a few pixels or dimming that area - you wouldn't notice but it happens - but it only work on one or two screen elements.

It certainly happens with iPhones - You can see it often in Mobile phone shops - or repair centres!

View attachment 1785013 View attachment 1785012
Retail displays are not real-world usage examples.. But sure.. Again, this was an issue with CRT's also.. Perhaps epaper doesn't have this issue..
 
Retail displays are not real-world usage examples.. But sure.. Again, this was an issue with CRT's also.. Perhaps epaper doesn't have this issue..
Yes, it was an issue with CRT as well, same for plasma. As soon as you turn it on, your "fuel" is burning away and it gets less bright. The problem with CRT and especially plasma and OLED is, their wear is not uniform across the whole screen. It happens individually to every pixel, while with with other technologies it's uniform.

Everyone have a look at some old tubes from a CRT projector: https://www.avsforum.com/threads/crt-wear-and-burn-analysis.293729/

The darker area in the middle of each tube is normal wear. The brighter parts around it are where no image is displayed (due to geometry, throw distance, etc. which has to be adjusted on the phosphor area with CRT projection). So this is perfectly normal and not visible during use. However, it shows exactly what happens to the phosphor of a CRT and the same things happens to OLED, minus the unused area around it because you're using the whole area of a screen for the image.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.