Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No need for OLED unless the technique improved a lot, miniLED is more than capable for the current form factors.
What are the differences between OLED and miniLED? Asking as a profane.


Apple didn't move to OLED in a long time because these displays degrade faster than LCD. Then why is Apple now considering to switch?
I think OLED last longer now.
 
I have an LG C2 OLED next to my Pro Display XDR, and 16" miniLED MacBook Pro and the OLED absolutely destroys them both in terms of picture quality when watching films and TV.

If burn in is solved then I can't wait for OLED.

I agree , but what about brightness? Normal consumer level OLED TVs and displays usually only go up to 250 ~ 600 nits.
 
I would say it’s the opposite. OLED is far superior to MiniLED in HDR as it has effectively millions of dimming zones vs; 2500 on the MiniLED making for a much more impressive contrast ratio. Yes it isn’t as bright but it doesn’t need to hit the same brightness due to higher perceived contrast. This is a-art from the much better Color volume OLED provides especially QD OLED

I have an OLED monitor, OLED TV AMD a MiniLED iPad and in the dark the OLED just crushes the MiniLED.

What about in the sun under a tree drawing? Can you even see that 250 nits OLED display?
 
Micro LED - each sub pixel (R,G or B) is it's own micro LED.
Mini LED - an RGB LCD lattice covers an array of mini white LEDs, each one lights a single set of subpixels (one white LED lights one set of RGB LCD pixels).
Sony and Sharp have only JUST released Mini LED TV sets a fee months ago. The tech is NEW. Or so I thought...

So I found this article very confusing as per - Quote:
"demand for mini-LED displays for use in consumer electronic devices is decreasing"
How is could that be possible I thought.
And "Apple doesn't sell a single product with mini LED." I thought.

So I go reading and realise that they brand the macbook pro as having a mini led display, in the states. I had not noticed this.

I've now realised that the marketing types (maybe😂, could be wrong) could not wait to use the next new high tech term to sell products and jumped the gun.

A panel with zones that covers more than one subset of pixels is not the original definition of a mini led display, but I guess that's what it has come to mean. Here in Japan, we don't use the term that way (I don't think).

Can anyone else comment on this? I have a TV with a lot of zones, but would no term it a mini led tv.
 
I agree , but what about brightness? Normal consumer level OLED TVs and displays usually only go up to 250 ~ 600 nits.
My LG C2 goes up to 800 nits so it's not dark by any means, and QD-OLEDs are hitting 1500 nits. Plus the iPhone 14 Pro's OLED screen can hit 2000 nits.

I don't think Apple would switch if they didn't think that OLED's two big problems: burn in, and low brightness, were mostly solved.
 
  • Like
Reactions: james2538
I would say it’s the opposite. OLED is far superior to MiniLED in HDR as it has effectively millions of dimming zones vs; 2500 on the MiniLED making for a much more impressive contrast ratio. Yes it isn’t as bright but it doesn’t need to hit the same brightness due to higher perceived contrast. This is a-art from the much better Color volume OLED provides especially QD OLED

I have an OLED monitor, OLED TV AMD a MiniLED iPad and in the dark the OLED just crushes the MiniLED.

In theory, yes. But actually producing a large display that has the qualities (and the volume!) Apple needs is going to be very challenging. Until a suitable technology (micro-LED) is sufficiently matured and production-ready mini-LED local dimming workaround is here to stay.
 
Micro LED - each sub pixel (R,G or B) is it's own micro LED.
Mini LED - an RGB LCD lattice covers an array of mini white LEDs, each one lights a single set of subpixels (one white LED lights one set of RGB LCD pixels).
Sony and Sharp have only JUST released Mini LED TV sets a fee months ago. The tech is NEW. Or so I thought...

So I found this article very confusing as per - Quote:
"demand for mini-LED displays for use in consumer electronic devices is decreasing"
How is could that be possible I thought.
And "Apple doesn't sell a single product with mini LED." I thought.

So I go reading and realise that they brand the macbook pro as having a mini led display, in the states. I had not noticed this.

I've now realised that the marketing types (maybe😂, could be wrong) could not wait to use the next new high tech term to sell products and jumped the gun.

A panel with zones that covers more than one subset of pixels is not the original definition of a mini led display, but I guess that's what it has come to mean. Here in Japan, we don't use the term that way (I don't think).

Can anyone else comment on this? I have a TV with a lot of zones, but would no term it a mini led tv.

Apple has been using miniLED backlights for a couple of years now. A miniLED display uses small LEDs as a source of white light, with a traditional LCD panel in front of it.

The technology is indeed new in the sense that it only recently matured enough to be used in actual production. And it’s very expensive, which is why not many bother. Most users don’t care about color accuracy or brightness and will find a cheaper OLED panel just as impressive for their media consumption. So only companies who build displays targeted at artists bother with this technology.
 
  • Like
Reactions: SilverWalker
Micro LED - each sub pixel (R,G or B) is it's own micro LED.
Mini LED - an RGB LCD lattice covers an array of mini white LEDs, each one lights a single set of subpixels (one white LED lights one set of RGB LCD pixels).
For the OLED side:

OLED - each sub pixel (R, G, or B) is its own oled, the different colors are chemically different and can wear out at different rates.
WOLED - same as OLED but has an additional white oled to make the image brighter, but this also means that bright colors lose saturation.
QD-OLED - replace the W, R, G, and B oleds with just 4 identical "white" OLEDS (I think it's actually blue with a filter), and then use quantum dots to filter out light and set the color of the pixels.

QD-OLED is cool because if you wanted to show a 400 nit blue on a normal OLED you'd need to run the blue pixel at 400 nits, which burns it out quickly. While on a QD-OLED you set the 4 "white" OLEDs to 100 nits each and then use quantum dots to make it blue. This means you can get much brighter and are only putting 1/4th of the load on your OLEDs. Also you don't have the issue of stuff like the blue OLEDs burning out faster than the red and green and destroying your color accuracy.
 
I use my iPad Pro 12.9 with mini LED often in dark conditions. The blooming is out of control. Until that is addressed it’s almost unacceptable in darkly lit conditions for such an expensive product.
 
  • Like
Reactions: james2538
My LG C2 goes up to 800 nits so it's not dark by any means, and QD-OLEDs are hitting 1500 nits. Plus the iPhone 14 Pro's OLED screen can hit 2000 nits.

I don't think Apple would switch if they didn't think that OLED's two big problems: burn in, and low brightness, were mostly solved.

Brightness is difficult. It all depends on how large of an area on your TV we are talking about. The LG C2 will only give you 800 nits peak brightness when it's 2% or less of your screen pushing for this kind of peak brightness. Once you move to 25%, it drops to 400 nits. 50%, it drops to 266 nits etc. For SDR, the overall brightness is limited to 422 nits.

How relevant this is will vary greatly. For SDR, the industry standard for colour calibration is to aim for 100 nits / 100 cd/m2, but this might not be sufficient if you are watching your TV in bright rooms. For HDR, it's a different story, as the HDR standard is referencing 10 000 nits for peak brightness. In theory, your HDR experience can peak up to 10 000 nits. There is obviously no TV on the market today that can achieve this, so we apply various tricks to limit peak brightness for HDR to stay within the capabilities of your TV. But the HDR spec itself will continue pushing contracts all the way up to 10 000 nits if you have a TV capable of going that far.


To me personally, I prefer OLED over mini-LED. Perfect black is more important to our perception of contrast than peak brightness. And it's not like everything is HDR even in 2022, and for SDR content, peak brightness is not relevant besides battling lousy lighting conditions in your room.

The main difference I noticed when watching HDR on a mini-LED TV compared to an OLED TV is that the eye-soaring bright highlights get even more eye-soaring. It's tough to view details in peak brightness for HDR content because pushing 1000 nits+ becomes too bright to be viewed comfortably. Whereas details in dark areas can be viewed with great comfort so having perfect black levels is much more impactful overall compared to having peak brightness extending beyond 800 nits in my experience.


mini-LED is much better at providing decent black levels when compared to LED. But it's far from perfect. You get a ton of halo/blooming due to the mini-LED sones being too large. OLED is infinitely better as it can turn off each individual pixel to achieve perfect black levels with no haloing, blooming or bleeding. micro-LED is going to be the best of both worlds, providing the peak brightness of LED, and the black levels of OLED without the drawbacks of burn-in/image retention etc.

But micro-LED is still some years into the future. As of today, we have mini-LED and OLED, and to me, OLED beats mini-LED without any questions.
 
Regardless of all the benefits of oled, don’t use it until Tim Apple solves the flickering pwm oleds on iPhones
 
Absolutely not. E.g. OLED displays in iPhone 14 Pro have 2000 nits peak brightness.
Phone displays use a different OLED technology than monitor and TV OLEDs.

TVs and monitors use either WOLED or newer QD-OLED technology. With WOLED all of the pixels are white but they have RGB filters overlaid on them to produce colors, and these filters ultimately diminish the maximum brightness output. Additionally since TVs and displays have a longer service life expectation than a phone there's consideration given to pixel lifespan - the more energy you push through an OLED pixel to make it brighter the more it shortens the lifespan.

Phone displays are made using true RGB OLED technology, in which each pixel glows its respective RGB color with no filters. Because of the lack of filters they can reach a higher peak brightness, but this technology is expensive and difficult to manufacture to larger display size.

QD-OLED is a hybrid approach using technical magic I won't get into here, but it balances the cost of true RGB OLED with the cost effectiveness of WOLED.

There are monitors currently on the market made with true RGB OLED technology. LG has a 32" 4K UltraFine Display that came out last year based on a true RGB OLED panel but it's not cheap - it costs $3999 USD.
 
Can’t wait for the iPads to start getting OLED screens. That’s when I’ll finally upgrade my 11” Pro.

I have an LG OLED TV and it just blows my laptop and monitor out of the water when it comes to display quality. With HDR content I prefer OLED since it doesn’t need to sear my retinas to achieve sufficient contrast. SDR is no contest.
 
OLED has improved a lot. Especially QD-OLED, which Samsung says won't burn in and also can hit 1000+ nits.

They've already started making QD-OLED computer monitors which they wouldn't do if burn in was still a huge problem.
There are reports of burn-in on their S95B TV with QD-OLED screen and that TV has been out for less than a year. So I would not take the Samsung words at face value.
 
I have an LG C2 OLED next to my Pro Display XDR, and 16" miniLED MacBook Pro and the OLED absolutely destroys them both in terms of picture quality when watching films and TV.

If burn in is solved then I can't wait for OLED.
HDR800 is nowhere close to HDR1600 of MiniLED.

Apple will not regress to make room for OLED. Until OLED can match the specs of MiniLED, Apple ain't budging.
 
  • Haha
Reactions: jmho
OLED will be used on consumer Apple products, mini-LED on prosumer ones. Apple needs HDR across the board, sufficiently bright OLED is a “cheap” way to get there. But it won’t replace miniLED and local dimming any time soon. Next evolutionary step is then micro-LEDs which will replace both.

Micro-LED is years away from coming to phones. Too expensive and not to mention they can't make it small enough. The only product that they have now are large TVs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.