Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Quite a few misconcenptions about the various technologies so here's a quick primer:

WRGB OLED - 4 white subpixels with one going through a red filter, one going through a green filter and one goign through a blue filter. The unfiltered white pixel is there to boost the brightness. Thus you get poorer colour volume (i.e. a bright blue is actually the blue subpixel and the white subpixel). The strength of the underlying all-white subpixels is that there was no need to evaporate the red, blue and green materials separately which can lead to mask issues (this is what knackered Samsung's OLED production as they just couldn't get the yields up in 2013 to match the price drops LG achieved). The white was made from blue and yellow emitters (evo panels use blue, green and yellow).

RGB OLED (as per this screen) - the standard 3 subpixels produced by inkjet printing in this case to avoid the masking issue. It is very likely that there are still colour filters as OLED emitters tend to have quite broad spectra. As there is no white subpixel, the colour volume is good. Lifetime/burn-in is still a bit of an unknown.

QD-OLED - As WRGB OLED, all the OLED subpixels are the same colour to ease production (no mask issues) - in this case 3 blue subpixels. The red and green subpixels have quantum dot layers to downconvert the blue. QDs typically have very narrow spectra so the colours will be better / wider than regular OLED (also, no white subpixel dilution). The blue subpixel may well need a colour filter as well. The downsides are that as blue OLEDs drive all 3 colours, the efficiency of the panel is lower than RGB-OLED (this is unlikely to come to smartphones or tablets anytime soon) and the lifetime of blue is the lowest of the colours so burn-in could be an issue.

MiniLED - A regular LCD with backlighting with very small LEDs. As such there is a lack of per-pixel control (leads to blooming) but the brightness can go high and no burn-in/lifetime issues.

MicroLED - very, very small LEDs with each subpixel being an individual LED. In theory, the best of all worlds with pixel-level brightness, high brightness and no lifetime issues. Currently experiencing very low yields in manufacturing at large screen sizes (100" and over) - 'regular' sizes are goign to be even more difficult to manufacture.
 
If this monitor supported 120hz variable refresh rate then it would literally be the perfect monitor until MicroLED equivalents come out. I still likely wouldn't be able to justify the cost but, I would have considered it as a good monitor is an investment for years to come. As it is, I'm likely going to go for a 42" LG C2 when they hit the market in the Spring. It's a bit big and will be at more risk than this one for actual burn in but it'll do for a couple of years until MicroLED gets down to the 32-40" range.
 
  • Like
Reactions: james2538
Clearly I’m not educated at all on this subject, but I just don’t understand why the display market seems so messy and overpriced. I just don’t understand how an LG OLED 55” TV (120Hz, OLED, basically best TV picture you can buy) can be $1,000-$1,500 brand new and an iPhone 13 Pro Max (6.7” OLED, 120Hz adaptive refresh, 1000 nits sustained brightness) can be $1099 with all the other tech it has, yet when it comes to monitors (somewhere between 24”-36”), it seems no one can make an OLED, 120Hz, high PPI, at least 500 nits sustained brightness monitor at any price point.

Yes, I get it, for professional displays that need perfect color calibrations and other technical items, I realize there’s a reason those are tens of thousands of dollars. But I just don’t understand why consumer monitors are so hard to get right. It’s pretty pathetic that you basically can’t get a monitor that matches the display quality of even the 2021 12.9” iPad Pro or 2021 MacBook Pros. Hell, there aren’t even a lot of displays that are sold with glossy fronts rather than the matte texture that dulls all the colors. Even the Pro Display XDR with 1000 nits sustained brightness, mini-LEDs, high PPI, and a glossy finush has its shortcomings (aside from price) with not having ProMotion and having blooming that all mini-LED displays have.
 
  • Like
Reactions: Jobeo
Quite a few misconcenptions about the various technologies so here's a quick primer:

WRGB OLED - 4 white subpixels with one going through a red filter, one going through a green filter and one goign through a blue filter. The unfiltered white pixel is there to boost the brightness. Thus you get poorer colour volume (i.e. a bright blue is actually the blue subpixel and the white subpixel). The strength of the underlying all-white subpixels is that there was no need to evaporate the red, blue and green materials separately which can lead to mask issues (this is what knackered Samsung's OLED production as they just couldn't get the yields up in 2013 to match the price drops LG achieved). The white was made from blue and yellow emitters (evo panels use blue, green and yellow).

RGB OLED (as per this screen) - the standard 3 subpixels produced by inkjet printing in this case to avoid the masking issue. It is very likely that there are still colour filters as OLED emitters tend to have quite broad spectra. As there is no white subpixel, the colour volume is good. Lifetime/burn-in is still a bit of an unknown.

QD-OLED - As WRGB OLED, all the OLED subpixels are the same colour to ease production (no mask issues) - in this case 3 blue subpixels. The red and green subpixels have quantum dot layers to downconvert the blue. QDs typically have very narrow spectra so the colours will be better / wider than regular OLED (also, no white subpixel dilution). The blue subpixel may well need a colour filter as well. The downsides are that as blue OLEDs drive all 3 colours, the efficiency of the panel is lower than RGB-OLED (this is unlikely to come to smartphones or tablets anytime soon) and the lifetime of blue is the lowest of the colours so burn-in could be an issue.

MiniLED - A regular LCD with backlighting with very small LEDs. As such there is a lack of per-pixel control (leads to blooming) but the brightness can go high and no burn-in/lifetime issues.

MicroLED - very, very small LEDs with each subpixel being an individual LED. In theory, the best of all worlds with pixel-level brightness, high brightness and no lifetime issues. Currently experiencing very low yields in manufacturing at large screen sizes (100" and over) - 'regular' sizes are goign to be even more difficult to manufacture.
A good description, but one point I wish to clarify:

WRGB has filters to show the desired colour of each sub-pixel, blocking out the rest of the white light for the RGB elements. QD-OLED uses quantum dots to convert, not block, the blue on the RG elements. Although the white OLEDs have a longer lifespan, they have to work harder as much of the light is blocked, while the blue OLEDs of QD-OLED all the light gets through, so they have a much easier time to create as much light. The first TVs using this new format have brighter displays than LG’s WRGB panels, but seeming less than the blocking/converting difference would allow, so I think the QD-OLED lifespan should be OK.
 
  • Like
Reactions: Tagbert
it's perfectly acceptable to just say to yourself "this isn't for me" and move on with your life instead of saying this.
I was not criticizing and obviously it's not for me. My puzzulment was not that I can't understand that if you make some Hollywood movies it will matter. Just imagine that it would be a very small market but apparently this is not the case.
 
I am using the current model 32EP950, as someone has been using the Eizo ColorEdge monitor all his life, while this OLED monitor is expensive but it's really great, the only thing I don't like is the semi antiglare finish on the screen itself, it looks grainy when looking up close, especially when viewing green content.

I hope the newer version does fix the anti-glare finish issue and their software too, LG's calibration software is junk to say the least, it's leagues under Eizo or even Asus PA software.
 
I wonder what it’s PWM frequency is. That is one scary monitor. It’s one thing to have an OLED TV 8 feet away from you, but entirely different to have this flickering monitor within arms reach. Ouch!

I am using the 32EP950, it has 120hz PWM it's very low, I didn't notice any flicker or eye strain though. I guess the new version would be the same.
 
I wonder what it’s PWM frequency is. That is one scary monitor. It’s one thing to have an OLED TV 8 feet away from you, but entirely different to have this flickering monitor within arms reach. Ouch!
Probably nonexistent. Larger OLED panels typically don’t use PWM.
 
It's not really on the subject but I don't understand why some professionals needs that kind of color accuracy. Maybe because I am a programmer and not an artist. I can understand that you don't want your yellows looking orange and vice versa but this kind of accuracy is probably over 50 times what my eyes could see.
You (and I) probably fall under the other 99% that this stuff is lost on. I'm an editor and have done plenty of color grading. No one likes when their work looks different from one viewing source to another but I can tell you that it really makes no difference to anyone but the artist(s) intimately involved with the project and maybe the producer if you're lucky enough to have one that cares. Audiences don't care about the difference between orange and yellow because they were never in the position to make that creative choice.
 
  • Like
Reactions: Stars&Stripes
I don't really understand why the prices and specs of these are what they are when 48" 120hz OLEDs are available for a thousand bucks and are incredible panels for the money. Suddenly when you go below 48" it becomes a "monitor" and the price and availability go haywire. I understand that some of these are reference style monitors for color accurate work but that doesn't quite explain why there aren't 24-32" OLEDs for mass market consumption without a hefty pro price tag. Is it just impossible to make these with the same yield due to the pixel density on smaller panels or something? Doesn't seem to hold Apple and the other manufacturers back from making large screen OLED phones and tablets.
TV manufacturers can sell TV's for basically zero profit margin, because most of them put in software that monitors your TV viewing habits and they sell the data. Even the few TV manufacturers that don't do this will often be able to sell with a very small profit per unit, because the volume involved is so huge in the TV market. Monitors, on the other hand, are something where they need to bake a decent profit margin into the cost, and a pretty hefty one at that given the huge costs involved in ramping up a production line for something like this.

And on a technical level - this monitor is literally nothing like their consumer TV sets. This monitor uses a very different technology under the hood to deliver significantly higher color accuracy. So in short, if you have to wonder why it costs so much, you were never the target audience in the first place :)

My only real disappointment with this monitor is that it doesn't use thunderbolt. I would really prefer a Thunderbolt 3 or 4 monitor, so I can daisy chain multiple of them together, or use that thunderbolt port on the monitor for something else. It's seriously disappointing given the price
 
Last edited:
  • Like
Reactions: Tagbert and sos47
400 nits


For one, OLED TVs don't have anywhere near the brightness of OLED monitors.

OLED TVs claim something like 1,000 nits, but only if a small portion of the screen is bright. The moment you want a large portion of the display to be bright, it drops to around 150 nits or less. These professional monitors can stay at 400 nits even when APL is high.

The second is obviously PPI. It's easy to make a big display with low density pixels.

Hm, didn't realize 100% values were so low on the LGs. I actually did not notice it in practice on the one I tested over Christmas - it stays plenty bright in HDR mode (highlights eye-wateringly so) but I'll have to keep an eye out on how it does on some other more demanding content. In practice though it presents itself as one of the best panels I've ever seen. I was blown away that I could play competitive Halo in HDR at 120hz+variable refresh with zero tearing and incredibly quick response times. Made me want to buy one immediately.


TV manufacturers can sell TV's for basically zero profit margin, because most of them put in software that monitors your TV viewing habits and they sell the data. Even the few TV manufacturers that don't do this will often be able to sell with a very small profit per unit, because the volume involved is so huge in the TV market. Monitors, on the other hand, are something where they need to bake a decent profit margin into the cost, and a pretty hefty one at that given the huge costs involved in ramping up a production line for something like this.

And on a technical level - this monitor is literally nothing like their consumer TV sets. This monitor uses a very different technology under the hood to deliver significantly higher color accuracy. So in short, if you have to wonder why it costs so much, you were never the target audience in the first place :)

My only real disappointment with this monitor is that it doesn't use thunderbolt. I would really prefer a Thunderbolt 3 or 4 monitor, so I can daisy chain multiple of them together, or use that thunderbolt port on the monitor for something else. It's seriously disappointing given the price

I actually am *somewhat* the target audience as I work in the film industry, but I'd also argue that 90% of us don't need crazy bonkers accuracy and reference modes save for those in color correction at the final step before delivery of higher end pieces. So many big jobs are finished on a MacBook Pro display, checked on a couple smartphones, and that's that.

I guess my beef is - why can't we make small OLED TVs at all even? The smallest out there is 48" - I think many regular consumers and prosumers out there would buy the crap out of a 27-32" regular 8 bit OLED panel in the same WRGB layout as the bigger LGs. It's just odd we've got $3000 miniLED options and now $3000 OLED professional options, but nothing in the midrange. Just edge lit IPS monitors and TVs. There's even great OLED laptop displays, but if you want one that sits on your desk it's going to cost more than the whole laptop that included the OLED display in the first place.
 
I was hoping the smallest model would come in around a grand. Too bad. Well, my need for color accuracy isn't great enough to lay down $2K for it, but it would have been nice to have a completely Apple-branded Mini setup.
 
I was hoping the smallest model would come in around a grand. Too bad. Well, my need for color accuracy isn't great enough to lay down $2K for it, but it would have been nice to have a completely Apple-branded Mini setup.

(NB: I know this isn't an Apple display, but the story is most likely here because of the current speculation over new Apple displays)

I think this is the root cause of some of the comments here - the Mac market tends to assume that you're either updating your Facebook page or (at least, in dreams) colour grading the latest Pixar blockbuster. There's nobody in-between who just needs a decent display that "just works" with their Mac, has a MacOS-friendly resolution and matches the design and gives them lots of screen estate for coding, working on documents, spreadsheets and less-than-premiere league graphics work.

If the Pro display XDR meets your needs, or at least offers you an attractive compromise versus the $20,000 reference display that you might need otherwise, that's great. The problem is, currently, if you're not in that league, Apple have nothing to offer - and most of the third-party choice is limited to not-quite-4k "UHD" and a bit of a lottery as to whether it will work smoothly with MacOS. If you want it as a dock for your MBP, then whether it has enough power delivery to actually power anything bigger than an Air is also an issue with 3rd party.

Sadly, for the last decade, Apple has had one job to do on displays - take the displays from the 4k and 5k iMacs, put them in matching cases and sell them at a believable price (i.e. less than the corresponding iMacs). Instead, they decided to outsource that to LG and while the results were quite practical they didn't "look the part"... they were also a bit late to the part since, the 5k iMac had already been around for years.

Nobody is asking Apple to make a cheap display to compete with that $600 Huawei - but if their idea of affordable is "half the price of the Pro XDR" they're holding it wrong.

The 27" LED Cinema (& the later Thunderbolt version) are the case in point - they weren't cheap but, when launched, they matched the design of the current Macs, and had 1440p screens (which weren't cheap at the time) they were arguably worth the money. The problem is, years later, when cheap 1440p displays - often with comparable panels - were available, Apple still wanted the original premium prices for the old product that didn't even match the current Macs. I don't know what it is with Apple and pricing - but the fact that the TB display always cost the same as the non-TB LED Cinema suggests that it is too high a logic for our puny brains.
 
  • Love
Reactions: Tagbert
I was looking at monitors myself but was looking at 32 inch 4K. I guess if I ever get into professional photography or film making, this might be something I would need but outside that for general everyday use, all I want is a decent USB C 32 Inch Monitor with some ports in it for peripherals. I think that is achievable under 4K.
 
A good description, but one point I wish to clarify:

WRGB has filters to show the desired colour of each sub-pixel, blocking out the rest of the white light for the RGB elements. QD-OLED uses quantum dots to convert, not block, the blue on the RG elements. Although the white OLEDs have a longer lifespan, they have to work harder as much of the light is blocked, while the blue OLEDs of QD-OLED all the light gets through, so they have a much easier time to create as much light. The first TVs using this new format have brighter displays than LG’s WRGB panels, but seeming less than the blocking/converting difference would allow, so I think the QD-OLED lifespan should be OK.
I think this is very much to be tested in real life. As you mention, the WOLED panel is limited in that each subpixel generates white (i.e. all the colours) and that a lot of this light is blocked (i.e. lost) by the RGB colour filers which is why the white subpixel is needed to maintain and drive the brightness at higher levels. This is also why WOLED panels are not used for mobile devices as the efficiency is too poor.
QD-OLED on the other hand does not block any light (and I believe the QD colour conversion layers are quite efficient) and the panels are top-emission which allows a higher aspect ratio (i.e. each subpixel is physically larger so subpixel can emit more light). However, because they are based on blue OLEDs, the efficiency is lower and the lifetime is lower. I heard rumors that the QD-OLED panels could only reach 2-300 nits when demonstrated a couple of years ago behind closed doors at CES which was due to burn-in risk. Perhaps these issues have been overcome. Either way I await the testing of the various technologies by channels such as HDTVTest so see what the panels are capable of. Vincent has already put out a video exposing why the QD-OLED vs WOLED vs MiniLED demonstration at CES was misleading.
 
Does it really? I have the same and it only keeps the glare off from the borders of the screen lol... like just 1-2 inches and that's it, it's not magical or anything
Maybe it depends where the glare's coming from. I have 2 Atomos monitors for filming. Sometimes they're unnecessary and other times they're life-savers, even when filming indoors.
 
Real world question here: I print pictures, I don't keep them on screens forever.
I've tried my best to calibrate the OLED display in my Windows laptop but it never prints like my Mac Mini attached to my old IPS Dell U3011 display. Are these new display technologies essentially useless if I'm printing?

My old regular print guy used to use a CRT before he passed last year.
 
4K at 32 inch is not a suitable replacement for anything Apple makes, much less the XDR. It's an awful pixel density for macOS in general, too low res for 2:1 high-DPI mode, and yet also too high res for low-DPI mode.
 
Last edited:
  • Like
Reactions: Tagbert
You (and I) probably fall under the other 99% that this stuff is lost on. I'm an editor and have done plenty of color grading. No one likes when their work looks different from one viewing source to another but I can tell you that it really makes no difference to anyone but the artist(s) intimately involved with the project and maybe the producer if you're lucky enough to have one that cares. Audiences don't care about the difference between orange and yellow because they were never in the position to make that creative choice.
The big difference is based on the word you used "artist(s)". If it is an artist working on a project, the audience won't know if the colours are intended or not. But if there are multiple artists, a colour switching between orange and yellow because of differences between artist monitors will be noticed, although much of the audience may not be able to tell exactly what is wrong, just that there is something wrong, unless it is really bad.

It can be critical for companies with colours in their logo or products. As an example, food labels that look wrong next to other products in the same range on the shop shelves and customers will take it as a sign the food may have a problem.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.