Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't know why most people obsess about brightness. I do not enjoy a flashlight shining into my eyes. I prefer looking at a monitor that looks like a photograph. Anything brighter is unnatural, that is not what our eyes are evolved for, and taxes my visual cortex, to the point that I cannot concentrate. I find OLED monitors much more comfortable, and I am using an Asus ProArt PQ22UC OLED because it is very accurate at low brightness (after calibration). An e-paper monitor, like the Dasung Paperlike, is even more comfortable. I can concetrate so much better on it, it is like reading a newspaper.

I cannot stand bright monitors, they feel like my eyes and brain are getting tortured.

I hope color e-ink technology comes to Macbooks and other laptops one day:
I don't think it will happen while Al Bundy is Apple's CEO, but one can only hope.
 
Last edited:
  • Like
Reactions: subjonas
Having owned a 4k laptop, I can say without hesitation that 4k resolution is just too much for 15" (or 16") displays. it makes no sense other then having 4k as a marketing ploy.
Having a 4K 15'' laptop I can say the higher resolution is much more pleasant to look at than FHD. I'd gladly get that over a better battery life, I'm that type of person.
For me the difference is very obvious, it's twice the amount of pixels. Also 4K content looks gorgeous.
 
  • Like
Reactions: iRun26.2
Having a 4K 15'' laptop I can say the higher resolution is much more pleasant to look at than FHD. I'd gladly get that over a better battery life, I'm that type of person.
For me the difference is very obvious, it's twice the amount of pixels. Also 4K content looks gorgeous.

I don't mean to question your truth, but is it possible that your 4K laptop display looks better than other laptop displays because it's just a much better quality panel? Everything I've read suggests that for a laptop, Apple's Retina display pretty much tops out in terms of PPI as to what the human eye can discern based on normal laptop use (ie. a foot or two away from your face). That's not to say that there aren't better quality panels than Apple's retina, but that a PPI of over 220 or so is lost to the human eye.
 
I don't know why most people obsess about brightness. I do not enjoy a flashlight shining into my eyes. I prefer looking at a monitor that looks like a photograph. Anything brighter is unnatural, that is not what our eyes are evolved for, and taxes my visual cortex, to the point that I cannot concentrate. I find OLED monitors much more comfortable, and I am using an Asus ProArt PQ22UC OLED because it is very accurate at low brightness (after calibration). An e-paper monitor, like the Dasung Paperlike, is even more comfortable. I can concetrate so much better on it, it is like reading a newspaper.

I cannot stand bright monitors, they feel like my eyes and brain are getting tortured.

I hope color e-ink technology comes to Macbooks and other laptops one day:
I don't think it will happen while Al Bundy is Apple's CEO, but one can only hope.

The point of having higher max brightness is to allow the device to work reasonably well in environments where the average brightness is far higher. i.e.: when you're sitting close to a window and the sun outside is shining very brightly.

Sure, you can get away with turning off all lights and work in the darkness with the screen at very low brightness, but... that's not for me, and many others. Not everyone tends to stay cooped up in complete darkness for 40 hours a week. I and probably a fair number of folks like our environments well-lit, and sometimes close to sunlight.

And for us high-brightness folks, OLED "accuracy" in general (not particularly the Asus ProArt monitor) simply sucks. The tech is very promising, but companies cut too many corners for it to reach the intended performance level.

The ProArt is hopefully not cutting too many corners. I don't know, I haven't seen one (not the PQ22UC at least), but all of the other OLEDs I have seen suck. I hope the ProArt is indeed not cutting too many corners considering the price point. But then... there are reports of it getting pretty inaccurate at higher brightness, which... is par for the course for other OLEDs, so again, I'm not surprised.

As an aside, we have a Sony A8F 65" downstairs to watch movies on. It is good for that. Brightness is also pretty decent, and in the dark, it is simply stunning. But my main work monitor is still an LG with 500 nits brightness because it sits right next to a bright sunny window until 4PM every single day. After that, yeah, its lowest brightness sucks compared to the Sony OLED TV, but again, I don't work in complete darkness.
 
The point of having higher max brightness is to allow the device to work reasonably well in environments where the average brightness is far higher. i.e.: when you're sitting close to a window and the sun outside is shining very brightly.

Sure, you can get away with turning off all lights and work in the darkness with the screen at very low brightness, but... that's not for me, and many others. Not everyone tends to stay cooped up in complete darkness for 40 hours a week. I and probably a fair number of folks like our environments well-lit, and sometimes close to sunlight.

And for us high-brightness folks, OLED "accuracy" in general (not particularly the Asus ProArt monitor) simply sucks. The tech is very promising, but companies cut too many corners for it to reach the intended performance level.

The ProArt is hopefully not cutting too many corners. I don't know, I haven't seen one (not the PQ22UC at least), but all of the other OLEDs I have seen suck. I hope the ProArt is indeed not cutting too many corners considering the price point. But then... there are reports of it getting pretty inaccurate at higher brightness, which... is par for the course for other OLEDs, so again, I'm not surprised.

As an aside, we have a Sony A8F 65" downstairs to watch movies on. It is good for that. Brightness is also pretty decent, and in the dark, it is simply stunning. But my main work monitor is still an LG with 500 nits brightness because it sits right next to a bright sunny window until 4PM every single day. After that, yeah, its lowest brightness sucks compared to the Sony OLED TV, but again, I don't work in complete darkness.

I agree with what you say in general. Fundamentally, the problem with emissive displays is that when you sit next to a window, as the sun goes up and down, brightness changes, and your pupils dialate or constrict accrodingly, letting less or more light in. When that happens, you need to keep changing your monitor brightness, to keep the light flux into your retina constant. Most people do not do that. This causes eye strain and damage to the retina.

Some EIZO monitors do this automatically. Reflective displays like e-paper (such as the Dasung paperlike) do not suffer from this issue. They are as bright as their environment. (But they have framerate and ghosting issues.)

With OLED, I can increase brightness, calibrate accuracy (I have an xrite i1 display pro calibrator) and use MacOS software (like gamma control or screen tint) to make white color darker, and appear as white as a piece of paper on my desk, while keeping other colors intact. This makes the OLED monitor look as natural and comfortable as a photograph. I do not have to sit in the dark.

If I try that trick with the mac internal LCD display, increasing brightness makes the black background look like bright gray. Everything looks grayed out, so it does not work.
 
Well, so my monitor actually has an ambient light sensor and does adjust its brightness automatically. I don't have to touch it at all throughout the day. It's not just the EIZO. There are quite a number of monitors that do this now.

And when I'm not connected to the external monitor, my MacBook also adjusts its brightness automatically. I don't have to touch it either.

So it simply "just works" for me. And that's enough while I wait for next-gen OLED and Mini LED to duke it out.
 
The excuses found here for Apple not offering a 4k and/or OLED display in MacBooks are both entertaining and sad. And I would bet my life the same people making these excuses would be first in line to buy one if Apple started selling them.

I've personally never seen or heard anyone with a 15-16" MBP not using the highest scaling "looks like 1920x1200" or "2048x1280" mode. A ~4K display would finally allow true @2x for this level of scaling, which is what everyone wants.

Complaining about loss of colour accuracy at high brightness is moronic at best, any colour critical work is always done at around 100nits with controlled lighting. And calibration only applies to one brightness level on any screen type, so increasing brightness on LCD has the same issue, maybe its worse on OLED, but it's a totally moot point. Nobody does colour critical work on a laptop in the sun or very high brightness environment. Nobody.

OLED can easily exceed the 500 nits macbooks currently have. Look at the brightness iPhones are pumping out with their tiny little battery. Yes power consumption is worse at high brightness, but its also better at low brightness. And the whole battery life argument is total crap anyway, try actually pushing your MacBooks hardware and see how many hours your battery lasts (hint: its 2 at best). Like anyone would care if their battery last 20 mins less by having a 4k oled screen.

The only point of any real concern is burn in, but even then it's mostly unfounded and there are many ways to mitigate it. Modern OLED TV's have proven to hold up extremely well over the past few years. Maybe Apple is waiting to see how the oleds in other laptops hold up first. And people worried about menubar and dock burn in could easily mitigate this by simply hiding them. Why anyone keeps either of these in view at all times regardless of screen type, especially on laptops, is beyond me tbh.

Regarding PWM, Apple does not and has never cared about this, they have made this very clear. Apple screens have always used PWM, and they even moved iPhones to OLED where it can be more apparent. They don't care about the 6 people getting headaches.
 
The excuses found here for Apple not offering a 4k and/or OLED display in MacBooks are both entertaining and sad. And I would bet my life the same people making these excuses would be first in line to buy one if Apple started selling them.

Some time back, I watched a Linus Tech video where he mentioned that the most popular laptop on Amazon was the older MBA.

It got me thinking that there may be such a thing as over serving the market, as the user may not really need the power that the latest hardware provides.

Yes, one could argue that people getting the 16” MBP are power users who use them for intensive tasks and will need every single ounce of performance the hardware can squeeze out, but also remember that these are laptops, to be carried around, and so from Apple’s perspective, factors such as portability and battery life still matter.

Using a 4K display means higher battery usage and taxing the GPU more, and I still believe that Apple ultimately settled on the resolution they did because they believed that was what allowed for the best overall user experience. You still have a fairly sharp and clear display, while enjoying excellent battery life and great performance while at it.

So not offering a 4K display doesn’t make Apple any worse or any more “cheapskate” than the competition. It just means that Apple chose to prioritise certain features more over others compared to the other PC manufacturers.
 
  • Like
Reactions: subjonas
I don't know why most people obsess about brightness. I do not enjoy a flashlight shining into my eyes. I prefer looking at a monitor that looks like a photograph.

Because most people use these displays in daylight (or maybe even outside). It is not about shining a flashlight into your eyes. It is about matching the ambient brightness. If your surrounding are bright, you need a brighter monitor to maintain the color and contrast. But I am sure you know all this given the fact that you are using a pro-level monitor that costs more than a specced-out 16" MBO.


I hope color e-ink technology comes to Macbooks and other laptops one day
I don't think it will happen while Al Bundy is Apple's CEO, but one can only hope.

E-ink is not nearly there technologically for the use in general-purpose display. Dasun tech is impressive but it still severely lacks color accuracy and refresh rate comparing to the standard LCDs. Not sure what Tim Cook has to do with this. It's obvious that no laptop manufacturer would use this tech — this would be commercial suicide.
[automerge]1590318068[/automerge]
With OLED, I can increase brightness, calibrate accuracy (I have an xrite i1 display pro calibrator) and use MacOS software (like gamma control or screen tint) to make white color darker, and appear as white as a piece of paper on my desk, while keeping other colors intact. This makes the OLED monitor look as natural and comfortable as a photograph. I do not have to sit in the dark.

And you are right, all these properties making OLED a great tech. But there is no such thing as free lunch. This is about finding a balance. Your monitor represents the absolute pinnacle of current OLED technology, and I am sure it is both stunning and impressive. But again, it costs more than the entire MacBook Pro and it is a standalone monitor. How do you know that putting equivalent technology in a laptop is feasible at all?

Having a 4K 15'' laptop I can say the higher resolution is much more pleasant to look at than FHD. I'd gladly get that over a better battery life, I'm that type of person.
For me the difference is very obvious, it's twice the amount of pixels. Also 4K content looks gorgeous.

I don't think that there is any disagreement here. The question is whether there is a practical difference in quality between 4K as used generally in a 15" laptop world (e.g. 3840 x 2400 in the new Dell XPS) and the "4K" as used by Apple (3072 x 1920).
 
Last edited:
I've personally never seen or heard anyone with a 15-16" MBP not using the highest scaling "looks like 1920x1200" or "2048x1280" mode. A ~4K display would finally allow true @2x for this level of scaling, which is what everyone wants.

Would there be any change in perceivable image quality however? Don't get me wrong, I am not claiming that the lower-res panel is somewhat "better", it's about looking at tradeoffs. Everything else being equal, of course I'd take a true 4K panel. The big question is: given the current display technology, could Apple offer a 4K panel that would a) enhance the image quality and b) not sacrifice anything else (like performance or batter life).

OLED can easily exceed the 500 nits macbooks currently have. Look at the brightness iPhones are pumping out with their tiny little battery.

That is a 6" display... compared to a 15". What laptop OLED panels you know that can deliver this kind of brightness?

The excuses found here for Apple not offering a 4k and/or OLED display in MacBooks are both entertaining and sad. And I would bet my life the same people making these excuses would be first in line to buy one if Apple started selling them.

We are discussing pro and contra of various existing display tech choices. If you have something constructive to contribute (as you do), I am sure that everyone wold appreciate it. There is no need to bring in this kind of attitude however. And yes, I would take a good quality OLED over an LCD any time of the day. This is not about personal preferences. This is about "Why does Apple use what they use". Of course I would prefer an OLED panel. I care more about deep blacks and contrast than I care about color accuracy (I am not a digital artist). And I think we can all agree that traditional LCD tech is a dead end at this point. But looking at OLED panels found in contemporary laptops, I understand why Apple is not hurrying — the advantages do not outweigh the disadvantages. MicroLEDs will hopefully change this.
 
Would there be any change in perceivable image quality however? Don't get me wrong, I am not claiming that the lower-res panel is somewhat "better", it's about looking at tradeoffs. Everything else being equal, of course I'd take a true 4K panel. The big question is: given the current display technology, could Apple offer a 4K panel that would a) enhance the image quality and b) not sacrifice anything else (like performance or batter life).

That is a 6" display... compared to a 15". What laptop OLED panels you know that can deliver this kind of brightness?

We are discussing pro and contra of various existing display tech choices. If you have something constructive to contribute (as you do), I am sure that everyone wold appreciate it. There is no need to bring in this kind of attitude however. And yes, I would take a good quality OLED over an LCD any time of the day. This is not about personal preferences. This is about "Why does Apple use what they use". Of course I would prefer an OLED panel. I care more about deep blacks and contrast than I care about color accuracy (I am not a digital artist). And I think we can all agree that traditional LCD tech is a dead end at this point. But looking at OLED panels found in contemporary laptops, I understand why Apple is not hurrying — the advantages do not outweigh the disadvantages. MicroLEDs will hopefully change this.
Anything that relies on pixel-level precision would benefit from real 2x scaling. Also the ability to edit/view 4k content at 1:1 pixel mapping. Do you really believe 4k would affect performance on any level that matters? Even Intels integrated gpus have supported 4k for years now. And I've already explained the whole battery life argument is a farce.

There are strong rumours that apple is going to implement mini-led backlighting(which will also consume more power btw) into their products, which imo brings more negatives than positives when compared to oled. Like I said I think the only real concern apple might have about oled in a Mac is the uncertainty of burn in over years of use, when you think about it that would explain why they are going with mini led, at least for now.

Off the top of my head I believe razer has a 4k 600nit oled display option for their laptops. Regarding iPhone screen size compared to 15", remember the resolution of the largest iPhone screen is almost as high as a 15" macbook pro screen, I don't think there's an issue there.

Outside of the uncertainty of burn in after years of use, I don't really see any legitimate disadvantages to oled. Remember it could be offered as a BTO option, it wouldn't have to be forced onto everyone. Personally I would take the risk.

Regarding attitude, if people want to say silly things and assume every decision apple makes is the best decision for them I am entitled to laugh/cry in response :p

Microled is obviously the endgame, but is still so far off it's not really worth mentioning.
 
Fundamentally, the reason Apple is not using OLED is that they care about their profit margin, and nothing else. IPS panels are much cheaper than OLED.
 
I don't mean to question your truth, but is it possible that your 4K laptop display looks better than other laptop displays because it's just a much better quality panel? Everything I've read suggests that for a laptop, Apple's Retina display pretty much tops out in terms of PPI as to what the human eye can discern based on normal laptop use (ie. a foot or two away from your face). That's not to say that there aren't better quality panels than Apple's retina, but that a PPI of over 220 or so is lost to the human eye.
To be fair, the resolution is half of the story, yeah. But my display actually doesn't support DCI-P3 as most Macbooks do, so in terms of color they should be better. But I was just comparing FHD and 4K on a laptop, Macbook Pros are in the middle (2K-3K ish) so I think that's where the pixels start disappearing.
 
Anything that relies on pixel-level precision would benefit from real 2x scaling. Also the ability to edit/view 4k content at 1:1 pixel mapping. Do you really believe 4k would affect performance on any level that matters? Even Intels integrated gpus have supported 4k for years now. And I've already explained the whole battery life argument is a farce.

There are strong rumours that apple is going to implement mini-led backlighting(which will also consume more power btw) into their products, which imo brings more negatives than positives when compared to oled. Like I said I think the only real concern apple might have about oled in a Mac is the uncertainty of burn in over years of use, when you think about it that would explain why they are going with mini led, at least for now.

How is the battery life argument farce? We can compare models such as the Dell XPS 13 that offer FHD and 4K options. The latter have noticeably shorter battery lives. And it seems like you just explained why Apple isn’t using OLED. They plan to use mini-LED. Why switch to OLED for a short time when you are planning a larger move to newer technology?
 
How is the battery life argument farce? We can compare models such as the Dell XPS 13 that offer FHD and 4K options. The latter have noticeably shorter battery lives. And it seems like you just explained why Apple isn’t using OLED. They plan to use mini-LED. Why switch to OLED for a short time when you are planning a larger move to newer technology?
I didn't say anything about a 13" laptop. And I already explained why, if you actually use the hardware inside these laptops(any brand) the battery life is abysmal. The "10+ hours" all these brands claim is only true if you stick to the tiny list of mundane tasks listed in their size 2 font semi-translucent fine print.

Mini led isn't better than OLED, that's like saying a 2020 MacBook Air is better than a 2019 MacBook Pro because its newer. It's a backlight technology that tries to mimic the benefit of a self-emissive display. Think of mini led like a life support machine for lcd.

If you think battery life impact is such a problem, ask yourself why apple would switch their flagship phones, where battery life and screen brightness are infinitely more important, to oled. LCD can go brighter than oled, and consumes less power at the higher end, so you must be really angry iPhones use oled now right?...Right?
 
Last edited:
I see PCs have 4K laptops. Not too many OLED though last time I checked.

I was wondering if/when they would do it or why they don’t do it.

Because apple don't play spec sheet racing games for the sake of bigger numbers.

4k on a laptop sized screen is irrelevant. It's not even relevant on smaller sized TVs outside of marketing.


[automerge]1590367659[/automerge]
Windows has variable scaling (up to 300%?) but macOS has only pixel doubling

Incorrect. The default out of the box res on the MBA 2020 for example is not pixel doubled.

The Mac hasn't done just dumb pixel doubling since retina was released, you've been able to scale to non-pixel doubled sizes since 2012.


There is a limit beyond which more pixels are pointless. Apple even listed the formula they use for this; it is based on normal viewing distance for the device.

4k on a 13"-16" or larger laptop display is just not needed. Even if you're doing video work - people aren't generally working with individual pixels by hand these days, that hasn't really been a thing for about 15-20 years now. There are tools that do the job.

And if you ARE working with individual pixels - you're zooming into them, making screen resolution irrelevant.
 
Last edited:
Anything that relies on pixel-level precision would benefit from real 2x scaling. Also the ability to edit/view 4k content at 1:1 pixel mapping. Do you really believe 4k would affect performance on any level that matters? Even Intels integrated gpus have supported 4k for years now. And I've already explained the whole battery life argument is a farce.

There are strong rumours that apple is going to implement mini-led backlighting(which will also consume more power btw) into their products, which imo brings more negatives than positives when compared to oled. Like I said I think the only real concern apple might have about oled in a Mac is the uncertainty of burn in over years of use, when you think about it that would explain why they are going with mini led, at least for now.

Off the top of my head I believe razer has a 4k 600nit oled display option for their laptops. Regarding iPhone screen size compared to 15", remember the resolution of the largest iPhone screen is almost as high as a 15" macbook pro screen, I don't think there's an issue there.

Outside of the uncertainty of burn in after years of use, I don't really see any legitimate disadvantages to oled. Remember it could be offered as a BTO option, it wouldn't have to be forced onto everyone. Personally I would take the risk.

Regarding attitude, if people want to say silly things and assume every decision apple makes is the best decision for them I am entitled to laugh/cry in response :p

Microled is obviously the endgame, but is still so far off it's not really worth mentioning.

The idea of using a laptop working on a 4k video not using a proxy so I can see 1:1 pixel quality on a screen too small for my eyes resolve a single pixel since they are around 2 thousands of an inch makes me very confused.

Does anyone really believe 4k would affect performance? Huh? Maybe not the screen itself but the processing power required to render those pixels does. This is why that Razor laptop you mentioned can't maintain 60fps @ 4k on any game more complex than Rocket League. Matter of fact "performance mode" is exactly that....1080p rendering not 4k.

Why would 4k even be considered for a laptop used for editing? Because its what your TV is? Apple did it right, using an arbitrary metric like resolution that changes with screen size lacks consistency. Changing the pixel density will change everything from nits to color, smaller screen device will look sharper then their more expensive counterparts, etc. Plus I prefer 16:10 for photo editing, for that 1:1 pixel thing you mentioned...

To each their own. I wouldn't be against 4k or higher but not until there are decent reasons. I understand Razors perspective with a gaming laptop but even then its still struggling at times according to its reviews. Which isn't surprising.
 
Because apple don't play spec sheet racing games for the sake of bigger numbers.

4k on a laptop sized screen is irrelevant. It's not even relevant on smaller sized TVs outside of marketing.

There is a limit beyond which more pixels are pointless. Apple even listed the formula they use for this; it is based on normal viewing distance for the device.

4k on a 13"-16" or larger laptop display is just not needed. Even if you're doing video work - people aren't generally working with individual pixels by hand these days, that hasn't really been a thing for about 15-20 years now. There are tools that do the job.

And if you ARE working with individual pixels - you're zooming into them, making screen resolution irrelevant.
This is just dumb. Don't forget to send your angry letters to apple when all of their products inevitably move to higher resolution screens.

For years people said the same thing about iPhone screens not needing to ever go beyond 326ppi, iPhones not needing oled, iPad minis and MacBook airs not needing retina displays, 2013 Mac pros not needing expandable hardware, "Average users" not needing more than 16GB device storage. The list goes on and on and on and on and on...

Assuming you have healthy or accurately corrected vision, 4K is easily noticeable on even smaller TV sizes.

You don't know what you're talking about regarding pixel accuracy. It's not just about "working with individual pixels", working on a resolution the doesn't divide equally into a screens native resolution can cause all sorts of unwanted effects where precision is important, especially(but not limited to) things like video. Fine patterns can show unwanted moire or appear as a single block of colour, noise can appear either too strong or not at all depending on the content, sharpening can lack accuracy, high contrast edges can exhibit shimmering, I'm sure there's plenty more.
 
Last edited:
I just think Apple should offer it as an upgrade but that’s just me.

AND

OLED

I’d pay for both those upgrades even if it cost a lot more.
 
  • Like
Reactions: Yurk and iRun26.2
The idea of using a laptop working on a 4k video not using a proxy so I can see 1:1 pixel quality on a screen too small for my eyes resolve a single pixel since they are around 2 thousands of an inch makes me very confused.

Does anyone really believe 4k would affect performance? Huh? Maybe not the screen itself but the processing power required to render those pixels does. This is why that Razor laptop you mentioned can't maintain 60fps @ 4k on any game more complex than Rocket League. Matter of fact "performance mode" is exactly that....1080p rendering not 4k.

Why would 4k even be considered for a laptop used for editing? Because its what your TV is? Apple did it right, using an arbitrary metric like resolution that changes with screen size lacks consistency. Changing the pixel density will change everything from nits to color, smaller screen device will look sharper then their more expensive counterparts, etc. Plus I prefer 16:10 for photo editing, for that 1:1 pixel thing you mentioned...

To each their own. I wouldn't be against 4k or higher but not until there are decent reasons. I understand Razors perspective with a gaming laptop but even then its still struggling at times according to its reviews. Which isn't surprising.
I addressed the pixel accuracy concerns in a recent reply.

I have no idea what your point is regarding gaming performance, high end desktops can barely play games at 4K60, so obviously laptops can't either. Games aren't fixed to the resolution of your screen, why would anyone attempt to run games at 4K60 on any laptop let alone a macbook? How is this relevant? What's your point? We're talking about general performance.

I couldn't really follow your third paragraph.

If you can't already see the benefits of moving to 4K now, you won't see it in the future either.
 
How is the battery life argument farce? We can compare models such as the Dell XPS 13 that offer FHD and 4K options. The latter have noticeably shorter battery lives. And it seems like you just explained why Apple isn’t using OLED. They plan to use mini-LED. Why switch to OLED for a short time when you are planning a larger move to newer technology?
I think you are confusing mini-LED (which is not at all like OLED) with micro-LED (which is more like OLED). Apple is reasearching micro-LED but it won't be used for years. mini-LED is coming soon, but it is not OLED.
[automerge]1590374407[/automerge]
Well, so my monitor actually has an ambient light sensor and does adjust its brightness automatically. I don't have to touch it at all throughout the day. It's not just the EIZO. There are quite a number of monitors that do this now.

And when I'm not connected to the external monitor, my MacBook also adjusts its brightness automatically. I don't have to touch it either.

So it simply "just works" for me. And that's enough while I wait for next-gen OLED and Mini LED to duke it out.

The macbook auto-brightness does not work well. It leads to too many brightness fluctuations. Also, you are confusing mini-LED with micro-LED. mini-LED is not at all like OLED. micro-LED is like OLED, but it is not coming any time soon.
 
This is just dumb. Don't forget to send your angry letters to apple when all of their products inevitably move to higher resolution screens.

For years people said the same thing about iPhone screens not needing to ever go beyond 326ppi, iPhones not needing oled, iPad minis and MacBook airs not needing retina displays, 2013 Mac pros not needing expandable hardware, "Average users" not needing more than 16GB device storage. The list goes on and on and on and on and on...

Assuming you have healthy or accurately corrected vision, 4K is easily noticeable on even smaller TV sizes.

You don't know what you're talking about regarding pixel accuracy. It's not just about "working with individual pixels", working on a resolution the doesn't divide equally into a screens native resolution can cause all sorts of unwanted effects where precision is important, especially(but not limited to) things like video. Fine patterns can show unwanted moire or appear as a single block of colour, noise can appear either too strong or not at all depending on the content, sharpening can lack accuracy, high contrast edges can exhibit shimmering, I'm sure there's plenty more.
Ideal PPI is one of those things people can go back and forth arguing all day for eternity if all they can back it up with is personal opinion and anecdotal evidence. There’s no point in trying to convince anyone else because they will already have their opinion—UNLESS one can prove it with hard scientific numbers regarding what the normal human eye can distinguish under normal circumstances (with reference links). And the “normal” terms must be agreed upon as well. Otherwise the discussion just devolves into incessant noise and insults.

The only irrefutable (ie. everyone agrees) facts so far are that higher PPI means sharper detail until around a certain point, when higher becomes superfluous, and that more pixels are a trade off as it means more power consumption and higher cost to manufacture. Anything beyond that has to be proven, if the discussion is to move forward in any meaningful way.

I’m not very familiar with the scaling issues, but regardless I imagine it would need to be significantly affecting a large percentage of users in order to convince Apple to make the trade off of 4K. If that is the case (and can be proven with data), then it would make a good argument for 4k. Although “significant” and “large” are subjective terms.
 
  • Like
Reactions: throAU
OLED isn't good and both mini-LED and micro-LED will replace it in the future or soon.

4K resolution is useless on a small screen. Also, 4K itself is not even generalized that we still struggle with 4k resolution.
 
Ideal PPI is one of those things people can go back and forth arguing all day for eternity if all they can back it up with is personal opinion and anecdotal evidence. There’s no point in trying to convince anyone else because they will already have their opinion—UNLESS one can prove it with hard scientific numbers regarding what the normal human eye can distinguish under normal circumstances (with reference links). And the “normal” terms must be agreed upon as well. Otherwise the discussion just devolves into incessant noise and insults.

The only irrefutable (ie. everyone agrees) facts so far are that higher PPI means sharper detail until around a certain point, when higher becomes superfluous, and that more pixels are a trade off as it means more power consumption and higher cost to manufacture. Anything beyond that has to be proven, if the discussion is to move forward in any meaningful way.

I’m not very familiar with the scaling issues, but regardless I imagine it would need to be significantly affecting a large percentage of users in order to convince Apple to make the trade off of 4K. If that is the case (and can be proven with data), then it would make a good argument for 4k. Although “significant” and “large” are subjective terms.
My main point is that the excuses people are making seemingly to justify apples decisions are almost entirely nonsensical and it's really just silly behaviour that benefits nobody. For example the concerns about impacts on performance and battery life a 4K screen would have, despite the fact that 15" and 16" MacBooks already internally render the display above 4K when set to the highest scaling mode. Or the idea that power draw from an oled in a macbook would be bad, despite nobody arguing against 800-1200nit(and soon to be 1000-1600nit) oleds in iPhones.
 
And I already explained why, if you actually use the hardware inside these laptops(any brand) the battery life is abysmal. The "10+ hours" all these brands claim is only true if you stick to the tiny list of mundane tasks listed in their size 2 font semi-translucent fine print.

My 16" easily lasts 8+ hours doing non-trivial tasks (like programming). Sure, you won't get more then 2-3 hours doing stuff like editing photos. But there are more users out there than content editors.

If you think battery life impact is such a problem, ask yourself why apple would switch their flagship phones, where battery life and screen brightness are infinitely more important, to oled. LCD can go brighter than oled, and consumes less power at the higher end, so you must be really angry iPhones use oled now right?...Right?

Because making a 6" panel and a 15" panel with required characteristics and acceptable yields is a vastly different thing. Which you of all people should know as you obviously have some knowledge on the matter. Don't forget that Apple was a latecomer to the OLED game and the first OLED panels they used were the best on the market — in fact, Samsung decided to sell them to Appel rather than using them in their own flagship phones because of the costs involved! OLED used in Apple's iPhones are excellent — and way better what you find in a laptop these days. A larger OLED displays with similar characteristics costs an arm and a leg.

For example the concerns about impacts on performance and battery life a 4K screen would have, despite the fact that 15" and 16" MacBooks already internally render the display above 4K when set to the highest scaling mode. Or the idea that power draw from an oled in a macbook would be bad, despite nobody arguing against 800-1200nit(and soon to be 1000-1600nit) oleds in iPhones.

a) The concern is about the power consumption of the panel itself, not the GPU (the overhead for compositing a desktop UI is trivial and any iGPU can easily manage 8K+ these days). There are many laptops that give you an option between different type of screens and from battery tests it is obvious that a 4K display consumes *significantly* more power. OLED laptop panels also seem have troubles in that area.

b) As I wrote above, making a small screen is "easier" than a large screen — you get less opportunity for defects. So if they can manufacture an exceptional quality panel at 6", it doesn't mean they can do so at 15".
[automerge]1590391153[/automerge]
Incorrect. The default out of the box res on the MBA 2020 for example is not pixel doubled.

The Mac hasn't done just dumb pixel doubling since retina was released, you've been able to scale to non-pixel doubled sizes since 2012.

Technically the previous poster is correct. Apple's HiDPI do work exclusively by pixel doubling/trippling (on later iPhones). Variable-ratio scaling is supported by pixel-doubling memory buffer first and then downscaling. This allows better image quality among other things. The basic difference to Windows approach is that Windows would draw directly to "native" resolution buffer while macOS draws to pixel-doubled "backing store" first before resampling it onto the native resolution image.
 
  • Like
Reactions: KPOM
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.