Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

theorist9

macrumors 601
Original poster
May 28, 2015
4,146
3,296
A common configuration for consumer buyers of Mac laptops is to pair them with a large external monitor for home use. Through High Sierra, a consumer could get beautifully sharp text by spending ~$500 for a 4k 27" display (163 ppi). Beginning with Mojave, however, Apple eliminated something called subpixel text rendering*. Effectively, this means that getting optimally sharp text with MacOS now requires a Retina monitor. Some also dislike non-Retina screens because of the UI size.

The problem is that Apple doesn't offer a single consumer-priced Retina external display, i.e. something priced for the type of buyer who spends ~ $1000- $2000 for a laptop (which is probably the largest segment of Apple's Mac market). Thus any consumer who buys a Mac laptop and wants to able to use it with an external monitor and have an optimal MacOS experience, can't, unless they move up to prosumer pricing (the $1,600+ Studio Display).

In sum, given that Apple has changed MacOS to effectively require Retina displays for optimum performance, they should offer some Retina externals for their largest market, which is consumer-class buyers.

This would also help them attract Windows switchers: Right now a big plus of Windows is that a 27" that looks great with text is within much easier reach than it is with MacOS—$500 for a 27" 4k does it with Windows, while with MacOS you need a $1600 27" 5k. Those willing to switch of course would accept paying more for Apple products, but not >3x as much to get about the same effective text sharpness. [Windows still has subpixel text rendering, and also has vectorized scaling, which allows UI size to be adjusted to non-integer ratios without losing sharpness like MacOS does.]

So what should be the pricing of these displays? Honestly, I'm not exactly sure. But as a first effort:

The 2020 27" iMac's base price was $1800, so if half of that was for the display and half for the computer, I'd say $900 for the 27". Then, proportionally by area, we have:

24" = $700 (consistently, this is also half the $1300 starting price of the 24" iMac, rounded up to the nearest $100)
27" = $900
32" = $1,300

And make the stock stand height-adjustable.

I've included the 32" for higher-end consumers who need a larger screen and can't afford, and don't need, a $5,000+ Pro Display XDR. And display analyst Ross Young observed the market is moving towards larger (above 27") displays.

*Beginning with Mojave, Apple eliminated subpixel text rendering from MacOS. Subpixel rendering significantly increases the effective pixel density in the horizontal direction by using the vertical R/G/B subpixels to more finely render text. With subpixel rendering, MacOS could look really crisp with a $500 27" 4k monitor (163 ppi). By eliminating this with Mojave, Apple effectively changed MacOS to require a Retina monitor for optimum viewing. There are probably a couple of reasons Apple eliminated it: (1) It requires knowledge of the display's pixel substructure; and (2) With the way MacOS does scaling, it creates artifacts with anything other than integer scaling. Having said that, it did seem to work pretty much without issue through High Sierra.
 
Last edited:
...
*Beginning with Mojave, Apple eliminated subpixel text rendering from MacOS. Subpixel rendering significantly increases the effective pixel density in the horizontal direction by using the vertical R/G/B subpixels to more finely render text. ...

I absolutely agree that Apple should offer a 5K monitor for under $1000.

It's ridiculous that, last year, you could buy an entire 27" iMac for $1800, and now Apple is selling basically the same thing, sans computer, for $1600.

But, can't agree with you about subpixel rendering. That's the very first thing I turn off on any computer. The red and blue fringing on the edges of text drives me crazy. I know we won't agree about this but I'm glad Apple dumped the idea.
 
I absolutely agree that Apple should offer a 5K monitor for under $1000.

It's ridiculous that, last year, you could buy an entire 27" iMac for $1800, and now Apple is selling basically the same thing, sans computer, for $1600.

But, can't agree with you about subpixel rendering. That's the very first thing I turn off on any computer. The red and blue fringing on the edges of text drives me crazy. I know we won't agree about this but I'm glad Apple dumped the idea.
But you indicated you run your displays with non-integer scaling which, as I mentioned, creates artifacts (specifically color fringing artifacts) with subpixel text rendering in the older versions of MacOS that offered it. I don't use non-integer scaling, because I'm willing to give up control of UI size to get maximum text sharpness, and this also eliminates (or should eliminate--at least I don't see it) the color fringing. Given this, I'm wondering if your issue isn't with subpixel rendering per se, but rather with the problems it creates when you combine it non-integer scaling.
 
Last edited:
When you say you turn subpixel text rendering off, which specific OS's are you referring to?

Every OS that I use regularly--Mac OS, Windows, and Ubuntu.

Although, as you've pointed out, Apple gave up on subpixel rendering ~3 years ago, and Microsoft has been phasing it out of their newer graphics stacks.

The reason text looks 'soft' at scaled resolutions with MacOS is that text rendering isn't perfect at 5K, either. It's still anti-aliased. And the process of scaling that down by only 25% isn't perfect.

You're basically anti-aliasing something that's already anti-aliased. Like a Xerox of a Xerox. Not ideal.
 
  • Like
Reactions: Solomani
Every OS that I use regularly--Mac OS, Windows, and Ubuntu.

Although, as you've pointed out, Apple gave up on subpixel rendering ~3 years ago, and Microsoft has been phasing it out of their newer graphics stacks.

The reason text looks 'soft' at scaled resolutions with MacOS is that text rendering isn't perfect at 5K, either. It's still anti-aliased. And the process of scaling that down by only 25% isn't perfect.

You're basically anti-aliasing something that's already anti-aliased. Like a Xerox of a Xerox. Not ideal.
Please take a look at my subsequent edit--I suspect the color fringing you're seeing might not be from subpixel rendering per se, but rather from artifacts that occur when you combine it with non-integer scaling. Do you see those color fringes with integer scaling?
 
But you indicated you run your displays with non-integer scaling which, as I mentioned, creates artifacts (specifically color fringing artifacts) with subpixel text rendering in the older versions of MacOS that offered it. I don't use non-integer scaling, because I'm willing to give up control of UI size to get maximum text sharpness, and this also eliminates (or should eliminate--at least I don't see it) the color fringing. Given this, I'm wondering if your issue isn't with subpixel rendering per se, but rather with the problems it creates when you combine it non-integer scaling.

Every time I've been annoyed by sub-pixel rendering, it was on a display that was doing integer scaling. (Well, no scaling, actually.)

I've only started using non-integer scaling relatively recently and only on a couple of my displays.
 
100% agree. Apple could drop a lot of the superfluous components (camera, speakers, phone-grade chip, dial back the P3 color space) and just give us a pure, 5k display in a nice chassis for a reasonable price. I'd buy it tomorrow.

It doesn't even seem like it should cost them, or us, that much, either.

Back in 2014 when Apple released their first 5K display, it was dramatically better than the average PC monitor. I just looked up some monitor reviews from around then and it seems like a typical ~$400 PC monitor was 24", 1080p, 6-bit, and pretty dim (only 250 cd/m2).

These days, your typical $400 PC monitor is 27", IPS, 4K, 10-bit, covers 99-100% of the sRGB color gamut, has good color accuracy, gets pretty bright (350 cd/m2), and functions as a docking station (USB-C power delivery, etc.).

I recently replaced a 5K iMac with a $400 Dell monitor and was frankly pretty shocked at how similar they are.
 
I highly doubt that Apple will release cheaper options since it’s not who they are. I wish third party manufacturers would make more options optimized for macOS. There were in fact some manufacturers that made 5K monitors such as the Dell UP2715K, HP Z27q, and a few others. Sadly, those were discontinued and probably expensive. The Dell 5K price for what I checked was $2499. The LG Ultrafine 5K and Studio Display are cheaper than that. iiyama ProLite XB2779QQS was the cheapest 5K display at $900, but that is not available and people say it’s panel quality does not compare and people have issues with it, worse than the Ultrafine 5K. Still, unfortunate there is still not that many good retina third-party options that work best for Mac.

No matter what people say, 27” 4K monitors are inferior and will never match a 5K one at 2K scaling. 27” 4K monitors are like the 1080p 27” monitors in the 2010s, just look bad with 81 DPI and at 4K, it’s double of that. Also, 1080P 2x scaling is just too big and big loss of screen real estate.
 
5 years ago apple apologized and said they were going to focus on pro users… we are seeing the results of their efforts today. Back then I only begrudgingly bought a MBP and held onto it as long as I could. Nowadays I want everything from across their whole lineup. I’m a professional user and pretty much on the verge of buying an ASD. I’m their target market and the prices make little difference to me.

I can’t say I predicted this for regular apple consumers but I can say I’m not surprised. They are doing exactly what they said they would and what a lot of users were asking for- I don’t think anyone was thinking about the cost or feature ramifications.

Rather than be mad at apple, I’m wondering why no third party monitor manufacturers produce 4k/24”, 5k/27”, 6k/32” monitors at retina/Mac OS spec? There are so many monitors in all shapes and sizes and the apple crowd seems to just have a huge void in options. Does apple prevent manufacturers from producing these in any way?
 
These days, your typical $400 PC monitor is 27", IPS, 4K, 10-bit, covers 99-100% of the sRGB color gamut, has good color accuracy, gets pretty bright (350 cd/m2), and functions as a docking station (USB-C power delivery, etc.).

I recently replaced a 5K iMac with a $400 Dell monitor and was frankly pretty shocked at how similar they are.
I have an old Dell 4k 27” P2715Q that I use as my main monitor to this day (scaled at 2560x1440). Personally I think ~163ppi is more than enough. Now when I use my 14” MBP in laptop viewing distances, or my phone, yes there is a noticeable difference with retina, but side by side at the viewing distances that I use a 27” monitor the differences are minimal and I’m still in love with this old dell 4k.

I somehow lucked out and text looks crisp, and crisp enough when I get extremely close and pixel peep. I don’t have any weird scrolling issues, moire, etc. I did have some performance issues back in the day using an old intel MBP with integrated graphics, performance improved with the i9 MacBook it just made way too much noise and heat, and now runs flawlessly on M1 pro.
 
I have an old Dell 4k 27” P2715Q that I use as my main monitor to this day (scaled at 2560x1440). Personally I think ~163ppi is more than enough. Now when I use my 14” MBP in laptop viewing distances, or my phone, yes there is a noticeable difference with retina, but side by side at the viewing distances that I use a 27” monitor the differences are minimal and I’m still in love with this old dell 4k.
...

That shouldn't be shocking. Retina resolution is generally considered to be 60 pixels per visual degree. If you look at a 4K display from 21 inches away, that works out to be 59.8 pixels per visual degree.

So from distances of 21" or greater, if you have average eyesight, you shouldn't really be able to tell a difference between a 4K and a 5K monitor.

I just checked and it looks like I usually sit ~20" away from my monitor, and sometimes I lean in a bit. I feel like my 4K monitor is noticably softer than my 5K monitor was (especially when I lean in) but I'm getting used to it.
 
  • Like
Reactions: pdoherty
...
Rather than be mad at apple, I’m wondering why no third party monitor manufacturers produce 4k/24”, 5k/27”, 6k/32” monitors at retina/Mac OS spec? There are so many monitors in all shapes and sizes and the apple crowd seems to just have a huge void in options. Does apple prevent manufacturers from producing these in any way?

I suspect that Apple probably co-developed their 5K panel with LG and can contractually limit what LG does with the panels. So LG probably isn't allowed to make a 5K display except for the expensive UltraFine model that Apple presumably approved.

I don't know what's preventing other manufacturers from making 5K displays, other than, maybe 4K seems like it's more than good enough to everbody who isn't a Mac user?
 
I have an old Dell 4k 27” P2715Q that I use as my main monitor to this day (scaled at 2560x1440). Personally I think ~163ppi is more than enough. Now when I use my 14” MBP in laptop viewing distances, or my phone, yes there is a noticeable difference with retina, but side by side at the viewing distances that I use a 27” monitor the differences are minimal and I’m still in love with this old dell 4k.

I somehow lucked out and text looks crisp, and crisp enough when I get extremely close and pixel peep. I don’t have any weird scrolling issues, moire, etc. I did have some performance issues back in the day using an old intel MBP with integrated graphics, performance improved with the i9 MacBook it just made way too much noise and heat, and now runs flawlessly on M1 pro.
As you can see, I have that same monitor as part of my 3-display setup! It really is a beautiful display. I've read reviews of the newer Dell 4k's by those that also own the P2715Q, and they say they're just not as good.

Having said that, when I tried moving from High Sierra to Mojave with that monitor, I was like "ugh, what's wrong with the text?". I also tried Catalina--same thing. I later found out it was the lack of subpixel text rendering, so I stuck with HS until I got my 2019 iMac, and was able to replace the Dell with the iMac as my main monitor. Running the two side-by-side, I can clearly see the difference.
 
As you can see, I have that same monitor as part of my 3-display setup! It really is a beautiful display. I've read reviews of the newer Dell 4k's by those that also own the P2715Q, and they say they're just not as good.

Having said that, when I tried moving from High Sierra to Mojave with that monitor, I was like "ugh, what's wrong with the text?". I also tried Catalina--same thing. I later found out it was the lack of subpixel text rendering, so I stuck with HS until I got my 2019 iMac, and was able to replace the Dell with the iMac as my main monitor. Running the two side-by-side, I can clearly see the difference.

Surely you weren't able to run High Sierra at a scaled resolution with sub-pixel rendering enabled? As you pointed out in earlier posts, that wouldn't work, with the way Apple handles scaled resolutions.
 
Surely you weren't able to run High Sierra at a scaled resolution with sub-pixel rendering enabled? As you pointed out in earlier posts, that wouldn't work, with the way Apple handles scaled resolutions.
I've always used a scaled resolution (2:1) on my 4k. But because it's integer scaling, it's sharp and avoids the subpixel rendering artifacts. I explained this in my top post—it's not scaling that creates the artifacts, it's non-integer scaling: "With the way MacOS does scaling, it creates artifacts with anything other than integer scaling."

I do get a bigger UI with my 27" 4k than with my 27" Retina 5k (when both are set to 2:1 scaling) but, as I wrote on the earlier thread where we discussed this, I actually prefer the bigger UI because when I want to work rapidly I find I can click on the buttons and scroll bars faster. Plus I mostly work with apps whose UI's don't take up much space (Mathematica, Excel, etc.), so the larger UI doesn't cost me much real estate.
 
Last edited:
I've always used a scaled resolution (2:1) on my 4k. But because it's integer scaling, it's sharp and avoids the subpixel rendering artifacts. I explained this in my top post—it's not scaling that creates the artifacts, it's non-integer scaling: "With the way MacOS does scaling, it creates artifacts with anything other than integer scaling."

It does give me a bigger UI with the 27" 4k than I get with my 27" Retina iMac (when both are set to 2:1 scaling) but, as I wrote on the earlier thread where we discussed this, I actually prefer the bigger UI because when I want to work rapidly I find I can click on the buttons and scroll bars faster. Plus I mostly work with apps whose UI's don't take up much space (Mathematica, Excel, etc.), so the larger UI doesn't cost me much real estate.

Gotcha. Didn't realize that MacOS ever did subpixel AA in HiDPI modes.

Honestly, I'm surprised you could see a difference. As I posted above, if you have a 27" 4K monitor, entire pixels should barely be perceptible at ~21 inches.

If you're talking about the resolution of subpixels, i.e., 3x the horizontal resolution of entire pixels, you'd have to be 7 inches away from the screen to start to resolve those, if you have average eyesight.
 
Gotcha. Didn't realize that MacOS ever did subpixel AA in HiDPI modes.
If you have a Mac running HS, you can observe it by opening the Digital Color Meter (in Applications/Utilities) and moving the cursor over black and white text.
Honestly, I'm surprised you could see a difference. As I posted above, if you have a 27" 4K monitor, entire pixels should barely be perceptible at ~21 inches.

If you're talking about the resolution of subpixels, i.e., 3x the horizontal resolution of entire pixels, you'd have to be 7 inches away from the screen to start to resolve those, if you have average eyesight.
I don't see the pixels directly, I just see the net effect on text crispness. Plus, while I can see the difference at 21", the large tables I work with often necessitate I use small fonts to fit everything on the screen, which means I'm sometimes leaning in to about a 15" viewing distance. And my eyes are also probably more sensitive to text sharpness than others', because my close vision is pretty good—I used to be able to read the microprinting on a $20 bill.

But even with average viewers and larger viewing distances, there are studies indicating people can see improvements in sharpness at even higher resolutions than Retina. I.e., you're saying you don't see much difference between 160 and 220 ppi at 21", but the study below says users can "conclusively" distinguish between what would effectively be 200 and 300 ppi at about that distance:

Steve Jobs claimed that, because of the angular resolution limit of the human eye, nothing beyond ~300 ppi at 10"–12" would provide increased sharpness (from which all the various Retina display resolutions flowed). However, other display experts have said that the effective limit of the human eye is significantly higher, indicating there is a benefit in going beyond the standard 220 ppi resolution of current Retina displays, even if they are viewed at 20"+.

See:

https://mostly-tech.com/tag/steve-jobs/
https://www.cultofmac.com/173702/why-retina-isnt-enough-feature/

And:
https://sid.onlinelibrary.wiley.com/doi/full/10.1002/jsid.186
"This study determines an upper discernible limit for display resolution. A range of resolutions varying from 254–1016 PPI were evaluated using simulated display by 49 subjects at 300 mm viewing distance. The results of the study conclusively show that users can discriminate between 339 and 508 PPI and in many cases between 508 and 1016 PPI." [1]

[1]Spencer, Lee, et al. "Minimum required angular resolution of smartphone displays for the human visual system." Journal of the Society for Information Display 21.8 (2013): 352-360.

300 mm is 59% of 20", so those correspond to being able to distinguish 200 ppi vs. 300 ppi, and 300 ppi vs 600 ppi, at 20", respectively (same angular resolutions). And for those like me who often lean in closer, to a 15" viewing distance, it's 270 ppi vs. 400 ppi, and 400 ppi vs. 800 ppi.
 
Last edited:
...
Yes, Steve Jobs claimed that, because of the angular resolution limit of the human eye, nothing beyond ~300 ppi at 10"–12" would provide increased sharpness (from which all the various Retina display resolutions flowed). However, other display experts have said that the effective limit of the human eye is significantly higher, indicating there is a benefit in going beyond the standard 220 ppi resolution of current Retina displays, even if they are viewed at 20"+.
...

Yes, the articles point out that "retina resolution" varies by distance and eyesight, which is why I've been careful to specify resolution in "pixels per visual degree" (which takes distance into account) and add the "average eyesight" disclaimer to everything I've written.

It was generally accepted that "retina resolution" (for people with average eyesight) is 60 pixels per visual degree long before Jobs claimed anything.

I'm not saying it's impossible that you could tell a difference between subpixel AA on/off at that screen size and resolution, just that I'm surprised.

All my computers are running Monterey so I can't really do this experiment. I suppose I could try it with Ubuntu but their implementation of subpixel AA is pretty garbage so I don't know if that would be useful.
 
  • Like
Reactions: pdoherty
Yes, the articles point out that "retina resolution" varies by distance and eyesight, which is why I've been careful to specify resolution in "pixels per visual degree" (which takes distance into account) and add the "average eyesight" disclaimer to everything I've written.

It was generally accepted that "retina resolution" (for people with average eyesight) is 60 pixels per visual degree long before Jobs claimed anything.

I'm not saying it's impossible that you could tell a difference between subpixel AA on/off at that screen size and resolution, just that I'm surprised.
Yes, the claim didn't originate with Jobs, and many made that claim but, still, those more recent articles and studies are saying even average viewers can see benefits at significantly more pixels per visual degree than even Apple's Retina displays offer.
All my computers are running Monterey so I can't really do this experiment. I suppose I could try it with Ubuntu but their implementation of subpixel AA is pretty garbage so I don't know if that would be useful.
Found this. All are on my Dell 4k at 2:1 scaling, and are viewing the same black text from an article on nytimes.com at identical display size.

Top is High Sierra, default settings. You can see the colors, indicating the implementation of subpixel text rendering (it doesn't actually display the subpixels individually, so you can't see those, and thus can't assess the actual sharpness of the text from this). Middle is Catalina with font smoothing set to 0, bottom is Catalina with font smoothing set to 3.

1660369796610.png



I also just checked it on my 2014 MBP, which has a Retina display (2:1 scaling) and is running High Sierra. You can see the colors indicating subpixel text rendering there as well.
 
Last edited:
  • Like
Reactions: pdoherty
...
Top is High Sierra, default settings. You can see the colors, indicating the implementation of subpixel text rendering (it doesn't actually display the subpixels individually, so you can't see those, and thus can't assess the actual sharpness of the text from this; that would require a photograph). Middle is Catalina with font smoothing set to 0, bottom is Catalina with font smoothing set to 3. ...

Huh. Well, subpixel AA has annoyed me without fail in the past, but I have to admit that even with my face pressed up against my 163 PPI screen I can't really tell the difference between "Display native values" in the top screen shot vs. the other two, other than that the font weight appears lighter in the middle screen shot.

The text in the top screen shot doesn't seem any sharper to me, either.

I suppose we've determined that our eyes are different.
 
Huh. Well, subpixel AA has annoyed me without fail in the past, but I have to admit that even with my face pressed up against my 163 PPI screen I can't really tell the difference between "Display native values" in the top screen shot vs. the other two, other than that the font weight appears lighter in the middle screen shot.

The text in the top screen shot doesn't seem any sharper to me, either.

I suppose we've determined that our eyes are different.
As I mentioned, this program only displays the net color of each pixel. It doesn't actually display the subpixels. So I don't believe you can tell relative sharpness from this program's display—it only shows you whether subpixel text rendering is being implemented, because when it's on you see colors even with black text.

To visualize subpixel text rendering directly, you'd need comparative photos using a macro lens good enough to enable you to see the individual subpixel structure. [I'm not saying you need to use a macro lens to see the *effect* of subpixel rendering on text sharpness. That I can see from a normal distance. I'm saying to actually see it in action at the subpixel level, you'd need a lot of magnification.]

This is not a photograph, but may provide a decent visualization. Try displaying both of these together on your monitor, and then step back several feet:
[Top: No subpixel rendering (you can tell because each pixel is either all black or all white) (equal intensities of R+G+B will appear white). Bottom: subpixel rendering.]
1660374394414.png


1660374408339.png

Source: https://arnowelzel.de/en/subpixel-rendering
 
Last edited:
As I mentioned, this program only displays the net color of each pixel. It doesn't actually display the subpixels. So I don't believe you can tell relative sharpness from this program's display—it only shows you whether subpixel text rendering is being implemented, because when it's on you see colors even with black text.

I'm not talking about the magnified letters in the screen shots. If you zoom in on the other text ("Display native values," etc.) you can see that there's subpixel AA in the top screen shot but not the other two.

When displayed at native resolution on my 4K Dell monitor, my eyesight is not good enough to see the red and blue fringing on this text, and all the text appears to be equally sharp to me.

To visualize subpixel text rendering directly, you'd need comparative photos using a macro lens good enough to enable you to see the individual subpixel structure.

Nah. I can very easily see red and blue fringing on letters when the display resolution is 110 pixels per inch or less. I find it to be very noticeable and super-annoying. Many other people do too. Many people don't.

[I'm not saying you need to use a macro lens to see the *effect* of subpixel rendering on text sharpness. That I can see from a normal distance. I'm saying to actually see it in action at the subpixel level, you'd need a lot of magnification.] ...

I dunno. You didn't seem to be aware that the regular text in one of your screen shots had subpixel AA but the other two didn't. Are you sure you can see the effect?
 
Last edited:
Nah. I can very easily see red and blue fringing on letters when the display resolution is 110 pixels per inch or less.
You misunderstand. In the paragraph to which you are responding, I was specifically talking about needing high magnification to see the state of each individual subpixel. I explicitly said: "I'm not saying you need to use a macro lens to see the *effect* of subpixel rendering on text sharpness. That I can see from a normal distance. I'm saying to actually see it in action at the subpixel level, you'd need a lot of magnification." You're not talking about seeing the individual subpixels, you're talking about seeing the effect of the subpixel rendering. Again, I explicitly said you don't need high magnification for that.

Speaking of close magnification, here you can see an actual photo comparison. You can see both the loss of sharpness that bothers me without subpixel rendering, and the color fringing that bothers you with it. For me, it's a significant loss of sharpness at the expense of minor color fringing. With you, it's the other way around.

1660375946518.png


I dunno. You didn't seem to be aware that the regular text in one of your screen shots had subpixel AA but the other two didn't. Are you sure you can see the effect?
I now see that you're talking about the regular text. But: You're looking at a screenshot of subpixel rendering on a display without subpixel rendering. You can't tell the difference from that. Instead, you need to use a photo (as shown above).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.