Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leman

macrumors Core
Oct 14, 2008
19,237
19,130
The proof is in the pudding, 1080p, and 1440p is still mainstream, and saying its too complicated for apple that they removed the feature seems insulting to apple's ability to make amazing products.

Im not suggesting it’s too complicated, I’m suggesting it’s not worth the complexity. That’s something different. Basically, my argument is that Apple gave up on customers using 1080p or 1440p and they do not consider these displays worth any dedicated effort. It’s a business decision, not the question of ability.

As to the rest of your argument, I fully agree that there is a substantial subjective component. But the way I view this discussion that component applies to arguments provided by @theorist9: some 4K displays have relatively low pixel density and may produce visible artifacts for some users. The matter of 1080p etc., at least for me, is outside this consideration entirely, because Apple simply gave up on having high-quality text rendering on those displays.

In summary, we have the following topics conflated in this thread:

- Apple deliberately dropping quality optimization for low-DPI displays
- Some larger 4K Displays producing subjectively worse results than others for some users
- Bugs in with certain display models

These are three different issues, although 1. and 2. are marginally related.
 

leman

macrumors Core
Oct 14, 2008
19,237
19,130
Well I'll will concede that its too complicated for apple to justify the continued support.

One thing that came to my mind that subpixel AA is fundamentally incompatible with Apples high-DPI rendering method. Apple renders into a super-resolution buffer and then resolves (downsamples) that buffer to the native display resolution. Any subpixel smoothing present in the original super-resolution buffer will become a weird smudge of random colors when resolved. That’s probably the primary technical reason why Apple dropped it. Otherwise they would need to ship two graphics stacks or something like that.
 

Darkseth

macrumors member
Aug 28, 2020
50
89
I was shopping for new external monitors, and I managed to get an amazing discount / deal on a brand new Dell Ultrasharp display, only to be warned not to use it with Mac’s as it has terrible scaling issues.

I have the same problems with my current Samsung 4K display as the letters are small as hell, and I basically have to change the resolution in Mac OS.

So why are ARM Mac’s terrible with external monitors?

This blog post may help.

I have no idea why @Xiao_Xi 's Post doesn't have more Likes, when it's basicly THE Answer to this whole Thread.

Here's another Article: https://bjango.com/articles/macexternaldisplays/
Which is a shorter, earlier version.

This Youtube Video here also talks about this Issue, and how that Dude went from a 4k Display back to a 2560x1440 Display:


So the Issue is, MacOS is optimized for 110~ ppi, or 220~ ppi. That's why Apple choses so strange Pixelnumbers on all of their Retina Display (5k at 27", 6k at 32", 4k at 21,5" and they had to go up to 4,5k~ for the 23,8" iMac M1). They don't care about the Resolution or "k", they want to keep around 220 ppi.

MacOS has 2 Icon Sets, one for 110 ppi, another "hiDPI" (or high resolution) for 220 ppi.

If your Monitor is at like 160 ppi, you have a problem. MacOS either uses the 110 ppi Icon-Set, and everything is super small --> you need so scale it bigger. It looks less sharp.
Or MacOS uses the 220 ppi Icon-Set, everything looks super big, and needs to scale down (of that's possible idk)
Both can look bad, and/or also reduce Systemperformance, because the GPU has to do extra renderscaling steps.

What usually happens is the worse 110 ppi Icon-Set is used, and scaled up -> it looks bad.


There's a Workaround with "BetterDisplay", where you can enforce the hiDPI IconSet. Worth a Shot if you have these issues.
 

Kazgarth

macrumors 6502
Oct 18, 2020
303
836
One thing that came to my mind that subpixel AA is fundamentally incompatible with Apples high-DPI rendering method. Apple renders into a super-resolution buffer and then resolves (downsamples) that buffer to the native display resolution. Any subpixel smoothing present in the original super-resolution buffer will become a weird smudge of random colors when resolved. That’s probably the primary technical reason why Apple dropped it. Otherwise they would need to ship two graphics stacks or something like that.
While I'm not capable of proving or disproving your claim.

It doesn't have to be an always on feature.

They could implement it as an option with a toggle like in Linux.

63RAjAf.png


This way you can please both the 90% peasants and the 10% elites with 5K+ retina monitors.
 
  • Like
Reactions: theorist9

leman

macrumors Core
Oct 14, 2008
19,237
19,130
While I'm not capable of proofing or disproof your claim. It doesn't have to be an always on feature.
They could implement it as an option with a toggle like in Linux.
63RAjAf.png


This way you can please both the 90% peasants and the 10% elites with 5k+ retina monitors.

Yes, because Apple is all about these kind of toggles. Their design philosophy is literally „configurability is the root of all evil“, which is also why their UI has historically been so polished.

The moment Apple starts adding such tweaks they are done as a company.
 

Kazgarth

macrumors 6502
Oct 18, 2020
303
836
Yes, because Apple is all about these kind of toggles. Their design philosophy is literally „configurability is the root of all evil“, which is also why their UI has historically been so polished.

The moment Apple starts adding such tweaks they are done as a company.
One, just one extra toggle (for font AA) in the display options won't suddenly change the whole design philosophy of the MacOS.

They added tons of new customizations in the MacOS Ventura system settings.

Why is this one (which would the the biggest quality of life upgrade for 90% of desktop mac users) is now suddenly a big deal and would break their design philosophy.
 

Gudi

Suspended
May 3, 2013
4,590
3,265
Berlin, Berlin
Why is this one (which would the the biggest quality of life upgrade for 90% of desktop mac users) is now suddenly a big deal and would break their design philosophy.
Because that's how it is. These are all the settings you get and all the settings you will ever need. Macs can't be improved by tweaking the settings. That's the whole purpose of macOS. You can make the text bigger or smaller, but it always looks perfect. Until you insist on running a 163 ppi display. Don't do that. It's gross!
Bildschirmfoto 2022-11-12 um 19.38.51.png
 

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
I have to admit that I never understood this claim. At the typical desktop viewing distances of 80-100cm a 32" 16:9 4K has comparable angular pixel density than the original retina 15" at the typical laptop viewing distances of 50-60cm. I don't think that close to my desk monitor and I don't have any problem with text aliasing on my 32" 4K monitor. I understand that someone who sits closer is likely to notice artifacting, but why get a large display and then stick your nose in it?
And I don't understand why you'd want to change your comfortable reading distance just because you're using a larger display. I use 50-60 cm for both my laptop and my desktop. Three points:

1) Even if you are viewing a 27" display at 80-100 cm, if you have good vision, you can still benefit from a 220 ppi pixel density:

Quoting from to https://sid.onlinelibrary.wiley.com/doi/full/10.1002/jsid.186 ,
"This study determines an upper discernible limit for display resolution. A range of resolutions varying from 254–1016 PPI were evaluated using simulated display by 49 subjects at 300 mm viewing distance. The results of the study conclusively show that users can discriminate between 339 and 508 PPI and in many cases between 508 and 1016 PPI." [1]Spencer, Lee, et al. "Minimum required angular resolution of smartphone displays for the human visual system." Journal of the Society for Information Display 21.8 (2013): 352-360.

300 mm is 1/3 of 90 cm, so what the latter users can see corresponds to the ability to distinguish between 170 ppi and 340 ppi at 90 cm.

I know the fact that you can't see the pixel structure at these densities and distances means the above results don't make sense to you. But that's not the right criterion to determine the cutoff at which higher density has no effect. To understand this better, I'd suggest reading the article.

2) Based on human near-peripheral vision, a 50-60 cm distance for a 27" display is within comfortable human viewing distances:

Sure, if the monitor is so large (like HDTV size) a 50 cm viewing distance puts the edges out of your near peripheral, and I'll sit further away. But 27" isn't that big. The near peripheral spans 60° (https://commons.wikimedia.org/w/index.php?curid=37052186), and you can comfortably view a screen that subtends that angle. For a 27" 16:9, that's a 52 cm viewing distance.

3) Doubling the viewing distance when you switch to a large display seems to me to defeat the whole purpose of having a large display, which is to be able to view more information at once:

Let's suppose you're using a comfortable font size on a 15" laptop screen to view, say, a spreadsheet. If, when you switch to a 27" display, you choose to sit twice as far away, then to have the same comfortable font size, you need to make the fonts twice as big, which means you'll need to double the linear dimensions to display the same thing. So you won't be able to view any more rows or columns on a large display than on a small display. This makes no sense to me. I'd rather keep the font size the same, so I can view twice as many columns and rows. Indeed, with the spreadsheets I create, I'm often just at the cutoff of being able to view all of the columns on a 27" display.

Finally, I never understood why people who disagree on this subject feel the need to use ad hominem "nose-based" characterisizations of the other side, like "pixel sniffers" or "stick your nose in it". I wish this would stop.
 
Last edited:
  • Like
Reactions: bcortens

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
Yes, because Apple is all about these kind of toggles. Their design philosophy is literally „configurability is the root of all evil“, which is also why their UI has historically been so polished.
Not so. You can find toggles for display preferences, like "Increase Contrast", within Accessibility:

1668286894766.png


And MacOS also allows you to toggle font smoothing (though they've moved it from System Preferences to Terminal):


1668286941065.png


The moment Apple starts adding such tweaks they are done as a company.
That's hyperbole.
 

bcortens

macrumors 65816
Aug 16, 2007
1,228
1,568
Ontario Canada
And I don't understand why you'd want to change what your comfortable reading distance is just because you're using a larger display. I use 50-60 cm for both my laptop and my desktop. Three points:

1) Even if you are viewing a 27" display at 80-100 cm, if you have good vision, you can still benefit from a 220 ppi pixel density:

Quoting from to https://sid.onlinelibrary.wiley.com/doi/full/10.1002/jsid.186 ,
"This study determines an upper discernible limit for display resolution. A range of resolutions varying from 254–1016 PPI were evaluated using simulated display by 49 subjects at 300 mm viewing distance. The results of the study conclusively show that users can discriminate between 339 and 508 PPI and in many cases between 508 and 1016 PPI." [1]Spencer, Lee, et al. "Minimum required angular resolution of smartphone displays for the human visual system." Journal of the Society for Information Display 21.8 (2013): 352-360.

300 mm is 1/3 of 90 cm, so what the latter users can see corresponds to the ability to distinguish between 170 ppi and 340 ppi at 90 cm.

I know the fact that you can't see the pixel structure at these densities and distances means the above results don't make sense to you. But that's not the right criterion to determine the cutoff at which higher density has no effect. To understand this better, I'd suggest reading the article.

2) Based on human near-peripheral vision, a 50-60 cm distance for a 27" display is within comfortable human viewing distances:

Sure, if the monitor is so large (like HDTV size) a 50 cm viewing distance puts the edges out of your near peripheral, and I'll sit further away. But 27" isn't that big. The near peripheral spans 60° (https://commons.wikimedia.org/w/index.php?curid=37052186), and you can comfortably view a screen that subtends that angle. For a 27" 16:9, that's a 52 cm viewing distance.

3) Doubling the viewing distance when you switch to a large display seems to me to defeat the whole purpose of having a large display, which is to be able to view more information at once:

Let's suppose you're using a comfortable font size on a 15" laptop screen to view, say, a spreadsheet. If, when you switch to a 27" display, you choose to sit twice as far away, then to have the same comfortable font size, you need to make the fonts twice as big, which means you'll need to double the linear dimensions to display the same thing. So you won't be able to view any more rows or columns on a large display than on a small display. This makes no sense to me. I'd rather keep the font size the same, so I can view twice as many columns and rows. Indeed, with the spreadsheets I create, I'm often just at the cutoff of being able to view all of the columns on a 27" display.

Finally, I never understood why people who disagree on this subject feel the need to use ad hominem "nose-based" characterisizations of the other side, like "pixel sniffers" or "stick your nose in it". I wish this would stop.
This is exactly how I use my 27” display - at roughly the same distance from my face as my 14” MBP - and for exactly the same reason, I want to be able to have more content on screen.
 
  • Like
Reactions: theorist9

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
One word: transparency. They didn't have to remove it if they didn't bring so much transparency into Mojave when they added dark mode, but they made a choice between transparency without color artifacts, or no transparency, and they chose the former.

Subpixel antialiasing works because you know what's underneath when you draw the character, and you can do the blending right then and there. Once the character is drawn, the context is effectively lost. However, with the advent of Core Animation and GPU-accelerated drawing, something like a text field may no longer sit on a solid background, especially if that background is a "material" that blends with the other content on the desktop (such as the translucent backgrounds added in Mojave). Then it becomes impossible to know how to do the subpixel blend when drawing the character, as it is only known at final compositing time, where you no longer have the context required to blend it properly.

One project I worked on had to selectively disable the subpixel antialiasing depending on exactly what was happening with text. If we detected that the background was solid fill, we enabled subpixel AA. But if the background was clear/transparent, which meant a clear background CALayer, we had to disable it, or it'd look even worse with color fringing where the subpixel AA assumed the wrong background color for the pixel. Something I also encountered playing with Core Animation when it first came out.

This is the same issue Apple was facing. In Mojave, if you enabled subpixel AA and turned dark mode on, you could see the color fringing from the text. In light mode it wasn't really as apparent since the AA was assuming a white background which is generally "close enough" even with some light translucency.
Yeah, one other poster mentioned that as an issue. In that case, Apple could have made it available as an option when you activate Accessibility's "Increase Contrast", since that removes transparency anyways. Doing that would make sense to me, since those that really care about text sharpness probably wouldn't mind activating Increase Contrast, as the latter also makes text easier to read.

The other issue is that it requires an RGB subpixel order (as opposed to BGR). But that was rarely an issue back when they had subpixel AA; so unless BGR monitors have become more common, it shouldn't have been an issue when they decided to eliminate it. And even if it were, I'm wondering if the OS could tell the order from the monitor metadata, and just notify the user if it can't be activated on that display.

I know they're not going back to offering subpixel AA. Given this, I think Apple should find a way to offer consumer-priced Retina external displays (equivalent in quality to those in the new iMac). There are currently none on the market. Potential buyers would be any Mac owner that doesn't do critical photography or video work, including Mini and Studio owners, as well as Air and MBP buyers who want a large external display for home use.
 
Last edited:

maflynn

macrumors Haswell
May 3, 2009
73,552
43,528
The moment Apple starts adding such tweaks they are done as a company.
Do you really believe that they would suddenly go under because they added options?

As mentioned there are already options in the accessibility section.

You continue to make logical backbends to justify apple where apple is failing to do something that every other mainstream os handles seamlessly.

Sometimes, apple gets it wrong, and there's no harm in admitting that.

The video below provides examples where macOS has shortcomings where most other operating systems had for years and years. My point is Apple and MacOS is not perfect, and handling monitors is just another example.

Arguments about subpixels, buying 4k monitors, foreground processing it being too complex are just attempts to obfuscate the fact that apple's handling of monitors is inferior to other platforms.
 

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
One thing that came to my mind that subpixel AA is fundamentally incompatible with Apples high-DPI rendering method. Apple renders into a super-resolution buffer and then resolves (downsamples) that buffer to the native display resolution. Any subpixel smoothing present in the original super-resolution buffer will become a weird smudge of random colors when resolved. That’s probably the primary technical reason why Apple dropped it. Otherwise they would need to ship two graphics stacks or something like that.
Then why have I never noticed this with the Retina display on my 2014 MacBook Pro, which runs High Sierra and thus has subpixel AA?

More generally, subpixel AA was on all Macs released through Sept 2018 (which is when Mojave was released). The Retina MacBook Pro was introduced in Sept 2012, and the Retina iMac was introduced in October 2014. So that's six years and four years, respectively, with subpixel AA on high-DPI displays, and I don't recall complaints of the type you describe about those displays. On the contrary, they were highly lauded.
 

leman

macrumors Core
Oct 14, 2008
19,237
19,130
And I don't understand why you'd want to change your comfortable reading distance just because you're using a larger display. I use 50-60 cm for both my laptop and my desktop.

If a larger display has the same effective resolution as a smaller display, isn't it kind of obvious that it is supposed to be viewed from further away? I mean, there is also the size desk and the question of ergonomic etc. The numbers I quoted are recommended viewing distances for different types of screens.

But of course, I perfectly understand that people can use displays differently. But if one wants to have a larger display to fi more content on it, you will need higher pixel density anyway. It is entirely unreasonable to expect higher image quality on a low-DPI display, regardless any AA tricks. Then again, I digress. This very discussion makes it clear that Apple doesn't cater to all the user's circumstances. Again, a business decision that one can view differently.

Not so. You can find toggles for display preferences, like "Increase Contrast", within Accessibility:

Accessibility ≠ configurability. The first is about providing legibility options for users with special needs. The second one is about letting the user chose the technology stack or parameters. In other words, the system cannot decide for you whether you need to have black borders around things because you have difficulties discerning them otherwise. But the system should be able to automatically offer you the best possible rendering method based on your hardware.

The thing is, Apple could have continued to use subpixel AA on low-res screens and disable it on retina screens, just like they did for years. But they decided to remove it altogether. Which means they simply don't care about this specific configuration. So what's even the point to talk about toggles which were never even necessary in the first place?

That's hyperbole.

Of course it is. I mean, I've been posting on these forums for a while, people should know by now that I tend to overdo it :D


Do you really believe that they would suddenly go under because they added options?

I think if they start adding configuration options like these it would signal that they are losing their edge. One of the cornerstones of Apple's vision is a certain arrogance along the lines "we know it better". It doesn't always work out, that's for sure, but that is what distinguishes them from the copycats. They re not afraid to make choices and they are not afraid to rapidly obsolete old technology. I mean, it's 2022, high-res displays are everywhere and cheap, why are they supposed to give themselves extra work by showing consideration to a group of users they care little about.

Arguments about subpixels, buying 4k monitors, foreground processing it being too complex are just attempts to obfuscate the fact that apple's handling of monitors is inferior to other platforms.

Only under a very narrow definition of "handling". If you consider all the potential user cases for all the possible hardware, sure, it's inferior. And also completely irrelevant because it's not something they are about.

Then why have I never noticed this with the Retina display on my 2014 MacBook Pro, which runs High Sierra and thus has subpixel AA?

Was subpixel AA even ever enabled on retina Macs?
 

Pressure

macrumors 603
May 30, 2006
5,063
1,399
Denmark
That's some nitpicking right there.

I personally don't care about window snapping, full screen behaviour, natural scrolling and the limitation with monitors on the low-end chips (which will disappear when they get support for TB4). The bigger issue is having worse support for non-standard monitors (ultra-wide monitors, 8K monitors etc.).

They are even complaining that applications as default are installed in the applications folder ...

It's clear they have very little real life experience with macOS and out of the 80 people Linus employs only 5 use macOS as their daily OS.

The video and comments section clearly comes off like these are nitpicks for Windows users trying macOS.

It's hard to change behaviour and I personally can't stand the windows snapping "feature" those times I have been forced to use Windows for example.
 
Last edited:

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
Was subpixel AA even ever enabled on retina Macs?
Yes, it was on by default, including on Retina displays. I just reconfirmed this using the Digital Color Meter to inspect black text on my 2014 Retina MBP running High Sierra.

More specifically, you could toggle it on or off by selecting or deselecting Font Smoothing. And Font Smoothing was on by default. The caveat is that Apple also had a default cutoff of, IIRC, about 6 points, below which AA would not be implemented. That cutoff was, however, user-adjustable. See also:

The thing is, Apple could have continued to use subpixel AA on low-res screens and disable it on retina screens, just like they did for years.
But that's not what they did. Like I said, it was on by default even on Retina displays.
But of course, I perfectly understand that people can use displays differently. But if one wants to have a larger display to fi more content on it, you will need higher pixel density anyway.
No, you just need to have the same pixel density.
It is entirely unreasonable to expect higher image quality on a low-DPI display, regardless any AA tricks. Then again, I digress. This very discussion makes it clear that Apple doesn't cater to all the user's circumstances. Again, a business decision that one can view differently.
The position you're arguing against is never one I advocated--it's a straw man. It gets tiresome to have to keep repeating this, but let's try it one last time:

First, let's define some terms: Low-res is ~90-110 pi. Mid-res is ~160 ppi (e.g., 27" 4k). High-res is Retina (~220 ppi). [You may consider 160 ppi high res, but the distinction is important here.] Proceeding:

1) I never advocated subpixel AA as a way to get sharp text on low-res screens. Instead, I advocated it as a way to get sharp text on mid-res screens. So when Apple abandoned subpixel AA, they weren't abandoning support of customers using low-res screens (that happened after Snow Leopard, which was the last OS that made text on low-res screens look good). Rather, they were abandoning support of customers using mid-res screens.

Specifically, with subpixel AA, which theoretically increases the horizontal resolution by 3x (the actual increase is certainly less), you could have a 4k 27" whose text was comparable in sharpness to a 5k 27" without subpixel AA. I.e., what subpixel AA offered was a way for the customer to have that "Retina" text sharpness with a $500 commodity 27" 4k display. Now, without subpixel AA, those same customers are required to shell out ~$1500 to get the same quality of user experience. That's a big ask for a Mini or MBA customer who wants an external display.

They're not going to bring back subpixel AA so I think, given Apple's current OS's really need a Retina display to be properly experienced (that "curated" user experience), that they should, correspondingly, offer consumer-priced large Retina displays, which the ASD is not.

[Some background: in the 00's, I happily used a low-res screen with MacOS, up through Snow Leopard. But Apple changed its text rendering with the next OS (I don't know what the specific change was), which meant text no longer looked good on low-res screens, even with the subpixel AA. That was Apple's first key transition: For text to look good, low-res screens no longer worked, forcing the customer to upgrade to mid-res. [The actual cutoff for looking decent was ~130 ppi, since it looked OK with the 1680x1050 upgraded display option on my 2011 15.4" MBP, which was 129 ppi.] After I got a 4k 27", I was again happy on MacOS until the 2nd key transition, which happened after High Sierra. Then, with the loss of subpixel AA, mid-res screens no longer looked great, so you needed to upgrade to high-res.]

If a larger display has the same effective resolution as a smaller display, isn't it kind of obvious that it is supposed to be viewed from further away? I mean, there is also the size desk and the question of ergonomic etc. The numbers I quoted are recommended viewing distances for different types of screens.
You're a smart guy, so I don't know how you've gotten yourself so confused about what I've been saying, since I think I've been crystal-clear. When I say I use my 15" and 27" at the same distance, I'm referring to the 15" Retina on my MBP, and the 27" Retina on my iMac. These are not the same resolution! The 27" is about twice as big, with the same ppi, so it has about twice the pixels in each direction. You're talking as if I were comparing a 15" Retina to a 27" HD, which I clearly wasn't. Given I have about four times the screen area on my iMac as my MBP, but the same ppi, I can indeed display four times as much content on the iMac—but only if I keep the font sizes the same, which in turn requires I keep the reading distance the same. Do you understand?

I simply don't understand buying a 27" Retina external for your 15" MBP, but then moving it twice as far away, such that you can't display any more readable content on it than on your MBP—at least if you're trying to get work done (spreadsheets, calculations, coding, document prep). If you're just surfing the internet, OTOH, I suppose it doesn't really matter—but then you don't need a big display anyways.

Now the cool thing is that, with subpixel AA on my 27" 4k, I had nearly the sharpness I get with my 27" 5k currently. Which meant I could use the same viewing distance for the former as the latter, and thus display the same amount of content*, and thus have the same productivity. I.e., subpixel AA allowed me to use my $500 27" 4k as it if were a $1600 ASD (at least when it came to text; I expect the the ASD is better for photography and graphics).

[*More precisely, nearly the same; getting the high sharpness also requires running 2:1 scaling, which results in a bigger UI than on the 5k; but since, with my apps, the UI is only ~5-10% of the real estate, a 40% increase in UI size means only a 2-4% reduction in real estate.]
 
Last edited:
  • Like
Reactions: ericwn

leman

macrumors Core
Oct 14, 2008
19,237
19,130
@theorist9 Thanks for the elaborate, informative post! I think there was slight a bit of a communication breakdown since maybe our talking points got a bit lost. In particular, we appear to be talking about different comparison anchors. If I understand you correctly, you were primarily comparing laptop and desktop displays with the same PPI (like the retina MBP and the retina iMac) while I was comparing different-size displays of similar resolution (e.g. a retina MBP and a 4K 27"), since the topic revolved about quality of text rendering on cheaper mid-resolution third-party 4K displays (140+ PPI).

Thanks also for clarification about subpixel AA on original retina Macs. It's interesting that the SE discussion you posted confirms what I have been saying — due to the downsampling step in the macOS rendering process subpixel AA by definition cannot provide accurate rendering at scaled resolutions. I can certainly imagine that it might offer some additional subjective smoothing on mid-resolution displays simply because the image will be more blurred, but then again, Apple's approach to text rendering was always about accuracy and not subjective preference. From the technical standpoint, you'd be shipping a broken renderer.

So in the end, we are left with a bit of a conundrum. Low-resolution displays are no mystery, they just don't have spatial resolution and definitely benefit from the resolution increase from subpixel AA, high-resolution displays (what Apple ships) have the spatial resolution and don't need subpixel AA, and mid-resolution displays (third-party 4K etc.) are in a weird spot. Subpixel AA will only work "correctly" if you are using a non-scaled resolution (pixel doubling), but if you are using pixel doubling you are only getting an equivalent of FullHD, so you can forget about fitting more content on the screen, at which point you might as well increase the distance to the monitor and "smooth" it this way. On the other hand, if you want more content on the screen, you need to use a scaled resolution, and using subpixel AA would result in an objectively broken text render. Not that it would be very noticeable of course...

In the end, it's a shame about the owners of the low-DPI displays, as they definitely got the short straw of the bargain here.
 

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
@theorist9 Thanks for the elaborate, informative post! I think there was slight a bit of a communication breakdown since maybe our talking points got a bit lost. In particular, we appear to be talking about different comparison anchors. If I understand you correctly, you were primarily comparing laptop and desktop displays with the same PPI (like the retina MBP and the retina iMac) while I was comparing different-size displays of similar resolution (e.g. a retina MBP and a 4K 27"), since the topic revolved about quality of text rendering on cheaper mid-resolution third-party 4K displays (140+ PPI).
I think you still don't understand me. Yes, I was comparing different-sized displays of different resolution, but I consider the 27" 4k and the 27" 5k to be in the same category when it comes to resolution, at least when subpixel AA was available. So that's what was lost when subpixel AA was lost—the ability to put a $500 27" 4k in the same performance category as a $1500 27" 5k, at least for text!

I.e., we have one category of resolution:

15" MBP, 2880 x 1800

And a second category:

4k 27", 3840 x 2160 (but effectively ~8,000 x 2160 with subpixel AA)

5k 27", 5120 x 2880 (subpixel AA also increases its horizontal resolution, but the effect is not as visible, b/c its pixel density is already pretty high).

It's the subpixel AA that allows you to group the 4k 27" and 5k 27" together when it comes to resolution--its horizontal resolution increase effectively converts the 4k 27" to ~8,000 x 2,160 (I'm reducing it from triple to double, since it's only triple in theory), and thus allows the 4k 27" to give equivalent resolution to the 5k 27" for text.

Again, subpixel AA significantly increases the effective resolution, allowing one to use a 27" 4k like a 27" 5k in terms of viewing distance. If you want to speak in terms of PPI, the 5k is 218 PPI horizontally and vertically, while the 4k with subpixel AA is like 325 PPI horizontally and 163 PPI vertically.

@theorist9 Thanks for the elaborate, informative post! I think there was slight a bit of a communication breakdown since maybe our talking points got a bit lost. In particular, we appear to be talking about different comparison anchors. If I understand you correctly, you were primarily comparing laptop and desktop displays with the same PPI (like the retina MBP and the retina iMac) while I was comparing different-size displays of similar resolution (e.g. a retina MBP and a 4K 27"), since the topic revolved about quality of text rendering on cheaper mid-resolution third-party 4K displays (140+ PPI).
Actually, that's what I've been trying to explain is not a limitation with pixel-doubled 4k @ 27" (163 ppi)--all you need to do is adjust your default font sizes (or app zoom settings) so the text is the same absolute size as on a pixel-doubled (i.e., default) 5k@27", and you can fit nearly the same content.

If you go from pixel-doubled 218 ppi to pixel-doubled 163 ppi, the UI size does increase by (218/163)* – 1 = 34%. However, it's only the UI size that need increase, and the UI (depending on your app) may take up only, say 10% of the real estate. Thus you only lose about 3-4% of real estate when going to a pixel-doubled 163 ppi display—an insignificant amount.

*I didn't square this because the UI size increases vertically only, not horizontally.

Here are screenshots with Mathematica illustrating exactly that. These are top-to-bottom half-screen screenshots, with 4k (163 ppi) on left, and 5k (218 ppi) on right. These are both run at the default (2:1 integer) scaling, so 4k = "looks like" 1080p, and 5k = "looks like" 1440p.

In spite of the difference in pixel density, the difference in screen real estate is trivial—in both cases you can display the same number of equations (eight), the only difference being a slight shift in where the 9th equation is cut off. So the real-world loss in my actual working area is insignificant. Granted, this would not be the case if you had programs where the UI took up a much larger percentage of the screen, but my apps typically don't. [Note that I don't actually use the Palette, the drop-down is just show for illustrative purposes.]


1668408844122.png
 
Last edited:
  • Like
Reactions: Kazgarth

Krevnik

macrumors 601
Sep 8, 2003
4,100
1,309
It's one of the main reasons for "sharp text" on Windows is that it doesn't respect the font metrics and instead snaps the shapes to the pixel grid? Or, to put differently, it doesn't render the fonts correctly to make the letters appear sharper? After being spoiled by macOS accurate text rendering I find Windows fonts absolutely horrendous.

Windows focuses on legibility over accuracy for sure.

But the other factor is that when running at scale factors like 1.25x or 1.5x, you’re not also getting a bunch of full-screen AA applied from the downsampling of a 2x frame buffer. On Windows, either the app renders at 1x and gets upscaled (not great) or it renders at the expected scale factor. No downsampling. It does help, and it’s this non-integer scaling scenario I was focused on with my comment.

I see the same effect with images on scaled resolutions on Mac as well. Where fine detail is just harder to work with in photo editors because of the smoothing effect. Screens that are 200+ ppi helps minimize this, since you’re already getting as much details as the eyes will take in without planting your nose on the screen at that point, but it’s more noticeable on 163 or 137 ppi displays.

Yeah, one other poster mentioned that as an issue. In that case, Apple could have made it available as an option when you activate Accessibility's "Increase Contrast", since that removes transparency anyways. Doing that would make sense to me, since those that really care about text sharpness probably wouldn't mind activating Increase Contrast, as the latter also makes text easier to read.

Yeah, as I mention, this was a design choice. There’s a technical reason why they had to make a design choice, but ultimately it was Apple’s decision to do it as they did.

I suspect that with the push for HiDPI displays across the lineup, they figured it was a reasonable trade off. But I’d love to see the analytics reports on how many users are using different types of display modes. If they collect data on usage of displays, you can bet the numbers helped guide the decision. Or at the very least, made it harder for the folks arguing to keep subpixel AA in some form to justify the work.

Given this, I think Apple should find a way to offer consumer-priced Retina external displays (equivalent in quality to those in the new iMac). There are currently none on the market. Potential buyers would be any Mac owner that doesn't do critical photography or video work, including Mini and Studio owners, as well as Air and MBP buyers who want a large external display for home use.

This is where I really think Apple miscalculated with their approach to higher resolutions. I think they expected this to go over better in the larger market and prices to come down faster, but they didn’t. A sort of “Mac Pro 2013” moment with their display tech.

End result is that MSFT’s approach of going after full resolution independence gives a better result with today’s selection of displays than Apple’s.
 
Last edited:

Robdmb

macrumors regular
Nov 5, 2008
242
28
Are most of the issues when using scaling modes? If for example you have a 1440p non-retina display (for example, a 3440 x 1440) would there be issues running at its native resolution?
 

pshufd

macrumors G3
Oct 24, 2013
9,947
14,438
New Hampshire
Are most of the issues when using scaling modes? If for example you have a 1440p non-retina display (for example, a 3440 x 1440) would there be issues running at its native resolution?

If the resolution is supported, then, no. I'm running a U2515H which is 25 inch QHD at native resolution and it is fine.
 

theorist9

macrumors 68040
May 28, 2015
3,702
2,804
End result is that MSFT’s approach of going after full resolution independence gives a better result with today’s selection of displays than Apple’s.
Two things I'm not clear on:

With Apple, you only get optimum sharpness if you pick an integer scaling factor. With Windows, is the sharpness independent of the scaling factor you choose?

Windows has something called "Make text bigger". I can't test what it does, because I'm using Windows remotely, but does this change the UI size independent of the scaling factor? If so, why can't Apple do this as well—is this part of not having "resolution-independence"?
1668459545966.png

I suspect that with the push for HiDPI displays across the lineup, they figured it was a reasonable trade off. But I’d love to see the analytics reports on how many users are using different types of display modes. If they collect data on usage of displays, you can bet the numbers helped guide the decision. Or at the very least, made it harder for the folks arguing to keep subpixel AA in some form to justify the work.
I'm wondering if they don't care about the numbers, and if it's instead more their general business philosophy of only caring about how Macs look on displays they sell themselves.

For instance, when Lion was released in July 2011, they changed the font rendering such that it required about 130 ppi to look good. That meant it didn't look good on any external display a Mac user was likely to own (4k was rare, and Macs couldn't drive it until 2013), but Apple didn't care, since it did look good on Apple's own high-res displays (the high-res option on the 15" MBP, and the standard display on the 17" MBP, both of which were 130 ppi). [The ACD, whose 106 ppi was too low to look good with Lion, was discontinued the same day Lion was released.]
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,237
19,130
Two things I'm not clear on:

With Apple, you only get optimum sharpness if you pick an integer scaling factor. With Windows, is the sharpness independent of the scaling factor you choose?

Windows has something called "Make text bigger". I can't test what it does, because I'm using Windows remotely, but does this change the UI size independent of the scaling factor? If so, why can't Apple do this as well—is this part of not having "resolution-independence"?
View attachment 2113105

It's the question how resolution independence is achieved. In the traditional rendering model there are just pixels — you get your drawing surface, it has certain pixel dimensions, you want to draw a 1 pixel thick line ,and so you do it. If you want to have resolution independence things get more complicated, as you need to distinguish between the logical coordinates/sizes (how "large" stuff looks) vs. it's actual size in physical (hardware) pixels on screen. One obvious approach is to say that a single "logical" pixel corresponds to "x" hardware pixels, where x is some real number. That's essentially what Windows does. Apple was actually one of the first to experiment with this technology, they had a working implementation of fractional backing scale factors somewhere around Lion, I forget. The problem though that this kind of fractional mapping makes rendering complicated. Especially applications that do their own rendering become broken and might need to have parts of them redesigned. There are also non-trivial performance implications. It's simple enough to code a line drawing algorithm that draws a line of fixed thickness. It's not that simple to code a line drawing algorithm that draws a line of arbitrary thickness (which you need to support arbitrary backing scale factors).

So Apple took a different approach. They fix their backing scale factors to be integer (2x or 3x currently), which simplifies algorithms and software transition, while still allowing them to emulate arbitrary scaling (vie downsampling). This decision was the main reason why most Mac software could transition to retina graphics as quickly as it did. This approach also result in a superb rendering quality most of the time, since you are essentially using super-sampling AA for scaled resolutions. The are drawbacks of course, memory usage for render surfaces and memory bandwidth being main concerns. As to performance, I don't think anyone did detailed measurements. It is true that rendering to a higher-resolution target is generally slower, but it's not obvious that is slower than implementing backing scale factor agnostic rendering algorithms everywhere (in practice it might even end up being faster). The performance overhead of downsampling tends to be massively exaggerated: GPUs are very good at linear filtering and those resolutions and frame rates are not even close to being a challenge, not to mention that you only need to process the small areas of the screen that have changed between two frames. Bandwidth concerns are real though, which is what we see with M1 and multiple high-res displays trying to do GPU-intensive work. But then again, it's not like any comparable GPU would fare any better in a similar scenario, retina rendering or not...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.