Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The manufacturer isn't Samsung. It's BOE. They have their own physically different technology to IPS which is visually equivalent.

Also, I am literally looking at this monitor right now. It has IPS-like viewing angles.

I won't reply further as I have a sense you seem to think I'm lying for some strange reason.
No, not all--I don't think you're lying. I was pressing you because you were making claims that (at least to me) seemed surprising, without providing evidence to back them up, so I was asking for that evidence. I didn't think you were lying; instead, I suspected you were making assumptions without having determined if those assumptions were correct.

And it turns out I was right to press for evidence, since first you were saying that the display you linked was IPS, and now, after pressing you, it turns out it's not IPS, it's some technology used by BOE that you are claiming is visually equivalent to IPS. I know you think they're equivalent, but have you done a sufficiently careful comparison to establish that? Whether it is or not will require some independent verification. [Yes, more evidence!]

And I know Samsung isn't the manufacturer of this panel. My point is your claim that only LG could use the term IPS is wrong, since Samsung uses it for their panels. So if BOE's panel was IPS, they could use that term as well.
 
Last edited:
I'm not an expert on Windows, but it's my understanding that, because of vectorized scaling, it can render displays at non-integer scaling ratios without suffering the artifacts seen with MacOS. Is that not correct?
I own a Surface Pro 2, which I don't use anymore. Unlike macOS, Windows requires the developer to add support for scaling for Win32 apps, which is bascially most of the apps. Even if you set the scaling, if the developer don't add support, it will either not scale or get blown up and look blurry. While Windows 10 improved support on this, it's still a hit and miss as there are limitations to this. I believe that UWP apps handles scaling natively, but nobody really use UWP apps. I needed to set scaling to make the device usable since 1080p at 10 inches is small without scaling. At least it had a screen close to retina at 208 DPI.

That is why macOS approach is better as the developer doesn't need to do anything to add support for High-DPI, but it requires 218 DPI monitors for best results.
 
Thanks for letting me know. I don't use Windows directly myself (I sometimes have to remote into Windows computers), so I wasn't aware of this.

I see you found a Windows setting somewhere that enables "ClearType" but that no longer means that anything is doing subpixel-AA. The whole situation is kind of a mess. From Wikipedia:

"The font rendering engine in DirectWrite supports a different version of ClearType with only greyscale anti-aliasing,[29] not color subpixel rendering, as demonstrated at PDC 2008.[30] This version is sometimes called Natural ClearType but is often referred to simply as DirectWrite rendering (with the term "ClearType" being designated to only the RGB/BGR color subpixel rendering version).[31] The improvements have been confirmed by independent sources, such as Firefox developers;[32] they were particularly noticeable for OpenType fonts in Compact Font Format (CFF).[33][34]"

Windows is a dumpster fire of overlapping development stacks and APIs. It's possible (likely) that the APIs to do subpixel-AA font rendering are still built-in to Windows but that doesn't mean anybody is using them. Win32 apps render fonts differently than UWP apps, etc. etc.

I'm not an expert on Windows, but it's my understanding that, because of vectorized scaling, it can render displays at non-integer scaling ratios without suffering the artifacts seen with MacOS. Is that not correct?

In Windows, individual programs can signal to the OS that they are aware of the OS's scaling settings and will manage their own scaling. What that usually means is that they will try to draw stuff on pixel boundaries, even when that's arguably not actual "scaling," but more, "try to make things look nicer when they're supposed to be bigger." Maybe this is what you're referring to?

What artifacts are you referring to re: MacOS scaling, BTW?
 
In Windows, individual programs can signal to the OS that they are aware of the OS's scaling settings and will manage their own scaling. What that usually means is that they will try to draw stuff on pixel boundaries, even when that's arguably not actual "scaling," but more, "try to make things look nicer when they're supposed to be bigger." Maybe this is what you're referring to?

What artifacts are you referring to re: MacOS scaling, BTW?
Perhaps I shouldn't have used the word artifacts. What I was refrerring to was the loss of sharpness one sees on MacOS when using non-integer scaling. That's what it's my understanding (perhaps incorrectly) that Windows avoids.

As to how Windows does that, I don't know. But it sounds plausible they could enable it the way you describe--forcing each glyph to fit into pixel boundaries. That would be consistent with Windows' overall glyph-rendering philosophy: It sacrifices shape and size to maintain sharpness. MacOS, by contrast, tries to maintain shape and size, but sacrifices sharpness to do so (https://damieng.com/blog/2007/06/13/font-rendering-philosophies-of-windows-and-mac-os-x/).

[This is is a matter of taste; I prefer maximizing sharpness, while this author is of the opposite view. Though there does seem to be somewhat of a consensus among the commenters that Windows' approach is better than MacOS's with lower-DPI monitors, which is perhaps why Windows users are less in need of Retina than Mac users.]

I had an interchange with a developer of one of the MacOS scaling ultilities (e.g., SwitchResX) (not that one, but I don't want to say which because I don't know if he wants to be publicly drawn into this), and he indicated one attribute that is a clear disadvantage of MacOS's approach to scaling is that its HiDPI always uses a 2x framebuffer. Consequently, any scaling above 2:1 will produce an undersampled desktop and will thus be wasted. This comes into play at the other extreme, with extremely high DPI monitors (e.g., Dell's 280 ppi 8k 32"), where you might want to use 3:1 scaling to get a comfortably readable UI size. He said Windows' vectorized approach doesn't have this limitation.
 
Last edited:
  • Like
Reactions: pdoherty
Perhaps I shouldn't have used the word artifacts. What I was refrerring to was the loss of sharpness one sees on MacOS when using non-integer scaling. That's what it's my understanding (perhaps incorrectly) that Windows avoids.

As to how Windows does that, I don't know. But it sounds plausible they could enable it the way you describe--forcing each glyph to fit into pixel boundaries. That would be consistent with Windows' overall glyph-rendering philosophy: It sacrifices shape and size to maintain sharpness. MacOS, by contrast, tries to maintain shape and size, but sacrifices sharpness to do so (https://damieng.com/blog/2007/06/13/font-rendering-philosophies-of-windows-and-mac-os-x/).

[This is is a matter of taste; I prefer maximiizing sharpness, while this author is of the opposite view. Though there does seem to be somewhat of a consensus among the commenters that Windows' approach is better with lower-DPI monitors, which is perhaps why Windows users are less in need of Retina than Mac users.]

If we're just talking about text rendering, that's certainly what GDI tried to do. (Note that that link is from 2007.) Newer Windows development stacks have switched to a Mac-like (i.e., accurate) approach.

Also, from your other posts, it sounds like you haven't used Windows in a long time. I'm not sure how you came to the conclusion that you prefer GDI's text rendering.

I had an interchange with a developer of one of the MacOS scaling ultilities (e.g., SwitchResX) (not that one, but I don't want to say which because I don't know if he wants to be publicly drawn into this), and he indicated one attribute that is a clear disadvantage of MacOS's approach to scaling is that its HiDPI always uses a 2x framebuffer. Consequently, any scaling above 2:1 will produce an undersampled desktop and will thus be wasted. This comes into play at the other extreme, with extremely high DPI monitors (e.g., Dell's 280 ppi 8k 32"), where you might want to use 3:1 scaling to get a comfortably readable UI size. He said Windows doesn't have this limitation.

Sure, but how many people are affected by that? How many people buy a 4K monitor and run it at e.g. 1280x720? The only reason I can think of for somebody to do that is if they have terrible eyesight. At which point, the issue seems moot, no?
 
If we're just talking about text rendering, that's certainly what GDI tried to do. (Note that that link is from 2007.) Newer Windows development stacks have switched to a Mac-like (i.e., accurate) approach.

Also, from your other posts, it sounds like you haven't used Windows in a long time. I'm not sure how you came to the conclusion that you prefer GDI's text rendering.
True, I haven't used Windows directly in a while. But the last time I checked, which was couple of years ago, I found text on low-DPI (~100 ppi) monitors more tolerable on Windows than Mac (didn't like it on either, but Mac gave me more of a headache, especially when I needed to use small fonts). Is it no longer the case that Windows is sharper/better optimized than MacOS for low-DPI monitors?

If we're just talking about text rendering, that's certainly what GDI tried to do. (Note that that link is from 2007.) Newer Windows development stacks have switched to a Mac-like (i.e., accurate) approach.

Also, from your other posts, it sounds like you haven't used Windows in a long time. I'm not sure how you came to the conclusion that you prefer GDI's text rendering.

Sure, but how many people are affected by that? How many people buy a 4K monitor and run it at e.g. 1280x720? The only reason I can think of for somebody to do that is if they have terrible eyesight. At which point, the issue seems moot, no?
I explained in my post that it would be useful at the other end of the spectrum: Ultra-high DPI monitors (beyond Retina pixel density), like the 280 ppi Dell 32" 8k which, at 3:1 scaling, has the same UI size as a 187 ppi monitor at 2:1. And I think many would prefer a UI that's 14% larger than a 218 ppi Retina's (at 2:1) to one that's 28% smaller.

And the applicability of 3:1 scaling will increase as more such monitors hit the market and their prices come down, which they will as 8k becomes more common. Note also the earlier study I cited, which indicates people can see improvements in sharpness by going beyond Retina pixel densities. While the study said this applied to their subjects generally, this will be particularly true for those with good close vision. So I think being able to run a 32" 8k at 3:1 would make for a superb viewing experience if it were available on MacOS (without the undersampling) which, alas, it's not.
 
Last edited:
True, I haven't used Windows directly in a while. But the last time I checked, which was couple of years ago, I found text on low-DPI (~100 ppi) monitors more tolerable on Windows than Mac (didn't like it on either, but Mac gave me more of a headache, especially when I needed to use small fonts). Is it no longer the case that Windows is sharper/better optimized than MacOS for low-DPI monitors?

As I've been saying, there's no such thing as "text in Windows." There are a bunch of different development stacks and APIs. Who knows what any given application will do for text rendering, or scaling.

I just looked at some screen shots from my Windows computer with scaling at 100%. Firefox, Notepad, and the command prompt do actually use subpixel-AA, but the colors are so faint that it seems like it would barely have any effect. Chrome, the system settings app, and various elements of the shell don't use subpixel-AA.

I didn't see text in any screen shot where effort was being made to align anything to pixel boundaries the way GDI used to. Not even the command prompt, which shocked me. I was sure they would still be using fixed-width bitmap fonts for the command prompt, but I guess not.

...
And the applicability of 3:1 scaling will increase as more such monitors hit the market and their prices come down, which they will as 8k becomes more common. ...

Well, when that does happen, I'm sure Apple will enable 3:1 HiDPI in MacOS, the same way they do on iPhones with bigger screens.
 
If you use Mactype on Windows, and create a rule to replace Segoe UI with Rubik (install it first), text quality is outstanding even on a 100 ppi display. Properly set-up subpixel AA coupled with the right fonts can work wonders.
 
I am all aboard any manufacturer providing above 4K screens that don’t cost a ton. While I don’t mind non-integer HiDPI scaling on MacOS at all, I just want to see higher res options. Especially 8K at 40-43" would be fantastic.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.