This discussion has been enlightening (special tip of the cap to theluggage). It has made me consider going back to PCs because it appears there is more flexibilty with monitors and resolutions on Windows than mac os.
Well, it's swings and roundabouts & please don't quote me as saying that Windows does 4k "better"!
On Windows it kinda depends on software being well-behaved, working on resolution-independent units, calling the correct OS routines to scale everything, having high-resolution bitmap assets that look good at high DPI etc.
The Mac approach is more robust with software that doesn't understand the high-DPI modes, coping with windows that straddle different-resolution screens etc.
The non-integer scaled modes on Mac really are very high quality (& there's a lot of misunderstanding - running at 'looks like 1440p' is
much better than an actual 1440p display and nothing like the mess you get when running a standard def display at non-integer resolution.
What you get on the Mac in "scaled" mode is similar to oversampling or anti-aliasing - which isn't necessarily a bad thing.
Can someone explain why mac os has this integer scaling straightjacket but Windows does not? I don't get it.
Windows has a user-adjustable setting for the screen PPI, and well-written software uses OS calls to adjust the scale of their fonts, icons, vector graphics etc. to match - and that has been true since at least Windows 3.1. You've always been able to run the display at different "resolutions" (which made sense with CRTs but anything other than native physical resolution on an LCD looks awful).
MacOS just has two standard DPIs (standard def and HiDPI) which it treats as universal constants, and offers alternative "scales" by (effectively) having the GPU create a large virtual screen that is then downsampled to the actual native resolution of the display.
Back in the good old 68k days,
all Mac screens were 72 PPI (so 1 point = 1 pixel*) and you couldn't really change the resolution without a hack. The practical upshot was that (say) 12 point text on the screen was actually the same physical size as 12 point text on paper. They've diverged from that since - but they still tend to keep the same
ballpark PPI while changing screen size and number of pixels, as you'll see with the 21.5 iMac vs. 24" iMac, vs. 5k vs. Pro XDR. (Note the 21.5 iMac was "true" 4k - 4096 x 2304 - not "UHD '4k' 3840-is-nearly-4k-isn't it".
(*Well, only after Apple and Adobe
re-defined the printer's point as exactly 1/72" rather than half of thrice the thickness of Guttenburg's underarm hair, or whatever, but, other than that, close enough)