I don't understand. What makes 218+ the preferred pixel density? I have been using a 32" 1440P display connected to my M1 Max MacBook Pro and it works fine.
If it works for you. don't sweat it. Not too many years ago, 1440p was bleeding edge - the legendary Apple 30" cinema display was "only" 2560x1600 (100ppi) and there are probably still a few people out there rocking those, plus the later 1440p LED Cinema Displays... As per previous posts, it also depends on how far you prefer to sit away from your screen.
One change Apple
has made in recent years is disabling a feature called "sub-pixel anti-aliasing"[1] which made font rendering on non-retina displays a bit worse - so your display has been actually unusable since Mac OS Mojave and any practical use you've think you've had from it since is a figment of your imagination. I thought you should know that
Realistically, though, the point is that people have been spoiled by "retina" displays, including the 5k iMac which (for some reason) Apple was able to sell at a too-good-to-be-true price until recently. Use a retina display for a while and your current 1440p screen will suddenly look like it's been constructed in Minecraft.
The root of this is a fundamental difference between Mac and PC/Windows/Linux when it comes to displays:
Displays sold primary for PCs are based on a range of standard "pixels wide x pixels high" resolutions e.g. 640x480 ("VGA") or 3840x2160 ("UHD") and if you buy a bigger display you get bigger pixels, lower PPI and by default everything just appears bigger and blockier. Since at least Windows 3 you've been able to adjust exactly what Windows "thinks" is the PPI of the display, and the Windows UI and
well behaved applications will re-scale their UI accordingly. The downsides of that are the "well behaved" requirement (app writes have to follow the rules or the results get garbled or unusably small) and that any bitmaps/other graphics assets will have to be re-scaled from whatever resolution the developer provided to the user-selected PPI. Of course, CRT screens didn't really have a well-defined fixed PPI in the same way that LCD screens do - just an upper limit on definition with no guarantee that the computer pixels would line up with the phosphor dots/stripes on the screen.
Apple displays (built into Macs or stand-alone) have tended instead to go for a fixed PPI, and if you buy a larger screen you get more pixels and more real estate. From the original Macs, through to the early 90s models, Apple's screens were in the 70-80ppi ball-park[2] so one "point" = 1/72" = close to 1 pixel , which made a lot of sense back when the killer app for the Mac was desktop publishing. MacOS and application UIs were designed assuming ~72 ppi. Although it is only a rule of thumb - there are some Mac models that don't fit - this "fixed resolution" idea has persisted with the "standard" resolution advancing to ~110ppi by around 2010. Then "retina" displays saw a doubling of linear resolution and MacOS started offering two "standard" PPI scales: standard-def 110ppi, and "HiDPI/Retina" 220ppi - and since this was an exact doubling, bitmap assets could be easily converted between the two with the minimum of artefacts.
Apart from the standard/retina factor-of-two choice (and even for that you have to jump through hoops to select a standard def mode on a high-def display or vice versa) you can't change the PPI scale value used by the OS - instead you have "scaled" modes that effectively render at 110 or 220 ppi (which shouldn't break applications) to an internal buffer and then re-sample the result to match the physical resolution of your screen. This is the "fractional scaling" bogeyman that is either near-invisible or makes your eyes bleed. YMMV.
Today, the 24" iMac, the 27" Studio Display and the Pro XDR display are all 218ppi, the 13" MBA is 224ppi and The 14 and 16" MBPs are all 254ppi - so it's only "ball park" but still near-constant compared to the variation between the smallest (13") and largest (32") screen! Also, some MacBooks default to a fractional-scaling mode.
[1] NB: There are actually good reasons for obsoleting sub-pixel anti-aliasing other than spite: it works by tweaking the brightness of the individual red, green and blue subpixels - of which there are 3 per pixel - around the edges of characters to smooth out the pixels. That doesn't work well with the translucent UI elements in current OSs and it relies on the OS knowing how the R/G/B sub-pixels are arranged on your screen - and alternative layouts like "pentile" are increasingly common esp. with OLED displays. If it goes wrong you get horrible rainbow fringes around the text - that used to happen if your display somehow ended up in YPbPr moder rather than RGB.
[2] I seem to recall the early displays being factory set to exactly 72ppi - which is easily do-able by tweaking a pot on a mono CRT - but can't find a source for that.