Not my experience - and I've been buying Dell 4K monitors literally by the pallet-load.
For Windows, most people set the scaling at 150% and seldom see issues. Some want the extra pixels and use 125%. Only a few old apps using legacy APIs show any kruftiness.
For our Apple users, they either "pixel double" and get a 1080p experience, or put up with tiny fonts and UI elements.
The issue you are talking about has more to do with the PPI that Apple chose to base their displays at. They intend users to stay on the pixel double scaling, since the native pitch of their displays are 2x of the UI norm.
And I am lost on what you were referring. OS X has had no issue for me when scaling slightly below 200%. For instance, using 1920x1200 on 15" MBP which has native 2880x1800, or 3200x1800 on my iMac 5K, both have no issues displaying the OS interface and text. The only apps that have problems are ones that were made pre-retina, but they are usable, just not as crisp.
The problem with most HiDPI panels out there is that they are not yet retina pitch, so demands an odd scaling between 120-180% for almost any normal usage. A 8k display would need to be ~40" to fall within retina, and 40" is way too large for normal desktop usage, at least not as the in-your-face interfacing monitor. If the Apple display is of 32" or less footprint, instead of wasting bandwidth for unneeded pixel pitch, why not use it on color bit-depth or refresh rate. (I remember Schiller remarked something similar.)
The input interface is also a concern, mentioned above. Currently, with TB3, the embedded DP version is only good enough for 5K 10-bit 60Hz and it already requires dual-plex (I think). If the Apple Display demands anything more exotic, it will mean not being compatible to all current Macs, at least not in the native/maximum setting.