Wait 'till a microLED/true HDR version comes out in 6-12 months, then you'll break the other leg

If you can justify the cost, there's no problem (or why not get teh Pro XDR while you're at it) - but I'd re-iterate that you can get 2-3 half decent 4k screens for the price, and enjoy
extreme "real estate".
I'd tend to agree with that myself - but it is very subjective and depends on your eyesight, your viewing habits, what software you use and how you use it...
OTOH, if my eyeballs were a decade or two younger, I'd be able to work in "looks like 2160p" when the job demanded - it's small and fiddly, but perfectly clear.
Even with "real estate" it depends on software: at "looks like 1920x1080" if you run in full screen, hide the dock and hide the menu bar the MacOS UI mostly gets out of the way, it comes down to how chunky the application's own UI is. Some apps offer their own choice of UI size, use nice compact palettes etc. and/or let you move the palettes to the second screen which you can now afford.
The 2560*1440 "look" kinda got set as the "gold standard" for Mac with the old, beloved by many, 30" Cinema Display (actually 2560x1600) in 2004 followed by the 27" iMac and 27" 2560x1440 Cinema Display in 2009/10 (which actually made the default UI slightly smaller) - but in my experience it
is quite a small UI compared to other systems.
Another side to this problem is not 4k
per se but the annoying abundance of 16:9 screens - the perfect aspect ratio for watching full-screen TV shows, pessimal for anything else. That's why I've gone for a pair of 3:2 Mateviews - same horizontal size and resolution as a 27" 4k, but a couple of inches height of extra real-estate tacked on the bottom.
Well, you do have the choice of 1x or 2x without changing the actual resolution. But, yes, being able to set it to 1.5x, 1.3x, 1.7x etc. as you can in Windows would be nice.
Problem is, it's a bit simplistic to say that this is "just scaling the UI" - what it is doing (in Windows) is changing the PPI scale factor that is used to translate the internal coordinates used by the OS and applications into "device coordinates" i.e. actual screen pixels (or something completely different if you're rendering to a printer).
First problem is that it could break a lot of existing software, because it would also affect the way dialogues are designed, how pixels are plotted on the screen etc. and any situations where OS UI elements are combined with application-generated graphics. Windows has had the "variable PPI" feature since at least Windows 3, and changing it has always
sort of worked, but sometimes ended up with jumbled or even unusable dialogues. Last time I used Windows seriously (not counting VMs) it worked pretty well on 150% but there was the occasional glitch that rendered an app near-unusable.
Second problem is that it doesn't fix
everything - things like (most) fonts and vector graphics can be scaled smoothly - but with things like bitmap "assets", if the application doesn't ship with a version of a bitmap that matches your preferred PPI then - one way or another - it's going to get "resampled". When applications render their own graphics it's completely up to the application writer to decide what the maximum precision is. You
can interrogate the OS and match the screen resolution - or you can work internally to a "good enough" resolution then downsample to match the screen.
Only having two PPI settings to support (and with the OS enforcing a 2x scale on any obsolete apps) does keep things simple and less error prone (and I've always been impressed with the way my iMac could cope with dragging apps between a 5k screen, a 4k screen and an old 1920x1280 display and still give reasonable results...
In the case of Blender - doesn't it have its own fully scalable UI? It's years since I dabbled with it but ISTR it did all of its own UI rendering quite independently of the OS (the "like nothing else on earth" UI was part of its ...charm).