5K is a complete waste of money if you're not a designer or videographer. You can't use them at their full resolution anyway (everything on the screen is way too small).
Not true - MacOS "Hi DPI" screen modes don't work that way. Unless you specifically enable "low resolution" or are running ancient software that doesn't support HiDPI modes, the screen is always
running at full 5k resolution. The "looks like 2560x1440" description only refers to the physical size of system text and icons which are the same as they were on the old 27" 1440p display - the actual level of detail you see on the screen is far higher. The alternative, "scaled" modes are a bit more complicated (and put extra load on the GPU and VRAM) but - at worst - consist of 2x the "looks like" resolution downsampled to 5k, which is still more detail than you'd see on a sub-5k screen.
At the very least, it sounds like I should pump the breaks here and see what is released this year. Thanks, all.
I'd suggest that you also
decide first whether you're going for Mac or PC (which isn't a debate to start in this thread) - the way MacOS and Windows handle higher resolution screens is different, and while 5k > 4k on either systems, there are reasons why 5k is particularly suitable for Mac - whereas I personally wouldn't bother with 5k on Windows.
Simplified somewhat - Mac applications are designed to work at one of two pixel sizes - standard (ballpark 100 pixels-per-inch) and "HiDPI" (with exactly twice as many pixels per-inch). So, what works best when moving from a standard def display to a high-def display is exactly
double the number of pixels - which is what you get going from an old 27" iMac (2560x1440) to a 5k iMac (5120x2880), so everything comes out the same physical size as on an old iMac, but with twice as many (linear) pixels worth of detail. On a 4k display at 27" the system text, menus, icons and window furniture all come out a bit large (although the level of detail is still 4k) so 4k on a Mac is more suited to the 21" models.
The alternative is to use the scaled modes that render to 2x the 'looks like' resolution then downsample to 4k/5k - which produces excellent-looking results at the expense of more load on the GPU and VRAM.
Windows applications are supposed
to be resolution independent and use OS calls to translate between inches/cm/points and pixels, and the system-wide pixels-per-inch setting can be freely changed. (In theory this is superior to MacOS, in practice, not so much, as it relies more on applications obeying the rules and gets really
confused if you have multiple displays with different scales). But if you go windows, that's what you get (fifty billion flies can't be wrong) and it means there's no intrinsic advantage to 5k over 4k beyond the increase in resolution (and I can vouch that 50 year-old imperfect eyeballs won't notice that).
I've got a 28" 4k screen next to a 2017 5k iMac - its a cheap & cheerful Dell S2817Q that doesn't compete with the iMac when it comes to colour reproduction (+it has horrible controls and stand), but its pin sharp and you have to look very closely to see a difference in detail. Of course
the 5k is better than 4k, but its not night-and-day. Also, for general use, the iMac doesn't seem to break a sweat driving it in "looks like 2560x1440" mode so it roughly matches the iMac.
(I got the Dell before my iMac when I was experimenting with a Hackintosh and didn't want to sink a lot of cash into a display that might not work. Frankly, it did the job for 'general' use for several months, although it wouldn't cut it for photography or semi-serious graphics).
The other thing to consider if you get a Macbook are the charging and docking facilities of the Apple/LG 5k display.
If you get a Mac Mini I'd worry about the ability of the GPU to really
drive a 4k or 5k display, including scaled modes if you wanted some flexibility about font/icon sizes. From other posts here, it sounds like you'd at least need to get 16GB RAM (the Mini uses main RAM as VRAM).
* Once upon a time, in the Classic Mac/Mac II era, all
Apple displays were physically 72 pixels-per-inch so 1 pixel = 1 PostScript Point.