My understanding is as follow. Theoretically 5k is right at 2x 1440P resolution. Therefore, the GPU no need to interpolate anything, just use four 5k pixel to simulate one 1440p pixel. So, for anything base on "pixel", 5K panel should be a better choice to display 1440P stuff. Assume if you have a 1440P BMP picture and, and open it on a 4K screen (full screen mode), the GPU now have to analysis how to use nine 4k pixel (3x3) to display four 1440P pixel (2x2). In other words, error may be introduced during the "zoom" process. Then.
If all 2x2 1440P pixels are in the same colour. Then there will be no lost, because the GPU can use 3x3 4k pixel to display the same single colour.
But if all four 2x2 1440P pixels has different colours, then only the 4 corners on the 3x3 4K pixels can display correctly base on the original picture's info. All other 5 pixels are interpolated by the GPU.
The following picture is one of the way how the GPU may interpolated the signal. The left most one is the original data in 1440P. 5K (mid one) can perfectly simulate 1440P and display it correctly. The right most one is the 4K simulated 1440P. There are 5 "wrong" pixels, in fact, more than 55% pixels may be "wrong" during display.
View attachment 742382
In some case, the software may designed to pick some pixels to "stretch" it, rather than interpolate the colour of the "missing pixels". In such case, the colour may look better, the image may look sharper, however, it's again no more the original image.
View attachment 742384
On the other hand, for "vector" style data (e.g. some font, line, etc), there should not be any big different between 4k and 5k. The GPU have to render it at real time anyway. But not pre-define a pixel, and then "zoom" in. So, at a proper distance. Once go beyond human eyes' angular resolution's limitation. They should looks virtually identical.
So, is it 5K is better than 4k? To display 1440P "pixel style" data, yes. However, in real world, we usually see 1080P source rather than 1440P source. Therefore, IMO, 4K may be better than 5K most of the time. Because when we display 1080P data in full screen, now 5K monitor have to do all the interpolation (of course, only true when you insist to display the source in full screen).
IMO, there is no definitive "better" monitor between 4K or 5K. It's really depends on the user's usage. However, by considering there is so much less trouble to display 4K nowadays. I personally prefer to go for 4K SST rather than 5K MST.