The interpolated modes on the rMBP look
much better than that because the image is actually scaled
down instead of up. The rMBP is rendering the desktop at 3840x2400, then scaling that image down to 2880x1800.
At first glance, it's hard to tell that it's running at a non-native resolution (if you don't know what to look for).
For all the details:
http://www.anandtech.com/show/6023/the-nextgen-macbook-pro-with-retina-display-review/6
I just read the article and it makes it pretty clear what's going on. However, just as I suspected, it's NOT all clear (literally). They are pretty much doubling or quadrupling the image internally based on its existing resolution) and then downscaling to to fit the screen. This would work great for 1440x900 since it's an exact multiple of the 2880x1800 native resolution. But if you run 1920x1200, it says the UI elements (vector and font based) look sharp, but any images will have SOME blurriness to them because 1920x1200 is NOT a 1:2 scale of 2880x1800. Doubling or quadrupling 1920x1200 and then downscaling still results in some blur since it is not a multiple scaling ratio. Furthermore, the article makes it very clear that older apps like Microsoft Office look blurry on the display for text.
There's one other huge problem and that would be the GPU having to move all these pixels. A 3D game could render internally to a lower resolution to speed things up, but the GPU still has to push all those quadrupled pixels out to the screen and from what I've read, it's barely able to do this for ordinary OS apps. Diablo 3 may play OK, but it's not exactly Crysis. What happens when you run Windows on this machine? It will just be treated like a 2880x1800 display.
These aren't new problems. Any time you buy a huge monitor and expect to game on it you're going to have to either have an uber expensive gaming rig with things like SLI to push that many pixels at those huge resolutions to get perfect native sharpness or you put up with some scaling distortion in order to get the frame rates you need for the game to be playable. I expect that eventually the GPUs will catch up to this display, but all reviews indicate it's pushing the limit on the Gen1 product just for basic apps. This is probably fine for an iPad, but I'm not so sure it's great for a regular computer. Fortunately, 1440x900 modes will scale well and I would use those modes for any gaming. 1920x1200 wouldn't scale as well for bitmaps.
And this idea that a 15" notebook can easily replace a 17" (as some on here keep saying) is flawed as well. Yes, you can fit as much on the screen with a higher resolution mode, but you'd also have to stick your nose closer to the screen since it won't appear as large in an absolute sense and monitor size is chosen as much for the actual size as it is the resolutions it offers, much more so for regular HDTVs (e.g. I have a 93" screen to watch HDTV on it; I'm not pushing some mega-resolution on it, just regular HDTV and even NTSC modes).
A 15" is hard enough to look at with 1440x900. 1920x1200 would result in very tiny UI and text elements. Even Apple knows 2880x1800 is unusable as a straight up resolution (they don't offer it). People have enabled it with hacks and in Windows, but it's SO TINY on a 15" screen. In the end, Apple is taxing the heck out of a GPU just to get 1440x900 to look "sharper". True 1080p HD material on it will still have to to be scaled to fit. It may look good and it may even look great, but it won't be "native". You will only get 1:1 "perfect" images with multiples (i.e. 720x450, 1440x900 and the full 2880x1800, which is not available as a true mode in OSX).