The Idiot's Guide to UI Elements on a Retina Screen:
Do you call it 'Idiot's Guide' because it is full of misconceptions or is there another reason for this?
Let's take a hypothetical UI element which is 10 pixels tall and 10 pixels wide. In "Best for Retina" mode this element would be stretched to 20 by 20. Each individual pixel in the original element will be interpolated into four pixels on the stretched version. Interpolation introduces "blurriness", but because of the super-high resolution of the Retina screen, this is offset to some degree.
Wrong (or, mostly wrong). Usually, UI elements are vector-rendered (for example, buttons or text). Vector rendering is always crisp, there is no interpolation and no blurriness. Only bitmap images, OpenGL views and custom-drawn views are interpolated (pixel-doubled) - and that also only for non retina-aware applications. As I mentioned before, retina-aware applications can choose finer levels of control.
In Scaled mode the situation is somewhat worse. The non-Retina UI element undergoes its doubling (to 20 by 20), and then undergoes a non-integer scaling to convert the virtual 1680 screen to the real 2880 screen. Our doubled 20 by 20 element becomes a rather unwieldily 17.143 by 17.143, along with inheriting another level of blurriness. 1 pixel of the original UI element (or any non-Retina image - www for instance) is now mapped to 1.7143 pixels of the screen, through two blur-inducing resampling stages.
Again, this is offset somewhat by the Retina effect of the screen (where pixels have less importance than previous). Some people will notice this added blurriness, some will not.
Again, this is not entirely correct. On the 1680x1050 HiDPI resolution, the OS renders everything into a big offscreen 3360x2100 buffer. So your 20x20 UI element will be first drawn to this buffer. Then, the buffer is linearly interpolated (downscaled) to the native 2880x1800 screen. Since downscaling preserves more details than upscaling the quality stays very good, although, as you correctly point out, some blurriness may be perceived as the result of interpolation.
The thing that worries me is that the 2880 Retina screen is not a standard size, so Apple cannot take advantage of the specialist hardware built into the rMBP's GPUs. All of this resampling is done in software. At the moment this software resampling is - supposedly - done on the CPU, but Mountain Lion will move this load onto the GPUs.
No idea what this is supposed to mean. There are no "standard sizes" in GPUs since I don't know... the 2000 or so? The GPU will happily interpolate the non-power of two textures for you, and this is what OS X uses it for. This interpolation is also very fast. Even the Intel IGP can do it at very fast rates (several hundred times per second). If the interpolation were done at the CPU, you'd see much higher CPU usage values in the activity monitor.
Retina-aware applications are displayed in a 1-to-1 fashion on the 2880 screen, but they start off twice the size of non-Retina applications, so they appear at a usable size on the screen (just much clearer). However, under the scaling-modes even the Retina-aware applications go throughout the same (second) non-intager scaling as the non-retina applications to convert from the virtual resolution to the real 2880 screen.
Here I again have no idea what you are talking about. If you are talking about 'native' 2880x1800 mode (which you can't select via standard pref pane), the retina-aware and non-retina aware apps behave exactly the same way under it - they are both unusably small. Here, logical pixels = real pixels and both are very small.
If you are talking about 'best for retina' mode, this is HiDPI 1440x900 (where each logical pixel is represented via 2x2 real pixels). Again, retina and non-retina aware apps are both 2x2 scaled, to appear the same size as they would at the 1440x900 monitor. The only thing that retina-aware (we really should start calling them HiDPI-aware, btw.) applications can do is recognize when they are run on a HiDPI display and adjust accordingly (e.g. render custom UI at higher internal resolution). If the application does not adjust explicitly, the OS will pixel-double the content that needs adjusting (again, these are bitmap images, OpenGL views and custom-drawn views).