I still don't get the concept behind retina. If it has to be always scaled down to lower resolutions, then how does it becomes different from a computer that runs the same resolutions (the resolution that the retina is scaled down to) natively ?
I think there are two advantages with Retina displays:
1. A Retina scaled down and a non-Retina native will likely have the Retina still produce much higher quality thanks to the tiny dot pitch which hides the effects of scaling extremely well (it has to be seen, really). In other words, a scaled ~230 ppi image will still usually win over a native ~100 ppi image since the scaling effects takes place on a pixel level, way below what can easily be seen on a retina dot pitch. You need a microscope to easily see the added aliasing / blur.
2. A Retina scaled down and a non-Retina scaled down will have the Retina still crisp and the non-Retina look horribly blurry. That is -- if you purchase a 15" Retina MBP, you get a native display of 1440x900, 1680x1050, and 1920x1200 all in one. OK, so it's not native, but your eyes will not be able to tell a difference. It's native-quality on an extremely crisp retina dot pitch to boot, three resolutions in one computer. The last time we had that convenience was on CRT displays, but they were worse in other ways. If you instead get a non-retina 15" laptop, you get only the resolution it was built for. All others will look blurry and (in my opinion) unusable unless you're using it for mirroring onto a TV or (perhaps, with varying success) in a computer game. The reason it becomes blurry and all nasty is because the dot pitch is so large that it can't hide the effects from scaling, which a retina display can.