I just took a screen shots of an app that was designed for 3GS (not optimized for retina screen) in three different ways. You can make your own conclusion (i have a meeting in 10 mins and must go now).
http://i49.tinypic.com/10cuio4.jpg
From left to right:
1. This is what the app looks like at 100% size on iPhone 4. You see it got scaled up, so the graphic is not smooth.
The text looks much better because iPhone 4 automatically changes the text. Please just analyze the images for this exercise.
2. This is what the app looks like at 200% size on iPad 2 running in NON-pixel-double mode. This is equivalent to take a screen shot of this app running in 3GS, then scale it up to 200% myself on a computer.
As you can see, the images look IDENTICAL from (2) and (1), meaning the scale up process iPhone 4 did, did not introduce any artifact or new color.
3. This is what the app looks like running on iPad 2 in pixel-double mode. As you see, the image actually was anti aliased, there fore Apple via pixel-double, actually smooth out the image and by definition, INTRODUCED artifact (information and pixel not in the original image).
So my thoughts right now is, we see iPhone 4 and iPad 2 pixel double, scale the same small image DIFFERENTLY.
They BOTH look worse, for various reasons perhaps theorized by some of the posters on this thread already.
The big question is, how wil rMBP scale an image? Is it using iPhone 4 method (direct honest scaling), or iPad's method (anti-alias scaling)?
http://i49.tinypic.com/10cuio4.jpg
From left to right:
1. This is what the app looks like at 100% size on iPhone 4. You see it got scaled up, so the graphic is not smooth.
The text looks much better because iPhone 4 automatically changes the text. Please just analyze the images for this exercise.
2. This is what the app looks like at 200% size on iPad 2 running in NON-pixel-double mode. This is equivalent to take a screen shot of this app running in 3GS, then scale it up to 200% myself on a computer.
As you can see, the images look IDENTICAL from (2) and (1), meaning the scale up process iPhone 4 did, did not introduce any artifact or new color.
3. This is what the app looks like running on iPad 2 in pixel-double mode. As you see, the image actually was anti aliased, there fore Apple via pixel-double, actually smooth out the image and by definition, INTRODUCED artifact (information and pixel not in the original image).
So my thoughts right now is, we see iPhone 4 and iPad 2 pixel double, scale the same small image DIFFERENTLY.
They BOTH look worse, for various reasons perhaps theorized by some of the posters on this thread already.
The big question is, how wil rMBP scale an image? Is it using iPhone 4 method (direct honest scaling), or iPad's method (anti-alias scaling)?