KnightWRX - question to help me get my head round all this resolution stuff.
If I drop the resolution of the high-res 15" MacBook Pro (call it A) from 1680x1050 to 1440x900, and compare that to the standard-res 15" MacBook Pro of 1440x900 (call it B), then the clarity on 'A' won't be as sharp as 'B' even though they are the same size of screen. I know the LCD isn't at its native resolution in 'A', but given that the elements are the same size and resolution as 'B' why does 'A' not look as sharp? Or does images appear as sharp, it is the font scaling that suffers for what ever reason?
That's where my confusion has come in - as LCDs only look good at their native resolution and dropping them down below that impacts sharpness/clarity. That's what I based my previous augment on, because I assumed that elements would not look as sharp as they would on a 1440x900 display if the elements are the same resolution when upscaled to a 2880x1800 display of the same size.
It looks blurry because the pixels don't match up. 1680 : 1440 = 7 : 6. That means when you switch your display to 1440x900, it has to display six pixels in seven physical pixels on the screen which is impossible to do well. With double the resolution, it has to display one pixel in two physical pixels on the screen, which means it just displays that pixel twice.
On the other hand, let's say you have this hi res MBP, and you want to display 1680 x 1050. In that case the hardware has to map 7 pixels to 12 pixels, which cannot be done perfectly, but with _less_ blur than with your current hardware because the pixels are only about half the size, so the blur is much less visible.