Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Because thats the logical resolution. You are running the 1680x1050 HiDPI mode which is implemented as a virtual 3360x2100 screen - the OS will then downscale the image to the native 2880x1800. This is basically a form of supersampling or a 'real' sub pixel rendering.
 
I just figured it out. If you use a scaled resolution, for some reason the computer actually thinks your hardware is different
 
Because thats the logical resolution. You are running the 1680x1050 HiDPI mode which is implemented as a virtual 3360x2100 screen - the OS will then downscale the image to the native 2880x1800. This is basically a form of supersampling or a 'real' sub pixel rendering.

Does it slow down the system?
 
Does it slow down the system?

Of course it does. The machine needs to work with four times as many pixels and do some overhead work. It doesn't make your computer perform slower though, if that's what you are asking. Only few badly coded applications will notice the performance hit (Apple's own App Store is a prime example).
 
Of course it does. The machine needs to work with four times as many pixels and do some overhead work. It doesn't make your computer perform slower though, if that's what you are asking. Only few badly coded applications will notice the performance hit (Apple's own App Store is a prime example).

Sorry, I meant, when you make it display "like" 1400x900 and then you switch to make it display like 1680x1050, does it work harder?
 
Sorry, I meant, when you make it display "like" 1400x900 and then you switch to make it display like 1680x1050, does it work harder?

Its more pixels to process, so yes, it works harder. Again, the important question is: why do you ask?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.