I've read through alot of the other threads on scaling and best for retina settings. I still don't understand how it works exactly though so I apologize if this is a dumb question about an overly discussed topic. Basically, I was wondering why performance would be impacted (as it says/warns under the display reolution settings). I would think that by running at a higher resolution, the graphics card would not have to work as hard as it does at "best for retina" resolution because it doesn't have to work to make things as crisp. Why is it that apple warns about performance being effected negatively? I thought in best for retina mode, it was as if 4 pixels were being used for the equivalent of 1 pixel on a non retina display. I don't understand how best for retina winds up being less stressful on the gpu than lower or higher resolution choices. Could someone explain?