Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

lazer155

macrumors regular
Original poster
Jul 5, 2010
150
0
I've read through alot of the other threads on scaling and best for retina settings. I still don't understand how it works exactly though so I apologize if this is a dumb question about an overly discussed topic. Basically, I was wondering why performance would be impacted (as it says/warns under the display reolution settings). I would think that by running at a higher resolution, the graphics card would not have to work as hard as it does at "best for retina" resolution because it doesn't have to work to make things as crisp. Why is it that apple warns about performance being effected negatively?

I thought in best for retina mode, it was as if 4 pixels were being used for the equivalent of 1 pixel on a non retina display. I don't understand how best for retina winds up being less stressful on the gpu than lower or higher resolution choices. Could someone explain?
 
It isn't "making things crisp"; think of it purely in terms of the resolution it is rendering. In the "best for Retina" mode, it is doubling the standard 1440x900 resolution by two in each dimension, so that's 2880x1800, which happens to be the maximum resolution of the display. However, let's say you set it to the "most desktop space" option, which is effectively 1920x1200. Instead of rendering 2880x1800, the max resolution the display supports, it will still double 1920x1200 in both dimensions, which is 3840x2400. That's HUGE. The graphics card is rendering over four million more pixels than the display can support, and on top of that, it is doing the calculations to downscale the giant image to 2880x1800 with every frame it renders, which happens 60(?) times per second. This last step doesn't even happen in the best for Retina mode.
 

It isn't "making things crisp"; think of it purely in terms of the resolution it is rendering. In the "best for Retina" mode, it is doubling the standard 1440x900 resolution by two in each dimension, so that's 2880x1800, which happens to be the maximum resolution of the display. However, let's say you set it to the "most desktop space" option, which is effectively 1920x1200. Instead of rendering 2880x1800, the max resolution the display supports, it will still double 1920x1200 in both dimensions, which is 3840x2400. That's HUGE. The graphics card is rendering over four million more pixels than the display can support, and on top of that, it is doing the calculations to downscale the giant image to 2880x1800 with every frame it renders, which happens 60(?) times per second. This last step doesn't even happen in the best for Retina mode.

I see it makes much more sense now as to why performance would be impacted. Is the scaling up to 3840x2400 and back down to 1920x1200 done in order to retain the quality of the all the elements on the screen so that they do not appear pixilated?
 
I see it makes much more sense now as to why performance would be impacted. Is the scaling up to 3840x2400 and back down to 1920x1200 done in order to retain the quality of the all the elements on the screen so that they do not appear pixilated?

At no point is the desktop scaled down to 1920x1200. It is rendered at 3840x2400, then manually scaled down to the native 2880x1800 panel resolution for display. That way, you get the HiDPI retina-style crispness of text and icons etc, but more effective pixels on screen. All modes that aren't "best for retina display" or whatever include this manual downscaling, and that's where the performance hit comes from.
 
At no point is the desktop scaled down to 1920x1200. It is rendered at 3840x2400, then manually scaled down to the native 2880x1800 panel resolution for display. That way, you get the HiDPI retina-style crispness of text and icons etc, but more effective pixels on screen. All modes that aren't "best for retina display" or whatever include this manual downscaling, and that's where the performance hit comes from.

Thank you! I finally understand it now. :D The performance impact doesn't seem to be too much at least for me but it is there sometimes. It's really not all that noticeable but I was wondering what was causing it. I mainly use best for retina since it looks the best but when I need to get a lot of work done, I find the 1920x1200 look mode to be more useful.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.