Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Blackvibes

macrumors member
Original poster
Oct 30, 2014
70
4
Hey guys,

I was wondering if there were any experiments regarding the scaled/simulated resolution option on retina MacBooks, in regard to battery consumption.

On all retina MacBooks you have the option to scale/simulate the resolution according to a 1024x640, 1280x800, 1440x900, 1680x1050. On MacBooks Pro's from 2016 the 1920x1200 option has also been added.

I was worried whether a higher scaling could consume more battery, even though the same number of pixels remain active.

Is there anyone that could give an answer that isn't just a guess? :)
 
I was wondering if there were any experiments regarding the scaled/simulated resolution option on retina MacBooks, in regard to battery consumption.

I am very sceptical about the possibility of such a benchmark. Too many factors to control. You need a room where you can control temperature and radio interference and you need to make sure that the repeated runs use exactly the same workload, and you need to run the test multiple dozens of times to ensure that your results are robust. Also, what you do with the computer is likely to have an effect. I am very sure that OS X uses efficient redrawing (only update part of the screen that actually changes), so for instance watching videos is more taxing than say, working with an office document. All in all, if there is a real effect, I doubt that its easy to detect.

You are welcome to try though, could be an interesting thing :)

On MacBooks Pro's from 2016 the 1920x1200 option has also been added.

Just a quick note her: that option has always been present (since the initial 15" retina MBP). After all, that was the justification for discontinuing the 17" model.
 
I was wondering if there were any experiments regarding the scaled/simulated resolution option on retina MacBooks, in regard to battery consumption.
I don't think you'll see any savings. The GPU still needs to drive the same number of pixels, the battery still needs to power the same number of pixels. Resolution is a logical construct that allows images to be viewed larger, it has no bearing on power.
 
I don't think you'll see any savings. The GPU still needs to drive the same number of pixels, the battery still needs to power the same number of pixels. Resolution is a logical construct that allows images to be viewed larger, it has no bearing on power.

Still, the question is not without merit. Higher resolutions = moving larger blocks of memory around and more raster operations. In fact, it is safe to assume that higher resolutions mean more work for the system. The question though if the difference is large enough to have any noteworthy effect on the battery life. I do not believe it is.
 
Still, the question is not without merit. Higher resolutions = moving larger blocks of memory around and more raster operations.
You're still moving around those same blocks because because the computer has to still determine which pixels to turn on and off. A scaled image is an image, it doesn't matter to the computer if you two pixels are logically joined, because it still has to figure out what pixels to change.
 
You're still moving around those same blocks because because the computer has to still determine which pixels to turn on and off. A scaled image is an image, it doesn't matter to the computer if you two pixels are logically joined, because it still has to figure out what pixels to change.

No, its not the whole picture. To produce the final output, OS X will first draw the image to the 2x2 backing buffer and then downsample the backing buffer (or the affected portion) to the frame buffer. The frame buffer is always 2880x1800 (lets assume the 15" model for the sake of simplicity). Providing you want to draw an image that is 100x100 points in size. On a "best for retina" setting, you just need to draw it to a 200x200 destination in the 2880x1800 backing buffer, which has the same resolution as the framebuffer so you are done here. But for example, on a "max resolution" setting, you need to draw it to a 200x200 destination in the 3840x2400 backing buffer, and subsequently downsample it to a 150x150 destination in the framebuffer (thats one additional resampling operation on 22k pixels). For a "1680x1050" setting, you are drawing to a 3360x2100 backing buffer and then resampling to a 171x171 destination in the framebuffer (29k pixels to be resampled).

Which is funny, because it means that the operation is actually cheaper when drawing to a higher resolution target :D Then again, it makes sense, as images appear "smaller" on higher resolutions and thus cover less actual pixel area. Conversely, working with higher resolutions would mean that you probably redraw larger areas more frequently (e.g. when watching a video etc.), which would cost more in return. E.g. drawing a full frame on 1920x1200 setting means completely filling the 3840x2400 backing buffer and then converting to 2880x1800 — thats 14.4 million ROPs. With a 1680x1050 setting, thats "just" 12.2 million ROPs.

Actual displaying of the image on the screen is "free" in respect to these considerations, because it is always the same, no matter which resolution is used. The final image resolution is always 2880x1800.

P.S. BTW, most of this is my speculation. I assume that OS X maintains a separate framebuffer and a backing buffer. But they could also use a build-in hardware resolution converter and just use the backing buffer as a framebuffer. In that case the conversion would be much cheaper.
 
There have been benchmarks posted before, and you can try using your laptop at native and then the highest scaled resolution - the battery does last a little longer at the native resolution. Depends on how graphics-intensive your use is. More pixels = more work = more power.
 
the computer has to still determine which pixels to turn on and off.
This part is completely correct - but we're talking about a "virtual screen", the frame buffer.
My 13" would natively have to draw 2560x1600 pixels.
I'm running it scaled @2880x1800, so the GPU simply HAS TO decide and draw 1088000 MORE pixels.
The final scaling from whatever-resolution-the-input-is to the physical resolution is done by HW and costs virtually nothing.
(edit: seems like the hardware solution didn't really cut it and Apple have actually rolled their own scaling routines, but again, that's still rather cheap)

This effect is extremely visible in graphically demanding games that run better the lower the resolution is.

But since drawing simple 2D graphics (text, primitive shapes) is generally very cheap on modern hardware, the difference is almost negligible.
 
Last edited:
I've seen a bit of experimental evidence it affects battery life a little, but nothing real solid. Probably hasn't been pursued much because for most people having the scaling they like is more important than small effects on battery life.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.