Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

andreigherghe

macrumors member
Original poster
May 23, 2011
31
3
A lot of people are saying the RMBP isn't running at native resolution (2880*1800)

That's NOT TRUE! The RMBP is ALWAYS running at the native 2880*1800! Proof? Make a screenshot. 2880*1800 is the size :)

The different thing is the DPI/GUI Scale, not the resolution. The DPI is 4x the normal one. The hacks don't change the resolution, they change the DPI mode: HiDPI (4x) or Normal DPI (1x). That should've been solved by Apple with a HiDPI checkbox. Check it, it is 4x. Uncheck it, it's 1x.

Maybe this will solve the confusions. Also, always use the Best for Retina option. The other ones are scaling the resolution, making artifacts, and increasing the GPU load
 
Last edited:

Stetrain

macrumors 68040
Feb 6, 2009
3,550
20
Yep. Apps and the OS only have two DPI modes at which they should know how to draw: 1x(normal) or 2x(HiDPI). Just like iPhone and iPad apps.

"Looks like 1440x900" aka "Best for Retina" is actually 2880x1800 with HiDPI mode enabled.

Apps that support it render at twice the DPI, apps that don't get upscaled so that everything is the same size as on a 1440x900 screen.

"Looks like 1680x1050" is actually 3360x2100 with HiDPI mode enabled. The final image is then scaled down to 2880x1800.

"Looks like 1920x1200" is actually 3840x2400 with HiDPI mode enabled. The final image is then scaled down to 2880x1800.

The "Run at native resolution!" trick still leaves you running at 2880x1800, same as "Best for Retina", but it disables HiDPI mode. The "Best for Retina" mode is already running at "native resolution", it just has the HiDPI mode enabled.

I think that the forthcoming Anandtech review is going to go pretty deeply into this topic and the specifics of how it works. Should be interesting and clear up a lot of questions.
 

parlour

macrumors member
Jun 21, 2007
32
1
A lot of people are saying the RMBP isn't running at native resolution (2880*1800)

That's NOT TRUE! The RMBP is ALWAYS running at the native 2880*1800! Proof? Make a screenshot. 2880*1800 is the size :)

The different thing is the DPI/GUI Scale, not the resolution. The DPI is 4x the normal one. The hacks don't change the resolution, they change the DPI mode: HiDPI (4x) or Normal DPI (1x). That should've been solved by Apple with a HiDPI checkbox. Check it, it is 4x. Uncheck it, it's 1x.

Maybe this will solve the confusions. Also, always use the Best for Retina option. The other ones are scaling the resolution, making artifacts, and increasing the GPU load

You are confused, too. The scaled options do not scale the resolution. The screen will always use all 2880x1800 pixels. When you pick “Looks like 1680x1050” what is displayed will be rendered in Hi-DPI mode at 3330x2100 and then downscaled, not to 1680x1050, but to 2880x1800. All pixels get to work, nothing is wasted.

You recommendation is consequently utter bull. Yes there is a performance impact, yes there are artifacts, but both are hardly perceptible, in fact, not at all perceptible for me.

The “Artifacts” (which in any case would just be a slight blurriness, hardly visible when you have so many pixels to work with) are barely visible anyway. I can’t see the difference, scaled everything looks as sharp as not scaled to me.

My recommendation is to use the setting that gives you UI elements in a size you like. Don’t care about the technical details, they are pretty much irrelevant.
 

Rizzm

macrumors 6502a
Feb 5, 2012
618
41
Yep. Apps and the OS only have two DPI modes at which they should know how to draw: 1x(normal) or 2x(HiDPI). Just like iPhone and iPad apps.

"Looks like 1440x900" aka "Best for Retina" is actually 2880x1800 with HiDPI mode enabled.

Apps that support it render at twice the DPI, apps that don't get upscaled so that everything is the same size as on a 1440x900 screen.

"Looks like 1680x1050" is actually 3360x2100 with HiDPI mode enabled. The final image is then scaled down to 2880x1800.

"Looks like 1920x1200" is actually 3840x2400 with HiDPI mode enabled. The final image is then scaled down to 2880x1800.

The "Run at native resolution!" trick still leaves you running at 2880x1800, same as "Best for Retina", but it disables HiDPI mode. The "Best for Retina" mode is already running at "native resolution", it just has the HiDPI mode enabled.

I think that the forthcoming Anandtech review is going to go pretty deeply into this topic and the specifics of how it works. Should be interesting and clear up a lot of questions.

How does a display actually show a resolution it's not capable of displaying? There are only so many pixels.
 

Stetrain

macrumors 68040
Feb 6, 2009
3,550
20
How does a display actually show a resolution it's not capable of displaying? There are only so many pixels.

Next sentence: "The final image is then scaled down to 2880x1800." ;)

What I was trying to say is that OSX renders the screen in software as if it were 3840x2400. If you take a screenshot in "Looks like 1920x1200" mode, the screenshot will actually be 3840x2400. Apps think that they're running at full 2x HiDPI mode on a 3840x2400 display.

That rendered 3840x2400 image is then downscaled to 2880x1800 to fit the actual display pixels.

As has been said above, these downscaled modes actually seem to work surprisingly well according to forum members and reviewers.
 
Last edited:

Rizzm

macrumors 6502a
Feb 5, 2012
618
41
Next sentence: "The final image is then scaled down to 2880x1800." ;)

What I was trying to say is that OSX renders the screen in software as if it were 3840x2400. If you take a screenshot in "Looks like 1920x1200" mode, the screenshot will actually be 3840x2400. Apps think that they're running at full 2x HiDPI mode on a 3840x2400 display.

That rendered 3840x2400 image is then downscaled to 2880x1800 to fit the actual display pixels.

As has been said above, these downscaled modes actually seem to work surprisingly well according to forum members and reviewers.

It just doesn't make sense to me. Why render it at that resolution, to downscale it again? Isn't everything that isn't native (2880 x 1800) just downscaled? Where did this doubling idea come from?
 

Stetrain

macrumors 68040
Feb 6, 2009
3,550
20
It just doesn't make sense to me. Why render it at that resolution, to downscale it again? Isn't everything that isn't native (2880 x 1800) just downscaled? Where did this doubling idea come from?

Because apps can render in only two modes, 1x and 2x.

They tried arbitrary UI scaling before (for example being able to set the UI scale at 1.37x). It was supposed to come in 10.4 I think, and then in 10.5. It never really worked correctly and it was hard to develop and test your apps to work in different modes.

The solution is just to have two modes to develop and adjust for, 1x and 2x. It worked on the iPhone and iPad, developers actually updated their apps.

So in order to get the effective screen real estate of 1920x1200 on a 2880x1800 display, there are two possibilities:

1) Render at 1920x1200 with everything in 1x mode. Upscale to fit screen.

2) Render at 3840x2400 with everything in 2x mode. Downscale to fit screen.

The second one produces much better results because the physical display has such a high pixel density, especially when using apps that support 2x mode.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,197
19,056
It just doesn't make sense to me. Why render it at that resolution, to downscale it again? Isn't everything that isn't native (2880 x 1800) just downscaled? Where did this doubling idea come from?

Because downscaling produces much better quality than upscaling. And the pixels of the display are already so small that downscaling artifacts are basically non-perceivable.

----------

Someday...all GUI elements will be rendered as vectors...and all will be good.

This is already the case for almost everything except pictures. But frames, buttons, text, etc. are all vector-based.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.