Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

13" 2020 MacBook Pro—what is your default "display scaling" resolution?

AbominableFish

macrumors newbie
Original poster
Jul 4, 2020
13
13
Why would Netflix think is a 2880x1800 display? It's a 2560x1600 one, so why the jump?

Also, why put a 2560x1600 resolution display if MacOS is limiting me to 1680x1050?

The more I read, the less I understand. In Windows it's very simple: you have a native 1080p display, you choose a 1080p resolution for it, and everything is 1080p in it.

In my MacBook/MacOS, my screen is native 2560x1600, the interface is at 1680x1050 if at max scaling, and Netflix is displaying at 1920x1080, even if it thinks is 2880x1800. How does that make sense? I must be very dumb, because I find it very confusing.
To answer your question though in case the article didn't, actually maybe a better way to say it is this. You have a screen that is physically 2560x1600. However, if you decide to set 2880x1800 as your "scaling resolution" in System Preferences basically what it does is that it tells the OS to pretend the screen resolution is 2880x1800. Of course, you will see it "downscaled" to 2560x1600 because your screen can't see it in more pixels, but your operating system will act as though you have a higher resolution display, so Netflix will also think you're on a higher resolution display—just like YouTube and any other program.

Windows actually has this as well. I forget exactly where it is, but you can choose scaling that is different from the normal display scaling (it shows up as a slider from 50-400% or something like that).

You said Netflix is displaying at 1080p. It probably "knows" that your resolution is set at 2880x1800, but it just displays 1080p to save bandwidth.

If you want more explanation about how it works on macOS feel free to ask and I'll do my best to explain what I know.
 
  • Like
Reactions: ProteinePlus

ProteinePlus

macrumors member
Nov 16, 2018
46
4
Thanks AbominableFish for your kind reply.

What I find confusing in MacOS is that apparently scaling is managed as a "resolution option". That is, you set a "virtual" 1680x1050 or 1440x900 resolution as a way to scale your UI, even if in the back your real resolution is always 2560x1600.

In Windows it doesn't work like that; you set a resolution that can be any supported combination up to the max physical resolution of your screen, and you set up UI scaling with a different option. For example, you have a 1920x1080 (1080p) res and the UI at 125% or 150%. Any program querying the operating system see that the computer has a 1080p res (wether you are at 75%-125%-150% UI scaling)

Now, my remaining doubts:

- If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one (1440x900) instead of the real one (2560x1600)? For example, I have plenty of programs that display graphics at 1440x900 when they should be doing it at 2560x1600 and be a lot crisper.

-Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what? Why would Netflix detect that? (in my MacBook it displays at 1080p, but now I doubt is true since the default res is 1440x900).

Thanks!
 

pshufd

macrumors 68030
Oct 24, 2013
2,792
8,512
New Hampshire
Thanks AbominableFish for your kind reply.

What I find confusing in MacOS is that apparently scaling is managed as a "resolution option". That is, you set a "virtual" 1680x1050 or 1440x900 resolution as a way to scale your UI, even if in the back your real resolution is always 2560x1600.

In Windows it doesn't work like that; you set a resolution that can be any supported combination up to the max physical resolution of your screen, and you set up UI scaling with a different option. For example, you have a 1920x1080 (1080p) res and the UI at 125% or 150%. Any program querying the operating system see that the computer has a 1080p res (wether you are at 75%-125%-150% UI scaling)

Now, my remaining doubts:

- If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one (1440x900) instead of the real one (2560x1600)? For example, I have plenty of programs that display graphics at 1440x900 when they should be doing it at 2560x1600 and be a lot crisper.

-Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what? Why would Netflix detect that? (in my MacBook it displays at 1080p, but now I doubt is true since the default res is 1440x900).

Thanks!

2,880 x 1,800 is for the 2015 (and later) MacBook Pro 15 inch models. 2,560x1,600 is for the MacBook Pro 13 inch models.

I think that it's more reasonable to report the virtual resolution as applications have no access to higher resolutions.
 

AbominableFish

macrumors newbie
Original poster
Jul 4, 2020
13
13
Thanks AbominableFish for your kind reply.

What I find confusing in MacOS is that apparently scaling is managed as a "resolution option". That is, you set a "virtual" 1680x1050 or 1440x900 resolution as a way to scale your UI, even if in the back your real resolution is always 2560x1600.

In Windows it doesn't work like that; you set a resolution that can be any supported combination up to the max physical resolution of your screen, and you set up UI scaling with a different option. For example, you have a 1920x1080 (1080p) res and the UI at 125% or 150%. Any program querying the operating system see that the computer has a 1080p res (wether you are at 75%-125%-150% UI scaling)

Now, my remaining doubts:

- If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one (1440x900) instead of the real one (2560x1600)? For example, I have plenty of programs that display graphics at 1440x900 when they should be doing it at 2560x1600 and be a lot crisper.

-Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what? Why would Netflix detect that? (in my MacBook it displays at 1080p, but now I doubt is true since the default res is 1440x900).

Thanks!
No problem, I'm glad it helped.

As for the scaling on macOS vs Windows, I personally prefer the way it is done on macOS. The application has no information about the scaling, which is good, because then it doesn't need to decide how to display on 1x resolution, or on 1.5x scaling. It just renders everything one way and if the user wants to make items on the screen bigger or smaller, they tell macOS and it occurs system wide—apps don't need to worry at all about this. They just continue doing what they were doing. That's why a lot of things don't scale correctly on Windows, but on macOS there never is any problems. I've never seen a scaling issue on macOS, but have seen scaling woes on Windows (4K displays seem to be the worst for this). Basically the scaling on macOS just seems to work, and it's one thing I think Apple got right.

So I would personally argue it's good that apps don't have access to the physical resolution as-is, because that's not a metric which they should care about.

It's important to note that on macOS there is no way to tell the operating system to render at the same resolution as your display but scale everything smaller. Basically, the way you scale on macOS is by changing your virtual resolution. It's not like on Windows where you have a fixed resolution and you can change the scaling. Both methods have advantages and disadvantages. I personally prefer the macOS way because that way apps will never have scaling issues and developers don't need to deal with making extra assets for 1.5x scaling, 2.0x scaling, etc. However, if you want to use a scale at anything other than 100%, you will have to compromise by giving up on sharpness.

> If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one

I'm not sure what you mean here, but macOS won't scale at 1440x900 unless you have that option selected in system preferences. If you have the 1280x800 option selected, that's how it's going to scale, and everything will be as crisp as can be. That's how I have it configured right now. If you want greatest parity between physical and virtual pixels, I recommend you do the same.

1598379482247.png


> Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what?

This is another one of the confusions which makes it kind of annoying when talking about resolution. In system prefs it says 1440x900, which is the "scaling resolution." But it really means 2880x1800 because we're on a high DPI display. Similarly, if you chose the 1280x800 scaling option, the actual virtual resolution would be 2560x1600. The reason for this you probably know already from the article you linked.

Netflix is displaying at 1080p, but that doesn't mean Netflix thinks that your screen is 1080p. It's most likely because it thinks it's the optimal resolution for your internet connection, or it simply can't display more than 1080p (I think in Safari it can't). But I'm quite sure that Netflix detects your screen correctly as 2880x1440 in resolution.

If you're curious about how website and apps view your screen, go to this website: http://whatismyscreenresolution.net

You'll notice that if you set the scaling resolution as 1440x900, then your screen will be identified as such (actually it is technically twice that, but many websites just report the scaled value). Basically anything in your operating system will take that as your virtual resolution.

You can also test it by taking a fullscreen screenshot and then looking at the resolution. If you're scaling at 1440x900, the res of the screenshot will be 2880x1440, confirming that the OS and everything in it is rendered at that virtual resolution before being downscaled to display it on your 2560x1600 screen.
 

ProteinePlus

macrumors member
Nov 16, 2018
46
4
Great explanation, AbominableFish. You may have a point in that this way of doing the scaling is easier for the developers.

Still, I'm not totally convinced this is the best way of doing things for all kind of applications. Let me give you an example. As we know, Macs are not good at gaming, at least intensive/complex 3D games, but cloud computing is changing things, with options such as Google Stadia and Geforce Now.

With a good internet connection, you can play Destiny 2, all bells and whistles on, in GeForce Now with your MacBook 13", since the heavy computation is done in the server and you Mac is only displaying a video stream, not different than Netflix. But, the way MacOs handles resolutions, introduces a problem... Stadia or Geforce think your computer has a "1440x900" screen, instead of a 2560x1600 one, and the internal rendering buffer is done at that resolution. You are definitely receiving a blurrier stream than necessary. Sure, you can play with this "virtual res", and up to 1600x1050, which is still under what anybody with a lowly Chromebook would receive at 1080P.

But this problem is not specific to cloud server computing, it affects local software too. If I test a graphical game in Windows with a 1080P screen it's always crispier and more defined than the MacOS version due to it internally rendering at 1440x900P instead of a higher res, even though the Mac physical screen is way more precise. And the scaling up to the DPI display, though it may help with text, somehow doesn't do much to graphics.

So, yes, this "virtual res" scaling is fine for desktop apps, but for heavy graphical apps I see problems, though it may be the best compromise for Macs target software.
 

antipodean

macrumors regular
May 2, 2014
113
67
Don’t forget that photos and video are displayed at the higher resolution on a Retina display. So if you use “looks like 1280x800”, photos and video elements are rendered at 2560x1600; UI elements, text, etc. use @2 HiDPI elements, so look sharp as on a 2560x1600 display but are sized as if the display were 1280x800.

the big difference from Windows is what happens if you select “looks like 1440x900” on a 13.3” 2560x1600 display. In MacOS the frame buffer is 2880x1800 and is down scaled to 2560x1600 as the final step (presumably using something like bicubic to do so as smoothly as possible). This taxes the GPU, but results in a desktop that is almost as sharp as native @2, but with more real estate.
Same for external displays. Eg I have my 4K 32” set to “looks like 2560x1440”. Smooth as butter with my eGPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.