Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why would Netflix think is a 2880x1800 display? It's a 2560x1600 one, so why the jump?

Also, why put a 2560x1600 resolution display if MacOS is limiting me to 1680x1050?

The more I read, the less I understand. In Windows it's very simple: you have a native 1080p display, you choose a 1080p resolution for it, and everything is 1080p in it.

In my MacBook/MacOS, my screen is native 2560x1600, the interface is at 1680x1050 if at max scaling, and Netflix is displaying at 1920x1080, even if it thinks is 2880x1800. How does that make sense? I must be very dumb, because I find it very confusing.
To answer your question though in case the article didn't, actually maybe a better way to say it is this. You have a screen that is physically 2560x1600. However, if you decide to set 2880x1800 as your "scaling resolution" in System Preferences basically what it does is that it tells the OS to pretend the screen resolution is 2880x1800. Of course, you will see it "downscaled" to 2560x1600 because your screen can't see it in more pixels, but your operating system will act as though you have a higher resolution display, so Netflix will also think you're on a higher resolution display—just like YouTube and any other program.

Windows actually has this as well. I forget exactly where it is, but you can choose scaling that is different from the normal display scaling (it shows up as a slider from 50-400% or something like that).

You said Netflix is displaying at 1080p. It probably "knows" that your resolution is set at 2880x1800, but it just displays 1080p to save bandwidth.

If you want more explanation about how it works on macOS feel free to ask and I'll do my best to explain what I know.
 
  • Like
Reactions: ProteinePlus
Thanks AbominableFish for your kind reply.

What I find confusing in MacOS is that apparently scaling is managed as a "resolution option". That is, you set a "virtual" 1680x1050 or 1440x900 resolution as a way to scale your UI, even if in the back your real resolution is always 2560x1600.

In Windows it doesn't work like that; you set a resolution that can be any supported combination up to the max physical resolution of your screen, and you set up UI scaling with a different option. For example, you have a 1920x1080 (1080p) res and the UI at 125% or 150%. Any program querying the operating system see that the computer has a 1080p res (wether you are at 75%-125%-150% UI scaling)

Now, my remaining doubts:

- If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one (1440x900) instead of the real one (2560x1600)? For example, I have plenty of programs that display graphics at 1440x900 when they should be doing it at 2560x1600 and be a lot crisper.

-Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what? Why would Netflix detect that? (in my MacBook it displays at 1080p, but now I doubt is true since the default res is 1440x900).

Thanks!
 
Thanks AbominableFish for your kind reply.

What I find confusing in MacOS is that apparently scaling is managed as a "resolution option". That is, you set a "virtual" 1680x1050 or 1440x900 resolution as a way to scale your UI, even if in the back your real resolution is always 2560x1600.

In Windows it doesn't work like that; you set a resolution that can be any supported combination up to the max physical resolution of your screen, and you set up UI scaling with a different option. For example, you have a 1920x1080 (1080p) res and the UI at 125% or 150%. Any program querying the operating system see that the computer has a 1080p res (wether you are at 75%-125%-150% UI scaling)

Now, my remaining doubts:

- If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one (1440x900) instead of the real one (2560x1600)? For example, I have plenty of programs that display graphics at 1440x900 when they should be doing it at 2560x1600 and be a lot crisper.

-Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what? Why would Netflix detect that? (in my MacBook it displays at 1080p, but now I doubt is true since the default res is 1440x900).

Thanks!

2,880 x 1,800 is for the 2015 (and later) MacBook Pro 15 inch models. 2,560x1,600 is for the MacBook Pro 13 inch models.

I think that it's more reasonable to report the virtual resolution as applications have no access to higher resolutions.
 
Thanks AbominableFish for your kind reply.

What I find confusing in MacOS is that apparently scaling is managed as a "resolution option". That is, you set a "virtual" 1680x1050 or 1440x900 resolution as a way to scale your UI, even if in the back your real resolution is always 2560x1600.

In Windows it doesn't work like that; you set a resolution that can be any supported combination up to the max physical resolution of your screen, and you set up UI scaling with a different option. For example, you have a 1920x1080 (1080p) res and the UI at 125% or 150%. Any program querying the operating system see that the computer has a 1080p res (wether you are at 75%-125%-150% UI scaling)

Now, my remaining doubts:

- If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one (1440x900) instead of the real one (2560x1600)? For example, I have plenty of programs that display graphics at 1440x900 when they should be doing it at 2560x1600 and be a lot crisper.

-Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what? Why would Netflix detect that? (in my MacBook it displays at 1080p, but now I doubt is true since the default res is 1440x900).

Thanks!
No problem, I'm glad it helped.

As for the scaling on macOS vs Windows, I personally prefer the way it is done on macOS. The application has no information about the scaling, which is good, because then it doesn't need to decide how to display on 1x resolution, or on 1.5x scaling. It just renders everything one way and if the user wants to make items on the screen bigger or smaller, they tell macOS and it occurs system wide—apps don't need to worry at all about this. They just continue doing what they were doing. That's why a lot of things don't scale correctly on Windows, but on macOS there never is any problems. I've never seen a scaling issue on macOS, but have seen scaling woes on Windows (4K displays seem to be the worst for this). Basically the scaling on macOS just seems to work, and it's one thing I think Apple got right.

So I would personally argue it's good that apps don't have access to the physical resolution as-is, because that's not a metric which they should care about.

It's important to note that on macOS there is no way to tell the operating system to render at the same resolution as your display but scale everything smaller. Basically, the way you scale on macOS is by changing your virtual resolution. It's not like on Windows where you have a fixed resolution and you can change the scaling. Both methods have advantages and disadvantages. I personally prefer the macOS way because that way apps will never have scaling issues and developers don't need to deal with making extra assets for 1.5x scaling, 2.0x scaling, etc. However, if you want to use a scale at anything other than 100%, you will have to compromise by giving up on sharpness.

> If MacOS uses this "virtual" resolutions as a mean for UI scaling, why would it answer back to any querying program that the res is the virtual one

I'm not sure what you mean here, but macOS won't scale at 1440x900 unless you have that option selected in system preferences. If you have the 1280x800 option selected, that's how it's going to scale, and everything will be as crisp as can be. That's how I have it configured right now. If you want greatest parity between physical and virtual pixels, I recommend you do the same.

1598379482247.png


> Where does the 2880x1800 resolution you mention comes from? Is that a multiple of what?

This is another one of the confusions which makes it kind of annoying when talking about resolution. In system prefs it says 1440x900, which is the "scaling resolution." But it really means 2880x1800 because we're on a high DPI display. Similarly, if you chose the 1280x800 scaling option, the actual virtual resolution would be 2560x1600. The reason for this you probably know already from the article you linked.

Netflix is displaying at 1080p, but that doesn't mean Netflix thinks that your screen is 1080p. It's most likely because it thinks it's the optimal resolution for your internet connection, or it simply can't display more than 1080p (I think in Safari it can't). But I'm quite sure that Netflix detects your screen correctly as 2880x1440 in resolution.

If you're curious about how website and apps view your screen, go to this website: http://whatismyscreenresolution.net

You'll notice that if you set the scaling resolution as 1440x900, then your screen will be identified as such (actually it is technically twice that, but many websites just report the scaled value). Basically anything in your operating system will take that as your virtual resolution.

You can also test it by taking a fullscreen screenshot and then looking at the resolution. If you're scaling at 1440x900, the res of the screenshot will be 2880x1440, confirming that the OS and everything in it is rendered at that virtual resolution before being downscaled to display it on your 2560x1600 screen.
 
Great explanation, AbominableFish. You may have a point in that this way of doing the scaling is easier for the developers.

Still, I'm not totally convinced this is the best way of doing things for all kind of applications. Let me give you an example. As we know, Macs are not good at gaming, at least intensive/complex 3D games, but cloud computing is changing things, with options such as Google Stadia and Geforce Now.

With a good internet connection, you can play Destiny 2, all bells and whistles on, in GeForce Now with your MacBook 13", since the heavy computation is done in the server and you Mac is only displaying a video stream, not different than Netflix. But, the way MacOs handles resolutions, introduces a problem... Stadia or Geforce think your computer has a "1440x900" screen, instead of a 2560x1600 one, and the internal rendering buffer is done at that resolution. You are definitely receiving a blurrier stream than necessary. Sure, you can play with this "virtual res", and up to 1600x1050, which is still under what anybody with a lowly Chromebook would receive at 1080P.

But this problem is not specific to cloud server computing, it affects local software too. If I test a graphical game in Windows with a 1080P screen it's always crispier and more defined than the MacOS version due to it internally rendering at 1440x900P instead of a higher res, even though the Mac physical screen is way more precise. And the scaling up to the DPI display, though it may help with text, somehow doesn't do much to graphics.

So, yes, this "virtual res" scaling is fine for desktop apps, but for heavy graphical apps I see problems, though it may be the best compromise for Macs target software.
 
Don’t forget that photos and video are displayed at the higher resolution on a Retina display. So if you use “looks like 1280x800”, photos and video elements are rendered at 2560x1600; UI elements, text, etc. use @2 HiDPI elements, so look sharp as on a 2560x1600 display but are sized as if the display were 1280x800.

the big difference from Windows is what happens if you select “looks like 1440x900” on a 13.3” 2560x1600 display. In MacOS the frame buffer is 2880x1800 and is down scaled to 2560x1600 as the final step (presumably using something like bicubic to do so as smoothly as possible). This taxes the GPU, but results in a desktop that is almost as sharp as native @2, but with more real estate.
Same for external displays. Eg I have my 4K 32” set to “looks like 2560x1440”. Smooth as butter with my eGPU.
 
Wow. I'm so glad this discussion is here. I'm trying to decide whether to keep an M1 MBP that I'm test driving or go back to my 15" 2018 MBP. So far the M1 is surpassing my expectations even though I picked up an 8GB version.

Even though it's working out well as a computer, I'm on the verge of returning it because I just couldn't get comfortable when using it as a laptop. Something seemed off visually. I thought it was my glasses. I suspected that maybe the slight size difference was consequential to the way I placed the laptop on my lap and making my vision blurry.

No. It's that the 13" is scaled differently than my 15" is.

Well. I wish I had figured this out earlier. I've been rubbing my eyes for the past week.
 
Wow. I'm so glad this discussion is here. I'm trying to decide whether to keep an M1 MBP that I'm test driving or go back to my 15" 2018 MBP. So far the M1 is surpassing my expectations even though I picked up an 8GB version.

Even though it's working out well as a computer, I'm on the verge of returning it because I just couldn't get comfortable when using it as a laptop. Something seemed off visually. I thought it was my glasses. I suspected that maybe the slight size difference was consequential to the way I placed the laptop on my lap and making my vision blurry.

No. It's that the 13" is scaled differently than my 15" is.

Well. I wish I had figured this out earlier. I've been rubbing my eyes for the past week.

Probably the aspect ratio. I far prefer the 15s to the 13s when the computer is on my lap.
 
  • Like
Reactions: smirking
Probably the aspect ratio. I far prefer the 15s to the 13s when the computer is on my lap.

Yeah, I'm only now realizing that a 13" isn't just a smaller 15". I wanted to give downsizing a try. The things I knew to be potential issues are pretty much non-factors, but the potential issues I wasn't aware of are creating some challenges for me.

I'm not ready to give up yet, but deciding between the M1 MBP I'm test driving and my old 2018 15" is proving more difficult than I expected.
 
Yeah, I'm only now realizing that a 13" isn't just a smaller 15". I wanted to give downsizing a try. The things I knew to be potential issues are pretty much non-factors, but the potential issues I wasn't aware of are creating some challenges for me.

I'm not ready to give up yet, but deciding between the M1 MBP I'm test driving and my old 2018 15" is proving more difficult than I expected.

Well, to throw more into the mix are the 14 and 16 inch models coming out in the next 6-12 months.
 
Well, to throw more into the mix are the 14 and 16 inch models coming out in the next 6-12 months.
I expect to end up with a 16" eventually, but given the anticipation, pent-up demand, and worldwide supply chain issues, whatever I decide to go with is going to have to be viable as a daily driver for at least 6 months.
 
I expect to end up with a 16" eventually, but given the anticipation, pent-up demand, and worldwide supply chain issues, whatever I decide to go with is going to have to be viable as a daily driver for at least 6 months.
I’m in the same boat as you. Picked up a 13” M1 in March because I needed to go back to the office after I sold my 16” back in January to get some $$ out of it while I could (bad decision)...
I’ve always been using the 15” and 16” before the M1, so switching to such a small display was not pleasant.
I now set the resolution to 2560x1600 when I work closely on it (and 1680 when using a 2nd monitor) and the amount of space is fantastic, almost identical to a 16” - though everything is tiny in the UI. I have a good eyesight, so it’s not really an issue for me and many apps allow for a zoomed UI (VS Code or Teams or Firefox/Safari) so the content is actually decently sized while the menu bar remains small.
I was thinking like you that I would go back to a 16” when they’re running Apple Silicon, but the more I use the 13”, the more I feel it’s the perfect laptop size and the power is quite sufficient for me, as a developer. I can even run Windows in Parallels, run Visual Studio, build, debug complex apps and it just runs well. It’s not the fastest computer on the block but it works, and it works silently, doesn’t get hot and it’s a pleasure to use. And it’s also more practical to carry and lighter than a 16”.
So all in all, it’s a great little machine, and I prefer it a lot more than my 16”.
Just wanted to share all this, as it might help you make a decision on which one to keep/look forward to. Just get 16 GB of ram.
 
So all in all, it’s a great little machine, and I prefer it a lot more than my 16”.
Just wanted to share all this, as it might help you make a decision on which one to keep/look forward to. Just get 16 GB of ram.

I'm at the age where you wear progressive lenses and the one quirk about the 13" that I wasn't ready for is that because of the modest shortening of the laptop dimensions, the screen is just a smidge too close and I'm having some difficulty adjusting to it.

Of course, if I stuck with it, I'd most likely adapt, but this was an unexpected challenge.

I'm definitely going with a 16GB machine when the new models come out. I intentionally went with an 8GB for now because I had advised some people that they needn't worry about 8GB being too little for their modest use cases. I wanted to put my money where my mouth and at least try it out so I know I'm not a liar. I heard from some pretty heavy users that they tried an 8GB M1 and were doing fine so I didn't feel like it was too much of a risk especially when I wasn't planning for this to be my endgame laptop.

I would just return it if 8GB was too limiting... and it's not, but the extra money for the 16GB model isn't an issue so I'll bump up to that when I get my endgame laptop. My most recent 15" is a 32GB. The way the M1 makes use of memory, I don't think that'll be necessary anymore.
 
I'm at the age where you wear progressive lenses and the one quirk about the 13" that I wasn't ready for is that because of the modest shortening of the laptop dimensions, the screen is just a smidge too close and I'm having some difficulty adjusting to it.

Of course, if I stuck with it, I'd most likely adapt, but this was an unexpected challenge.

I'm definitely going with a 16GB machine when the new models come out. I intentionally went with an 8GB for now because I had advised some people that they needn't worry about 8GB being too little for their modest use cases. I wanted to put my money where my mouth and at least try it out so I know I'm not a liar. I heard from some pretty heavy users that they tried an 8GB M1 and were doing fine so I didn't feel like it was too much of a risk especially when I wasn't planning for this to be my endgame laptop.

I would just return it if 8GB was too limiting... and it's not, but the extra money for the 16GB model isn't an issue so I'll bump up to that when I get my endgame laptop. My most recent 15" is a 32GB. The way the M1 makes use of memory, I don't think that'll be necessary anymore.

I use TriFocals (Distance, Computer, Reading) when walking about but use BiFocals (Computer, Reading) when at a computer. I have the BiFocals at my Desktop Cluster and in the Living Room where I use a laptop. I also have a pair of TriFocals with a very tall Computer section which can be used with a computer. I don't think that I could manage progressives because you have to tilt your head or angle the glasses to get the right focus for the distance that you're looking at. It is somewhat of a pain with the glasses. I have about ten pairs of single-lens, bifocals, trifocals and sunglass and night-driving variants. But I've found that it's the most comfortable way to go.
 
  • Like
Reactions: smirking
I don't think that I could manage progressives because you have to tilt your head or angle the glasses to get the right focus for the distance that you're looking at. It is somewhat of a pain with the glasses. I have about ten pairs of single-lens, bifocals, trifocals and sunglass and night-driving variants. But I've found that it's the most comfortable way to go.

I'm only on my second set of progressives. My first one a few years ago felt like something I'd never get used to, but after a week it wasn't a total struggle and after a month it felt seamless.

I was expecting my second pair to be a continuation of the first, but of course that's not how it works. You know how every time you get new glasses you go through that adjustment period where you're convinced they must have gotten measurements wrong? They must have gotten your left eye and right eye's prescription mixed up or damnit, you must be wearing someone else's glasses. 😂

Well, so it was like this for my new progressives and it was very unfortunate that I'm trying to get used to a 13" laptop at the same time I'm trying to adapt to a new set of progressives. I can't really tell if it's the laptop or my glasses when I'm having problems reading the screen.

It is getting better though now after a full week of using a 13" screen. I think I'll be able to adjust. I did give having a bunch of single vision lenses a consideration, but I can't remember where my keys are for more than 5 minutes at a time, I'd probably end up having to drive with my reading glasses. I only have a pair of single vision for night driving, but half the time I can't remember where I put them because I'm not doing that much night driving.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.