Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

conanthewarrior

macrumors member
Original poster
Nov 5, 2014
53
3
Hi everyone, I am a bit confused by which resolution is best to choose for videos on my 15", 2013 MBP I got a few days ago.

I understand the screen resolution is 2880x1800, but is actually half this, times 2 to make things appear larger.

On youtube, am I best to choose 1080, or 1440? There is enough pixels for 1440, but as the os scales up 1400x900 x2, I don't know if I will actually be getting 1440, or even 1080.

1080 definitely looks much better than 720, and I think 1440 does, but am unsure if this is psychological with 1440.

Thanks for your help, Conan.
 

New_Mac_Smell

macrumors 68000
Oct 17, 2016
1,931
1,552
Shanghai
You choose automatic, and get the best resolution your bandwidth can support.

Curious why this is a question? There's no harm in choosing the best you can use.
 

caramelpolice

macrumors regular
Oct 6, 2012
212
232
The OS is not running at 1440x900 and blowing everything up.

The OS is running at 2880x1800. Text is rendered twice as large, and the OS and icons use higher-resolution "2X" bitmap assets to make everything look sharper. Since everything is rendered with twice as high-resolution assets, you get the same "real estate" as a non-Retina 1440x900 display.

Long story short - you can watch 1080p and 1440p video on YouTube and rest assured you are seeing the full resolution.
 

redheeler

macrumors G3
Oct 17, 2014
8,574
9,162
Colorado, USA
On youtube, am I best to choose 1080, or 1440? There is enough pixels for 1440, but as the os scales up 1400x900 x2, I don't know if I will actually be getting 1440, or even 1080.
1440p video does take advantage of your Retina display, making it look more crisp and show more detail than 1080p. If you can go all the way up to 2160p (4K) without issue then do so, as even compared to 1440p you will see slightly more detail and less-noticeable compression artifacts.
 

conanthewarrior

macrumors member
Original poster
Nov 5, 2014
53
3
Thanks for the replies.

I asked as I was interested if the video is actually displayed at that resolution, or not. I just like to know how things work, may seem like a silly question but I really wanted to know.

Oh, I misunderstood then. I thought the OS was running so it looks like 1440x900, but blowing it up by using extra pixels to give the clarity of 2880x1900.

I can run the 2K without issue, I didn't know it would be beneficial.

Again, sorry if the question seems dumb, I just really wanted to know the answer.
 

leman

macrumors Core
Oct 14, 2008
19,409
19,491
Oh, I misunderstood then. I thought the OS was running so it looks like 1440x900, but blowing it up by using extra pixels to give the clarity of 2880x1900.

Its exactly how it works. In the 1440x900 mode, it draws each "pixel" using 2x2 real pixels for increased detail. This is essentially want is known as supersampling AA or subsample/subpixel rendering, only that in this case the subsamples are actual small pixels.

Other scaled resolutions work the same.

Software that is aware of the HiDPI nature of the display can access the "subpixels" directly to prevent loss of detail when resampling images. Safari can do it when playing youtube videos.

I can run the 2K without issue, I didn't know it would be beneficial.

Well, do you see a difference? If not, does it matter?
 

conanthewarrior

macrumors member
Original poster
Nov 5, 2014
53
3
Its exactly how it works. In the 1440x900 mode, it draws each "pixel" using 2x2 real pixels for increased detail. This is essentially want is known as supersampling AA or subsample/subpixel rendering, only that in this case the subsamples are actual small pixels.

Other scaled resolutions work the same.

Software that is aware of the HiDPI nature of the display can access the "subpixels" directly to prevent loss of detail when resampling images. Safari can do it when playing youtube videos.



Well, do you see a difference? If not, does it matter?

Oh, I was confused as Caramel police said the OS is not running at 1400x900 and blowing everything up.

I don't see a difference between 1440 and 2K, no, and of course it doesn't matter. I just really wanted to know the 1400x900 to 2880x1800 was done, and you explained it perfectly for me :).
 

jerryk

macrumors 604
Nov 3, 2011
7,419
4,207
SF Bay Area
I assume on youtube it just is selecting the resolution of the source. How to best handling display that resolution source based on you MacBook's settings are a function of MacOS and the browser used.
 

caramelpolice

macrumors regular
Oct 6, 2012
212
232
leman's post is... extremely wrong. It's not supersampling AA or "subpixel rendering." It's much simpler than that.

It's literally just "everything is drawn with higher-resolution assets." Like, hypothetically, say an app uses a 32x32 image for a button on a non-Retina display. On a Retina display, they use a 64x64 image instead. They don't just take the 32x32 image and blow it up, there is a separate 64x64 version made with more detail to take advantage of the extra pixels. This extends to every image, button, icon, etc. you see in macOS on a Retina display. Because everything is drawn with assets that are exactly twice the size of "non-Retina" assets, you end up with the same effective screen real estate as a non-Retina display with half (or a quarter, technically) the pixels (so a 2880x1800 display gets you the same effective screen real estate as a 1440x900 "non-Retina" display).

This is assuming you're using the "native" resolution of your display, of course ("like 1280x800", aka 2560x1600 for the 13" Pro and "like 1440x900", aka 2880x1800 for the 15" Pro). For the other resolutions, it resamples. So, for instance, the "1680x1050" mode on the 13" Pro actually renders a desktop at 3360x2100 internally, then downsamples it to the display's native 2560x1600. This means the image isn't quite as crisp as the "1280x800" mode, because you inevitably lose pixel detail when you shrink an image.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,409
19,491
leman's post is... extremely wrong. It's not supersampling AA or "subpixel rendering." It's much simpler than that.It's literally just "everything is drawn with higher-resolution assets."

Which is the basic idea behind supersampling. You draw at a higher resolution than your display allows (subpixel precision) in order to increase the visual acuity. Only that in a case of HiDPI hardware, the display actually can physically display the "subpixels". This is all about relationship between logical and physical pixels on the screen.

Using higher-resolution assets is the logical consequence of it. Factually, they have exactly the same size as low-DPI assets (since they appear same dimensions on the screen) , they simply increase the sampling resolution. Unfortunately, due to unfortunate but technically necessary decisions and conventions in computer graphics we are conditioned into talking about resolutions in a way that ignores the basic idea behind the concept — resolution is simply about level of spatial detail you can display. In another words, dimensions of an image and its spatial resolution are different things. Printer people got it right. Computer people still mix it up though, since they are used to the old idea of hardcoded DPI (that never really worked anyway).

I am looking forward to next-gen operating system APIs where the UI and graphical assets are specified in terms of their actual physical dimensions + spatial information at a suitable but flexible resolution and the OS correctly maps this information to the panel capabilities and dimensions.

So, for instance, the "1680x1050" mode on the 13" Pro actually renders a desktop at 3360x2100 internally, then downsamples it to the display's native 2560x1600. This means the image isn't quite as crisp as the "1280x800" mode, because you inevitably lose pixel detail when you shrink an image.

At the same time it will be more accurate than a 1680x1050 on a native 1680x1050 panel.
 

caramelpolice

macrumors regular
Oct 6, 2012
212
232
Which is the basic idea behind supersampling. You draw at a higher resolution than your display allows (subpixel precision) in order to increase the visual acuity. Only that in a case of HiDPI hardware, the display actually can physically display the "subpixels". This is all about relationship between logical and physical pixels on the screen.

Using higher-resolution assets is the logical consequence of it. Factually, they have exactly the same size as low-DPI assets (since they appear same dimensions on the screen) , they simply increase the sampling resolution. Unfortunately, due to unfortunate but technically necessary decisions and conventions in computer graphics we are conditioned into talking about resolutions in a way that ignores the basic idea behind the concept — resolution is simply about level of spatial detail you can display. In another words, dimensions of an image and its spatial resolution are different things. Printer people got it right. Computer people still mix it up though, since they are used to the old idea of hardcoded DPI (that never really worked anyway).

I am looking forward to next-gen operating system APIs where the UI and graphical assets are specified in terms of their actual physical dimensions + spatial information at a suitable but flexible resolution and the OS correctly maps this information to the panel capabilities and dimensions.



At the same time it will be more accurate than a 1680x1050 on a native 1680x1050 panel.

I think we might mostly just disagree on terminology.

"Supersampling" is pretty exclusively used to refer to, as you mentioned, drawing at a higher resolution and scaling down. macOS does do supersampling in high-DPI mode - that's exactly how the downsampled high-resolution scales work. But the "1280x800" and "1440x900" modes on the 13" and 15" Retina MBPs, respectively, are not supersampled by the common definition of supersampled. The image being rendered is a 1:1 perfect pixel match for the display's native resolution.

What you are getting at in terms of "logical" pixels and "physical" pixels is what macOS internally refers to as "points" (logical) versus "pixels" (physical). In other words, "pixels" in macOS parlance always refers to the physical pixels. Because macOS has to work on both non-Retina and Retina displays, software developers design their UIs and layouts in terms of "points" (which are basically units of "a non-Retina pixel"), which are then translated into actual pixels by the operating system depending on whether it's a "Retina" display or not. But this is a software implementation detail in practice, and is mostly only used for UI - things like video playback, 3D rendering, and so on can address individual pixels as they always have. From an end user's point of view, what matters is that they are not seeing a lower-resolution image that is then "blown up". Every pixel of their display is taken advantage of and can be directly addressed.
 

leman

macrumors Core
Oct 14, 2008
19,409
19,491
Oh, I was confused as Caramel police said the OS is not running at 1400x900 and blowing everything up.

Well, caramelpolice is not wrong and what they wrote is in fact technically accurate. If there is any confusion, its because how we are habituated to think about resolutions and pixels (as I mentioned above).

The thing is, the OS is indeed running a 1440x900 resolution. Its easy to check — open up a new Swift playground in Xcode and use the following code to query the dimensions of the screen:

Code:
import Cocoa

let main_screen = NSScreen.main!

main_screen.frame.width
main_screen.frame.height

Mine says 1680x1050 (I am running a scaled retina mode).

And that is what your software sees — it genuinely believes to be running this resolution, just as if you were using a non-retina mode. This is what is known as logical pixels or points. At the same time, everything is drawn of course at the full 2880x1800 resolution since every point is represented by more then one hardware pixel. Before retina displays, the mapping was usually one point = one hardware pixel. After retina displays, its more complicated.

So yes, its running 1440x900, but displays it using 2880x1800 pixels. The OS is aware of this discrepancy of course and is able to use higher-quality assets to take advantage of the available higher spatial resolution (just as caramelpolice describes). An app that is retina-aware can also check whether the system is running in HiDPI mode and adjust its custom drawing appropriately.
[doublepost=1526897589][/doublepost]
"Supersampling" is pretty exclusively used to refer to, as you mentioned, drawing at a higher resolution and scaling down.

This is why I said it was the same idea as supersampling — drawing at a higher resolution than the target resolution (the target resolution is still the logical resolution aka the resolution in points!). Of course supersampling in itself is technically different, since it contains the obligatory resolve step. I still think its a good analogy — you take advantage of the fact that the display can draw at subpoint accuracy. But yes, its very easy to loose the common ground here, sorry if this caused confusion.

But this is a software implementation detail in practice, and is mostly only used for UI - things like video playback, 3D rendering, and so on can address individual pixels as they always have. From an end user's point of view, what matters is that they are not seeing a lower-resolution image that is then "blown up". Every pixel of their display is taken advantage of and can be directly addressed.

Well, it depends on what your software does. Legacy apps do only see the logical resolution. Like, if you query a full-screen OpenGL context, you will get a 1440x900 buffer etc. So if you do custom drawing you have to take care of the point/pixel discrepancy manually (of course if you use system APIs, they will do it for you automatically).

This opens a world of exiting possibilities where one can combine the best of two worlds. For instance, in the 3D software I am writing, I draw the 3D scenes at reduced resolution (to allow for better performance), while I draw the UI at full panel resolution. This gives you decent graphics with good performance, and the super-crisp UI. Unfortunately, almost no game bothers doing it the right way...
 
Last edited:

ixxx69

macrumors 65816
Jul 31, 2009
1,298
879
United States
Hi everyone, I am a bit confused by which resolution is best to choose for videos on my 15", 2013 MBP I got a few days ago.

I understand the screen resolution is 2880x1800, but is actually half this, times 2 to make things appear larger.

On youtube, am I best to choose 1080, or 1440? There is enough pixels for 1440, but as the os scales up 1400x900 x2, I don't know if I will actually be getting 1440, or even 1080.

1080 definitely looks much better than 720, and I think 1440 does, but am unsure if this is psychological with 1440.

Thanks for your help, Conan.
As long as you're using one of the HiDPI/scaled resolution settings, your display will display images and video at their native resolution and will only start scaling once you exceed the display resolution - e.g. a 4K image/video would scale down to 2880x1800 (to fit on the display). An 1080p HD video will play at native resolution since it's less than 2880x1800.

Some of the discussion in this thread goes a little into the weeds... if anyone wants to know how desktop scaling works, this Apple developer support doc explains it pretty well in simple defined terms...
https://developer.apple.com/library...hResolutionOSX/Introduction/Introduction.html

Cheers!
 
  • Like
Reactions: leman
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.