Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Drich290195

macrumors 6502
Original poster
Apr 2, 2011
467
6
I have a mid 2015 15 inch MacBook Pro. Im just playing with SwitchRes and have a couple of questions. They offer options of resolutions with HIDPI and non HIDPI.

What is the difference? What does HIDPI do. As far as I'm aware you only get the true resolution at 1440x900. Do the scaled resolutions with HIDPI still keep images using the full 1:1 native and just scale the text or is it only best for retina that does this.

Am i right in thinking that scaled resolutions will still render the image at the native 1:1 pixel ratio just scale the text and UI elements on the screen.

Im confused as to what HIDPI is so any help would be gratefully received.
 
Last edited:
  • Like
Reactions: Marty_Macfly

Toutou

macrumors 65816
Jan 6, 2015
1,082
1,575
Prague, Czech Republic
HiDPI mode means that the display stays at a certain resolution but draws everything scaled twice.

E.g. a 3840x2160 (4K) display can work as a huuuuge display with the screen real estate of four (normal) FullHD displays.
Or in HiDPI mode the screen real estate will be equal to a FullHD display but displayed with four times as many pixels (much sharper text, windows, icons).
The screen of your MacBook runs at 2880x1800 HiDPI — the screen real estate is equal to any other 1440x900 laptop, but is sharper. Apple calls this "retina display".

A "scaled resolution" is a term Apple uses for something that's really just feeding a non-native resolution to the display. For example, I'm running my 13" at 1440x900, scaled. The display itself is 2560x1600.
This is what happens:
- the screen is composed (window sizes, positions) as if it was a 1440x900 screen
- it's a retina screen (HiDPI), so everything is rendered x2, i.e. at 2880x1800
- the 2880x1800 result is displayed on a 2560x1600 screen
Note that the resolutions don't match each other, so each physical pixel displays the result of interpolation between two virtual pixels. This causes the image to be sliiiightly blurrier, which is a hit I'm willing to accept for more screen estate. This is the whole magic behind "scaled resolution". (There's no magic)

Windows handles scaling differently - window and font sizes are actually changing by a percentage. This produces clear and sharp images but is much harder to implement and can cause weird compatibility problems.
 
  • Like
Reactions: JerTheGeek

Drich290195

macrumors 6502
Original poster
Apr 2, 2011
467
6
Thanks helps a lot


HiDPI mode means that the display stays at a certain resolution but draws everything scaled twice.

E.g. a 3840x2160 (4K) display can work as a huuuuge display with the screen real estate of four (normal) FullHD displays.
Or in HiDPI mode the screen real estate will be equal to a FullHD display but displayed with four times as many pixels (much sharper text, windows, icons).
The screen of your MacBook runs at 2880x1800 HiDPI — the screen real estate is equal to any other 1440x900 laptop, but is sharper. Apple calls this "retina display".

A "scaled resolution" is a term Apple uses for something that's really just feeding a non-native resolution to the display. For example, I'm running my 13" at 1440x900, scaled. The display itself is 2560x1600.
This is what happens:
- the screen is composed (window sizes, positions) as if it was a 1440x900 screen
- it's a retina screen (HiDPI), so everything is rendered x2, i.e. at 2880x1800
- the 2880x1800 result is displayed on a 2560x1600 screen
Note that the resolutions don't match each other, so each physical pixel displays the result of interpolation between two virtual pixels. This causes the image to be sliiiightly blurrier, which is a hit I'm willing to accept for more screen estate. This is the whole magic behind "scaled resolution". (There's no magic)

Windows handles scaling differently - window and font sizes are actually changing by a percentage. This produces clear and sharp images but is much harder to implement and can cause weird compatibility problems.
 

leman

macrumors Core
Oct 14, 2008
19,444
19,556
HiDPI simply refers to the fact that the used resolution is much higher than what was traditionally used. In its current usage, it also implies sub-point rendering. Its very similar to how printers work: the quality of the output depends on how many small points you use to construct your image — but the resolution does not affect the size of the output.
[doublepost=1487073452][/doublepost]
Windows handles scaling differently - window and font sizes are actually changing by a percentage. This produces clear and sharp images but is much harder to implement and can cause weird compatibility problems.

It doesn't matter whether you scale by percentage or use a constant backing factor — there will still be interpolation, real or implied (btw, OS X did include flexible scaling factor as an experimental feature, but that was scraped around Snow Leopard times). The main reason why Windows appears to be crispier to some is because of their way of rendering fonts.
 

Toutou

macrumors 65816
Jan 6, 2015
1,082
1,575
Prague, Czech Republic
whether you scale by percentage or use a constant backing factor — there will still be interpolation
I was not stressing the percentage/constant difference, but if I'm not terribly wrong, DPI-aware (important!) apps in Windows can adjust their font sizes and graphics accordingly so there is no need for interpolation — the output is always native and rendered with scaling in mind.

This is a difficult but technically superior solution to Apple's "we make the hardware so let's just render at 200% and if you need anything else, well, take this interpolated picture that looks good enough"
 

leman

macrumors Core
Oct 14, 2008
19,444
19,556
I was not stressing the percentage/constant difference, but if I'm not terribly wrong, DPI-aware (important!) apps in Windows can adjust their font sizes and graphics accordingly so there is no need for interpolation — the output is always native and rendered with scaling in mind.

This is a difficult but technically superior solution to Apple's "we make the hardware so let's just render at 200% and if you need anything else, well, take this interpolated picture that looks good enough"

Its not that simple though.

The interpolation always happens — this way or another. You still must map the mathematical concept (e.g. line/curve) onto the pixel grid. The main difference is that Apple's solution simply tells you that every logical point is backed by exactly 2x2 physical pixels (which is a lie of course), while with a flexible backing factor you can have stuff like one point is backed by 1.5x1.5 pixels. The question is then how your app deals with it. Ideally, when drawing a single-point line (and wanting to be correct), the app should use blending (interpolation) on pixels that only partially cover the line. In the real life, only few bother though — most of the time one uses a cheap algorithm which accepts or rejects the pixel based on some threshold coverage. The result? A crisp looking line which is also wrong. In fact, Apple's solution can lead to the more correct result under these circumstances. The app can draw the line to the 2x2 backing buffer using coverage threshold and the window manager will then interpolate back to a x1.5 scaling factor. This gives you a much better treatment of all those partially covered pixels than what a naive drawing algorithm could do when drawing directly to a native-resolution target. What Apple is doing here is essentially super-sampling AA. I would disagree in calling this a technically inferior solution. In fact, from the technical standpoint, its a much more involved and sophisticated system. After all, the OS needs to track and interpolate the dirty rects in the fashion that will not introduce any image artefacts — very difficult to pull off properly and much more then just "give an app a buffer and let it do whatever it wants" approach that Windows uses.

The only real issue I can see with Apple's solution is when you really need that pixel precision. Which you of course don't get with OS X HiDPI, because you are drawing to subpixels with an unpredictable position in the final pixel grid. But with modern high-resolution screens, I very much doubt that there are a lot of valid reasons to want pixel precision.

Now, the reason why Windows is commonly known to have 'crisper' graphics is because they abuse pixel-perfect rendering. This means distorting the underlaying mathematical representation of the image for the purpose of better final pixel alignment. There is a lot of written on the topic of OS X vs. Windows font rendering. In the end, I guess its a matter of personal aesthetic preferences. As for me, I prefer to see things as they were actually intended by the artist, and not how they would look when mapped to the closest pixel.
 

mehulparmariitr

macrumors member
Jan 13, 2022
51
4
HiDPI mode means that the display stays at a certain resolution but draws everything scaled twice.

E.g. a 3840x2160 (4K) display can work as a huuuuge display with the screen real estate of four (normal) FullHD displays.
Or in HiDPI mode the screen real estate will be equal to a FullHD display but displayed with four times as many pixels (much sharper text, windows, icons).
The screen of your MacBook runs at 2880x1800 HiDPI — the screen real estate is equal to any other 1440x900 laptop, but is sharper. Apple calls this "retina display".

A "scaled resolution" is a term Apple uses for something that's really just feeding a non-native resolution to the display. For example, I'm running my 13" at 1440x900, scaled. The display itself is 2560x1600.
This is what happens:
- the screen is composed (window sizes, positions) as if it was a 1440x900 screen
- it's a retina screen (HiDPI), so everything is rendered x2, i.e. at 2880x1800
- the 2880x1800 result is displayed on a 2560x1600 screen
Note that the resolutions don't match each other, so each physical pixel displays the result of interpolation between two virtual pixels. This causes the image to be sliiiightly blurrier, which is a hit I'm willing to accept for more screen estate. This is the whole magic behind "scaled resolution". (There's no magic)

Windows handles scaling differently - window and font sizes are actually changing by a percentage. This produces clear and sharp images but is much harder to implement and can cause weird compatibility problems.
So if I have set scale to 2560*1440 on a 4k screen it will double 2k to 5120*2880 and then downscale to 4k resolution.
What will happen if I use the native(default) resolution of 4k? Will it also double or there is no need of rendering up and down?
 

aevan

macrumors 601
Feb 5, 2015
4,519
7,201
Serbia
We don't use that kind of language here, OP. We call things the proper way - the way Steve did. So it's "Retina".
 

Yebubbleman

macrumors 603
May 20, 2010
6,011
2,599
Los Angeles, CA
Translated into Apple Marketing, HIDPI is effectively Retina display technology. DPI stands for dots per inch; and the whole point behind a Retina display is how sharp the picture is (a byproduct of how much detail is in the image; ergo dots per inch).
 

Toutou

macrumors 65816
Jan 6, 2015
1,082
1,575
Prague, Czech Republic
So if I have set scale to 2560*1440 on a 4k screen it will double 2k to 5120*2880 and then downscale to 4k resolution.
What will happen if I use the native(default) resolution of 4k? Will it also double or there is no need of rendering up and down?
Yes, exactly. See for yourself, just make a screenshot of the whole screen and check the resolution. Screenshots are always the size of the underlying framebuffer (the virtual screen).
Native 4K renders in 4K, as usual.
 
  • Like
Reactions: mehulparmariitr

mehulparmariitr

macrumors member
Jan 13, 2022
51
4
Yes, exactly. See for yourself, just make a screenshot of the whole screen and check the resolution. Screenshots are always the size of the underlying framebuffer (the virtual screen).
Native 4K renders in 4K, as usual.
oh that's a cool trick. thanks for the info. I have a 27inch dell 4k with 163 ppi and text looks clear with 2k scaling.

Now was thinking of getting ultrawide 3440*1440 but all of them have ppi of 109 which is low for text clarity. There are few which provides 5k resolution on 34 inch but they are above 1000 dollars. That's why now thinking of getting a 32inch 4k.
 

Amethyst1

macrumors G3
Oct 28, 2015
9,672
11,997
We don't use that kind of language here, OP. We call things the proper way - the way Steve did. So it's "Retina".

lionrmbp-jpg.1904240
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.