What is HIDPI ?

Discussion in 'MacBook Pro' started by Drich290195, Feb 13, 2017.

  1. Drich290195, Feb 13, 2017
    Last edited: Feb 13, 2017

    Drich290195 macrumors 6502

    Drich290195

    Joined:
    Apr 2, 2011
    #1
    I have a mid 2015 15 inch MacBook Pro. Im just playing with SwitchRes and have a couple of questions. They offer options of resolutions with HIDPI and non HIDPI.

    What is the difference? What does HIDPI do. As far as I'm aware you only get the true resolution at 1440x900. Do the scaled resolutions with HIDPI still keep images using the full 1:1 native and just scale the text or is it only best for retina that does this.

    Am i right in thinking that scaled resolutions will still render the image at the native 1:1 pixel ratio just scale the text and UI elements on the screen.

    Im confused as to what HIDPI is so any help would be gratefully received.
     
  2. Mindinversion macrumors 6502

    Mindinversion

    Joined:
    Oct 9, 2008
    #2
  3. Toutou macrumors 6502a

    Toutou

    Joined:
    Jan 6, 2015
    Location:
    Prague, Czech Republic
    #3
    HiDPI mode means that the display stays at a certain resolution but draws everything scaled twice.

    E.g. a 3840x2160 (4K) display can work as a huuuuge display with the screen real estate of four (normal) FullHD displays.
    Or in HiDPI mode the screen real estate will be equal to a FullHD display but displayed with four times as many pixels (much sharper text, windows, icons).
    The screen of your MacBook runs at 2880x1800 HiDPI — the screen real estate is equal to any other 1440x900 laptop, but is sharper. Apple calls this "retina display".

    A "scaled resolution" is a term Apple uses for something that's really just feeding a non-native resolution to the display. For example, I'm running my 13" at 1440x900, scaled. The display itself is 2560x1600.
    This is what happens:
    - the screen is composed (window sizes, positions) as if it was a 1440x900 screen
    - it's a retina screen (HiDPI), so everything is rendered x2, i.e. at 2880x1800
    - the 2880x1800 result is displayed on a 2560x1600 screen
    Note that the resolutions don't match each other, so each physical pixel displays the result of interpolation between two virtual pixels. This causes the image to be sliiiightly blurrier, which is a hit I'm willing to accept for more screen estate. This is the whole magic behind "scaled resolution". (There's no magic)

    Windows handles scaling differently - window and font sizes are actually changing by a percentage. This produces clear and sharp images but is much harder to implement and can cause weird compatibility problems.
     
  4. Drich290195 thread starter macrumors 6502

    Drich290195

    Joined:
    Apr 2, 2011
    #4
    Thanks helps a lot


     
  5. leman macrumors 604

    Joined:
    Oct 14, 2008
    #5
    HiDPI simply refers to the fact that the used resolution is much higher than what was traditionally used. In its current usage, it also implies sub-point rendering. Its very similar to how printers work: the quality of the output depends on how many small points you use to construct your image — but the resolution does not affect the size of the output.
    --- Post Merged, Feb 14, 2017 ---
    It doesn't matter whether you scale by percentage or use a constant backing factor — there will still be interpolation, real or implied (btw, OS X did include flexible scaling factor as an experimental feature, but that was scraped around Snow Leopard times). The main reason why Windows appears to be crispier to some is because of their way of rendering fonts.
     
  6. Toutou macrumors 6502a

    Toutou

    Joined:
    Jan 6, 2015
    Location:
    Prague, Czech Republic
    #6
    I was not stressing the percentage/constant difference, but if I'm not terribly wrong, DPI-aware (important!) apps in Windows can adjust their font sizes and graphics accordingly so there is no need for interpolation — the output is always native and rendered with scaling in mind.

    This is a difficult but technically superior solution to Apple's "we make the hardware so let's just render at 200% and if you need anything else, well, take this interpolated picture that looks good enough"
     
  7. leman macrumors 604

    Joined:
    Oct 14, 2008
    #7
    Its not that simple though.

    The interpolation always happens — this way or another. You still must map the mathematical concept (e.g. line/curve) onto the pixel grid. The main difference is that Apple's solution simply tells you that every logical point is backed by exactly 2x2 physical pixels (which is a lie of course), while with a flexible backing factor you can have stuff like one point is backed by 1.5x1.5 pixels. The question is then how your app deals with it. Ideally, when drawing a single-point line (and wanting to be correct), the app should use blending (interpolation) on pixels that only partially cover the line. In the real life, only few bother though — most of the time one uses a cheap algorithm which accepts or rejects the pixel based on some threshold coverage. The result? A crisp looking line which is also wrong. In fact, Apple's solution can lead to the more correct result under these circumstances. The app can draw the line to the 2x2 backing buffer using coverage threshold and the window manager will then interpolate back to a x1.5 scaling factor. This gives you a much better treatment of all those partially covered pixels than what a naive drawing algorithm could do when drawing directly to a native-resolution target. What Apple is doing here is essentially super-sampling AA. I would disagree in calling this a technically inferior solution. In fact, from the technical standpoint, its a much more involved and sophisticated system. After all, the OS needs to track and interpolate the dirty rects in the fashion that will not introduce any image artefacts — very difficult to pull off properly and much more then just "give an app a buffer and let it do whatever it wants" approach that Windows uses.

    The only real issue I can see with Apple's solution is when you really need that pixel precision. Which you of course don't get with OS X HiDPI, because you are drawing to subpixels with an unpredictable position in the final pixel grid. But with modern high-resolution screens, I very much doubt that there are a lot of valid reasons to want pixel precision.

    Now, the reason why Windows is commonly known to have 'crisper' graphics is because they abuse pixel-perfect rendering. This means distorting the underlaying mathematical representation of the image for the purpose of better final pixel alignment. There is a lot of written on the topic of OS X vs. Windows font rendering. In the end, I guess its a matter of personal aesthetic preferences. As for me, I prefer to see things as they were actually intended by the artist, and not how they would look when mapped to the closest pixel.
     

Share This Page