Why is rMBP scaling 1680 and HD modes?

Discussion in 'MacBook Pro' started by Ploki, Dec 28, 2012.

  1. Ploki macrumors 68020

    Jan 21, 2008
    Doesn't that hinder performance?

    Theoretically if its enough pixels to be "retina" one shouldn't need to render at quadrupled resolution and then downscale to actual resolution.

    That means that rMBP is operating 3840* 2160 resolution - rendering that many pixels AND scaling them at the same time.

    That means that "more realestate" option takes up a LOT of cpu.

    That said, when I saw it at the store I don't know what some of you are yacking about but it didn't look blurry or bad at all in HD mode. It looks absolutely great in 1680 mode and in HD mode.
  2. Galatian macrumors 6502


    Dec 20, 2010
    You just answered your own question: running a retina display on the scaled optioned doesn't look blurry because Apple made sure OSX is rendering it in a higher resolution and then downscaling it. Read the excellent reviews about retina dislay on anandtech or/and arstechnica if you want more in depth analysis how this works.
  3. Ploki thread starter macrumors 68020

    Jan 21, 2008
    I already did and they did seem to mention that at upscaled resolution you do loose performance.

    But you get that anyway going form 1440 to 1680. In retina, one would think that going to 1680 would allow you to "gain" some performance because you need to render less pixels?

    Does that mean that games in reality at 1680 actually perform WORSE than at 2880? Well, I know they don't, but that would mean only GPU scales when it comes to games.
    That would mean that one could theoretically bypass OS X scaling in order not to get a performance drop when using a scaled mode.

    Also, does that mean that the Retina gets better performances with external monitor and closed lid?
  4. Mackan macrumors 65816

    Sep 16, 2007
    The OS X scaling happens when you are in desktop mode. When inside games, the rendering control and scaling is taken over by the game itself. It will render frames at the chosen resolution, and not double it and downscale back. If it did, gaming at acceptable fps would be rather impossible.
  5. Galatian macrumors 6502


    Dec 20, 2010
    Ok to put it easy: What is Retina Mode in OS X? In short it is rendering everything it 2560x1600 under the hood and then scales it up to appear like 1280x800. This is for the 13" rMBP of course. Since 2560x1600 is exactly 4 times more pixels then 1280x800 the scaling is pretty easy on the hardware: it is a direct integer and can be easily calculated by the graphic card. Other "scaled" resolutions are not that easy. Apple could simply use a non native resolution and be done with it, but of course this would produce very unsharp pictures. Hence Apple went with the whole upscaling and downscaling. Please do read the excellent AnandTech Article about it.

    You really have to understand the concept of native resolution here though: TFT panels are produced in a way that one segment (consisting of 3 subpixel in each basic colour blue, green and red) equals one pixel. Now it becomes apparent that TFT can only actually draw something on their native resolution, because they can't physically merge those pixel or something. Every other resolution other then the native one is in fact scaled and will not look as good as the native resolution.

    Now mind you this is just the software layer for the UI on OSX. Games of course can be run on any non native resolution. Of course for best visual you would not want that. That's actually the reason why people buy expensive graphic card and many PC games yawn at current gaming consoles because they can barely actually output 1080p which is the the native resolution of most TFT televisions these days.

    Running a game on higher resolutions will always use more graphic power. But running a game on 1680 on the retina screen is just as demanding as running it on an external screen, because usually those game will not use the same scaling methods Apple uses. They only advantage you would gain running it on a different screen might be coming from running it on the actual native resolution. Of course there are more factors though what makes a good screen and usually you everyday 200$ screen has just so much worse color reproduction then the fairly good calibrated retina screens Apple uses.
  6. Krevnik macrumors 68040


    Sep 8, 2003
    Holy misinformation Batman. While there is really only one error here, it's an important one.

    When running in "Best for Display", there is no scaling being done. When in HiDPI mode, the screen canvas is doubled in each dimension. Everything is rendered at that scale. The trick is that it also makes the height and width of UI elements double as well. That's not upscaling, when the artwork used for the elements is larger as well. The reason for this is on the machine itself, the pixels are half the height and width as they are on the cMBP. So a 1" tall icon is rendered using 2x the pixels that have 0.5x the height: 1" x 2 x 0.5 = 1". It's a bit like a 3D game that always treats the screen as a fixed height, no matter what the resolution is. Here though, there are reasons to stick to doubling the height, since bitmap images can't really be scaled up and down easily without distortions, and most apps use bitmap images for icons/etc, not vector images.

    But yes, the scaled resolutions use the double-height canvas as well for sharpness. By making sure the canvas is at least the same pixel count as the screen, and running in HiDPI mode, you make sure your canvas is always downscaled to the display resolution. Upscaling attempts to create information that isn't there, and will always look fuzzy, blurry or pixelated.

    Or another way:

    1280x800 HiDPI -> Renders at 2560x1600 -> 2560x1600 (No Scaling)
    1680x1050 -> Renders at 1680x1050 -> 2560x1600 (Upscaling, Fuzzy)
    1680x1050 HiDPI -> Renders at 3360x2100 -> 2560x1600 (Downscaling, Sharp)
  7. Galatian macrumors 6502


    Dec 20, 2010
    Na that's actually what I meant...just didn't express myself correctly :D;)
  8. Ploki thread starter macrumors 68020

    Jan 21, 2008

    Still, you could have an option to not sacrifice performance over looks and use upscaling + rendering at 1680*1050.

    See, I kinda saw the "retina screen" somewhat as a cathode-ray. Because its a theoretical retina, you should have no problems rendering ANY resolution, because you can always "create" virtual pixels, like a cathode ray does.
  9. scenox macrumors member

    Oct 20, 2012
    Of course you can do that, download e.g. Retina DisplayMenu v0.2 (from Phoenix Dev, free), there you can set the resolution e.g. either to 1920x1200 HiDPI or to 1920x1200 (no HiDPI), same for 1680x1050 and others.
    As said before, elements in noHiDPI mode look blurry if you simply upscale them, because the information stays the same at upscaling.
  10. Krevnik macrumors 68040


    Sep 8, 2003
    This method can do things CRT can't without having a similarly high resolution. The display is always 220PPI, no matter how much screen real estate you want. So text and images should always be sharp. The real problem is that Apple needed a way to ask how much real estate you want on the screen. And the way people think of that is in terms of pixels. It makes this concept more confusing than it needs to be. When you ask the OS for 1680x1050, you are really asking for "UI elements the same size as if I had a 1680x1050 pixel display".

    But in effect, it is using virtual pixels to render, and then downscaling it to the native resolution in order to preserve sharpness. A CRT of the same size at 1680x1050 is not as sharp as the rMBP. And as the other guy pointed out, running the rMBP in an actual 1680x1050 mode results in pixelation and blurry everything.
  11. Ploki thread starter macrumors 68020

    Jan 21, 2008
    Thanks for the tip!
    Will try it the moment it arrives.

    This actually makes much more sense!

    Apples isn't offering the hi-res retina version because its not needed at all.
  12. krravi macrumors 65816

    Nov 30, 2010
    Just curious anyone compared how a non retina 1680x1050 looks vs a retina 1680x1050?

    I have a rMBP which looks great on 1440x900, but I really want more real estate and hence when scaled to 1680x1050 it doesn't look that great? I understand the pixel doubling at 1440x900...

    So just curious what about non retina MBP's who have gotten a high res 1680x1050 display? Since its running at native resolution is it any better than the scaled retina version?
  13. shansoft macrumors 6502

    Apr 24, 2011
    I'll be honest with you since I have both machine.

    The cMBP with 1680*1050 resolution have some slight aliasing in some part if you look extremely close to the edge. In rMBP with 1680*1050 HiDPI mode it looks very similar to cMBP, except you can't see the aliasing anymore. (Not visible until you stick your eye to it).

    The difference is not huge, but the clarity does adds up a bit. If the App does not support Retina Display, it looks extremely blurry compare to cMBP.

    Overall, I wouldn't say rMBP experiences are great. There are just too many problems with rMBP right now that really push me back to use cMBP more.
  14. leman macrumors G3

    Oct 14, 2008
    IMHO, there are basically two reasons for why Apple is doing it this way.

    First of all, this simplifies the programming API. OS X uses a parameter called the backing factor, which tells the program how many real pixels correspond to a single logical pixel. To illustrate this - a non-retina mode of 1440x960 has the backing factor of 1 (one logical pixel = one real pixel); while the HiDPI 1440x960 has the backing factor of 0.5 (1 logical pixel = 0.5 real pixel). Basically, if one wants to draw a line starting at (0,0) and going to (100, 100), one should first convert these values to the real pixel coordinates, e.g. (0, 0) - (200, 200) and then draw the corresponding pixels.

    For most applications, this backing value doesn't even matter. OS X uses vector graphics almost exclusively to render its buttons and other controls; furthermore, the rendering of these items is handled by the OS itself. So applications which only use these standard items don't need any modification at all to work with HiDPI modes. Only in the case where the application does some custom rendering, a special care of the backing factor is needed. Of course, the same goes for displaying image data.

    In earlier versions os OS X Apple was playing with fractional backing factors, e.g. you were able to set it to something like 0.25 which would result in an effective logical resolution of 720*480. To emulate something like 1680x1050 one would need a backing factor of 0.58333333333... Now, here one can already see some problems having to do with precision and rounding of pixel coordinate calculations. Another problem are pixel images - how do we treat these properly? This is why Apple simply decided to go for 'integer' pixel ratio. A backing factor is either 1, or 0.5. This simplifies lots of rendering algorithms (they only need to take care of these two special cases), essentially eliminates rounding problems - you don't get fractional pixel relationships anymore and also makes the handling of pixel images trivial - you simply provide two kinds of resolution, the normal and the HiDPI (2x2) one. Apple also made it very easy to upgrade existing applications with HiDPI pixel graphics, you simply add image files with the same name but ending in '@2' - the OS will pick them automatically and then do the rest. If you do some custom rendering in your application, you still have to add some code, but its not that bad price to pay for resolution independence.

    With integer scaling factors, the only way to emulate other 'fractional' resolutions is to do a downscale as a post-processing step. E.g. render 1680x1050 as 2x2 scaled 3360x2100 and then downscale to the native resolution (whatever that is). This way you can essentially implement non-integer backing factors .

    The second reason is that this way also enhances the quality image. I mentioned rounding problems with fractional backing factors. With a two-step rendering process, you first render the image to a buffer with a resolution high enough to avoid such problems altogether and then use linear interpolation to blend together corresponding 'fractions' of pixels. This is essentially super-sampling anti-aliasing that Apple does here.

    Now, a 'proper' way to do resolution independence would be to abandon pixels altogether and just use units of space like cm etc. However, this would be a horrible mess for both the software and hardware. In this regards, Apple's solution is a clever hack which ensures that its easy to write applications which work - and look great - on both normal and HiDPI screens.
  15. krravi, Jan 1, 2013
    Last edited: Jan 1, 2013

    krravi macrumors 65816

    Nov 30, 2010
    So what you are saying is, if I run MS Visual Studio in Parallels the cMBP 1680x1050 will look better than the rMBP?

    O Man.. I have to pack up either my 15" rMBP or the 13" MBA today. Have to make a decision. Love the 15 in but unless you are working on retina aware apps that show the toolbars etc big, but the content in high res then working on 1440x900 is a waste of that space.

    The MBA on the other hand will only run max at 1440x900, but is light and portable.
  16. Ploki thread starter macrumors 68020

    Jan 21, 2008
    On the other hand you can run full HD mode on rMBP.

    exactly what I was thinking with the whole cathode ray reference!

    thanks for a very insightful post

Share This Page