rMBP newbie. Question about resolution?

Discussion in 'MacBook Pro' started by DannyNguyener, Mar 17, 2013.

  1. DannyNguyener macrumors regular

    DannyNguyener

    Joined:
    Mar 13, 2010
    #1
    Ok, I just bought a 15" Retina. I think it looks very crystal clear and sharp on the "best retina" display option, but do I lose clarity if I were to switch to the "more space" display mode? And also do I take a performance hit if I were to move up resolution sizes? Do you guys notice any performance loss? And what do you guys like to run your resolutions on?
     
  2. B... macrumors 68000

    B...

    Joined:
    Mar 7, 2013
    #2
    It will not look as clear and sharp as 1440x900 because there will not be a 1:1 pixel mapping. but I prefer to keep it on 1440x900 on the 15" and 1440x900 on the 13" when I look at them at the Apple store. On the 13", it is as much resolution as an Air, which I like. And with my uses, I do not need more screen real estate than 1140x900. Even though it is not as clear, it still looks ok because the screen is so high res. And for what I do, there is not much of a performance hit in my tests.
     
  3. leman macrumors 604

    Joined:
    Oct 14, 2008
    #3
    I use 1680x1050 or 1920x1200. No noticeable clarity or performance loss. Other people will tell you otherwise. Its all subjective. Anyway, you have the computer in front of you - why don't you just use it the way you want and make your own experience?
     
  4. xShane macrumors 6502a

    xShane

    Joined:
    Nov 2, 2012
    Location:
    United States
    #4
    Technically speaking, you might take a performance hit by having higher resolution.

    However, you'll likely not notice it unless you were trying to do some intensive processes on that max resolution.
     
  5. theluggage macrumors 68030

    Joined:
    Jul 29, 2011
    #5
    Yes - it is inevitable - but whether or not the loss of clarity is noticeable is questionable (and the whole questions is knotty because you're comparing the "clarity" of different sized images).

    Its fine scaling up 1440x900 to the 2880x1800 of the retina display (so that old applications work) since each "big" pixel maps exactly on to a 2x2 block of real pixels.

    Any other resolution - 1680x1050 or 1920x1200 - doesn't divide neatly into 2880x1800 and rather than just doubling-up the pixels it has to be scaled up using some sort of "resampling" algorithm to sort out the mismatch between the pixel sizes.

    On a non-retina display, e.g. scaling up 1440x900 to 1920x1200 on a 24" display, this can cause quite noticable blurring - but because the target resolution of the rMBP is so high, and the blurring is the limit of your vision, it gives pretty good results.

    You'll probably be able to see it if you look from a few inches away or use a magnifying glass - so don't do that then!
     
  6. leman macrumors 604

    Joined:
    Oct 14, 2008
    #6
    This is wrong. The rMBP renders the UI essentially at the 'subpixel' level. There is no upscaling from 1440x900 to 2800x1800 - doing this would be pointless and waste any benefit of the retina display. Rather, everything is rendered with more detail. This Apple developer page explains everything very neatly: http://developer.apple.com/library/...hResolutionOSX/Introduction/Introduction.html

    Similar is also true about 1680x1050 and 1920x1200 - everything is rendered at higher accuracy to an offscreen buffer first (3360x2100 or 3840*2400 respectively) and then downscaled to match the physical resolution. It is true that there is no perfect mapping between the logical and physical pixels here, but the image is still much more accurate than a native 1680x1050 panel would allow.
     
  7. theluggage macrumors 68030

    Joined:
    Jul 29, 2011
    #7
    What I posted was a deliberate simplification, but I was trying to get over the point that the problem is with scaling up/down other than by integer multiples, without opening a can of worms that depends on how individual bits of software do their rendering, whether they're retina-aware, whether they use OS calls to render vector information or draw it pixel-by-pixel, etc.

    There is for old applications (which is what I referred to) that use non-retina bitmaps or draw graphics pixel-by-pixel.

    ...which is still not going to be as good as rendering directly to 2800x1800 (where "rendering" may include rendering at 5680x3600 and downsampling, or doing sub-pixel calculations to the same effect).

    An image rendered pixel-by-pixel by a non-retina app thinking it is writing to a 1680x1050 display certainly isn't going top be improved by pixel doubling/re-sampling it to 3360x2100 and then downsampling to 2880x1800 - its going to be degraded even c.f. a native 1680x1050 panel.

    An image rendered pixel-by-pixel by a retina-aware App (or the iOS font/vector graphics rendering engine) to a 3360x2100 display and then downsampled to 2880x1800 may be better than anything you'll see on a 1680x1050 panel but it won't be as good as a native 2880x1800 render.

    As I said it's a knotty comparison because you don't know what you're actually comparing.

    The one sure thing is that everything would be a lot better if more OS X apps made more use of vector graphics, less use of bitmaps and were written to be resolution independent.
     
  8. cjmillsnun macrumors 68020

    Joined:
    Aug 28, 2009
    #8
    I use a third party utility to set the resolution to 2880x1800. I have good eyes so have made the decision to maximise the screen real estate.

    The display is sharp and clear. However I suspect with a magnifying glass it would look worse than at lower resolutions.

    Graphics performance is fine. Up to a few days ago it wasn't, but the firmware update fixed that.
     
  9. leman macrumors 604

    Joined:
    Oct 14, 2008
    #9
    Ok, sorry, I though you were talking about the big picture.

    I don't see why this would be the case. If I want to emulate a 1680x1050 resolution, then rendering to a 3360x2100 buffer and then downsampling to 2800x1800 is the same thing as doing 2x2 sub-pixel calculations. Of course, increasing the resolution of the backing buffer would give you more detail, but then you'd have to expose non-integer backing factors to the applications - Apple toyed with it but ultimately decided agains it. Only available backing factors are 1.0 and 2.0.

    Well, in theory it could produce the same result if the up/downsampling is implemented in a proper way, but I agree on you with this one. Again, I did not realise that you were talking about software which uses custom rendered views.

    Again, i don't understand what you mean by a 'native' 2880x1800 render. Technically, the backing buffer resolution is higher than 2880x1800. I have no idea how one would emulate the 1680x1050 resolution better (as in - better quality) than how OS X implements it today.

    Agree. Furthermore, I would love to see all this pixel thing go away and specify UI in 'real' measurement units (e.g. using the metric or typographical system). I am sure we will see these things in the future.
     
  10. DannyNguyener thread starter macrumors regular

    DannyNguyener

    Joined:
    Mar 13, 2010
    #10
    Another question - why did apple label that specific resolution as the default or standard resolution? Why does Apple consider it "Best for Retina"?
     
  11. xShane macrumors 6502a

    xShane

    Joined:
    Nov 2, 2012
    Location:
    United States
    #11
    Because Retina was designed to work best for that resolution, hence "Best for Retina".
     
  12. theluggage macrumors 68030

    Joined:
    Jul 29, 2011
    #12
    Say I produce my new retina-friendly 114x114 app icon. I will design that to look best at 114x114 - i might paint it at 10x that resolution then downsample it to 144x144 using the best algorithm for the job, I might construct it as vector art and then let the art package render a 114x114 version, or I might lovingly tweak each pixel by eye to get the best effect.

    When that is displayed pixel-for-pixel it will look optimal.

    In a scaled mode, that icon is first going to get plotted pixel-for-pixel to a virtual 3360x2100 screen at 114x114, then it is going to get resampled by a factor of something like 2880/3360. It's not even going to end up a whole number of pixels wide (so it will get merged in to the surrounding pixels). It will no longer look optimal for that screen.

    The same thing is going to happen for generated graphics - the app will render them optimised for 1:1 viewing on the "virtual" screen - they then get resampled by a non-integer scale for the real screen.

    Put simply, retina-aware software rendering directly to the screen at native resolution will always give a better result than any process involving non-integer scaling.

    As to whether a "scaled" 1680x1050, rendered via a 2x buffer, on a retina display will look better/same/worse than a native 1680x1050 panel, I suspect that the answer will vary depending on whether the application can take advantage of the 2x buffer. You can't add detail that was never there - but you can use extra resolution to smooth out the jaggies.
     
  13. Ploki macrumors 68000

    Joined:
    Jan 21, 2008
    #13
    o_O

    Well every app needs to be retina-aware in order to take advantage of the retina screen. You can "retinize" non-bitmap data with "retinizer" to render elements such as text and other vector drawn stuff as retina.

    If you look at retina icon package you'll see you have 1024*1024 icons labeled as 512x512@2x. OF course you need to use double resolution when rendering artwork!! That's the whole point.

    But 1680*1050 WILL look better on retina then on a native panel, because its rendered at 2x and essentially "downscaled" to retina, meaning you don't see pixels anyway.
     
  14. leman macrumors 604

    Joined:
    Oct 14, 2008
    #14
    Still, with a sufficiently small pixels, it does not matter whether the logical-to-physical mapping is integer or fractional. I mean, printers, cameras etc. all do some sort of interpolation (explicit or implicit e.g. via the camera lens) to map the 'real' HiDef image to the limited resolution of the medium (e.g. camera sensor). In the end what matters is the maximal spatial discriminative capacity of the human vision. Once the pixel is small enough, the human eye is not able to recognise a blended edge between several logical pixels. What I mean, that there is some resolution of the panel such that going higher will not produce any better image to the eye - no matter which logical resolution we are trying to emulate. Of course, it is another matter entirely whether the current 'retina' display is detailed enough (I'd say not yet). But once we hit over 600ppi...
     
  15. theluggage macrumors 68030

    Joined:
    Jul 29, 2011
    #15
    If you look back at the original question it was whether you would "lose clarity" by using "more space" rather than "best for retina" on a retina display. The "native 1650x1050" panel thing is a bit of a red herring.

    All the "more space" modes involve rendering to a virtual screen and then scaling by a non-integer factor. It is that additional, non-integer scaling that is always going to reduce quality compared to direct, "smart", rendering at 2880x1800.

    In 2880x1800, your "512x512@2x" icon will get rendered 1:1 at 1024x1024 pixels as nature intended.

    In "1650x1050" mode, it will first get rendered to the 2x virtual screen as 1024x1024 and then re-sized to 894x894 on the "real" screen. That will inevitably be degraded compared to either a 1024x1024 icon rendered 1:1 to a retina screen, or for an icon designed to be rendered at 894x894.

    If you have a non-retina app, which uses an old-school 512x512 icon then (unless OSX is really, really clever) first that will get upscaled to 1024x1024 by the OS, then it will be scaled again to 894x894. Again, that is an extra step that will introduce more "artefacts" than directly upscaling 512x512 => 1024x1024 in "best for retina" mode. That scenario could - depending on the content and how the application did its rendering - even conceivably produce a worse result than a real 1650x1050 panel.

    ...and that's one reason why serious photographers prefer RAW mode, so that their post-processing can use the original, un-interpolated data direct from the sensor rather than something that has already been interpolated once in the camera, minimising the number of interpolating/re-sampling steps.

    Its why you shouldn't scale all your artwork to 600dpi before printing - a decent printer driver will make a better job of interpolating/resampling the original data, optimised for the characteristics of the output medium.
     
  16. Stetrain macrumors 68040

    Joined:
    Feb 6, 2009
    #16
    The "Best for Retina" on the 15" rMBP is called "Looks like 1440x900" but is actually 2880x1800, the native resolution of the display.

    All of the text and UI elements are drawn at double the size that they would normally be, which makes things the same physical size as on a 1440x900 display.

    Apple chose to exactly double the pixel density to make things easier for themselves and developers, exactly like they did on the iPhone and iPad. Developers only have to test their applications in two modes, retina (2x the size in both directions) and non-retina.

    All of the other retina resolution modes are still using double-sized assets and fonts, but in a larger virtual screen which is then scaled down to the real pixels of the display.
     
  17. Ploki macrumors 68000

    Joined:
    Jan 21, 2008
    #17
    This:
    In "1650x1050" mode, it will first get rendered to the 2x virtual screen as 1024x1024 and then re-sized to 894x894 on the "real" screen. That will inevitably be degraded compared to either a 1024x1024 icon rendered 1:1 to a retina screen, or for an icon designed to be rendered at 894x894.

    is true only if you can actually SEE the pixels. else its just going to look smaller. If you zoom up you eventually get 1024*1024... Yes, unfortunately interpolated!

    The only question here is how much DPI do you need before you don't see the pixels, then its all relative size...
     
  18. theluggage macrumors 68030

    Joined:
    Jul 29, 2011
    #18
    I don't dispute that we're well and truly into the theoretical "if you look at the screen with a magnifying glass you might just notice the difference" territory here! Personally, 1920x1200 on the screen of my 17" MBP is enough for my middle-aged retinas!
     
  19. Ploki macrumors 68000

    Joined:
    Jan 21, 2008
    #19
    So if you don't actually see the pixels, whats the problem?

    I want to try to zoom in a picture @scaled mode to bring its size to "normal pixels" because I wonder if there is any aliasing going on.

    Why alias when downscaling anyway?
     

Share This Page