Can't run rMBP in 1440x900?

Discussion in 'MacBook Pro' started by Panini, Jul 2, 2012.

  1. Panini, Jul 2, 2012
    Last edited: Jul 2, 2012

    Panini macrumors regular

    Jun 12, 2012
    Palo Alto, CA
    All this time, I was under the impression that it was possible to run the rMBP in "legacy" mode via pixel doubling (4 pixels now equals 1 pixel) so that it is essentially running in 1440x900. This would, theoretically, eliminate any lag issues the rMBP faces that the legacy models do not, and the two computers would become basically identical save for form factor.

    With this impression, I felt I could run in 1440x900 while playing games to get peak performance while not sacrificing quality since this is what I'm used to on the old models. I think it is still possible in games but I hear it performs much worse than a native 1440x900 display. Why is this so when it is just integer scaling?

    My main question is, is 1440x900 really not possible?

  2. iBookG4user macrumors 604


    Jun 27, 2006
    Seattle, WA
    I do believe that the MacBook Pro renders it at 2880x1800 and then scales it down to 1440x900, thus the GPU is still being stressed by the increased resolution.
  3. prfrma macrumors regular

    May 29, 2010
    SwitchresX should allow you to force what ever resolution you want to run.
  4. Stetrain macrumors 68040

    Feb 6, 2009
    You cannot set the desktop resolution (for the OS and apps) to 1440x900 (HiDPI scaling disabled) without using a third party app (there are several).

    However, full screen games generally bypass the OS's desktop resolution setting. So if you set a game to run in full screen 1440x900, that's exactly what it should run. The display will scale that up to full screen, just like when running a normal monitor at non-native resolution.

    There are many threads where people are running games like Skyrim, World of Warcraft, and Diablo 3 at 1440x900 with much improved performance over running them at 2880x1800.
  5. Panini thread starter macrumors regular

    Jun 12, 2012
    Palo Alto, CA
    Is there any way to disable HiDPI scaling so that it renders at 1440x900 and stretches it out to fit on the 2880x1800 display? It should theoretically look identical to legacy MBPs, right? Do third party applications do this? If so, is the performance boost associated with rendering at 1440x900 present?

    Also, obviously there is going to be a huge performance boost from running Skyrim on 1440x900 vs 2880x1800, but what I am asking is if there is a performance boost from running on a native 1440x900 display vs setting the retina MBP's resolution to 1440x900.

    Thanks for your help.
  6. Stetrain macrumors 68040

    Feb 6, 2009
    Yes, third party programs (I think "SetResX" is one) will let you run at 1440x900 with no DPI scaling.

    In both that case and the case of running a game in full screen 1440x900, performance should be identical to a non-retina model with equivalent specs.
  7. Panini thread starter macrumors regular

    Jun 12, 2012
    Palo Alto, CA
    Thanks! Exactly the answer I was looking for.

    I have one more (probably unanswerable) question, though. If I used a third party program to set the Desktop resolution to 1440x900, will that mean everything will run at 1440x900 even though apple scales indivual elements on the screen? Will this conflict in any way?

    I also heard that something like the retina display running at half resolution (iPad 3, in this example) looks worse than another display running at its native resolution (which is half retina, iPad 2, in this case).
  8. Andrmgic macrumors 6502a

    Jun 27, 2007
    LCDs always look best at their native resolution.

    If you use a third party program to set the resolution to 1440x900, it will behave like a non retina macbook pro.

    However, it will look worse than a non retina macbook pro running at 1440x900 because you're running at 1/4 the native resolution of the display.
  9. Uplift macrumors 6502

    Feb 1, 2011
    Would this eliminate the problem designers are facing? Can i go to 1440x900 no scaling and the result would be the same as what i can see now ? (on my 2010 macbook pro)
  10. leman macrumors G3

    Oct 14, 2008
    Of course it is possible with games. Don't be confused. The HiDPI (e.g. internal rendering at 2x2 resolution) happens only with UI. Moreover, OpenGL contexts are not HiDPI by default. Basically, all OpenGL games under OS X are pixel-doubled (unless the application is HiDPI aware and takes certain precautions). A fullscreen 1440x900 OpenGL game in OS X will render at 1440x900 internally and it will be then pixel-doubled by the OS to match the screen. In Windows, the resolution is what you set it.

    If you don't believe me, refer to Apple's developer documentation on resolution independence support. I am too lazy to pick out the links right now.
  11. Panini thread starter macrumors regular

    Jun 12, 2012
    Palo Alto, CA
    Why will it look worse? 4 pixels = 1 pixel. It will obviously look worse than on retina mode but why worse than a 1440x900 native display? If anything, it should look the same or better (due to the IPS panel).
  12. Stetrain macrumors 68040

    Feb 6, 2009
    There are some weird dynamics that go on, mostly because of the space between pixels. The 4 pixels = 1 pixel can look a bit different than 1 pixel on a half dpi display because of the physical layout of the pixels.

    It's not a huge difference though, just something that might be noticeable.
  13. Andrmgic macrumors 6502a

    Jun 27, 2007
    If you take standard definition content and put it on a HDTV, it looks worse than it would on a SDTV..

    Your retina display can't display anything but 2880x1800 pixels

    When you set the resolution to 1440x900, the graphics card will stretch it to fill all 2880x1800 pixels, effectively stretching 1 pixel into the space of 4 pixels.

    I've tried running at 1440x900 on the retina macbook pro and it looks quite blurry because it gets stretched across a larger number of pixels. It looks fuzzy compared to the last gen macbook pro running at 1440x900.

    It's the same idea as if you have a cinema display or other external monitor and you set the resolution to be something less than its native resolution.
  14. shurcooL macrumors 6502a

    Jan 24, 2011
    But this shouldn't happen. It happens when your resolution is not an even divisor of the native resolution.

    So when you stretch a 1280x800 image to 1440x900 it will look slightly blurry, but when you stretch 1280x800 to 2560x1600 it should stay pretty much the same. Just like how non-Retina iOS apps display the same on iPhone 4 and 3GS.
  15. Andrmgic macrumors 6502a

    Jun 27, 2007
    I'm not saying that it doesn't look better when it is an even division than when it's not, but I am saying that it doesn't look as good as 1440x900 does on a display that is natively 1440x900.

    It looks blurry as hell to me at 1440x900
  16. shurcooL, Jul 3, 2012
    Last edited: Jul 3, 2012

    shurcooL macrumors 6502a

    Jan 24, 2011
    I see, that's a shame... It's probably because Apple doesn't do the scaling properly, even though in theory the 2880x1800 Retina display should be nearly indistinguishable running at 1440x900 from the native 1440x900 panel.

    You just need to use nearest-neighbour scaling, nothing fancy.
  17. strausd macrumors 68030

    Jul 11, 2008
    I do not think it is a problem with the scaling. It has more to do with a sort of "optical illusion."

    Think about when you look at an iPad 2. The pixelation is there, but you can also see each individual pixel. You can even see the spacing between each pixel when up close. On the new iPad, that is not the case. You cannot see each individual pixel and you definitely cannot see the spacing between them. This smooth surface makes lower resolution images more apparent to the eye. Why? I do not know the specifics. But it really is all about native resolution.

    However, for the MBP if you can have it render at 1680X1050 or 1920X1200 and then scale it up to 2880X1800, you will likely never notice because of the pixel density on such a high resolution display. And that will also increase GPU performance. But think about it. 1920X1200 is the resolution on my 24" Apple display. At a normal viewing distance, individual pixels are not distinguishable. Just think about how much more so would it be on a 15" monitor.
  18. shurcooL macrumors 6502a

    Jan 24, 2011
    You can test this out with a CRT monitor.

    Set it to 800x600 resolution and view a 800x600 image fullscreen at 1:1 scale.

    Then set the monitor's resolution to 1600x1200 and view the same 800x600 image, resized to be twice as large (i.e. 1600x1200) using nearest-neighbor scaling.

    The two views will not look exactly the same, but they will be very close. The latter scenario (i.e. imitating a Retina display showing a lower resolution) shouldn't look more blurry than the former.


    Alternatively, take a 1440x900 screenshot of anything, resize it to 2880x1800 using nearest-neighbor scaling and view it on the rMBP set to full 2880x1800 (non-Retina) resolution. That should give you an idea of best-case scaling and how it can look.
  19. Andrmgic macrumors 6502a

    Jun 27, 2007
    That is a flawed test.. LCDs are fixed pixel displays, CRTs are not.

    CRTs are able to display different video signals and resolutions without having a problem with using a non-native resolution.

    CRT monitors do not have a "native resolution"

    The short answer is: If you use something other than an LCD native resolution, there is quality lost in the image. If it is an even multiple, the image is not distorted or skewed, but is still of lower quality.
  20. shurcooL, Jul 4, 2012
    Last edited: Jul 4, 2012

    shurcooL macrumors 6502a

    Jan 24, 2011
    From that very article,

    "In theory, some resolutions should work well, if they are exact multiples of smaller image sizes. For example, a 1600×1200 LCD could display an 800×600 image well, as each of the pixels in the image could be represented by a block of four on the larger display, without interpolation. Since 800×600 is an integer factor of 1600×1200, scaling should not adversely affect the image. But in practice, most monitors apply a smoothing algorithm to all smaller resolutions, so the quality still suffers for these "half" modes."

    I'm guessing that's the case. While in theory it's possible to display the 1440x900 resolution better, since it's an even multiple of the native panel resolution, what ends up happening is that it's treated the same as all other non-native resolutions - smoothing is applied, making the 1440x900 resolution appear blurry.

    This is likely a limitation of the scaler hardware. Supporting smoothing for non-exact multiple resolutions AND nearest-neighbour scaling for exact multiple resolutions would make it even more complex. So the image quality at 1440x900 was sacrificed for the sake of reduced complexity (and therefore cost, heat, size, weight) of the scaler. There's enough complexity to deal with given that it works with a higher resolution than anything (on consumer-level) before it.

    Which makes sense, given that Apple doesn't even officially support the 1440x900 non-Retina resolution, it's only available via a hack.
  21. leman macrumors G3

    Oct 14, 2008
    When the pixels are small enough, this does not matter. If you want, CRTs alo have 'native' resolution, which is determined by the dot pitch.
  22. leman macrumors G3

    Oct 14, 2008
    If I understand it correctly, your quote refers to something different: the case when we set the resolution of native 1600x1200 to 800x600. In this case, the scaler hardware may indeed be a culprit.

    However, this is not how retina works. In retina mode, the resolution is set to 1600x800. Then, the image (as a texture) is upscaled and rendered to the native screen. So no "non-native" smoothing is applied in the first place. The upscaling is done via normal texturing hardware, so no additional filters should be applied.

    The only explanation I can offer (aside from suggesting that the images are fine and all people reporting blur are suffering from optical illusion) is the following: the upscaling is indeed messed up because the image position is not always pixel-aligned. The upscaler could be applying some sort of subpixel-weighted linear interpolation which could potentially result in blur. I have to test it for myself, I'l just take my old MBP with me and do side-by-side comparisons with same content on both screens.

    Anyway, if there is a problem, it will be a software one. There is no way that 2x2 uniform pixel block on a 220 PPI 15" screen will look differently than a 1x1 uniform pixel on a 110 PPI 15" screen as they have absolutely identical dimensions.
  23. terraphantm macrumors 68040

    Jun 27, 2009
    Difference is sd dimensions aren't exactly half of either the HD resolutions, so some interpolation has to be done to stretch ad content. Not the case with 1400x900 on the rmbp
  24. mikeo007 macrumors 65816

    Mar 18, 2010
    It will not look as good. One of the posters above me said it must be a software problem, and he is correct.

    The reason: subpixel rendering

    Subpixel rendered fonts will not look right at non-native resolution.
    This is why things involving text look considerably worse on the retina MBP running at 1440x900

Share This Page