Is the GPU always driving 2880x1800 on retina mbp?

Discussion in 'MacBook Pro' started by dave343, Feb 15, 2015.

  1. dave343 macrumors member

    Joined:
    May 11, 2014
    #1
    Hi All,

    As I understand it, (correct me if I'm wrong) the Retina MBP pushes 4 physical pixels for every 1 pixel on screen.
    Because it's native resolution is 2880x1800 and most games can't be played at that (for performance reasons), how is the up scaling handled say if you're playing at 1440x900? Is the graphics card driving 1440x900, or does the GPU still have to drive 2880x900 regardless of what resolution you have set in the game? Or is the LCD handling the upscale regardless of what resolution you have set in game?

    I apologize in advance if this has been asked a million times, however I've tried searching and can't find anything that answers my question definitively.
     
  2. afhstingray macrumors newbie

    Joined:
    Feb 9, 2015
    #2
    it dosent work that way, same reason on a windows PC when u lower the res u get better frame rates. on my rMBP13" i can play tf2 at native res on max with no issues, another game i like to play is serious sam BFE. at HD res (not native) i can play it on high settings. of course it will never look as good as native, but its still very good and if you have an external HD monitor and you play it on native its perfect.
     
  3. dave343, Feb 15, 2015
    Last edited: Feb 15, 2015

    dave343 thread starter macrumors member

    Joined:
    May 11, 2014
    #3
    If you lower your resolution on a PC, obviously the performance goes up in games, but on the Retina, if everything is up scaled to 2880x1800 (which can't be altered from my understanding), when you lower the resolution in your game, on a hardware level it's still being up scaled to 2880x1800... yes/no? And if yes, is the Graphics card still having to process the Retina resolution /2880 ? or is the LCD doing the up scaling? Hopefully I explained my question right... sorry if it's confusing. Thanks for the reply.

    I'm asking because if regardless of what resolution you set in game, if on the hardware level it's being up scaled, then your performance will always be worse then a non retina MBP (since the Graphics is always having to process 2880...) correct me if I'm wrong since this is what I'm trying to understand.
     
  4. afhstingray macrumors newbie

    Joined:
    Feb 9, 2015
    #4
    i honestly cant answer your question as i dont know the nitty gritty of it, but both on the mac an PC, when i set the game resolution lower, i get better performance.

    perhaps one of the guru's of gaming here can explain what actually happens behind the scenes. but (i assume you've played demanding games before) even the benchmark sites and review sites will tell you reducing the resolution of the game works.
     
  5. snaky69 macrumors 603

    Joined:
    Mar 14, 2008
    #5
    It's being rendered at twice whatever resolution you're using, then properly downscaled for the screen.
     
  6. afhstingray macrumors newbie

    Joined:
    Feb 9, 2015
    #6
    and your question in your initial post was about playing games at lower res (than native). i play serious sam BFE at high at HD res, no issues smooth. native res=unplayable

    TF2 since its a less demanding older game i play at native res on my rMBP. smooth as butter.
     
  7. dave343 thread starter macrumors member

    Joined:
    May 11, 2014
    #7
    Ok, hopefully I understand this right, so if it's being rendered at twice the resolution I have it set to, in game, then the GPU is being taxed more, then say the regular mac non retina running at the same in game res... since the GPU in the retina is always taxed for 2880x1800, then down scaling to whatever res you chose in game.
     
  8. afhstingray macrumors newbie

    Joined:
    Feb 9, 2015
    #8
    using the normal UI, its at the native res. when u pick a game res the gpu only does the res you pick. this is why some of the windows you might have had open might have resized after you exit the game.
     
  9. ixxx69 macrumors 6502a

    Joined:
    Jul 31, 2009
    Location:
    United States
    #9
    For something like games, isn't it maybe dependent on the game, and also whether the game is "full screen" or "window" mode?

    I was playing around with the Unigine Valley Benchmark apps, and on my 4K screen in full screen mode, it didn't seem to matter what resolution setting I was using in OS X, performance seemed dictated by the game settings (the only exception being if I set the display setting below the resolution of the game settings).
     
  10. leman macrumors 604

    Joined:
    Oct 14, 2008
    #10
    So many confusing and wrong information in this thread. starting with the expression 'to drive a screen'

    I'll try to make it simple. The LCD always runs at the native resolution. The GPU can draw to whatever resolution it's hardware supports. That image is then rescaled to fit the native resolution of the screen. Whether this is done using some specialized hardware rescaled or via the 'normal' texture functionality of the GPU - only Apple knows.

    The fact is - if your game is following some basic rules (such as drawing to a full-screen borderless window etc.), the GPU will only need to work to the resolution the game sets for it's backing buffer. Which for most games is FAR lower then the retina red. Is there a hidden penalty for rescaling to native resolution? Maybe. Hovewer, i haven't seen any clear benchmarks on this and my own are also inconclusive. At any rate, this penalty would be so low in most cases that it wouldn't matter much anyway.
     
  11. dave343 thread starter macrumors member

    Joined:
    May 11, 2014
    #11
    Thanks you! This is the exact answer I've been seeking and explained great.
     
  12. dusk007 macrumors 68040

    dusk007

    Joined:
    Dec 5, 2009
    #12
    Rescaling takes so little work it is basically inconsequential at todays hardware.

    Unfortunately to work fast it is not the best possible scaling algorithm. I also found that 1440x900 is not directly mapped as it should be but somewhat washed out.
    1680x1050 and 1920x1080 are though so sharp as to be almost indistinguishable from a screen with such a native resolution at normal viewing distances for gaming. Personally i find that 1920x1080 is the best resolution to play at and I usually turn down details so this res works. Going higher nets no visible benefit in sharpness or anything. Going lower does reduce sharpness at objects far away noticeably.

    Generally a game renders whichever resolution you set. There is specific hardware that then rescales it to whatever. Many display controllers can do that work, it might not even be done by the GPU at all.
     
  13. ixxx69 macrumors 6502a

    Joined:
    Jul 31, 2009
    Location:
    United States
    #13
    First, it is a perfectly correct expression to suggest that the GPU is driving the screen. When we're discussing whether a computer/GPU can "drive a display" we're asking whether the GPU is capable of outputting at the LCD's native resolution with reasonable performance.

    Second, yes, the LCD has a fixed number of pixels = native resolution, and all of those pixels are used regardless of the OS/GPU's output resolution. But there's an actual practical meaning to using the term "native" resolution when discussing the OS's resolution setting.

    On a traditional non-retina/HiDPI screen, if the system's resolution is set to something other than the "native" resolution of the display, the results are often not exactly optimal. That's because the display itself is doing the resolution scaling - not the GPU... i.e. the GPU composites the pixel output at the OS's desktop resolution to the display (thereby "driving the display"), and the display then scales it to make it fit on the screen (at native resolution). That scaling isn't particularly sophisticated and does simple pixel interpolation to get it to fit right. This has nothing to do with Apple or the OS. You can see this in action by forcing a screen resolution in OS X that has a different aspect ratio to the display - the GPU will output the desktop resolution and the display will simply scale the desktop image to fit the display resulting in distorted output on the screen.

    It's only when it comes to retina/4K+ HiDPI displays that OS X gets around this issue by using HiDPI scaling from the OS/GPU side, and then outputting that scaled desktop at "native" resolution, so the display doesn't have to do any scaling. And even then, this works best when the scaling is 2:1 of the display's "native" resolution. Regardless, the results are generally very impressive.
     
  14. leman macrumors 604

    Joined:
    Oct 14, 2008
    #14
    Yeah, they seem to be using linear filtering (which IMO is a hint that its done on a GPU). However, filtering of that kind is basically 'free', which means that the performance cost consists essentially from doing memory copies (read the backing buffer in -> write to the display buffer). For a 1440x900 backing buffer and 2800x1800 display buffer this is around 24Mb worth of data in the worst case (not counting color compression and other tricks). Given that even 650M VRAM bandwidth is 80Gb/s its really quite cheap. Of course, there are scenarios where drawing is already bandwidth-starved, there the cost of rescaling should be more visible.

    No, in fact 'drive a display' in the context of video cards always meant 'is able to output signal on that resolution'. And this makes sense, because not so long time ago cards were quite limited in that regard. Instead of asking 'can it drive the display?' (because, yes, it can) people should be asking about performance estimates. Otherwise is just like 'can I use this bike to commute to work?" (yes, you most surely can, but this might not be a good choice if we only knew exactly how your commute works). At any rate, there are so many myths about 'driving displays' around that I think we should just stop using that notion altogether.

    Yeah, this is what I mean by 'confusing information'. What you write here is kind of correct, but also kind of besides the point. Fact is: non-native resolution on LCDs look bad because you are trying to map pixel data data onto a grid of different granularity. No matter which scaling algorithm you use, it will not look good because the physical pixels are BIG. And there is no way to adjust the image so that it fits well into those pixels. Of course, you are correct in saying that linear interpolation done by the GPU texturing units will most likely result in a better quality than a simpler filtering employed by hardware scalers in the monitors. But this is certainly not the reason why traditional LCDs suck with non-native resolutions.

    CRTs have been using much more primitive scaling hardware earlier and they don't have any issues with quality under different resolutions. Why? Because their 'pixel granularity' (yes, they have it) is so small that you can't distinguish the details with the naked eye anyway. This is the same reason why resolution does not matter with a HiDPI screen like in retina machines — pixels are small enough so that scaling will not introduce additional discrete noise. On a retina machine, running the display on a non-native resolution will produce a quality with is more or less comparable with a LCD of that native resolution. This is a 'CRT' effect. So what you write about scaling and Apple's HiDPI implementation does not really make that much sense.
     
  15. Natzoo macrumors 65816

    Natzoo

    Joined:
    Sep 16, 2014
    Location:
    Not sure where i am
    #15
    Mine doesn't. It formats to the usual 1440X900 when playing games. Right now when I'm browsing safari i get the 1440 x 900. So no the retina doesn't use the 2880x1800
     
  16. Freyqq macrumors 68040

    Joined:
    Dec 13, 2004
    #16
    On the desktop, it renders at 2880x1800 for 1440x900 hidpi. In a game, you can set the resolution to whatever, even 1440x900 non-hidpi (it will stretch it to fill up the screen, but it is only rendering at 1440x900). You can also do this on the desktop if you install a third party program to force the resolution. So, it's all software.
     
  17. TheIguana macrumors 6502a

    TheIguana

    Joined:
    Sep 26, 2004
    Location:
    Canada
    #17
  18. ixxx69 macrumors 6502a

    Joined:
    Jul 31, 2009
    Location:
    United States
    #18
    You seem to be suggesting that I'm wrong, but never exactly say what about, and then you say the same thing as I did in a very round about way with some weird analogy involving a bike thrown in for good measure. I still don't know what these "myths" about driving displays you're referring to. It's pretty well understood phrase.

    Again, you're implying that I'm not quite correct without ever stating what you think I'm incorrect about, and then you pretty much repeat what I said in a very round about way. I never mentioned anything about "linear interpolation done by the GPU texturing units" (not sure even you know what you're talking about there), but I did refer to pixel interopolation on the display, i.e. "and there is no way to adjust the image so that it fits well into those pixels." This is why we need commonly understood "phrases" like "driving the display" so that we don't have to have a two page debate about this basic stuff (honestly, you're the first person I am aware of who has ever had in issue with that phrase).
    CRTs have absolutely nothing to do with HiDPI implementations - where you're getting this info, I'd be really curious to know. You're going to have to be a lot more specific on what I wrote about Apple's HiDPI implementation that doesn't make sense.
     
  19. bigpoppamac31 macrumors 68000

    Joined:
    Aug 16, 2007
    Location:
    Canada
    #19
    Where can I find this third party App?? I'm running my 13" rMBP at a "scaled" resolution of 1440 x 900 so it's pushing 2880 x 1800 pixels.
     
  20. dave343, Feb 15, 2015
    Last edited: Feb 15, 2015

    dave343 thread starter macrumors member

    Joined:
    May 11, 2014
    #20
    This article is actually what drove my question because I don't quite understand how the GPU is taxed.
    Anandtech mentioned that the Retina's display has a native resolution of 2880x1800, howver OSX only will scale from 1024 up to 1900x1200. By default, OSX displays at something like 1440x900. It also mentioned that the Retina display pushes 4 Physical pixels for every pixel diplayed, at 1440x900.
    So going forward, you have the Macbook Retina, and lets say a regular PC laptop also displaying 1440x900. The PC laptop's LCD is native 1440, the Macbook Retina is Native 2880, but has the resolution scaled down to 1440.

    So, what I couldn't quite understand, is that if you are in a game playing at the default OSX resolution of 1440, is the graphics card actually having to render 2880x1800, or is it rendering 1440 and the LCD up scales it.
    On the PC laptop side, since it's native resolution is 1440, I know that's what the Graphics card only has to render to the LCD. No up scaling etc.

    So, that's what I original wanted to know, if the Macbook Pro Retina takes a performance hit playing games at the same resolution as any other laptop since the native res of the LCD is 2880 on the Retina. Either the GPU only has to render 1440 and "something" is handling the up scale, or the GPU has to render the native Retina res of 2880 regardless, and then it is just down scaled for the game you're playing at OSX's resolution of 1440. If the GPU has to render everything at the Retina's native 2880, and then down scale, the performance will be lower playing a game at 1440, then playing the same game on a laptop with a native res of 1440.
     
  21. Freyqq macrumors 68040

    Joined:
    Dec 13, 2004
    #21
    1440x900 hidpi is a resolution setting that pushes 2880x1800. That's all there is to it. Running 1440x900 in a non-hidpi mode pushes 1440x900 pixels and stretches to fit the screen. It's the same process as if you were using an external 1920x1200 screen and decided to set the resolution to 1440x900. It would stretch 1440x900 to fit the whole screen. The computer is only rendering the 1440x900 pixels in that instance. Hi-dpi (retina) just means that hi-dpi aware programs will scale the UI and content to be readable as if it were at 1/4 the pixel density.

    To summarize, if you have a 15" retina macbook pro and you set a game to run at 1440x900, the computer will render it at 1440x900. If you set the game to render at 2880x1800, it would render at 2880x1800. If you were in a windowed mode and set it at 1440x900, it will fill up 1/4 the screen. Example: I play starcraft 2 at 1920x1200 in fullscreen mode, which sets the monitor resolution to 1920x1200. It runs appreciably better than if I ran it at 2880x1800, which is also a selectable option.
     
  22. ixxx69 macrumors 6502a

    Joined:
    Jul 31, 2009
    Location:
    United States
    #22
    That might be the way it works with some games, but testing the Uningine Valley Benchmark, if I set the game resolution to 2560x1440 in windowed mode, it takes up the whole screen on my 4K set to 2560x1440 HiDPI mode. So in that case, either the game or OS X knows to scale the window as well rather than literally drawing 2560x1440 pixels.

    Furthermore, whether the display is in 2560x1440 HiDPI mode or low-res 2560x1440 mode (i.e. no HiDPI), the FPS appears to be the same. I don't know if that means the game application window is bypassed by OS X's HiDPI scaling and therefore there's no hit on performance while using a HiDPI screen?
     
  23. Freyqq macrumors 68040

    Joined:
    Dec 13, 2004
    #23
    To clarify, if you go into info on the application and hit "low resolution," it will be at 1/4 size. If you uncheck low resolution, it will stretch to fill the screen - still rendering at 1440x900.
     
  24. ixxx69 macrumors 6502a

    Joined:
    Jul 31, 2009
    Location:
    United States
    #24
    The info indicates that the "low resolution" is already checked and greyed out, yet it exhibits the behavior I previously described.

    Do you have that Uningine Valley benchmark app (I only have the free version)?
     
  25. leman, Feb 15, 2015
    Last edited: Feb 15, 2015

    leman macrumors 604

    Joined:
    Oct 14, 2008
    #25
    Sorry if I was not clear enough. The phrase 'to drive display' is ambiguous between 'is able to output a video signal at a specific resolution' and 'is able to deliver reasonable performance at a specific resolution'. The second statement cannot be easily generalised because performance depends on the usage scenario. This is why the phrase 'to drive a display' is often confusing and misleading — e.g. Intel IGP will happily run with a 4K monitor but it will obviously struggle if you attempt to run a game under full 4K resolution.

    To illustrate the confusion a bit better, take the OP's original question: is the GPU always driving the 2880x1800 resolution? It is, because it will always output the video signal at that resolution, but that is absolutely orthogonal to the amount of work the GPU needs to perform when, say, drawing a game. It is entirely possible that it draws a game at 1024x786 and still output the video signal at 2880x1800. The crucial thing to understand is that the GPU is not drawing directly to the display. It is drawing to a series of memory buffers of different resolutions, which are then combined by the OS in complicated way so that a final picture can be produced.

    Again, sorry if I wasn't clear enough. Your post suggests that image scaling is the main reason for suboptimal quality when drawing to non-native resolution of a classical LCD. I wanted to point out that this is not entirely correct.

    Frankly, if you are unfamiliar with linear interpolation or texturing hardware then maybe talking about image rescaling is not such a good idea. Especially since you are clearly suggesting that doing scaling on GPU is higher quality then using a specialised DSP chip. To make statements like these you should at least understand how rescaling is performed in hardware and what is the difference between scaling done on the GPU vs scaling done by a dedicated DSP.

    I never said that CRTs have anything to do with HiDPI implementation. I was merely stating that color CRTs and hi-res LCDs have a similar hardware feature — small granularity pixels. This is reduces distortion from image rescaling and ultimately allows these displays to work with a wide range of resolutions without severe quality degradation

    Its actually quite simple. When you set your system to 2560x1440 HiDPI mode, the OS (and the games) 'see' the display as having resolution of 2560x1440 logical pixels. For non-GPU-intensive applications, the OS will back each of these logical pixels by a 2x2 grid of physical pixels — this happens completely in a completely transparent fashion to the application, which still thinks that it is drawing to a single pixel. Namely: if the app asks for a 100x100 window, the OS will allocate a 200x200 buffer but present it as a 100x100 one to the app.
    However, if the application requests GPU-intensive features (e.g. an OpenGL context), the OS will attempt to optimise and reduce the resolution of the buffer. So when asking for a 100x100 window with OpenGL acceleration, you will actually get a 100x100 pixel buffer. The OS will then take care of all the rescaling so that the image still appears a correct size (200x200) on a HiDPI display.

    Of course, the application can use specific APIs to realise that it is actually dealing with a HiDPI display and ask the OS to adjust its behaviour. For instance, a game could ask for a high-res OpenGL buffer (that is essentially what SC2 does)

    OS X is able to recognise and optimise certain drawing scenarios. Performance-wise, it would make sense for it to step back from the default super-sampling drawing if a game is drawing to the entire screen. However, I am not aware whether they actually do that kind of optimisation. In your case, FPS might be the same because (as mentioned above), the additional rescaling step is fairly cheap on modern hardware. At any rate, the game is always rendering to a 2560x1440 buffer (with the OS optionally doing one or two rescaling steps afterwards).
     

Share This Page