Is the GPU always driving 2880x1800 on retina mbp?

dave343

macrumors member
Original poster
May 11, 2014
31
0
Hi All,

As I understand it, (correct me if I'm wrong) the Retina MBP pushes 4 physical pixels for every 1 pixel on screen.
Because it's native resolution is 2880x1800 and most games can't be played at that (for performance reasons), how is the up scaling handled say if you're playing at 1440x900? Is the graphics card driving 1440x900, or does the GPU still have to drive 2880x900 regardless of what resolution you have set in the game? Or is the LCD handling the upscale regardless of what resolution you have set in game?

I apologize in advance if this has been asked a million times, however I've tried searching and can't find anything that answers my question definitively.
 

afhstingray

macrumors newbie
Feb 9, 2015
26
0
it dosent work that way, same reason on a windows PC when u lower the res u get better frame rates. on my rMBP13" i can play tf2 at native res on max with no issues, another game i like to play is serious sam BFE. at HD res (not native) i can play it on high settings. of course it will never look as good as native, but its still very good and if you have an external HD monitor and you play it on native its perfect.
 

dave343

macrumors member
Original poster
May 11, 2014
31
0
If you lower your resolution on a PC, obviously the performance goes up in games, but on the Retina, if everything is up scaled to 2880x1800 (which can't be altered from my understanding), when you lower the resolution in your game, on a hardware level it's still being up scaled to 2880x1800... yes/no? And if yes, is the Graphics card still having to process the Retina resolution /2880 ? or is the LCD doing the up scaling? Hopefully I explained my question right... sorry if it's confusing. Thanks for the reply.

I'm asking because if regardless of what resolution you set in game, if on the hardware level it's being up scaled, then your performance will always be worse then a non retina MBP (since the Graphics is always having to process 2880...) correct me if I'm wrong since this is what I'm trying to understand.
 
Last edited:

afhstingray

macrumors newbie
Feb 9, 2015
26
0
i honestly cant answer your question as i dont know the nitty gritty of it, but both on the mac an PC, when i set the game resolution lower, i get better performance.

perhaps one of the guru's of gaming here can explain what actually happens behind the scenes. but (i assume you've played demanding games before) even the benchmark sites and review sites will tell you reducing the resolution of the game works.
 

snaky69

macrumors 603
Mar 14, 2008
5,903
480
But even when you lower your res in game, it's still being displayed on your LCD at 2880x1800...? I thought no matter resolution you set in game, or what scaling you set in the OSX display properties, it'll always be 2880x1880 on screen. Everything is up scaled to retina res? And if this is the case, which I thought it was, is the graphics card always pushing 2880, or is the up scale being handled by something else, so the GPU will only have to push the in game res you set? I hope I'm explaining my question right...
It's being rendered at twice whatever resolution you're using, then properly downscaled for the screen.
 

afhstingray

macrumors newbie
Feb 9, 2015
26
0
and your question in your initial post was about playing games at lower res (than native). i play serious sam BFE at high at HD res, no issues smooth. native res=unplayable

TF2 since its a less demanding older game i play at native res on my rMBP. smooth as butter.
 

dave343

macrumors member
Original poster
May 11, 2014
31
0
It's being rendered at twice whatever resolution you're using, then properly downscaled for the screen.
Ok, hopefully I understand this right, so if it's being rendered at twice the resolution I have it set to, in game, then the GPU is being taxed more, then say the regular mac non retina running at the same in game res... since the GPU in the retina is always taxed for 2880x1800, then down scaling to whatever res you chose in game.
 

afhstingray

macrumors newbie
Feb 9, 2015
26
0
Ok, hopefully I understand this right, so if it's being rendered at twice the resolution I have it set to, in game, then the GPU is being taxed more, then say the regular mac non retina running at the same in game res... since the GPU in the retina is always taxed for 2880x1800, then down scaling to whatever res you chose in game.
using the normal UI, its at the native res. when u pick a game res the gpu only does the res you pick. this is why some of the windows you might have had open might have resized after you exit the game.
 

ixxx69

macrumors 65816
Jul 31, 2009
1,119
635
United States
For something like games, isn't it maybe dependent on the game, and also whether the game is "full screen" or "window" mode?

I was playing around with the Unigine Valley Benchmark apps, and on my 4K screen in full screen mode, it didn't seem to matter what resolution setting I was using in OS X, performance seemed dictated by the game settings (the only exception being if I set the display setting below the resolution of the game settings).
 

leman

macrumors G3
Oct 14, 2008
9,975
4,555
So many confusing and wrong information in this thread. starting with the expression 'to drive a screen'

I'll try to make it simple. The LCD always runs at the native resolution. The GPU can draw to whatever resolution it's hardware supports. That image is then rescaled to fit the native resolution of the screen. Whether this is done using some specialized hardware rescaled or via the 'normal' texture functionality of the GPU - only Apple knows.

The fact is - if your game is following some basic rules (such as drawing to a full-screen borderless window etc.), the GPU will only need to work to the resolution the game sets for it's backing buffer. Which for most games is FAR lower then the retina red. Is there a hidden penalty for rescaling to native resolution? Maybe. Hovewer, i haven't seen any clear benchmarks on this and my own are also inconclusive. At any rate, this penalty would be so low in most cases that it wouldn't matter much anyway.
 

dave343

macrumors member
Original poster
May 11, 2014
31
0
So many confusing and wrong information in this thread. starting with the expression 'to drive a screen'

I'll try to make it simple. The LCD always runs at the native resolution. The GPU can draw to whatever resolution it's hardware supports. That image is then rescaled to fit the native resolution of the screen. Whether this is done using some specialized hardware rescaled or via the 'normal' texture functionality of the GPU - only Apple knows.

The fact is - if your game is following some basic rules (such as drawing to a full-screen borderless window etc.), the GPU will only need to work to the resolution the game sets for it's backing buffer. Which for most games is FAR lower then the retina red. Is there a hidden penalty for rescaling to native resolution? Maybe. Hovewer, i haven't seen any clear benchmarks on this and my own are also inconclusive. At any rate, this penalty would be so low in most cases that it wouldn't matter much anyway.
Thanks you! This is the exact answer I've been seeking and explained great.
 

dusk007

macrumors 68040
Dec 5, 2009
3,383
61
Is there a hidden penalty for rescaling to native resolution? Maybe.
Rescaling takes so little work it is basically inconsequential at todays hardware.

Unfortunately to work fast it is not the best possible scaling algorithm. I also found that 1440x900 is not directly mapped as it should be but somewhat washed out.
1680x1050 and 1920x1080 are though so sharp as to be almost indistinguishable from a screen with such a native resolution at normal viewing distances for gaming. Personally i find that 1920x1080 is the best resolution to play at and I usually turn down details so this res works. Going higher nets no visible benefit in sharpness or anything. Going lower does reduce sharpness at objects far away noticeably.

Generally a game renders whichever resolution you set. There is specific hardware that then rescales it to whatever. Many display controllers can do that work, it might not even be done by the GPU at all.
 

ixxx69

macrumors 65816
Jul 31, 2009
1,119
635
United States
So many confusing and wrong information in this thread. starting with the expression 'to drive a screen'

I'll try to make it simple. The LCD always runs at the native resolution. The GPU can draw to whatever resolution it's hardware supports. That image is then rescaled to fit the native resolution of the screen. Whether this is done using some specialized hardware rescaled or via the 'normal' texture functionality of the GPU - only Apple knows.
First, it is a perfectly correct expression to suggest that the GPU is driving the screen. When we're discussing whether a computer/GPU can "drive a display" we're asking whether the GPU is capable of outputting at the LCD's native resolution with reasonable performance.

Second, yes, the LCD has a fixed number of pixels = native resolution, and all of those pixels are used regardless of the OS/GPU's output resolution. But there's an actual practical meaning to using the term "native" resolution when discussing the OS's resolution setting.

On a traditional non-retina/HiDPI screen, if the system's resolution is set to something other than the "native" resolution of the display, the results are often not exactly optimal. That's because the display itself is doing the resolution scaling - not the GPU... i.e. the GPU composites the pixel output at the OS's desktop resolution to the display (thereby "driving the display"), and the display then scales it to make it fit on the screen (at native resolution). That scaling isn't particularly sophisticated and does simple pixel interpolation to get it to fit right. This has nothing to do with Apple or the OS. You can see this in action by forcing a screen resolution in OS X that has a different aspect ratio to the display - the GPU will output the desktop resolution and the display will simply scale the desktop image to fit the display resulting in distorted output on the screen.

It's only when it comes to retina/4K+ HiDPI displays that OS X gets around this issue by using HiDPI scaling from the OS/GPU side, and then outputting that scaled desktop at "native" resolution, so the display doesn't have to do any scaling. And even then, this works best when the scaling is 2:1 of the display's "native" resolution. Regardless, the results are generally very impressive.
 

leman

macrumors G3
Oct 14, 2008
9,975
4,555
Unfortunately to work fast it is not the best possible scaling algorithm. I also found that 1440x900 is not directly mapped as it should be but somewhat washed out.
Yeah, they seem to be using linear filtering (which IMO is a hint that its done on a GPU). However, filtering of that kind is basically 'free', which means that the performance cost consists essentially from doing memory copies (read the backing buffer in -> write to the display buffer). For a 1440x900 backing buffer and 2800x1800 display buffer this is around 24Mb worth of data in the worst case (not counting color compression and other tricks). Given that even 650M VRAM bandwidth is 80Gb/s its really quite cheap. Of course, there are scenarios where drawing is already bandwidth-starved, there the cost of rescaling should be more visible.

First, it is a perfectly correct expression to suggest that the GPU is driving the screen. When we're discussing whether a computer/GPU can "drive a display" we're asking whether the GPU is capable of outputting at the LCD's native resolution with reasonable performance.
No, in fact 'drive a display' in the context of video cards always meant 'is able to output signal on that resolution'. And this makes sense, because not so long time ago cards were quite limited in that regard. Instead of asking 'can it drive the display?' (because, yes, it can) people should be asking about performance estimates. Otherwise is just like 'can I use this bike to commute to work?" (yes, you most surely can, but this might not be a good choice if we only knew exactly how your commute works). At any rate, there are so many myths about 'driving displays' around that I think we should just stop using that notion altogether.

Second, yes, the LCD has a fixed number of pixels = native resolution, and all of those pixels are used regardless of the OS/GPU's output resolution. But there's an actual practical meaning to using the term "native" resolution when discussing the OS's resolution setting.

On a traditional non-retina/HiDPI screen, if the system's resolution is set to something other than the "native" resolution of the display, the results are often not exactly optimal. That's because the display itself is doing the resolution scaling - not the GPU... i.e. the GPU composites the pixel output at the OS's desktop resolution to the display (thereby "driving the display"), and the display then scales it to make it fit on the screen (at native resolution).
That scaling isn't particularly sophisticated and does simple pixel interpolation to get it to fit right
Yeah, this is what I mean by 'confusing information'. What you write here is kind of correct, but also kind of besides the point. Fact is: non-native resolution on LCDs look bad because you are trying to map pixel data data onto a grid of different granularity. No matter which scaling algorithm you use, it will not look good because the physical pixels are BIG. And there is no way to adjust the image so that it fits well into those pixels. Of course, you are correct in saying that linear interpolation done by the GPU texturing units will most likely result in a better quality than a simpler filtering employed by hardware scalers in the monitors. But this is certainly not the reason why traditional LCDs suck with non-native resolutions.

CRTs have been using much more primitive scaling hardware earlier and they don't have any issues with quality under different resolutions. Why? Because their 'pixel granularity' (yes, they have it) is so small that you can't distinguish the details with the naked eye anyway. This is the same reason why resolution does not matter with a HiDPI screen like in retina machines — pixels are small enough so that scaling will not introduce additional discrete noise. On a retina machine, running the display on a non-native resolution will produce a quality with is more or less comparable with a LCD of that native resolution. This is a 'CRT' effect. So what you write about scaling and Apple's HiDPI implementation does not really make that much sense.
 

Natzoo

macrumors 65816
Sep 16, 2014
1,199
84
Not sure where i am
Mine doesn't. It formats to the usual 1440X900 when playing games. Right now when I'm browsing safari i get the 1440 x 900. So no the retina doesn't use the 2880x1800
 

Freyqq

macrumors 601
Dec 13, 2004
4,014
166
If you lower your resolution on a PC, obviously the performance goes up in games, but on the Retina, if everything is up scaled to 2880x1800 (which can't be altered from my understanding), when you lower the resolution in your game, on a hardware level it's still being up scaled to 2880x1800... yes/no? And if yes, is the Graphics card still having to process the Retina resolution /2880 ? or is the LCD doing the up scaling? Hopefully I explained my question right... sorry if it's confusing. Thanks for the reply.

I'm asking because if regardless of what resolution you set in game, if on the hardware level it's being up scaled, then your performance will always be worse then a non retina MBP (since the Graphics is always having to process 2880...) correct me if I'm wrong since this is what I'm trying to understand.
On the desktop, it renders at 2880x1800 for 1440x900 hidpi. In a game, you can set the resolution to whatever, even 1440x900 non-hidpi (it will stretch it to fill up the screen, but it is only rendering at 1440x900). You can also do this on the desktop if you install a third party program to force the resolution. So, it's all software.
 

ixxx69

macrumors 65816
Jul 31, 2009
1,119
635
United States
No, in fact 'drive a display' in the context of video cards always meant 'is able to output signal on that resolution'. And this makes sense, because not so long time ago cards were quite limited in that regard. Instead of asking 'can it drive the display?' (because, yes, it can) people should be asking about performance estimates. Otherwise is just like 'can I use this bike to commute to work?" (yes, you most surely can, but this might not be a good choice if we only knew exactly how your commute works). At any rate, there are so many myths about 'driving displays' around that I think we should just stop using that notion altogether.
You seem to be suggesting that I'm wrong, but never exactly say what about, and then you say the same thing as I did in a very round about way with some weird analogy involving a bike thrown in for good measure. I still don't know what these "myths" about driving displays you're referring to. It's pretty well understood phrase.

Yeah, this is what I mean by 'confusing information'. What you write here is kind of correct, but also kind of besides the point. Fact is: non-native resolution on LCDs look bad because you are trying to map pixel data data onto a grid of different granularity. No matter which scaling algorithm you use, it will not look good because the physical pixels are BIG. And there is no way to adjust the image so that it fits well into those pixels. Of course, you are correct in saying that linear interpolation done by the GPU texturing units will most likely result in a better quality than a simpler filtering employed by hardware scalers in the monitors. But this is certainly not the reason why traditional LCDs suck with non-native resolutions.
Again, you're implying that I'm not quite correct without ever stating what you think I'm incorrect about, and then you pretty much repeat what I said in a very round about way. I never mentioned anything about "linear interpolation done by the GPU texturing units" (not sure even you know what you're talking about there), but I did refer to pixel interopolation on the display, i.e. "and there is no way to adjust the image so that it fits well into those pixels." This is why we need commonly understood "phrases" like "driving the display" so that we don't have to have a two page debate about this basic stuff (honestly, you're the first person I am aware of who has ever had in issue with that phrase).
CRTs have been using much more primitive scaling hardware earlier and they don't have any issues with quality under different resolutions. Why? Because their 'pixel granularity' (yes, they have it) is so small that you can't distinguish the details with the naked eye anyway. This is the same reason why resolution does not matter with a HiDPI screen like in retina machines — pixels are small enough so that scaling will not introduce additional discrete noise. On a retina machine, running the display on a non-native resolution will produce a quality with is more or less comparable with a LCD of that native resolution. This is a 'CRT' effect. So what you write about scaling and Apple's HiDPI implementation does not really make that much sense.
CRTs have absolutely nothing to do with HiDPI implementations - where you're getting this info, I'd be really curious to know. You're going to have to be a lot more specific on what I wrote about Apple's HiDPI implementation that doesn't make sense.
 

bigpoppamac31

macrumors 68020
Aug 16, 2007
2,172
301
Canada
On the desktop, it renders at 2880x1800 for 1440x900 hidpi. In a game, you can set the resolution to whatever, even 1440x900 non-hidpi (it will stretch it to fill up the screen, but it is only rendering at 1440x900). You can also do this on the desktop if you install a third party program to force the resolution. So, it's all software.
Where can I find this third party App?? I'm running my 13" rMBP at a "scaled" resolution of 1440 x 900 so it's pushing 2880 x 1800 pixels.
 

dave343

macrumors member
Original poster
May 11, 2014
31
0
I think everyone would be wise to have a look at Anand's review of the original retina MacBook Pro where he explained in pretty good detail how the technology and the software handles a retina display.

http://www.anandtech.com/show/6023/the-nextgen-macbook-pro-with-retina-display-review/6
This article is actually what drove my question because I don't quite understand how the GPU is taxed.
Anandtech mentioned that the Retina's display has a native resolution of 2880x1800, howver OSX only will scale from 1024 up to 1900x1200. By default, OSX displays at something like 1440x900. It also mentioned that the Retina display pushes 4 Physical pixels for every pixel diplayed, at 1440x900.
So going forward, you have the Macbook Retina, and lets say a regular PC laptop also displaying 1440x900. The PC laptop's LCD is native 1440, the Macbook Retina is Native 2880, but has the resolution scaled down to 1440.

So, what I couldn't quite understand, is that if you are in a game playing at the default OSX resolution of 1440, is the graphics card actually having to render 2880x1800, or is it rendering 1440 and the LCD up scales it.
On the PC laptop side, since it's native resolution is 1440, I know that's what the Graphics card only has to render to the LCD. No up scaling etc.

So, that's what I original wanted to know, if the Macbook Pro Retina takes a performance hit playing games at the same resolution as any other laptop since the native res of the LCD is 2880 on the Retina. Either the GPU only has to render 1440 and "something" is handling the up scale, or the GPU has to render the native Retina res of 2880 regardless, and then it is just down scaled for the game you're playing at OSX's resolution of 1440. If the GPU has to render everything at the Retina's native 2880, and then down scale, the performance will be lower playing a game at 1440, then playing the same game on a laptop with a native res of 1440.
 
Last edited:

Freyqq

macrumors 601
Dec 13, 2004
4,014
166
This article is actually what drove my question because I don't quite understand how the GPU is taxed.
Anandtech mentioned that the Retina's display has a native resolution of 2880x1800, howver OSX only will scale from 1024 up to 1900x1200. By default, OSX displays at something like 1440x900. It also mentioned that the Retina display pushes 4 Physical pixels for every pixel diplayed, at 1440x900.
So going forward, you have the Macbook Retina, and lets say a regular PC laptop also displaying 1440x900. The PC laptop's LCD is native 1440, the Macbook Retina is Native 2880, but has the resolution scaled down to 1440.

So, what I couldn't quite understand, is that if you are in a game playing at the default OSX resolution of 1440, is the graphics card actually having to render 2880x1800, or is it rendering 1440 and the LCD up scales it.
On the PC laptop side, since it's native resolution is 1440, I know that's what the Graphics card only has to render to the LCD. No up scaling etc.

So, that's what I original wanted to know, if the Macbook Pro Retina takes a performance hit playing games at the same resolution as any other laptop since the native res of the LCD is 2880 on the Retina. Either the GPU only has to render 1440 and "something" is handling the up scale, or the GPU has to render the native Retina res of 2880 regardless, and then it is just down scaled for the game you're playing at OSX's resolution of 1440. If the GPU has to render everything at the Retina's native 2880, and then down scale, the performance will be lower playing a game at 1440, then playing the same game on a laptop with a native res of 1440.
1440x900 hidpi is a resolution setting that pushes 2880x1800. That's all there is to it. Running 1440x900 in a non-hidpi mode pushes 1440x900 pixels and stretches to fit the screen. It's the same process as if you were using an external 1920x1200 screen and decided to set the resolution to 1440x900. It would stretch 1440x900 to fit the whole screen. The computer is only rendering the 1440x900 pixels in that instance. Hi-dpi (retina) just means that hi-dpi aware programs will scale the UI and content to be readable as if it were at 1/4 the pixel density.

To summarize, if you have a 15" retina macbook pro and you set a game to run at 1440x900, the computer will render it at 1440x900. If you set the game to render at 2880x1800, it would render at 2880x1800. If you were in a windowed mode and set it at 1440x900, it will fill up 1/4 the screen. Example: I play starcraft 2 at 1920x1200 in fullscreen mode, which sets the monitor resolution to 1920x1200. It runs appreciably better than if I ran it at 2880x1800, which is also a selectable option.
 

ixxx69

macrumors 65816
Jul 31, 2009
1,119
635
United States
If you were in a windowed mode and set it at 1440x900, it will fill up 1/4 the screen.
That might be the way it works with some games, but testing the Uningine Valley Benchmark, if I set the game resolution to 2560x1440 in windowed mode, it takes up the whole screen on my 4K set to 2560x1440 HiDPI mode. So in that case, either the game or OS X knows to scale the window as well rather than literally drawing 2560x1440 pixels.

Furthermore, whether the display is in 2560x1440 HiDPI mode or low-res 2560x1440 mode (i.e. no HiDPI), the FPS appears to be the same. I don't know if that means the game application window is bypassed by OS X's HiDPI scaling and therefore there's no hit on performance while using a HiDPI screen?
 

Freyqq

macrumors 601
Dec 13, 2004
4,014
166
That might be the way it works with some games, but testing the Uningine Valley Benchmark, if I set the game resolution to 2560x1440 in windowed mode, it takes up the whole screen on my 4K set to 2560x1440 HiDPI mode. So in that case, either the game or OS X knows to scale the window as well rather than literally drawing 2560x1440 pixels.

Furthermore, whether the display is in 2560x1440 HiDPI mode or low-res 2560x1440 mode (i.e. no HiDPI), the FPS appears to be the same. I don't know if that means the game application window is bypassed by OS X's HiDPI scaling and therefore there's no hit on performance while using a HiDPI screen?
To clarify, if you go into info on the application and hit "low resolution," it will be at 1/4 size. If you uncheck low resolution, it will stretch to fill the screen - still rendering at 1440x900.
 

ixxx69

macrumors 65816
Jul 31, 2009
1,119
635
United States
To clarify, if you go into info on the application and hit "low resolution," it will be at 1/4 size. If you uncheck low resolution, it will stretch to fill the screen - still rendering at 1440x900.
The info indicates that the "low resolution" is already checked and greyed out, yet it exhibits the behavior I previously described.

Do you have that Uningine Valley benchmark app (I only have the free version)?
 

leman

macrumors G3
Oct 14, 2008
9,975
4,555
You seem to be suggesting that I'm wrong, but never exactly say what about, and then you say the same thing as I did in a very round about way with some weird analogy involving a bike thrown in for good measure. I still don't know what these "myths" about driving displays you're referring to. It's pretty well understood phrase.
Sorry if I was not clear enough. The phrase 'to drive display' is ambiguous between 'is able to output a video signal at a specific resolution' and 'is able to deliver reasonable performance at a specific resolution'. The second statement cannot be easily generalised because performance depends on the usage scenario. This is why the phrase 'to drive a display' is often confusing and misleading — e.g. Intel IGP will happily run with a 4K monitor but it will obviously struggle if you attempt to run a game under full 4K resolution.

To illustrate the confusion a bit better, take the OP's original question: is the GPU always driving the 2880x1800 resolution? It is, because it will always output the video signal at that resolution, but that is absolutely orthogonal to the amount of work the GPU needs to perform when, say, drawing a game. It is entirely possible that it draws a game at 1024x786 and still output the video signal at 2880x1800. The crucial thing to understand is that the GPU is not drawing directly to the display. It is drawing to a series of memory buffers of different resolutions, which are then combined by the OS in complicated way so that a final picture can be produced.

Again, you're implying that I'm not quite correct without ever stating what you think I'm incorrect about, and then you pretty much repeat what I said in a very round about way.
Again, sorry if I wasn't clear enough. Your post suggests that image scaling is the main reason for suboptimal quality when drawing to non-native resolution of a classical LCD. I wanted to point out that this is not entirely correct.

I never mentioned anything about "linear interpolation done by the GPU texturing units" (not sure even you know what you're talking about there)
Frankly, if you are unfamiliar with linear interpolation or texturing hardware then maybe talking about image rescaling is not such a good idea. Especially since you are clearly suggesting that doing scaling on GPU is higher quality then using a specialised DSP chip. To make statements like these you should at least understand how rescaling is performed in hardware and what is the difference between scaling done on the GPU vs scaling done by a dedicated DSP.

CRTs have absolutely nothing to do with HiDPI implementations - where you're getting this info, I'd be really curious to know. You're going to have to be a lot more specific on what I wrote about Apple's HiDPI implementation that doesn't make sense.
I never said that CRTs have anything to do with HiDPI implementation. I was merely stating that color CRTs and hi-res LCDs have a similar hardware feature — small granularity pixels. This is reduces distortion from image rescaling and ultimately allows these displays to work with a wide range of resolutions without severe quality degradation

That might be the way it works with some games, but testing the Uningine Valley Benchmark, if I set the game resolution to 2560x1440 in windowed mode, it takes up the whole screen on my 4K set to 2560x1440 HiDPI mode. So in that case, either the game or OS X knows to scale the window as well rather than literally drawing 2560x1440 pixels.
Its actually quite simple. When you set your system to 2560x1440 HiDPI mode, the OS (and the games) 'see' the display as having resolution of 2560x1440 logical pixels. For non-GPU-intensive applications, the OS will back each of these logical pixels by a 2x2 grid of physical pixels — this happens completely in a completely transparent fashion to the application, which still thinks that it is drawing to a single pixel. Namely: if the app asks for a 100x100 window, the OS will allocate a 200x200 buffer but present it as a 100x100 one to the app.
However, if the application requests GPU-intensive features (e.g. an OpenGL context), the OS will attempt to optimise and reduce the resolution of the buffer. So when asking for a 100x100 window with OpenGL acceleration, you will actually get a 100x100 pixel buffer. The OS will then take care of all the rescaling so that the image still appears a correct size (200x200) on a HiDPI display.

Of course, the application can use specific APIs to realise that it is actually dealing with a HiDPI display and ask the OS to adjust its behaviour. For instance, a game could ask for a high-res OpenGL buffer (that is essentially what SC2 does)

Furthermore, whether the display is in 2560x1440 HiDPI mode or low-res 2560x1440 mode (i.e. no HiDPI), the FPS appears to be the same. I don't know if that means the game application window is bypassed by OS X's HiDPI scaling and therefore there's no hit on performance while using a HiDPI screen?
OS X is able to recognise and optimise certain drawing scenarios. Performance-wise, it would make sense for it to step back from the default super-sampling drawing if a game is drawing to the entire screen. However, I am not aware whether they actually do that kind of optimisation. In your case, FPS might be the same because (as mentioned above), the additional rescaling step is fairly cheap on modern hardware. At any rate, the game is always rendering to a 2560x1440 buffer (with the OS optionally doing one or two rescaling steps afterwards).
 
Last edited: