Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sorry if I was not clear enough. The phrase 'to drive display' is ambiguous between 'is able to output a video signal at a specific resolution' and 'is able to deliver reasonable performance at a specific resolution'. The second statement cannot be easily generalised because performance depends on the usage scenario. This is why the phrase 'to drive a display' is often confusing and misleading — e.g. Intel IGP will happily run with a 4K monitor but it will obviously struggle if you attempt to run a game under full 4K resolution.
That makes your point much clearer. I still think you're being pedantic for no reason, but we can just agree to disagree.

Again, sorry if I wasn't clear enough. Your post suggests that image scaling is the main reason for suboptimal quality when drawing to non-native resolution of a classical LCD. I wanted to point out that this is not entirely correct.
Again, your point is much clearer stated that way. I'm well aware that low resolution of traditional displays is a key reason that pixel interpolation is so noticeable on them. Honestly, that's so obvious, it didn't even occur to me to point it out - seemed kind of self evident. I don't think you know what the meaning of "not correct" is. I just didn't expound on the topic fully to your satisfaction.

Frankly, if you are unfamiliar with linear interpolation or texturing hardware then maybe talking about image rescaling is not such a good idea.
Seriously? Man you're full of yourself. You don't need to know the intricacies of graphic subystems to understand the basic concept of how OS X's HiDPI works. It's explained in pretty simple terms by Apple itself. Again, you're conflating topics into something I wasn't suggesting. It's like you're overthinking everything being said. The way OS X's implementation of HiDPI is totally different from "DSP" scaling... it's a totally different concept - that's what I was pointing out. You've so confused the issue that I'm not sure you understand how HiDPI works.

Leman, I do appreciate your reply, but here's an observation: you're acting like someone whose sole reason for posting is to impress on everyone how knowledgeable you are by suggesting everyone else is wrong as an excuse to demonstrate your own expertise. It's ironic that you accuse others of confusing the issue or bringing up irrelevant information. Maybe don't make so many assumptions about what everyone else doesn't understand or shouldn't be talking about based on your exceptional standards. :rolleyes:
 
Seriously? Man you're full of yourself. You don't need to know the intricacies of graphic subystems to understand the basic concept of how OS X's HiDPI works. It's explained in pretty simple terms by Apple itself. Again, you're conflating topics into something I wasn't suggesting. It's like you're overthinking everything being said. The way OS X's implementation of HiDPI is totally different from "DSP" scaling... it's a totally different concept - that's what I was pointing out. You've so confused the issue that I'm not sure you understand how HiDPI works.

That is well and nice, but you were not at all talking about how HiDPI works in the post I was originally quoting. I understood that you were talking about resolution scaling as performed by display (in 'hardware') in contrast as performed on GPU (in 'software') and I merely wanted to point out some issues with your reply that — again, in my opinion — could be potentially confusing for not tech-savy readers.

Leman, I do appreciate your reply, but here's an observation: you're acting like someone whose sole reason for posting is to impress on everyone how knowledgeable you are by suggesting everyone else is wrong as an excuse to demonstrate your own expertise.

I do hear that often. Maybe I am truly a horrible human being that just wants to bore everyone with his know-it-all attitude ;) But honestly, I am just attempting to define and point out separate topics. In my observation, users on these forums tend to conflate multiple orthogonal issues into one big mess of facts, fiction and wishful thinking which then drags on and is spread around. As the result, harmful myths are born. I make it my personal hobby to fight those myths.

Maybe don't make so many assumptions about what everyone else doesn't understand or shouldn't be talking about based on your exceptional standards. :rolleyes:

You are certainly right and I apologise if my previous post sounded confrontational. But again, to explain where I am coming from — the purpose of my post was to clarify some details that will hopefully reduce the chance that someone reading your post could get confused. For instance, after reading what you wrote someone could be left with the message that the retina mode looks good on Macs just because the OS does the scaling 'manually' (on the GPU) rather then letting the display hardware do it. But this is clearly not correct. So what I wanted to do was to clarify some things to make such impression less likely. Sorry if it escalated somewhat.
 
Thanks for all the replies.

I hope I'm not coming across as obtuse, I've been trying to digest all the information provided although maybe there is no simple answer to my questions.

I currently have a MBA, which I'm planning on giving to my sister, and then picking up the MBP retina. I game casually on the MBA, and for the most part games play pretty well at the native res. 1440x900.

I don't know if anyone can break all the information down for me to a simple answer, but here's the best I can explain it.

For pure example only; I play Half-Life 2 on the Air and get say 50 FPS at the default 1440x900. This resolution is set in game, and it's also the res OSX is set to and the native res of the LCD, obviously. So now I'm moving to the Retina, which has a default resolution of 2880x1800. Now I understand this is not the resolution OSX displays on screen, it's just the native res of the physical LCD in which whatever res you choose in OSX (or rather scaling...), it's just stretched to fit on the LCD.

From the information you've all provide, I'm still not clear on things completely.
Let's say on the Retina, I play Half-Life 2, and in the game settings, I set it to 1440x900 the same as I play on the Air. Putting aside whatever is doing the up scaling, the video buffer, the LCD, or OSX in the background, what resolution is the Intel GPU having to render. The 1440x900 in the game that I set, or the 2880 native res of the LCD, and then it's downscaled (by whatever) to the res I actually see. (1440 set by the game)

And again, I apologize as I'm trying to understand in more simpler terms what resolution the GPU has to render, so I get a better Idea of the performance impact moving to a retina and gaming on it, will make. I know these questions son't always have simple answers, but thanks again for all the explanations and replies.
 
Last edited:
It renders at the resolution you set it.
So if your actual MacBook Air can drive half life 2 with 50 fps , your new retina MacBook can run it at the same fps if not better at 1440x900 resolution.
 
From the information you've all provide, I'm still not clear on things completely.
Let's say on the Retina, I play Half-Life 2, and in the game settings, I set it to 1440x900 the same as I play on the Air. Putting aside whatever is doing the up scaling, the video buffer, the LCD, or OSX in the background, what resolution is the Intel GPU having to render. The 1440x900 in the game that I set, or the 2880 native res of the LCD, and then it's downscaled (by whatever) to the res I actually see. (1440 set by the game)

in this case the GPU will render to a 1440x900 target, just as chrizzz09 says above. So it should be more or less the same performance as if the screen were native 1440x900.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.