Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Libertine Lush

macrumors 6502a
Original poster
Nov 23, 2009
682
2
Hello all,

As explained so well in the Anandtech review of the rMBP that many have read, the rMBP renders a scaled resolution (1680x1050 and 1920x1200) at 2x the selected resolution, then downscales it to fit the screen, which is why a scaled 1680x1050 resolution on the rMBP looks nicer than the native 1680x1050 on the Hi-Res "classic" MBP.

However, does this also happen when in Windows?

Was having a discussion elsewhere about how impressive it is that the rMBP can play a certain game in Boot Camp at almost all Ultra settings at 1920x1200, and that it is even more impressive considering the resolution being processed is actually 3840x2400. But am I wrong. Is that only happening in OSX?

Thanks.
 
Hello all,

As explained so well in the Anandtech review of the rMBP that many have read, the rMBP renders a scaled resolution (1680x1050 and 1920x1200) at 2x the selected resolution, then downscales it to fit the screen, which is why a scaled 1680x1050 resolution on the rMBP looks nicer than the native 1680x1050 on the Hi-Res "classic" MBP.

However, does this also happen when in Windows?

Was having a discussion elsewhere about how impressive it is that the rMBP can play a certain game in Boot Camp at almost all Ultra settings at 1920x1200, and that it is even more impressive considering the resolution being processed is actually 3840x2400. But am I wrong. Is that only happening in OSX?

Thanks.
Only happens in OSX
 
I don't think so (but I don't know).

What I found interesting was that the resolution can lie. I was playing Half-Life 2 (under OSX Steam) and it's slow at 2880x1800, under 30fps in some situations. I reset it to run at 1440x900, AA 2x. Now, certain things are lower resolution. However, I am almost certain that the rendered textures in the distance are actually rendering at 2880x1800 (so with tiny pixels). The actual shapes being displayed are at 1440x900, and the pixellation on small distant items is clear. The textures, however, are detailed. Some of this scaling-down-scaling-up is being done by the graphics hardware automagically.

In my experience (and coming back to the question), Windows Bootcamp scales down from 2880x1800 and hence the scaled resolutions aren't as nice and crisp as the same resolutions in HiDPI OSX. 2880x1800 is perfect, of course, as it 1440x900.
 
I don't think so (but I don't know).

What I found interesting was that the resolution can lie. I was playing Half-Life 2 (under OSX Steam) and it's slow at 2880x1800, under 30fps in some situations. I reset it to run at 1440x900, AA 2x. Now, certain things are lower resolution. However, I am almost certain that the rendered textures in the distance are actually rendering at 2880x1800 (so with tiny pixels). The actual shapes being displayed are at 1440x900, and the pixellation on small distant items is clear. The textures, however, are detailed. Some of this scaling-down-scaling-up is being done by the graphics hardware automagically.

In my experience (and coming back to the question), Windows Bootcamp scales down from 2880x1800 and hence the scaled resolutions aren't as nice and crisp as the same resolutions in HiDPI OSX. 2880x1800 is perfect, of course, as it 1440x900.

wtf are you talking about? none of this was accurate at all.
 
I don't think the op understands much about scaling.
OSX renders its UI all the 2D stuff in double because it only has two scaling modes. normale and hidPI (which is double).
In order to keep proportions proper they render higher and scale down.

Windows has no such thing but in theory a lot of scaling modes from 100-150% and in theory you can add even more DPI modes if you do not rely on the wizard.
That requires a bit more thought on the side of the programmer which is why apps often aren't perfectly res independent.

Windows renders the standard resolution and later scales up to whatever res the monitor has but that is not actually an OS feature. If you plug in an external HDTV you can send it a 1280x720p signal and it will scale up in the TV no GPU or OS required. All external screens can run a bunch of resolutions and have their own scaling.

Was having a discussion elsewhere about how impressive it is that the rMBP can play a certain game in Boot Camp at almost all Ultra settings at 1920x1200, and that it is even more impressive considering the resolution being processed is actually 3840x2400. But am I wrong. Is that only happening in OSX?
You are completely wrong because even in OSX games render at whatever resolution you set inside the game. Games know full well how to scale to different resolutions and need no tricks to show the right proportions.
Rendering them at anything higher than demanded would be a huge waste of resources.
Super Sampling in games also has nothing to do with proportions but with detail. From a higher res picture it is easier to see if something ought to look like it looks or if it should actually be a straight aliased line.
 
Basically OSX is stupid it can only draw a line in two ways.
To cover 10% of the screen either paint 144 or 288 pixels.
In no situation can it paint 168. It just doesn't know how to do that. You cannot tell it that 10% of the screen is anything else but 144 or 288.
So you ignore it and if you want higher than 144 you paint in 288 and later shrink the whole picture by 40%.

BTW I don't think that is much of a future proof solution. It seems more like a we didn't find a better way for now so we did this emergency-solution. Might be they stick with it because they don't deem the people that want to run something else than the prescribed resolution important enough. Seems to be Apple practice more and more.
 
I don't think so (but I don't know).

What I found interesting was that the resolution can lie. I was playing Half-Life 2 (under OSX Steam) and it's slow at 2880x1800, under 30fps in some situations. I reset it to run at 1440x900, AA 2x. Now, certain things are lower resolution. However, I am almost certain that the rendered textures in the distance are actually rendering at 2880x1800 (so with tiny pixels). The actual shapes being displayed are at 1440x900, and the pixellation on small distant items is clear. The textures, however, are detailed. Some of this scaling-down-scaling-up is being done by the graphics hardware automagically.

In my experience (and coming back to the question), Windows Bootcamp scales down from 2880x1800 and hence the scaled resolutions aren't as nice and crisp as the same resolutions in HiDPI OSX. 2880x1800 is perfect, of course, as it 1440x900.

Unless you selected super sampling AA, I don't think what you're describing is reality
 
Windows doesn't do this, only OS X.

----------

What I found interesting was that the resolution can lie. I was playing Half-Life 2 (under OSX Steam) and it's slow at 2880x1800, under 30fps in some situations. I reset it to run at 1440x900, AA 2x. Now, certain things are lower resolution. However, I am almost certain that the rendered textures in the distance are actually rendering at 2880x1800 (so with tiny pixels). The actual shapes being displayed are at 1440x900, and the pixellation on small distant items is clear. The textures, however, are detailed. Some of this scaling-down-scaling-up is being done by the graphics hardware automagically.

:eek:

You know, when I was a kid, I used to believe that cars are powered by little horses inside the engine (why would they call it horsepowers otherwise)?

Bottom-line: sometimes its really better to remain silent...

----------

Basically OSX is stupid it can only draw a line in two ways.
To cover 10% of the screen either paint 144 or 288 pixels.
In no situation can it paint 168. It just doesn't know how to do that. You cannot tell it that 10% of the screen is anything else but 144 or 288.
So you ignore it and if you want higher than 144 you paint in 288 and later shrink the whole picture by 40%.

BTW I don't think that is much of a future proof solution. It seems more like a we didn't find a better way for now so we did this emergency-solution. Might be they stick with it because they don't deem the people that want to run something else than the prescribed resolution important enough. Seems to be Apple practice more and more.

No, you got it all wrong, alas.

This was a deliberate choice by Apple engineers. In earlier version of OS X you had a flexible scale factor (akin to current Windows DPI settings) which would map device pixels and logical units in a flexible way. However, due to imperfect scaling, visual artefacts will arise. This is why Apple has built in supersampling AA into the OS. When rendering on a non-native resolution, the OS will render to a higher-resolution buffer and then downscale it to the native resolution. This also allows for some optimisations when dealing with images and UI rendering.

The system is still able to display any custom resolution and the supersampling is there to enhance the quality. Apple's way (with 4x a supersampling buffer) is the superior way and other OSes will undoubtedly implement this technique as well.

And BTW, 10% of the screen in the 1650x1080 mode is rendered as 330 pixels ;)
 
You are right the example is way off. It is always shrunk to 50% and rendered at whatever 200% will get you the desired 100%.
It is 336 to be picky.

I don't doubt it is deliberate but I don't think it is the best way. Should it be it will be a function of future graphics drivers that allow higher input than output resolutions, but this software solution is an emergency measure. Doesn't change what it is though.
I also haven't read about any unsurmountable up scaling problems that make it really worth it do render everything higher as opposed to just do a special scaling for bitmaps. Most GUI elements work quite well and are easily scaled up without any problems. This solution is about compatibility but I doubt it is really worth it given how much higher they still want to go in resolutions. Here we are at 4k, but on a 27" with some lower settings it gets a bit ridiculous.
Hardware will get faster but is it really worth it to do all this drawing work when most of the time it is really an unnecessary thing.
I think the Windows 8 approach is more the future than this.
It is okay when you expect that all people will want to run the same scaling factor anyway. I know that me and my dad have vastly different opinions on what is ideal. He runs the 19" 1440x900 display at 125%, I prefer a 15" at 1680x1050 or 1920x1200.
 
There's something a bit weird going on that I don't quite understand when running games on OSX. Bear with me here...

My "normal" resolution I boot with is 1920x1200. I then start up Half-Life 2 and open up a screen, setting Video options to 1440x900. If I shift-command-3 I can take a screenshot.

Then I change the fullscreen video options to 2880x1800. Of course, everything is sharper and slower (fps down to about 25-30). Take the same screenshot.

Now, exit the game and view the screenshots in Pixelmator. BOTH images are 3840x2400 (?!). The one taken at (notionally) 1440x900 is slightly blurrier, but not a huge amount. Both pictures contain detail in all of the 3840x2400 pixels, there is no duplication of pixel data anywhere.

What's going on here? Is it just the screen-capture program doing this? Where's it getting the data? HL2 thinks it's sending out 1440x900 but it appears that the graphics card is storing detail at a higher resolution than this. If a 1440x900 image was being presented at 3840x2400 you would expect a lot more doubling -- as suddenly the number of pixels increases by over seven times.

I'm not even sure why OSX is taking the snapshot at 3840x2400...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.