Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

macbook123

macrumors 68000
Original poster
Feb 11, 2006
1,869
85
I have this basic question: which aspects of a retina-optimized app are "scaled" to the resolution I pick in system preferences (e.g., 1920x1200), and which elements are not? When I look at Google maps in Safari (a retina-optimized app), the maps look very crisp and not "downgraded" to 1920x1200. Same with photos in Aperture. So my interpretation of "retina-optimized" became that the control elements and font related to the apps controls are scaled to the downgraded resolution, but the graphical elements are not.

What I then wonder is why a different approach wouldn't make more sense, one in which one always uses the full 2880x1880 but sets a minimum font size and control element size, instead of scaling the actual resolution. Or is that the same thing? Sorry if these questions are stupid.
 

macbook123

macrumors 68000
Original poster
Feb 11, 2006
1,869
85
Could somebody please provide some input on this?

For example, in Aperture, I noticed that pictures and videos are indeed shown at the *reduced* resolution of the current setting. E.g., a Full HD movie takes up the entire screen width and 120 pixels less than the full screen height vertically in the Aperture window. Only when I switch resolution to the full 2880x1800 using SetResX does the Full HD video take up less space than the full screen.

This suggests that even Retina-optimized apps like Aperture don't really use the full resolution of the display, but present a downgraded, interpolated, version of the real images. If this is the same with Safari, I'm overall confused because it would seem what Apple really should have done is increase the size and width of control elements and fonts while leaving the resolution of graphics untouched.

Now what I don't understand about my observations is that during the introduction of the Retina Pro's Apple touted a FCP (Final Cut Pro) window with a small sub window allegedly showing a full HD movie. Does this mean that FCP is really the only Retina-optimized app that shows you media at full resolution?
 

leman

macrumors Core
Oct 14, 2008
19,202
19,061
Basically, it works as follows. The core idea is to differentiate between 'points' (which are the logical pixels all applications are aware of) and native pixels. On a traditional screen, in a traditional OS 1 point = 1 pixel. That is, if you render a 60x20 points button, it will have the dimensions of 60x20 pixels on the screen and if you render a full-sized Full HD video it will have the dimensions of 1920 × 1080 pixels on the screen.

Now, on a HiDPI (retina) screen, each point is 2x2 pixels. The applications still work with points though . So the same button stays 60x20 points as far the application is concerned, but the OS renders it as 120x40 pixels. Because pixels are so small, this retina button actually has the same size as a non-retina 60x20 button. Something very similar happens to the video: the application just tells the OS "hey, I want to display 1080p content, so give me a 1920x1080 window". But because dimensions are points and not pixels, this window will be 3840x2160 in the retina mode and the video will be upscaled (on scaled 1920x1200 resolution the OS X renders into a '4x supersampled' internal buffer 3840x2400 and then downscales it to to the native 2880x1800).

This video upscaling happens because the application is not 'retina'(HiDPI)-aware or just not care and thus asks for a 1920x1080 points display area. A HiDPI-aware application can detect that point ≠ pixel and ask for actual 1920x1080 pixels (which would be 960x540 points). This is what Apple demonstrated with Final Cut Pro.

BTW, per default, OpenGL 3d content is rendered pixel-doubled to improve performance. So if the application asks for an 500x500 points OpenGL window, the OS will always give it a 500x500 pixels window and than upscale (pixel-double) it to 1000x1000 when the window is drawn. Again, HiDPI-aware applications can control this as well.

Now, as to the question how many applications out there are already HiDPI-aware - I have no idea.

Hope it could clear things up for you a bit.
 

DiGriz

macrumors newbie
Jul 4, 2012
8
0
The Idiot's Guide to UI Elements on a Retina Screen:
User Interface (UI) elements in non-Retina application are scaled once in "Best for Retina" mode, and twice in the "Scaled" modes.

Let's take a hypothetical UI element which is 10 pixels tall and 10 pixels wide. In "Best for Retina" mode this element would be stretched to 20 by 20. Each individual pixel in the original element will be interpolated into four pixels on the stretched version. Interpolation introduces "blurriness", but because of the super-high resolution of the Retina screen, this is offset to some degree.

In Scaled mode the situation is somewhat worse. The non-Retina UI element undergoes its doubling (to 20 by 20), and then undergoes a non-integer scaling to convert the virtual 1680 screen to the real 2880 screen. Our doubled 20 by 20 element becomes a rather unwieldily 17.143 by 17.143, along with inheriting another level of blurriness. 1 pixel of the original UI element (or any non-Retina image - www for instance) is now mapped to 1.7143 pixels of the screen, through two blur-inducing resampling stages.

Again, this is offset somewhat by the Retina effect of the screen (where pixels have less importance than previous). Some people will notice this added blurriness, some will not.

The thing that worries me is that the 2880 Retina screen is not a standard size, so Apple cannot take advantage of the specialist hardware built into the rMBP's GPUs. All of this resampling is done in software. At the moment this software resampling is - supposedly - done on the CPU, but Mountain Lion will move this load onto the GPUs.

Rendering the original 10 by 10 UI element directly on the 2880 screen would make it unusable small, and Apple are using the Retina effect to give "virtual" resolutions to avoid this problem.

Retina-aware applications are displayed in a 1-to-1 fashion on the 2880 screen, but they start off twice the size of non-Retina applications, so they appear at a usable size on the screen (just much clearer). However, under the scaling-modes even the Retina-aware applications go throughout the same (second) non-intager scaling as the non-retina applications to convert from the virtual resolution to the real 2880 screen.
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,202
19,061
The Idiot's Guide to UI Elements on a Retina Screen:

Do you call it 'Idiot's Guide' because it is full of misconceptions or is there another reason for this?

Let's take a hypothetical UI element which is 10 pixels tall and 10 pixels wide. In "Best for Retina" mode this element would be stretched to 20 by 20. Each individual pixel in the original element will be interpolated into four pixels on the stretched version. Interpolation introduces "blurriness", but because of the super-high resolution of the Retina screen, this is offset to some degree.

Wrong (or, mostly wrong). Usually, UI elements are vector-rendered (for example, buttons or text). Vector rendering is always crisp, there is no interpolation and no blurriness. Only bitmap images, OpenGL views and custom-drawn views are interpolated (pixel-doubled) - and that also only for non retina-aware applications. As I mentioned before, retina-aware applications can choose finer levels of control.

In Scaled mode the situation is somewhat worse. The non-Retina UI element undergoes its doubling (to 20 by 20), and then undergoes a non-integer scaling to convert the virtual 1680 screen to the real 2880 screen. Our doubled 20 by 20 element becomes a rather unwieldily 17.143 by 17.143, along with inheriting another level of blurriness. 1 pixel of the original UI element (or any non-Retina image - www for instance) is now mapped to 1.7143 pixels of the screen, through two blur-inducing resampling stages.

Again, this is offset somewhat by the Retina effect of the screen (where pixels have less importance than previous). Some people will notice this added blurriness, some will not.

Again, this is not entirely correct. On the 1680x1050 HiDPI resolution, the OS renders everything into a big offscreen 3360x2100 buffer. So your 20x20 UI element will be first drawn to this buffer. Then, the buffer is linearly interpolated (downscaled) to the native 2880x1800 screen. Since downscaling preserves more details than upscaling the quality stays very good, although, as you correctly point out, some blurriness may be perceived as the result of interpolation.

The thing that worries me is that the 2880 Retina screen is not a standard size, so Apple cannot take advantage of the specialist hardware built into the rMBP's GPUs. All of this resampling is done in software. At the moment this software resampling is - supposedly - done on the CPU, but Mountain Lion will move this load onto the GPUs.

No idea what this is supposed to mean. There are no "standard sizes" in GPUs since I don't know... the 2000 or so? The GPU will happily interpolate the non-power of two textures for you, and this is what OS X uses it for. This interpolation is also very fast. Even the Intel IGP can do it at very fast rates (several hundred times per second). If the interpolation were done at the CPU, you'd see much higher CPU usage values in the activity monitor.


Retina-aware applications are displayed in a 1-to-1 fashion on the 2880 screen, but they start off twice the size of non-Retina applications, so they appear at a usable size on the screen (just much clearer). However, under the scaling-modes even the Retina-aware applications go throughout the same (second) non-intager scaling as the non-retina applications to convert from the virtual resolution to the real 2880 screen.

Here I again have no idea what you are talking about. If you are talking about 'native' 2880x1800 mode (which you can't select via standard pref pane), the retina-aware and non-retina aware apps behave exactly the same way under it - they are both unusably small. Here, logical pixels = real pixels and both are very small.

If you are talking about 'best for retina' mode, this is HiDPI 1440x900 (where each logical pixel is represented via 2x2 real pixels). Again, retina and non-retina aware apps are both 2x2 scaled, to appear the same size as they would at the 1440x900 monitor. The only thing that retina-aware (we really should start calling them HiDPI-aware, btw.) applications can do is recognize when they are run on a HiDPI display and adjust accordingly (e.g. render custom UI at higher internal resolution). If the application does not adjust explicitly, the OS will pixel-double the content that needs adjusting (again, these are bitmap images, OpenGL views and custom-drawn views).
 

DiGriz

macrumors newbie
Jul 4, 2012
8
0
Do you call it 'Idiot's Guide' because it is full of misconceptions or is there another reason for this?
Yes. Specifically aimed at people who think they know more than they do...

Wrong (or, mostly wrong). Usually, UI elements are vector-rendered (for example, buttons or text).
Last time I checked all UI elements were still bitmaps. The screen grabs of non-retina applications with blurry UI elements would lend credence to this. Surely if Apple had move everything to vector then all applications would be - by default - Retina ready; but they're not, ergo you are talking out of an orifice not usually used for such.

Again, this is not entirely correct. On the 1680x1050 HiDPI resolution, the OS renders everything into a big offscreen 3360x2100 buffer.
...which is the "virtual 1680" screen. Are you being deliberately obtuse, or is this genuine?

Then, the buffer is linearly interpolated (downscaled) to the native 2880x1800 screen.
...which is exactly what I said.

Since downscaling preserves more details than upscaling the quality stays very good...
...and punch in the arm is better than a kick in the balls; It's still better to have neither. Anyway, you're just plain wrong; downscaling loses more detail than upscaling, it's just not as obvious. And the net result is still 1.714 upscale, so you've still lost-out twice.

I hate to think how badly you're being fiddled by your accountant.

No idea what this is supposed to mean. There are no "standard sizes" in GPUs since I don't know... the 2000 or so?
Really, so the fact that everyone else conforms to the standard resolutions is just a mere fluke of probability?


Here I again have no idea what you are talking about.
I know, I clearly pitched the "idiot's guide" too high.

If you are talking about 'native' 2880x1800 mode (which you can't select via standard pref pane)...
Why would I be talking about that in simplified guide? Seriously?

If you are talking about 'best for retina' mode, this is HiDPI 1440x900 (where each logical pixel is represented via 2x2 real pixels).
I think there's an echo in here.

Again, retina and non-retina aware apps are both 2x2 scaled...
Um, not they're not. Non-Retina applications are pixel-doubled, Retina applications are already rendered at twice the size. That's why non-retina applications are blurry and Retina ones are sharp. If you had to upscale the Retina applications, they too would be blurry. You can't invent detail, you've been watching too much CSI. E n h a n c e !
 

leman

macrumors Core
Oct 14, 2008
19,202
19,061
Last time I checked all UI elements were still bitmaps. The screen grabs of non-retina applications with blurry UI elements would lend credence to this. Surely if Apple had move everything to vector then all applications would be - by default - Retina ready; but they're not, ergo you are talking out of an orifice not usually used for such.

Then stop 'checking' and start reading Apple developers docs. Buttons, frames, text etc. are all vector-rendered.

This page might help for starters: http://tinyurl.com/d7kr95d



...and punch in the arm is better than a kick in the balls; It's still better to have neither. Anyway, you're just plain wrong; downscaling loses more detail than upscaling, it's just not as obvious. And the net result is still 1.714 upscale, so you've still lost-out twice.

No it is not - and the reason is exactly because (vector) UI is rendered at a higher resolution in the first place (see above). It does make a difference if you upscale a low-res image or downscale a high-res one - the later will usually look better. Of course, you are right about the bitmap context.


Really, so the fact that everyone else conforms to the standard resolutions is just a mere fluke of probability?

Modern RAMDAC modules can work with custom resolutions. The RAMDAC on the RMPB must be a particularly fast one to be able to output that many pixels. But I am sure that you don't want to say that the RMBP bypasses the RAMDAC and uses the CPU to send the data directly to the display.



Um, not they're not. Non-Retina applications are pixel-doubled, Retina applications are already rendered at twice the size. That's why non-retina applications are blurry and Retina ones are sharp. If you had to upscale the Retina applications, they too would be blurry. You can't invent detail, you've been watching too much CSI. E n h a n c e !

Again, see first part. If the application uses Cocoa-provided controls and does not do anything exotic like drawing its own controls or using offscreen-rendering, it is already retina-enabled and no pixel-doubling will occur.

Overall, I see that you tried to keep the things simple, which is a good thing. Unfortunately, some of your explanations can be understood in the wrong way. Basically, you sound like if a application which is not specially compiled with some sort of retina-specific code will be pixel-doubled. This is plainly wrong. Most applications will look perfect on the retina screen, because OS manages the HiDPI rendering automatically. You only need to adjust the software if a) you use bitmaps (only images), b) you do some sort of exotic things like custom control drawing.
 

macbook123

macrumors 68000
Original poster
Feb 11, 2006
1,869
85
Thanks for the explanation. That is the way I had understood it myself, but the Aperture example above contradicts it: in Aperture, which should be a retina-aware app according to Apple, Full HD content is always stretched to the full screen width when I have it at 1920x1200 setting. It really should show in a small sub-window like it does in Final Cut.

Basically, it works as follows. The core idea is to differentiate between 'points' (which are the logical pixels all applications are aware of) and native pixels. On a traditional screen, in a traditional OS 1 point = 1 pixel. That is, if you render a 60x20 points button, it will have the dimensions of 60x20 pixels on the screen and if you render a full-sized Full HD video it will have the dimensions of 1920 × 1080 pixels on the screen.

Now, on a HiDPI (retina) screen, each point is 2x2 pixels. The applications still work with points though . So the same button stays 60x20 points as far the application is concerned, but the OS renders it as 120x40 pixels. Because pixels are so small, this retina button actually has the same size as a non-retina 60x20 button. Something very similar happens to the video: the application just tells the OS "hey, I want to display 1080p content, so give me a 1920x1080 window". But because dimensions are points and not pixels, this window will be 3840x2160 in the retina mode and the video will be upscaled (on scaled 1920x1200 resolution the OS X renders into a '4x supersampled' internal buffer 3840x2400 and then downscales it to to the native 2880x1800).

This video upscaling happens because the application is not 'retina'(HiDPI)-aware or just not care and thus asks for a 1920x1080 points display area. A HiDPI-aware application can detect that point ≠ pixel and ask for actual 1920x1080 pixels (which would be 960x540 points). This is what Apple demonstrated with Final Cut Pro.

BTW, per default, OpenGL 3d content is rendered pixel-doubled to improve performance. So if the application asks for an 500x500 points OpenGL window, the OS will always give it a 500x500 pixels window and than upscale (pixel-double) it to 1000x1000 when the window is drawn. Again, HiDPI-aware applications can control this as well.

Now, as to the question how many applications out there are already HiDPI-aware - I have no idea.

Hope it could clear things up for you a bit.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.