Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

akm3

macrumors 68020
Original poster
Nov 15, 2007
2,252
279
Windows has had DPI scaling for years, but it has mostly sucked and had bad software support. With the latest Windows 8.1 iteration, is it functionally identical to how Macs use high DPI displays?

Content is rendered 1:1 (movies, photos, etc) with the UI interface scaled up to be useable on high resolution displays.

Am I missing something that makes the Mac implementation better?
 
Windows has had DPI scaling for years, but it has mostly sucked and had bad software support. With the latest Windows 8.1 iteration, is it functionally identical to how Macs use high DPI displays?

Content is rendered 1:1 (movies, photos, etc) with the UI interface scaled up to be useable on high resolution displays.

Am I missing something that makes the Mac implementation better?

Yes. Retina is completely different from Windows scaling.

OSX/Retina starts with the "looks like" resolution (by default 2560x1440, but can be higher, such as 3200x1800, depending on what you select as the scaling). It tells all software that the screen is this big, so it all works. But it creates a video buffer exactly twice the size in both directions (5120x2880, or 6400x3600 in the examples above). It renders text twice the size at the higher resolution, without telling the application that it's done this. The application's measurements and inbuilt images are doubled and rendered on to the higher resolution buffer.

If the application is retina-aware, it is allowed to serve up double-resolution images, which are put into the buffer instead of the non-retina images.

Once complete, the final screen is scaled down to the screen's physical resolution (or just displayed, if 5120x2880 was the buffer resolution).

In this way, all existing software works exactly as before except that text is super-sharp. Retina-aware applications can use the maximum resolution possible. And (because it's always down-scaled) even at higher "looks like" resolutions the text remains sharp because it's down-scaled rather than being pixel doubled.

*This* is what retina is. Apple haters will tell you it just means high resolution screens, but only because they don't understand what the difference is.
 
Windows has had DPI scaling for years, but it has mostly sucked and had bad software support. With the latest Windows 8.1 iteration, is it functionally identical to how Macs use high DPI displays?

Content is rendered 1:1 (movies, photos, etc) with the UI interface scaled up to be useable on high resolution displays.

Am I missing something that makes the Mac implementation better?

Lol windows is no where near as good as OSX's implementation.

With OSX I'm always seeing the benefit of the Hi DPI display, windows everything looks small or the icons look pixelated. It's just bad.
 
That's what I thought too, but are you SURE? Check out windows 8.1 HiDPI features. It sounds the same to me except the render off screen/ scale down part.
 
That's what I thought too, but are you SURE? Check out windows 8.1 HiDPI features. It sounds the same to me except the render off screen/ scale down part.

I've had my dell 4k connected to my windows pc gaming at 4k with 60hz.

believe me it doesn't look good.
 
That's what I thought too, but are you SURE? Check out windows 8.1 HiDPI features. It sounds the same to me except the render off screen/ scale down part.

From what I read, its just as how Windows has always done its HiDPI... a DPI slider controls how pixels are rendered (e.g., if a app wants to draw 1 pixel, the OS will draw 1-2 pixels instead).

With OS X its a very different — if an app wants to draw 1 pixel, the OS will ALWAYS draw 2 pixels instead. The trick is that the drawing happens to an offscreen buffer and then downscaled and copied to the screen.

The OS X methods is a more resource intensive (as it always needs to draw 2x2 bigger image), but it results in a higher quality image. And its much friendlier for programmers.
 
From what I read, its just as how Windows has always done its HiDPI... a DPI slider controls how pixels are rendered (e.g., if a app wants to draw 1 pixel, the OS will draw 1-2 pixels instead).

With OS X its a very different — if an app wants to draw 1 pixel, the OS will ALWAYS draw 2 pixels instead. The trick is that the drawing happens to an offscreen buffer and then downscaled and copied to the screen.

The OS X methods is a more resource intensive (as it always needs to draw 2x2 bigger image), but it results in a higher quality image. And its much friendlier for programmers.

So the Retina iMac, when set to "Best for Retina" (providing an equivalent 2560x1440 resolution) is rendering offscreen at 10,240x5,760 before scaling down? (Or native 5120x1880? before rendering down to 2560x1440)

I promise I'm not trying to be dense or difficult, I'm trying to understand.
 
In "Best for retina" (i.e. looks like 2560x1440) it's rendering at 5120x1880.

Anandtechs retina MacBook Pro review explains it in a good way:
In the default “best for Retina Display” setting, the desktop, menu bar, icons and Finder windows are drawn at 2880 x 1800, but they are drawn larger than they would normally be at 2880. Apple draws everything at 4x the size to make the desktop behave exactly as it would on a 15.4-inch 1440 x 900 display - this is the backing scale factor (2.0) at work. This approach provides the best image quality as there’s integer mapping from pixels on the panel to pixels on the desktop. No interpolation or filtering is necessary... ...the screen is drawn at 2880 x 1800 but everything is scaled up to be the same size it would be at 1440 x 900
 
Is your gaming PC windows 8.1?

yep. Problem with Windows is that it depends on the applications on whether or not to support Hi DPI. OSX will do it on everything, now whether or not you get Retina assets depends on the application, but with Windows, the UI elements, everything if the app isn't supporting HiDPI, will look horrible. Steam, chrome, etc all have poor HiDPI support.
 
So the Retina iMac, when set to "Best for Retina" (providing an equivalent 2560x1440 resolution) is rendering offscreen at 10,240x5,760 before scaling down? (Or native 5120x1880? before rendering down to 2560x1440)

I promise I'm not trying to be dense or difficult, I'm trying to understand.

Read my post 2. If you're using "best for retina" it renders a 2560x1440 screen at double resolution (5120x2880), and then (since that particular resolution is also the screen resolution) it just shoves it at the screen.

Other "looks like" resolutions will render at higher resolutions and then scale down to fit the screen.
 
Lol windows is no where near as good as OSX's implementation.



With OSX I'm always seeing the benefit of the Hi DPI display, windows everything looks small or the icons look pixelated. It's just bad.


Not in windows 8.1. It looks perfect at 200%, the same shape and size on the rMBP as 1440x900 but twice as sharp, very nice.

This is the first time pixel doubling has worked in the windows environment.

Prior to that your comments were valid, not anymore.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.