Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, it's not. Resolution independence means for example that font sizes are user selectable and that text is directly rendered in the desired target size and resolution, not that the whole screen is rendered off-screen and then scaled down.
It doesn't have anything to do with font size being user selectable. Apple doesn't want people changing all the UI elements around.
 
Because OS X scaling is just a cheap cop out instead of the real thing, which would be a proper resolution independent interface.
So the "cheap" solution that actually works is supposed to be worse than "the real thing" that doesn't?
As of today, there's no other desktop OS that is capable of decently supporting high resolution screens. You can praise scaling in Windows or whatever all you want, but today, many applications look like crap. This includes several Microsoft applications and even parts of Windows. Good grief.
Besides, have you actually used OSX scaling? Rendering at doubled resolution might sound wasteful, but looking at how insignificant the hardware impact of 2D rendering is, this isn't exactly a problem. I use a 4k and 5k 27" screen right next to each other, both with a virtual resolution of 1440p and the unavoidable difference due to a different hardware resolution is minor at best. I can notice it, many people actually don't. Things like blurred edges simply don't exist, down scaling seems to works well enough.
[doublepost=1466094437][/doublepost]
It doesn't have anything to do with font size being user selectable. Apple doesn't want people changing all the UI elements around.
Indeed, different topic altogether. The only way how it's related to scaling is that you cannot use a high resolution and compensate by increasing font size instead of UI scaling. But why would anyone want that, everything will look disproportionate.
Making the OSX GUI more customizeable in general is a different topic, but this should definitely not be used to handle higher resolutions.
 
So the "cheap" solution that actually works is supposed to be worse than "the real thing" that doesn't?
As of today, there's no other desktop OS that is capable of decently supporting high resolution screens. You can praise scaling in Windows or whatever all you want, but today, many applications look like crap. This includes several Microsoft applications and even parts of Windows. Good grief.
Well, I didn't even mention Windows, much less praise it... I could mention that 20 or so years ago there were some promising approaches on the Amiga, and surely Apple could have built on that to come up with something really nice, but from what we have seen they aren't even trying. The display postscript inherited from NeXT a long time ago will hopefully eventually become a proper vector based and resolution independent GUI. Until then, as long as the pixel still rules, Apple's scaling is a good and simple trick, and it's definitely better than half-assing it as Windows does, but it shouldn't be the end of the development.
 
Well, I didn't even mention Windows, much less praise it... I could mention that 20 or so years ago there were some promising approaches on the Amiga, and surely Apple could have built on that to come up with something really nice, but from what we have seen they aren't even trying. The display postscript inherited from NeXT a long time ago will hopefully eventually become a proper vector based and resolution independent GUI. Until then, as long as the pixel still rules, Apple's scaling is a good and simple trick, and it's definitely better than half-assing it as Windows does, but it shouldn't be the end of the development.
I didn't intend to suggest that you praise Windows but I see how it came off like that. What I meant by "Windows or whatever" was just a random desktop OS other than OSX.
But if what you are saying isn't that OSX scaling is bad in comparison to other existing Systems but rather that it isn't perfect I wouldn't disagree. The main problem though is legacy, both OS GUI as well as applications. The situation would be different if all applications would have been programmed to have geometry independent of resolution, but as of today, this is unrealistic.
However, the way I see it, the need for purely coordinate based geometry might have been outlived by improved screen resolution before it was actually implemented. Once you throw enough hardware pixels, aka more than the eye can see (which isn't necessarily achieved by retina resolution yet), the need for pixel precision disappears.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.