Finally! The "resolution independence" rumors started years ago can now be put to a rest? Anyone remember that?
Yeah,
the only positive thing with this topic is, that maybe finally Apple will put some effort on resolution independence in osx. Maybe even in 10.9 at 2017? At least if they haven't dumped the whole osx by then.
Sadly win7 has had 10bit color support from 2009. I guess osx will never be again "the most advanced os in the world" at least in graphical tech.
If you can't see a difference between image two and image three in my post, that's fine.
Wow, lots of confusion, outdated misconseption, oversimplifation and overcomplications...
Although this isn't so very har rocket science, few opininions (I'm not really professsional expert on the subject, but usually want to understand the ideas behind certain decisions):
Myth#1: Native resolution
Lcd can look sharp with other resolutions than native.
If the other resolutions are exact equal division of the original, there's no interpolation between pixels and therefore no additional blur.
Myth#2: Screen PPI defines how sharp it will look
Just like you can't tell camera sensor's sensitivity from pixel density, you can't define screen's acuity by PPI. The grid's thickness compared to pixel size varies. Pixel's edges can be sharp or blur. Screen can have different coatings. All these variables affect how sharp will something look. If the grid between pixels is thin enough or have blur enough edges, it doesn't matter how big the pixels are. In this case exact multiples will look exact (1x1 vs. 2x2).
With retina displays, you shouldn't be able to see individual pixels, so you surely shouldn't be able to see the grid between the pixels or sub-pixels.
Myth#3: Scaling/interpolation is different in Retina display
In CRT era tubes had shadow mask / aperture grill, which was sharper than resolution. Because of this and analog beam, every resolution was "scaled" and therefore "equal".
Fixed pixel displays on the other hand have been sharp only with exact equal divisions.
This changes when we go over "retina display" densities (what will apple call this; retina+, retinaPro or what?) when line pair (2 pixels) looks like one dot (hmmm, should we include Kell factor here?). Then it doesn't matter if the dot is 1, 1.5 or 2 pixels wide, all look the same.
Then fixed pixel display starts to behave like crt and you don't have to have exact multiples of last gen display.
The output bus and bandwidth to the monitor has nothing to do with the actual processing unit ability to push out the pixels. It's 1 bottleneck (which we have solved around 2008 with DP on Macs).
Was dual-dvi a bottleneck (before dp)?
8Gbit/s was a bottleneck?
The practicality issue is not with the consumers, but with the manufacturers.
Apple would have to pay gigantic premiums for these custom panels to be made, and at a non-standard resolution too.
It's much simpler for Apple to buy 1080p or 1600p 15" panels that are already in production instead of use this Retina Display standard.
Yep, I can imagine the loud voices in Apple's executive floor: "At last we are making record breaking profits with MBP when its display cost us only $30 and now they are demanding a $300 display! Lets drop macs away and focus on iOS devices!"