So Retina is the output of an equation that looks like: Retina = PPI/Distance from face? The reason it's BS is because designers can't know exactly how far you're holding the object from your eyes.
No, but they can determine the typical range(s) at which the display is viewed, and calculate the required PPI based on that and typical visual acuity. Strangely enough, that's exactly how 'retina display' has been defined since the very first moment it was described by Steve Jobs on stage with the announcement of the iPhone 4.
----------
Ok, so if that is the case explain how things will be the same size on a "retina" MBA screen that is 11" at 2732x1536 versus a "retina" 15" MBP At 2440x1900? They both have different DPI, but the elements are just pixel doubled.
HiDPI mode is just pixel doubling, just like on iPad and iPhone. The difference is that iPad and iPhone only have two display sizes. Macs have 6 display configurations between just laptops (11", 13" Air (1440), 13" Pro (1280), 15" (1440), 15" (1680), 17" (1920)). The DPI on all those are different. Which means pixel doubling will make one element larger on one screen than it is on another, just like we have right now with non-hidpi.
My point is that on a 1440x900 screen at 15", interface elements are HUGE. Why should that become the "standard"? On iOS having standard element sizes is great, because they are based on touch size, which most people have similar sized finger tips. On a Mac you use a mouse, so preferred screen resolution differs as a mouse is always a 1px target.
True "resolution independence" would mean that the user can scale interface elements to whatever size they want. So if they have bad eyesight, they can make toolbars 1" tall. If they don't, then they don't have to. HiDPI does not give us this, it just gives us crisp text and interface elements that are the same size as they are now, still dependent on DPI.
HiDPI is a mid-point compromise on the road to resolution independence.
Once you hit the point where a user can't distinguish individual pixels, it becomes easier to manage resolution independence, because you can have settings which control how big a 'display unit' is (in pixels) and have software specify sizes in 'display units' (rather than pixels). Unfortunately, software has historically been written with widget-sizes specified in pixels, so there will be a moderately painful period which will be eased by the HiDPI stage.
Now that we're seeing pixels too small to visibly distinguish, we have more size options for any given widget where it remains readable or otherwise visually clear. This means that you can configure a system such that your 'display unit' is 1, 2, 3 or more pixels in size, and the system can do the necessary math to convert your buttons/text/etc to pixel dimensions.
I'm guessing 2015-2018 is when we'll actually hit the resolution independence stage, largely due to the inertia of legacy software not being updated to take advantage of it.
----------
Yeah, anyone else notice that for the larger screens, Intel has recommended too high of a DPI?
I mean, if 300 dpi is good for viewing from 12-16", shouldn't 150 dpi be sufficient for large displays viewed from 24-30", instead of the recommended 220 dpi Intel lists?
I suspect they are confusing things by combining retina capability with the desire for "more elements on a screen". Either way, people have a minimum desired element size which is based primarily (?) on a couple factors - viewing distance and their own eyesight, and to a lesser extent on dpi and contrast. (Assuming "decent" resolution, not old style VGA.)
When the physical size of the screen is introduced as a variable, people can fit more stuff on a larger screen, but if the elements get too small or too large, they're not happy.
There should be a notion of a minimum font size or minimum element size for applications and OS elements, and items get larger from there. The complication is that some other items should scale, but perhaps not all.
I don't think they're over-specifying the resolution. I think they're simply acknowledging that while most people 'typically' sit within that range of their desktop displays, they also often lean in more closely. For example, at work, I generally find myself at one of 3 ranges (roughly 16", 24" or 36") depending on exactly what I'm doing (examining something, typing normally, reading/planning code). Specifying a higher-than-absolutely-necessary resolution for a desktop means that people who lean in will still get the benefit of the 'retina' resolution.
----------
As much as I would love for these to be in the 2012 lineup, I don't know how Retina displays will work out for my design process.
...
What happens if I design my graphics or website on the new retina display, and everything looks great? How will I know how it looks on regular screens?
That's easy. Just change your resolution from 'WxH HiDPI' to 'WxH' and check out your design. (Or keep a regular, non-HiDPI, display around for testing until it becomes unnecessary.)
Another concern is, GPUs don't seem to be technologically advanced enough to drive these resolutions. How does Apple intend on maintaining the current graphics performance with these new displays?
Even the latest and greatest from ATI/nVidia won't be able to handle a 30" Retina display.
It'll be interesting to see how these all get implemented. I hope GPUs won't become the next bottleneck, and further setback the enthusiast/gaming community from the Mac environment.
It's really only a strain for a modern GPU to push these kinds of pixels around when they have to also do 3D rendering. GPUs have been capable of pushing these sorts of resolutions for 'normal' desktop applications for about 15 years at this point. At one point, I had an 8MB Matrox video card which was capable of these resolutions, if I'd had a display that could do it back then. My first computer had a 14" monitor (12.7" visible) that could push 1280x1024 at 50Hz, with later systems the monitors became both physically larger and capable of higher resolutions. It wasn't until the HDTV craze really hit that 1920x1080 was considered 'high resolution' for a desktop system.