Exactly. Not to mention that the 27" is already better than HD quality. How much more do you really need? I'm sure well see a retina display iMac someday when its cost effective. But it probably won't be for a loooong time, and is hardly necessary. So why even wait? I think being productive now is worth much more. But hey, that's just me.
I'd like to see the power draw on the 21.5" with Iris. I really like seeing the Iris Pro in the base model. That is plenty for most people.
Looks like a nice setup. With 4k monitors still costing 3K+, it will probably be a year or so before we see a 27" retina monitor. I've been using the 1440p 27" size for a while and love it. I doubt I would need 4k, especially since I sit ~3-4' away.
yes, it all depends how far away from your computer you sit - (i sit 40-50cm from a laptop and 60-70cm from a desktop)
http://isthisretina.com/ has a helpful calculator that you an use to see if a device fits apple's retina definition of being unable to discern an individual pixel
imac:
1920x1080 at 21.5" requires 86cm to be considered retina
2560x1440 at 27" requires 81cm to be considered retina
theoretical rimac:
3840x2160 at 21.5" requires 43cm to be considered retina
5120x2880 at 27" requires 41cm to be considered retina
mbp:
1280x800 at 13.3" requires 71cm to be considered retina
1440x900 at 15.4" requires 79cm to be considered retina
rmbp:
2560x1600 at 13.3" requires 38cm to be considered retina
2880x1800 at 15.4" requires 41cm to be considered retina
So a rmbp makes sense, however an rimac may be over the top, a slight resolution increase would be sufficient. (due to sitting further back when using a desktop)
But, this definition of retina is something apple came up with, "300ppi at 10-12 inches is retina", i believe if you do the math from an
arc minute it is around 320-360ppi, which would increase those distances listed above.
furthermore these calculations are all based on 20/20 vision being normal, BUT
studies have shown that normal vision is closer to 20/16 or 20/12, that would increase the distances listed above
additionally this paragraph from wikipedia:
Vernier acuity measures the ability to align two line segments. Humans can do this with remarkable accuracy, it is a hyperacuity. Under optimal conditions of good illumination, high contrast, and long line segments, the limit to vernier acuity is about 8 arc seconds or 0.13 arc minutes, compared to about 0.6 arc minutes (20/12) for normal visual acuity or the 0.4 arc minute diameter of a foveal cone. Because the limit of vernier acuity is well below that imposed on regular visual acuity by the "retinal grain" or size of the foveal cones, it is thought to be a process of the visual cortex rather than the retina. Supporting this idea, vernier acuity seems to correspond very closely (and may have the same underlying mechanism) enabling one to discern very slight differences in the orientations of two lines, where orientation is known to be processed in the visual cortex.
so if humans are super sensitive at noticing lines not aligning, aliasing is noticeable even past the point where you can't differentiate pixels
here is a post helping you to get a grasp on the sizes of arc seconds and arc minutes:
http://darkskydiary.wordpress.com/2010/04/06/arcminutes-and-arcseconds/
(20/20 means: at 20feet you see what the 'average' person sees at 20feet)