Separate names with a comma.
Discussion in 'iMac' started by FooArk, May 16, 2012.
Last year Apple released the 1080p iMacs. From my understanding thats the highest resolution right?
No that is full HD, now they have above and beyond 1080p and the last years iMac the 27 inch is 1440p, and now the 'retina' is almost double that.
1080p on a 21.5 inch screen is 103 pixels per inch. Pretty far from retina, I would say.
how would you notice the difference from 1080p on a 21.5 inch mac and Retina?
1080p is the highest resolution currently for HDTVs, but far from the highest resolution for computer displays. Even the 3rd gen iPad has a higher resolution than 1080p.
Isn't retina supposed to mean your typical viewing distance for the display as well as double the res?
First, some background. The RETINA is a light-sensitive tissue lining the inner surface of the eye. It is on the retina that the eye create an image of the world for you. The retina in the eye serves the same purpose as the film in a camera.
Second, the term. Retina DISPLAY is a term that apple introduced to say that the screen is so packed with pixels that your retina cannot perceive the individual pixels but only the picture.
Third, definition. (here someone might be more accurate but I'll try to keep it in layman terms). For a display to be a retina display you have to consider the typical watching distance. The closer you typically sit from the screen the more pixels are needed per inch of screen (referred to as ppi). Simply put, you need more ppi for an iPhone than for a PC because you typically use the phone on a much shorter distance. The iPhone for example has 326 ppi, and is supposed to be used from a 12 inch distance.
Hope this helps.
Can anyone link me to a PC with a display (or just any consumer display) that has a resolution double that of the 27inch imac?
no, because there is none.
Sharp is said to have started production of high resolution panels. If there will be retina displays, these are the panels that Apple is most likely to use.
My post on 'retina'.
Nice graph. I like it a lot.
Excellent, well put. It shows that Macs are not retina, but not massively away. Doubling the resolution will put them way beyond.
Doubling the resolution is one way to achieve retina. The other is resolution independence, where you can scale the content to whatever you want, but it still works. This would be cheaper hardware wise, but much more expensive in terms of software development.
But some of the rumours seem to suggest there could be an intermediate solution. By that I mean the graphics could be scaled up in steps, rather than going all the way to double. A bit like you can enlarge the text in Safari in steps. This allows you to make your own size / area trade offs.
I am likely wrong though. I was reading between the lines.
Can I just poke my head in to say I do not want retina.
Complete waste of time in my view. Unnecessary load on the CPU and graphics subsystems for zero or near zero gain and definitely increased cost. Why bother?
I vastly prefer the non-retina display on my Samsung Galaxy amoled screen to the retina display on my iphone, for example.
And on a 27" screen, for me, pixels are already well small enough at 2560x1440. I hope they do not add any more pixels.
1080p 21.5" iMac:
4K (four times the pixels of 1080p) or Retina 21.5" iMac:
As retina means that the squares are so small that your eye wouldn't see a difference if you'd make them even smaller, no, 1080p on a 21.5" display with a viewing distance of about 3 feet isn't retina. However, 1080p is retina on a 40" TV with a viewing distance of 15-20 feet.
It really depends on which horizontal viewing angle you "use". Closer to display you are - display occupies more of your (horizontal) field of view in degrees. Now, 27'' iMac has a display that is 23.54'' wide. I guess closest one will sit to it is 24'' (2 feet).
In that case, display occupies 52 (52.248) degrees of your horizontal viewing angle. If simplify things a bit here - on today's iMac you get 49 pixels per degree.
Now, if someone can tell me what is the closest viewing distance for iPad, we'd know how many pixels per degree you get with iPad at that distance.
Sammich, I'm looking at you.
In the case of the imac, doubling the resolution would take you well beyond what's necessary for "retina" display. However, they could simply double the resolution in sort of a "virtual" desktop using HiDPI, and then scale it down to the actual display resolution using subpixel smoothing.
That will work for anything that doesn't require pixel accuracy; but since the display is at the "retina" limit, pixel accuracy doesn't have much meaning anyway.
A little math to illustrate what I mean...
current 27" iMac is 2560x1440.
Let's say a Retina iMac increases the DPI by 50% (not double).
So Retina 27" iMac = 3840x2160.
Now, HiDPI on the Retina iMac would make all the screen elements look larger than they do on the regular iMac, but without HiDPI they'd be tiny!
Here's how you keep the screen elements the same absolute size: basically output to a screen buffer that is resolution doubled at 5120x2880.
Then scale it down to 3840x2160 using subpixel rendering.
The result: HiDPI graphical elements are the same physical size, but as clear as allowed by the human retina. In theory it won't look any different than the actual double-DPI mode because the pixels are already too small anyway.
You still need all the graphics horsepower to render double DPI to the offscreen buffer, but you relax the hardware requirements of the LCD panel a bit.
I'd say the only thing that comes close is photo monitors used by pro photographers or printers for image clarity but there unbelievably expensive, very rare and not really for practical purposes.
The next main thing that has stopped this is video card power. Driving 4,200 x 1,900 pixels while crunching 3d imagery and 24-bit colour is no easy task and the two technologies are coming together in terms of power at the moment.
I have posted this before but it can't hurt to do it again. I have used the IBM "Big Bertha" display - was available about 10 years ago - out of production now, I'm sure. It was a 22" LCD display with 200 dpi (3840x2400). For me, the most impressive feature was the ability to display very small fonts. If you write a lot of code, this is a big advantage as you can display more code without having to scroll. At the time, IBM described it as matching the acuity of the human eye at typical viewing distances. The display was very expensive, as you can imagine. Going to 200 dpi is definitely worthwhile for computer monitors.
Why not skip the intermediate spets and just render resolution-independent images directly? The sizes for HiDPI OS X icons look like mipmap chains and could indeed indicate that Apple moves towards resolution independence. Actually, I'd like to see the entire OS API move to real-world dimensions instead of pixels. This would be true revolution in software UI. Maybe in OS XI
Retina has nothing to do with the actual pixel density, it's if your eyes perceive individual pixels based on their viewing distance. For what it's worth, from my average distance of 2-3 feet I really can't make out individual pixels.
A small resolution increase would be nice I suppose, but 4x the amount is just overkill. It's an unneeded luxury, and will tax the gpu for more than it's worth.
A much better idea would have just to implement resolution independence. I would have liked that feature a whole lot more considering how much I use the command zoom with my imac.