Quoted for truth.Not to mention, when Apple first came with the name Retina display, they already said that it's about the distance to the eye. They never defined it using a fixed ppi value.
Not to mention, when Apple first came with the name Retina display, they already said that it's about the distance to the eye. They never defined it using a fixed ppi value.
Can you verify this? I distinctly remember Apple quoting a ppi figure of over 300 and saying the limit of the human eye is 300ppi. I just googled it and and he iPad 2 announcement is where I can find the only mention of perceived resolution being a function of viewing distance. This makes sense of course and lots of us new this to be true right from the beginning but Apple never explicitly stated it before. If you look in the other sections of the forum you will see people throwing the "Retina" word around to go with their speculations of a resolution bump for their MacBook Air, Pro or what not because the definition was never made absolutely clear. Not putting blame on Apple, the product delivers for sure, but the "Retina Display" as a brand took on a life of it's own.
My theory is that Apple needed to reveal the truth about viewing distance to explain why the iPad can get away with a ppi figure of below 300 when the average consumer thinks he/she needs over 300 for his crisp text. No big deal but I think they should have been more specific from the beginning.
Steve Jobs said:It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels.
So, yes, right from the very beginning, viewing distance was incorporated as part of Apple's definition of "retina display".
which is exactly the definition of retina display. That's the mathematics (I'm sure just a small part of it) behind determining what density is needed to achieve a retina affect from a given distance.
In practical terms, "looking the same as the iPhone" IS retina. If you can't see the pixels, its a retina display
Are you serious? LOL.
That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters.![]()
Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof?![]()
So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula
That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters.![]()
Apple created the term "Retina display", of course you can define a screen as "Retina" if you create the definition itself!
Yes I know this. I was telling the guy who forgot that the "new" definition of retina display takes into account for viewing distance and was explaining to me basically how simply, "more pixels equals better.. duh."
You and I both know that more pixels equals AMAZING at a close distance, as do HD televisions if viewing at an appropriate distance. The other guy didn't. Go tell him.
Are you serious? LOL.
That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters.![]()
Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof?
Notice how also any value beside the distance is not given. You know, arctan changes with h and so does that "a" in the formula.
So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula.
There's no definition of Retina, so I can apply that to anything I want.
Did I say something different?![]()
Apple has always made the case that they call the displays "Retina" because pixel size is smaller than the minimal discernable distance for most people at the typical viewing distance. They didn't need to put a formula that most of us already know / could find in a textbook in thirty seconds up on a presentation screen to go any farther with that.
Why would you define a screen without taking viewing distance into consideration? Designing a Jumbotron billboard or build-side display at 300DPI would be a complete waste of time and energy, because no one is going to climb on the side of the building so they can look at it from a meter away.
I don't necessarily think that 260DPI or 300DPI is "truly" the highest resolution one can see, simply based on experiencing things like the difference in visual quality between text printed at 300 (first gen laser printers), 600 (more modern lasers), and 1200+ DPI (e.g., linotype / magazine print). There's clearly a difference that's visible there above 300DPI. OTOH it's subtle, and as a consumer, I probably wouldn't make it high on my priority list.
I like the high-res displays. I'm excited that Apple's expanding their use (I think the widely spread conjecture that Apple will start rolling out Macs with "Retina" displays is true). Will there be a good reason to go to an even higher DPI later? Maybe. I don't see why it's such a big deal.
If it were irrelevant, they wouldn't need to define retina at different distances and they would use the old definition of 300dpi.
They have defined the term based on typical usage of the screen, which is not wrong, and btw 20/20 vision is defined by being able to differentiate 2 objects (pixels) separated by 1 arc minute, the further away you get from the screen the bigger those pixels can be. and that's with 20/20 vision.
Notice how a jumbotron uses big leds which could be the size of your fist up close, but when looked at from a distance it looks fine.
the fact is that just because Retina isn't a beep boop measurement of a specific technical quantity doesn't make it a 'meaningless marketing term for the masses'
Are you serious? LOL.
That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters.![]()
Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof?
Notice how also any value beside the distance is not given. You know, arctan changes with h and so does that "a" in the formula.
So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula.
That has nothing to do with the "Retina" display.
Notice that perfect vision 20/20 has nothing to do with pixels (they are not lines, so that doesn't relate).
You are also quite confused: two objects are not separated by an arc minute, they can at most be separated by an angle (or visual angle, as you wish) of an arc minute.
Add to this that absolutely no value in the formula is provided (so how can you say anything relates to 20/20?), and there you have the nonsensical mass attraction.
20/20 has to do with the fact that you can discern features separated by the visual angle of one arc minute.
I'm not confused at all, it's implied they're separated by the angle of an arc minute at the distance apple has used (10" for iPhone, 15" for iPad), notice how from a point of origin for the two lines separated by the angle, the further away you go the bigger the separation of the lines become.
how can we say it relates to 20/20? because they said so when introducing the iPhone4.
For someone with 20/20 vision you have a = 1 arc minute, you have d (15 inches), solve for h, how is that hard to understand?
Apparently when someone uses the term "retina", to LostSoul80's ears, it's the same as saying "qzxwerty" or "blaplabla" because it's a meaningless term (oh, it's meaningless technically speaking - which just means it doesn't meet LostSoul80's arbitrary semantic standards).
Or maybe LostSoul80 thinks it's just a buzzword Apple has concocted to mean "this thing is good you should like it" (like a manchurian candidate phrase for consumers) but ultimately "retina" doesn't really indicate any conceivable attributes whatsoever about the product in question.
Neither interpretation is very satisfying, so who knows what he's going on about.
Do five rolleyes in one post indicate: "Take me seriously"? I'm not sure the merit of continuing a discussion with someone who approaches conversation that way.
Not features but letters, and letters are not composed by dots (or pixels), so you can't apply that logic in any way.
Unless you re-define the meaning of 20/20 vision.
That wasn't true of course. First, definition of 20/20 doesn't include dots. Second, 20/20 does not refer to perfect vision, because perfect vision doesn't exist: it just states that a person has a relatively high acuity compared to others. 20/20 is not in fact the maximum.
Third: Apple's statement was totally false. Unprecise and also the calculation doesn't match.
You don't "solve" that formula in that way: its meaning is to return the visual angle of two objects of which you know the distance, and know the distance of the observer. You don't do the opposite -- clearly you're confused.
Also, try to "solve" it and you'll find no arc minute.
You're just being obtuse.
Yes, "Retina" is like "uh no pixels". That applies to most tech gadgets if viewed at the right distance (hence why we can't have "Retina displays"). It's a market term to astonish the mass, but indicates in reality a relatively high pixel density.
![]()
Hermann Snellen published his famous letter chart. His most significant decision was not to use existing typefaces but to design special targets, which he called optotypes. He based it on a 5x5 grid. This was crucial because it was a physical standard measure to reproduce the chart. Snellen defined standard vision as the ability to recognize one of his optotypes when it subtended 5 minutes of arc, thus the optotype can be recognized only if the person viewing it can discriminate a spatial pattern separated by a visual angle of 1 minute of arc (one element of the grid).
Edmund Landolt proposed the Landolt C, a symbol that has only one element of detail and varies only in its orientation. The broken ring symbol is made with a "C" like figure in a 5 x 5 grid that, in the 20/20 optotype, subtends 5 minutes of arc and has an opening (oriented in the top, bottom, right or left) measuring 1 minute of arc. This proposal was based in the fact that not all of Snellen's optotypes were equally recognizable. This chart is actually the preferred visual acuity measurement symbol for laboratory experiments but gained only limited acceptance in clinical use.
let's see
From 1862
From 1888
clearly 1 minute of arc came out of my ass, i'm not redefining the definition of 20/20 vision.
No one is talking about perfect vision, people with better than 20/20 vision would need higher ppi to not notice the pixels.
If *YOU* can't solve that formula for h with the parameters for a and d that apple has given you, then that's *YOUR* problem, not mine, you know what it'll give you? the height of the pixel, pixels are square so it'll also give you the width of the pixel, and what does that give you? a means to calculate ppi.
The definition of retina display *is* a moving target, since we don't all have the same vision, it depends on many factors, so apple decided to reduce most of them and tell you they're assuming 20/20 vision and a specific distance of use.