Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not to mention, when Apple first came with the name Retina display, they already said that it's about the distance to the eye. They never defined it using a fixed ppi value.
 
Not to mention, when Apple first came with the name Retina display, they already said that it's about the distance to the eye. They never defined it using a fixed ppi value.
Quoted for truth.

Apple always disclosed that their choice of definition for a "retina display" has always been related to the typical arclength of human visual acuity, and that the corresponding pixel density was necessarily dependent upon the distance the display was held away from the eye.

If Apple had arbitrarily decided that the iPad was "intended" to be held 30 inches away from the eye, then they could have used a screen with an even lower ppi, and it still could have met their originally disclosed definition of a "retina display".

There was no "new" definition of "retina display" presented at yesterday's announcement.
 
Not to mention, when Apple first came with the name Retina display, they already said that it's about the distance to the eye. They never defined it using a fixed ppi value.


Can you verify this? I distinctly remember Apple quoting a ppi figure of over 300 and saying the limit of the human eye is 300ppi. I just googled it and and he iPad 2 announcement is where I can find the only mention of perceived resolution being a function of viewing distance. This makes sense of course and lots of us new this to be true right from the beginning but Apple never explicitly stated it before. If you look in the other sections of the forum you will see people throwing the "Retina" word around to go with their speculations of a resolution bump for their MacBook Air, Pro or what not because the definition was never made absolutely clear. Not putting blame on Apple, the product delivers for sure, but the "Retina Display" as a brand took on a life of it's own.

My theory is that Apple needed to reveal the truth about viewing distance to explain why the iPad can get away with a ppi figure of below 300 when the average consumer thinks he/she needs over 300 for his crisp text. No big deal but I think they should have been more specific from the beginning.
 
Can you verify this? I distinctly remember Apple quoting a ppi figure of over 300 and saying the limit of the human eye is 300ppi. I just googled it and and he iPad 2 announcement is where I can find the only mention of perceived resolution being a function of viewing distance. This makes sense of course and lots of us new this to be true right from the beginning but Apple never explicitly stated it before. If you look in the other sections of the forum you will see people throwing the "Retina" word around to go with their speculations of a resolution bump for their MacBook Air, Pro or what not because the definition was never made absolutely clear. Not putting blame on Apple, the product delivers for sure, but the "Retina Display" as a brand took on a life of it's own.

My theory is that Apple needed to reveal the truth about viewing distance to explain why the iPad can get away with a ppi figure of below 300 when the average consumer thinks he/she needs over 300 for his crisp text. No big deal but I think they should have been more specific from the beginning.

Jobs' exact statement, at the keynote when the iPhone 4 was announced, was:
Steve Jobs said:
It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels.

(emphasis mine)

So, yes, right from the very beginning, viewing distance was incorporated as part of Apple's definition of "retina display".
 
Retina Display is an Apple Marketing term. It's just an easier way to say high density pixel depth. Resolution has nothing to do with Retina Display except for being a component in the math to come up with ppi/dpi. The new iPad has higher resolution than a 1080p TV by a little over 1 million pixels. The definition of Retina Display is fuzzy at best. Ideally, to have something as good as print, you'll need to have 300dpi. At 264dpi, the iPad is better than all the Kindles produced, and most current 8+ inch Android tablets/RIM Playbook/HP Slate etc... There have been higher pixel density devices made in the past mostly by mobile phone manufacturers. Specifically, HTC, Nokia, and Samsung have all produced 300ppi or higher devices. The significance of this Retina Display on the iPad is that it is the first larger than 3.8 inch display made with this kind of pixel density. It's what makes the iPad so special and I'm surprised that it didn't raise the cost of the iPad.
 
So, yes, right from the very beginning, viewing distance was incorporated as part of Apple's definition of "retina display".

I stand corrected, thanks for taking the time to reply. Still doesn't explain all the marketing confusion, I just wish things were simpler. Slapping a name on to it probably didn't help. Not that any of this matters really.
 
which is exactly the definition of retina display. That's the mathematics (I'm sure just a small part of it) behind determining what density is needed to achieve a retina affect from a given distance.

In practical terms, "looking the same as the iPhone" IS retina. If you can't see the pixels, its a retina display

Are you serious? LOL.

That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters. :rolleyes:

Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof? :rolleyes:

Notice how also any value beside the distance is not given. You know, arctan changes with h and so does that "a" in the formula. :rolleyes:

So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula.
 
Are you serious? LOL.

That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters. :rolleyes:

Apple created the term "Retina display", of course you can define a screen as "Retina" if you create the definition itself!

Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof? :rolleyes:

http://blogs.discovermagazine.com/badastronomy/2010/06/10/resolving-the-iphone-resolution/

So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula

http://en.wikipedia.org/wiki/Visual_angle
 
Apple has always made the case that they call the displays "Retina" because pixel size is smaller than the minimal discernable distance for most people at the typical viewing distance. They didn't need to put a formula that most of us already know / could find in a textbook in thirty seconds up on a presentation screen to go any farther with that.

That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters. :rolleyes:

Why would you define a screen without taking viewing distance into consideration? Designing a Jumbotron billboard or build-side display at 300DPI would be a complete waste of time and energy, because no one is going to climb on the side of the building so they can look at it from a meter away.

I don't necessarily think that 260DPI or 300DPI is "truly" the highest resolution one can see, simply based on experiencing things like the difference in visual quality between text printed at 300 (first gen laser printers), 600 (more modern lasers), and 1200+ DPI (e.g., linotype / magazine print). There's clearly a difference that's visible there above 300DPI. OTOH it's subtle, and as a consumer, I probably wouldn't make it high on my priority list.

I like the high-res displays. I'm excited that Apple's expanding their use (I think the widely spread conjecture that Apple will start rolling out Macs with "Retina" displays is true). Will there be a good reason to go to an even higher DPI later? Maybe. I don't see why it's such a big deal.
 
Apple created the term "Retina display", of course you can define a screen as "Retina" if you create the definition itself!

There's no definition of Retina, so I can apply that to anything I want.
Did I say something different? :rolleyes:


Nothing to see there. I want scientific proof, not an article without documentation claiming otherwise utter uselessness. :rolleyes:

Also: as I said, "a" means nothing as nothing is given. You can call it visual angle (explain what Apple intended naming it "a" please) or dog, it doesn't change a bit what I said in my post before.

Next time you attempt to reply, avoid inserting links and nothing else. Especially if they reveal to be useless and not supporting what you might be wanting to say. :rolleyes:
 
Yes I know this. I was telling the guy who forgot that the "new" definition of retina display takes into account for viewing distance and was explaining to me basically how simply, "more pixels equals better.. duh."

You and I both know that more pixels equals AMAZING at a close distance, as do HD televisions if viewing at an appropriate distance. The other guy didn't. Go tell him.

I was sleepy, my mistake :p

Are you serious? LOL.

That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters. :rolleyes:

Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof? :rolleyes:

Notice how also any value beside the distance is not given. You know, arctan changes with h and so does that "a" in the formula. :rolleyes:

So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula.

They have defined the term based on typical usage of the screen, which is not wrong, and btw 20/20 vision is defined by being able to differentiate 2 objects (pixels) separated by 1 arc minute, the further away you get from the screen the bigger those pixels can be. and that's with 20/20 vision.

Notice how a jumbotron uses big leds which could be the size of your fist up close, but when looked at from a distance it looks fine.
 
There's no definition of Retina, so I can apply that to anything I want.
Did I say something different? :rolleyes:

What are you going on about? "Retina" isn't a meaningless arbitrary term. Your calculator's screen doesn't warrant the name "retina" and the iPad's does. People have given plenty of adequate definitions throughout the thread, they roughly summarize as 'a screen with a dpi which, when viewed in a normal use case, makes individual pixels indiscernable'. You can quibble over the details of such a definition, but the fact is that just because Retina isn't a beep boop measurement of a specific technical quantity doesn't make it a 'meaningless marketing term for the masses' or whatever your brain is trying to gin up.
 
Apple has always made the case that they call the displays "Retina" because pixel size is smaller than the minimal discernable distance for most people at the typical viewing distance. They didn't need to put a formula that most of us already know / could find in a textbook in thirty seconds up on a presentation screen to go any farther with that.

I don't see an application for that formula in sciences, so I assume only a few know that formula (which is by the way really elementary).

That's not a definition, that's vagueness. As I said, you can call anything Retina display, as long as you don't see the pixels. That's by no means a technical specification, more a market term.


Why would you define a screen without taking viewing distance into consideration? Designing a Jumbotron billboard or build-side display at 300DPI would be a complete waste of time and energy, because no one is going to climb on the side of the building so they can look at it from a meter away.

Stating "I can't see the pixels" from a certain distance is not a specification. Saying X PPI is, instead, because you can relate that to precise quantities.

I don't necessarily think that 260DPI or 300DPI is "truly" the highest resolution one can see, simply based on experiencing things like the difference in visual quality between text printed at 300 (first gen laser printers), 600 (more modern lasers), and 1200+ DPI (e.g., linotype / magazine print). There's clearly a difference that's visible there above 300DPI. OTOH it's subtle, and as a consumer, I probably wouldn't make it high on my priority list.

I like the high-res displays. I'm excited that Apple's expanding their use (I think the widely spread conjecture that Apple will start rolling out Macs with "Retina" displays is true). Will there be a good reason to go to an even higher DPI later? Maybe. I don't see why it's such a big deal.

Agreed.
 
I wonder what it's like to live in a world where words are either "marketing terms" warranting disparagement OR "scientifically-supported technical specifications" warranting assent, and there's no other way to refer to things.
 
If it were irrelevant, they wouldn't need to define retina at different distances and they would use the old definition of 300dpi.

It is relevant in a sense that most HDTVs are retina displays already and therefore look as good as the iPad screen at normal viewing distance

A 50" 1080p HDTV is retina at 6 feet away.
 
They have defined the term based on typical usage of the screen, which is not wrong, and btw 20/20 vision is defined by being able to differentiate 2 objects (pixels) separated by 1 arc minute, the further away you get from the screen the bigger those pixels can be. and that's with 20/20 vision.

Notice how a jumbotron uses big leds which could be the size of your fist up close, but when looked at from a distance it looks fine.

That has nothing to do with the "Retina" display.
Notice that perfect vision 20/20 has nothing to do with pixels (they are not lines, so that doesn't relate).
You are also quite confused: two objects are not separated by an arc minute, they can at most be separated by an angle (or visual angle, as you wish) of an arc minute.

Add to this that absolutely no value in the formula is provided (so how can you say anything relates to 20/20?), and there you have the nonsensical mass attraction.

the fact is that just because Retina isn't a beep boop measurement of a specific technical quantity doesn't make it a 'meaningless marketing term for the masses'

It makes instead. If you don't specify what that means, and don't prove it, that's a meaningless term, technically speaking.

"The iper AMOLED is better than anything else". There you go an equivalent statement to the Retina display (of course that requires the minimum effort of putting in context; saying, just because someone's probably going to argue the inarguable :rolleyes:). Oh, and it's defined by the "beauty" given by

b = √pi*∂U/∂I^2-45.
 
Are you serious? LOL.

That has nothing to do with any definition. You don't define a screen basing on the distance you see it. A 30x30 screen is Retina if you look at it from 20 meters. :rolleyes:

Nothing, also, tells you can't see the pixels. How's that? Is there some magic there, or mere speculation without scientific proof? :rolleyes:

Notice how also any value beside the distance is not given. You know, arctan changes with h and so does that "a" in the formula. :rolleyes:

So, that's totally without a context and utterly useless. What it does is to astonish the uneducated mass with a nonsensical formula.

retina display is a term that means you cannot see the individual pixels at the viewing distance considered average or normal for it. Seeing as apple made up the term, they can define it as whatever they want. You don't like the definition, and thats your opinion, but it doesn't change the fact that what they said yesterday is exactly what they said when they first introduced retina displays with the iPhone 4.

Yes, none of us have used it yet. We can't say for sure that you can't see the pixels. Be patient, wait until you can judge for yourself. In the meantime, read all the reviews from yesterdays hands on...everything I've read has been overwhelmingly positive and does in fact say you can't see the pixels.
 
That has nothing to do with the "Retina" display.
Notice that perfect vision 20/20 has nothing to do with pixels (they are not lines, so that doesn't relate).
You are also quite confused: two objects are not separated by an arc minute, they can at most be separated by an angle (or visual angle, as you wish) of an arc minute.

Add to this that absolutely no value in the formula is provided (so how can you say anything relates to 20/20?), and there you have the nonsensical mass attraction.

20/20 has to do with the fact that you can discern features separated by the visual angle of one arc minute.

I'm not confused at all, it's implied they're separated by the angle of an arc minute at the distance apple has used (10" for iPhone, 15" for iPad), notice how from a point of origin for the two lines separated by the angle, the further away you go the bigger the separation of the lines become.

how can we say it relates to 20/20? because they said so when introducing the iPhone4.

For someone with 20/20 vision you have a = 1 arc minute, you have d (15 inches), solve for h, how is that hard to understand?

You're just being obtuse
 
Apparently when someone uses the term "retina", to LostSoul80's ears, it's the same as saying "qzxwerty" or "blaplabla" because it's a meaningless term (oh, it's meaningless technically speaking - which just means it doesn't meet LostSoul80's arbitrary semantic standards).

Or maybe LostSoul80 thinks it's just a buzzword Apple has concocted to mean "this thing is good you should like it" (like a manchurian candidate phrase for consumers) but ultimately "retina" doesn't really indicate any conceivable attributes whatsoever about the product in question.

Neither interpretation is very satisfying, so who knows what he's going on about.
 
20/20 has to do with the fact that you can discern features separated by the visual angle of one arc minute.

Not features but letters, and letters are not composed by dots (or pixels), so you can't apply that logic in any way.
Unless you re-define the meaning of 20/20 vision. :rolleyes:

I'm not confused at all, it's implied they're separated by the angle of an arc minute at the distance apple has used (10" for iPhone, 15" for iPad), notice how from a point of origin for the two lines separated by the angle, the further away you go the bigger the separation of the lines become.

Obvious (read: useless) statement.

how can we say it relates to 20/20? because they said so when introducing the iPhone4.

That wasn't true of course. First, definition of 20/20 doesn't include dots. Second, 20/20 does not refer to perfect vision, because perfect vision doesn't exist: it just states that a person has a relatively high acuity compared to others. 20/20 is not in fact the maximum.
Third: Apple's statement was totally false. Unprecise and also the calculation doesn't match. :rolleyes:

For someone with 20/20 vision you have a = 1 arc minute, you have d (15 inches), solve for h, how is that hard to understand?

You don't "solve" that formula in that way: its meaning is to return the visual angle of two objects of which you know the distance, and know the distance of the observer. You don't do the opposite -- clearly you're confused.

Also, try to "solve" it and you'll find no arc minute. :rolleyes:


You're just being obtuse.


Apparently when someone uses the term "retina", to LostSoul80's ears, it's the same as saying "qzxwerty" or "blaplabla" because it's a meaningless term (oh, it's meaningless technically speaking - which just means it doesn't meet LostSoul80's arbitrary semantic standards).

Or maybe LostSoul80 thinks it's just a buzzword Apple has concocted to mean "this thing is good you should like it" (like a manchurian candidate phrase for consumers) but ultimately "retina" doesn't really indicate any conceivable attributes whatsoever about the product in question.

Neither interpretation is very satisfying, so who knows what he's going on about.

Yes, "Retina" is like "uh no pixels". That applies to most tech gadgets if viewed at the right distance (hence why we can't have "Retina displays" :rolleyes:). It's a market term to astonish the mass, but indicates in reality a relatively high pixel density. :rolleyes:
 
Do five rolleyes in one post indicate: "Take me seriously"? I'm not sure the merit of continuing a discussion with someone who approaches conversation that way.
 
Do five rolleyes in one post indicate: "Take me seriously"? I'm not sure the merit of continuing a discussion with someone who approaches conversation that way.

It's up to any interpretation.
Notice no one forces you to continue anything, so if in doubt you're free to leave. :rolleyes:
 
Not features but letters, and letters are not composed by dots (or pixels), so you can't apply that logic in any way.
Unless you re-define the meaning of 20/20 vision. :rolleyes:

That wasn't true of course. First, definition of 20/20 doesn't include dots. Second, 20/20 does not refer to perfect vision, because perfect vision doesn't exist: it just states that a person has a relatively high acuity compared to others. 20/20 is not in fact the maximum.
Third: Apple's statement was totally false. Unprecise and also the calculation doesn't match. :rolleyes:

You don't "solve" that formula in that way: its meaning is to return the visual angle of two objects of which you know the distance, and know the distance of the observer. You don't do the opposite -- clearly you're confused.

Also, try to "solve" it and you'll find no arc minute. :rolleyes:

You're just being obtuse.

Yes, "Retina" is like "uh no pixels". That applies to most tech gadgets if viewed at the right distance (hence why we can't have "Retina displays" :rolleyes:). It's a market term to astonish the mass, but indicates in reality a relatively high pixel density. :rolleyes:

let's see

From 1862
Hermann Snellen published his famous letter chart. His most significant decision was not to use existing typefaces but to design special targets, which he called optotypes. He based it on a 5x5 grid. This was crucial because it was a physical standard measure to reproduce the chart. Snellen defined “standard vision” as the ability to recognize one of his optotypes when it subtended 5 minutes of arc, thus the optotype can be recognized only if the person viewing it can discriminate a spatial pattern separated by a visual angle of 1 minute of arc (one element of the grid).

From 1888
Edmund Landolt proposed the Landolt C, a symbol that has only one element of detail and varies only in its orientation. The broken ring symbol is made with a "C" like figure in a 5 x 5 grid that, in the 20/20 optotype, subtends 5 minutes of arc and has an opening (oriented in the top, bottom, right or left) measuring 1 minute of arc. This proposal was based in the fact that not all of Snellen's optotypes were equally recognizable. This chart is actually the preferred visual acuity measurement symbol for laboratory experiments but gained only limited acceptance in clinical use.

clearly 1 minute of arc came out of my ass, i'm not redefining the definition of 20/20 vision.

Do you know what a grid is?

No one is talking about perfect vision, people with better than 20/20 vision would need higher ppi to not notice the pixels.

If *YOU* can't solve that formula for h with the parameters for a and d that apple has given you, then that's *YOUR* problem, not mine, you know what it'll give you? the height of the pixel, pixels are square so it'll also give you the width of the pixel, and what does that give you? a means to calculate ppi.

The definition of retina display *is* a moving target, since we don't all have the same vision, it depends on many factors, so apple decided to reduce most of them and tell you they're assuming 20/20 vision and a specific distance of use.

PS: here's a hint, h = 2d tan(a/2)
 
let's see

From 1862


From 1888


clearly 1 minute of arc came out of my ass, i'm not redefining the definition of 20/20 vision.

The definition (can't you read? :eek:) considers explicitly figures composed by lines and by no means as small as a pixel.

No one is talking about perfect vision, people with better than 20/20 vision would need higher ppi to not notice the pixels.

Oh, 20/20 is called "perfect vision" by definition. Don't you know?

If *YOU* can't solve that formula for h with the parameters for a and d that apple has given you, then that's *YOUR* problem, not mine, you know what it'll give you? the height of the pixel, pixels are square so it'll also give you the width of the pixel, and what does that give you? a means to calculate ppi.

Hint: I've obviously calculated both the visual angle for the iPad and the iPhone.
Guess what? As I stated in my previous post (can't you read? :eek: #2 :rolleyes:) the angle is superior to an arc minute.
Moreover, it's different between the iPad and the iPhone.

If you can't apply such an elementary calculation, that totally your problem.

Edit: LOL, are you serious? So you think "h" refers to the width or height of a pixel?

Oh boy.


The definition of retina display *is* a moving target, since we don't all have the same vision, it depends on many factors, so apple decided to reduce most of them and tell you they're assuming 20/20 vision and a specific distance of use.

Absolutely incorrect.
The visual angle doesn't match up with anything you can associate with 20/20 vision.
A technical specification is missing, and all you can tell is "uh no pixels wow".

Moreover, apparently you're just being really obtuse. :rolleyes:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.