Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Wirelessly posted (Mozilla/5.0 (iPhone; CPU iPhone OS 5_0_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9A405 Safari/7534.48.3)

jsolares said:
You seem to have missed what I wrote, but that doesn't surprise me seen your inability to oppose anything.
To quickly sum up: you can't apply the logic of 20/20 to pixels, and I've written also why. :rolleyes: If you're unable to read or understand, that's your problem.

Also, to prove this is Apple itself: no documentation providing anything anywhere. They know they would better not deepen too much.

A "Retina" display is any given display: perhaps it's a quality and any screen can qualify as "Retina". It is not defined, and certainly you won't define it in any way. Only Apple can, and it hasn't. :rolleyes:

1. I think you mean ability not inability. Perhaps so, i've been told i'm a contrarian, i don't go for the popular, i go for what i understand though :)

2. Do you have lines in your eyes? i have rods and cones. they're the ones responsible for what your brain interprets and in turn see.

Why do you think you can't apply 20/20 logic to pixels? you don't have lines in your eyes interpreting what you see, you have rods and cones, or maybe you do have lines :eek:.

dist2.jpg


Funny how the rods and cones in our eyes look like an LCD screen.

What more documentation do you need?

A "Retina" display as defined by apple is any given display where you can't see the pixels at the intended usage distance for it.

My 27" display is not a retina display just because if i sit 5 feet from it i won't notice pixels, it would be if sitting at 18-20" i couldn't see them at all. it wouldn't take that many more pixels though (damn astigmatism).

Heck i have a SGSII and i can't see its pixels with all of its 218ppi, i had to take off my glasses to be able to focus when putting it 5 inches in front, and then i could see the pixels.

I will be the first to tell apple to cram it if they go and change their definition though.

So would u consider galaxy s 2 retina to Ur eyes? I only have the 1 and it's bootyfil but lacks my 4s sharpness of course
 
"Edit: LOL, are you serious? So you think "h" refers to the width or height of a pixel?

Oh boy." from LostSoul80.

Seriously?
Then i guess we're done here.
What does h and a stand for in the formula a = 2 arctan(h/2d)?
 
What does h and a stand for in the formula a = 2 arctan(h/2d)?

look at it as a triangle,d is the distance from your eye to the screen, h would be height of the square projected in the screen, the square would be a pixel.

a is the angle of what your eye can resolve, using 1 arc minute which is what having 20/20 vision allows you to resolve you get a bigger h than the size of the pixels in either screen, so a would need to be smaller than 1 arc minute meaning your eye can't resolve it.
 
Last edited:
look at it as a triangle,d is the distance from your eye to the screen, h would be height of the square projected in the screen, the square would be a pixel.

a is the angle of what your eye can resolve, using 1 arc minute which is what having 20/20 vision allows you to resolve you get a bigger h than the size of the pixels in either screen, so a would need to be smaller than 1 arc minute meaning your eye can't resolve it.

What are the units for a, h, d then?
Is a in radians, and h and d just have to be equivalent units?
 
What are the units for a, h, d then?
Is a in radians, and h and d just have to be equivalent units?

a will be in whatever mode you have the calculator on, degrees or radians.

1 arc minute is 1/60 of a degree or 290.8882087 micro radians.

so for the iphone4 a = 2 * atan(0.0031 / 24), which gives you a = 0.0148014096 degrees which is less than 1/60 or 0.016666666667 degrees, at 10 inches it gives you 0.0177616915 which is higher.

for the new ipad a = 2 * atan(0.0037841796875 / 30) = 0.0144545016 degrees at 15" at 14" it gives you 0.015486966 both lower than 1/60 of a degree.
 
Last edited:
I just thought of this. How are videos and movies going to look any better if the standard is 1080p?

What are you going to see that's different?
 
I just thought of this. How are videos and movies going to look any better if the standard is 1080p?

What are you going to see that's different?

Because the previous screen couldn't show native 1080p, or even uncropped 720p.

I think 1920x1080 scaled up a bit to 2048 across will look nicer than 1366x768 scaled down to 1024 across.
 
Galaxy Note a Retina Display?

Is the Galaxy Note 5.3 phone a Retina Display?

Hmm doing the 'math' it looks like the Samsung Galaxy Note would fit in the wishy washy definition of retina display:

iphone 4s: 3.5-inch 960 x 640 pixels, 326 ppi
Galaxy Note: 5.3-inch 800 x 1280 pixels, 285 ppi
New iPad: 9.7-inch 2048 x 1536 pixel, 264 ppi

Basically use Apple's 2 distances they suggest for the iphone and ipad and using the size of the note get its distance and see that given Note's resolution/ppi, it is a Retina Display. That and you get a simple wacom tablet digitizer under the screen surface ( sorry that was a bit of a troll remark - I have a note as well as a ipod touch that I use everyday - and I do like the pressure sensitive, pixel accurate pen).

Food for thought, But as has been said, RD is a bit of a marketing term anyway.
 
To me, 'Retina Display' means a screen that when held at the typical usage distance, I can't see the individual pixels on the display. It's a really easy way for Apple to say 'this is a high quality screen' without quoting technical jargon.
 
I just thought of this. How are videos and movies going to look any better if the standard is 1080p?

What are you going to see that's different?

You've never seen a 1080p video on an iPad. The iPad 2 has a 1024x768 (right?) display. So therefore the new display now allows 1080p videos to be played. That's why Apple made a big deal about them being offered now.

I still prefer buying Blu-ray Disc by a long shot because there's not as much compression, the audio codecs are better and it's much easier to share BD movies. But this is a nice move, and maybe one day we BD owners can rip video in 1080p with the ease of iTunes ripping CDs. (Hint: All that DRM is not stopping the people who want to pirate movies!)
 
So it should be able to play high definition YouTube videos right? How can I do that? Because there isn't an option.

Does New iPad do HTML5? Or is it an app thing?
There's not an option right now, no, but Apple may update the YouTube app with the ability to choose what resolution video you want to watch.

That aside, if you visit YouTube with the YouTube HTML5 Beta enabled, you should be able to use the HTML5 videos, iOS does run HTML5 fine. (I believe iAds are actually made with HTML5, funnily enough.)
I just thought of this. How are videos and movies going to look any better if the standard is 1080p?

What are you going to see that's different?
As has been said, you previously couldn't watch 1080p movies on the iPad 2. The benefit will cap at 1920x1080 for 1080p content, which is really all that exists right now.

Unless you go and download one of the 4K videos from YouTube and put it on it :D
 
Why do you think you can't apply 20/20 logic to pixels? you don't have lines in your eyes interpreting what you see, you have rods and cones, or maybe you do have lines :eek:.

This shows your inability to understand a definition, and what a definition is.
Acuity is measured using optotypes (or a similar means): black symbols drawn on white paper. This definition involves normal light conditions and refers to lines which are the sum of contiguous squares, along with a series of factors.
For instance, pixels are not comparable to black ink on white paper in normal light conditions (as a start).
As a consequence, you can apply this logic only to contexts similar to the way it is defined, otherwise you're applying a precise definition to something the definition itself doesn't grant to be valid (and in fact it is not). :rolleyes:
When someone defines objectively something, as in science, there are always precise contexts with precise variables (or range of), and if you exit that logic that means you haven't understood a single thing in science. You can't, for example, apply a theorem if some hypothesis aren't met. :rolleyes:


Funny how the rods and cones in our eyes look like an LCD screen.

What more documentation do you need?

A "Retina" display as defined by apple is any given display where you can't see the pixels at the intended usage distance for it.

That's absolutely not a documentation involving the ability to resolve pixels or showing Apple's complete definition of a Retina display. :rolleyes: Moreover, that's not even documentation (scientific research with at minimum hundreds of pages, to begin with – you need evidence to prove anything).
 
This shows your inability to understand a definition, and what a definition is.
Acuity is measured using optotypes (or a similar means): black symbols drawn on white paper. This definition involves normal light conditions and refers to lines which are the sum of contiguous squares, along with a series of factors.
For instance, pixels are not comparable to black ink on white paper in normal light conditions (as a start).
As a consequence, you can apply this logic only to contexts similar to the way it is defined, otherwise you're applying a precise definition to something the definition itself doesn't grant to be valid (and in fact it is not). :rolleyes:
When someone defines objectively something, as in science, there are always precise contexts with precise variables (or range of), and if you exit that logic that means you haven't understood a single thing in science. You can't, for example, apply a theorem if some hypothesis aren't met. :rolleyes:

It's easier for the eye to resolve higher contrast right? black text on white screen is the higher contrast you can get on an LCD, the iPad2 had 920:1 which is higher than paper, meaning it will be easier for the eye to resolve the text.

With colors its a completely different thing but actually means you can see less of the pixels not more due to contrast.

Why do you think they're not comparable? the contrast on paper is lower than on an IPS LCD.

But enough about that, People smarter than me says you can.
http://www.eyenetra.com/test2Connect.html
http://web.media.mit.edu/~raskar/

It might be easy to prove as well, when the new ipad comes out load a snellen test chart at full resolution on it and put it at a distance appropriately and test if it's comparable to the snellen chart on paper, you could use a DSLR camera and compare the pictures. or you could test with people.

That's absolutely not a documentation involving the ability to resolve pixels or showing Apple's complete definition of a Retina display. :rolleyes: Moreover, that's not even documentation (scientific research with at minimum hundreds of pages, to begin with – you need evidence to prove anything).

What do you want them to prove? you keep harping about hundreds of pages, but for what exactly? maybe these : http://web.media.mit.edu/~pamplona/NETRA/ ?

----------

So would u consider galaxy s 2 retina to Ur eyes? I only have the 1 and it's bootyfil but lacks my 4s sharpness of course

to my eyes yes, the S1 has the regular AMOLED screen which is pentile and not really RGB so despite having higher ppi it has less "real" pixels than the S2, haven't seen an S1 tho.

Super AMOLED Plus, first introduced with the Samsung Galaxy S II and Samsung Droid Charge smartphones, is a further development where the PenTile RGBG pixel matrix (2 subpixels) is replaced with Samsung's "Real Stripe" (3 subpixels) RGB RGB subpixel arrangement. This goes from eight to twelve subpixels per group, resulting in finer details. The screen technology is also brighter, thinner with AMOLED Plus displays being 18% more energy efficient than the old Super AMOLED displays.
 
It's easier for the eye to resolve higher contrast right? black text on white screen is the higher contrast you can get on an LCD, the iPad2 had 920:1 which is higher than paper, meaning it will be easier for the eye to resolve the text.

With colors its a completely different thing but actually means you can see less of the pixels not more due to contrast.

:eek:
We are talking about Apple's claim about its "Retina" display, not about your own conclusions or thoughts about it.
Snellen's 20/20 tells a person (in a precise condition) can't resolve an optotype (which features particular distances) at a particular distance. It doesn't say anything else. The distance of one arc minute has to do with the optotypes themselves, it's used as a common reference. Well defined conditions are necessary to conform the method of examination.

20/20 doesn't absolutely mean you won't be able to resolve anything less distant than one arc minute. I've repeated this quite a few times now. :rolleyes:
Also, the real retina of the human eye has a much higher "resolution" in terms of angular distance.


Who cares if someone comes up with an application which can't strictly follow Snellen's definition, but is used as a means of cheap diagnostic tool for poor countries that can't afford the real expensive tools? Do you think that means anything in this matter? (Rhetorical question. The answer is: no).

It might be easy to prove as well, when the new ipad comes out load a snellen test chart at full resolution on it and put it at a distance appropriately and test if it's comparable to the snellen chart on paper, you could use a DSLR camera and compare the pictures. or you could test with people.

If you want to define a new scale of acuity you're free to do it. Bear in mind no one will even remotely care or look at it.

What do you want them to prove? you keep harping about hundreds of pages, but for what exactly? maybe these : http://web.media.mit.edu/~pamplona/NETRA/ ?

:eek:
You're still asking this question? Are you able to read? (Again, rhetorical question :rolleyes:).
That has nothing to do with resolving pixels. And again here's the same conclusion: Apple's "Retina" display is a marketing term without any definition behind it, indicating, perhaps, a quality any screen can have if considered at a distance.
Notice also you still haven't linked anything about Apple's documentation. Why? Because Apple has none, maybe? :rolleyes: And everything you are saying is based on vagueness you heard in a company's event of a product launch?
 
Is the Galaxy Note 5.3 phone a Retina Display?

Hmm doing the 'math' it looks like the Samsung Galaxy Note would fit in the wishy washy definition of retina display:

iphone 4s: 3.5-inch 960 x 640 pixels, 326 ppi
Galaxy Note: 5.3-inch 800 x 1280 pixels, 285 ppi
New iPad: 9.7-inch 2048 x 1536 pixel, 264 ppi

Basically use Apple's 2 distances they suggest for the iphone and ipad and using the size of the note get its distance and see that given Note's resolution/ppi, it is a Retina Display. That and you get a simple wacom tablet digitizer under the screen surface ( sorry that was a bit of a troll remark - I have a note as well as a ipod touch that I use everyday - and I do like the pressure sensitive, pixel accurate pen).

Food for thought, But as has been said, RD is a bit of a marketing term anyway.

i assumed so. i've been thinking about this a lot lately.
 
:eek:
We are talking about Apple's claim about its "Retina" display, not about your own conclusions or thoughts about it.
Snellen's 20/20 tells a person (in a precise condition) can't resolve an optotype (which features particular distances) at a particular distance. It doesn't say anything else. The distance of one arc minute has to do with the optotypes themselves, it's used as a common reference. Well defined conditions are necessary to conform the method of examination.

No, you were adamant that snellens theory that the eye can resolve at 1 minute of arc for people with 20/20 vision cannot be applied to pixels in any way or form, because at first, they weren't lines.

The minute of arc is not used as common reference, it's because snellen believed the eye can resolve it, if he believed it would resolve half a minute of arc he would've used that. he defined 20/20 based on the minute of arc.


20/20 doesn't absolutely mean you won't be able to resolve anything less distant than one arc minute. I've repeated this quite a few times now. :rolleyes:
Also, the real retina of the human eye has a much higher "resolution" in terms of angular distance.

If you're able to resolve smaller angles then you don't have 20/20 vision but better vision, 20/15, 20/10 etc.

Can you provide a paper where it says all human eyes have higher resolution than 1 minute of arc? been googling and can't find anything. the only thing i've been able to find is this:
The other consideration is the density of retinal receptors, which nature seems to have planted no denser than is likely to be needed. With a perfect cornea, no internal reflection and no diffraction limit, you'd be physically limited to half an arc-minute.

Who cares if someone comes up with an application which can't strictly follow Snellen's definition, but is used as a means of cheap diagnostic tool for poor countries that can't afford the real expensive tools? Do you think that means anything in this matter? (Rhetorical question. The answer is: no).

Because you said pixels can't be used at all regarding eye tests, because it's not black lines on paper.

How expensive is a snellen chart? with the snellen chart you can only tell a person can't actually see at a given acuity 20/20, etc. but you can't measure what correction is needed, yet you can with an LCD screen with high ppi, but wait you said you can't use pixels...

If you want to define a new scale of acuity you're free to do it. Bear in mind no one will even remotely care or look at it.

It's not a new scale, it's using the current scale with a high resolution display and see if there's a real difference or not between the paper one and the display.

:eek:
You're still asking this question? Are you able to read? (Again, rhetorical question :rolleyes:).
That has nothing to do with resolving pixels. And again here's the same conclusion: Apple's "Retina" display is a marketing term without any definition behind it, indicating, perhaps, a quality any screen can have if considered at a distance.
Notice also you still haven't linked anything about Apple's documentation. Why? Because Apple has none, maybe? :rolleyes: And everything you are saying is based on vagueness you heard in a company's event of a product launch?

I'm still asking because you haven't said what exactly you want them to prove.

And as i said before, of course it's a marketing term, they're all about selling things.

But here we have people from MIT proving that the screens do have a high enough resolution to be used as eye test tools. except they're pixels not lines, so they're probably just wrong...

And BTW.

CRAM IT APPLE, at 10" the iPhone4/4S does not have a retina display, should've stuck with 12" as steve said when introducing it instead of 10" in the chart with the formulae.
 
No, you were adamant that snellens theory that the eye can resolve at 1 minute of arc for people with 20/20 vision cannot be applied to pixels in any way or form, because at first, they weren't lines.

You're being so obtuse. :eek:
Optotypes are composed by contiguous lines and only optotypes (or very similar means) are used to determine acuity based on Snellen's definition.
Only optotypes are used to determine acuity and certainly pixels can't be compared to optotypes by their very nature of separated entities.
Third time I've repeated this.

The minute of arc is not used as common reference, it's because snellen believed the eye can resolve it, if he believed it would resolve half a minute of arc he would've used that. he defined 20/20 based on the minute of arc.

Visual angle takes into account the distance of the observer from the optotypes: that's why I said one arc minute (the related distance if 6 metres) is a common reference. Also, by common reference I actually meant it's a precise value given for any test to be declared as valid, and not that it's a totally random value as you're supposing (based on nothing). :rolleyes:

If you're able to resolve smaller angles then you don't have 20/20 vision but better vision, 20/15, 20/10 etc.

Can you provide a paper where it says all human eyes have higher resolution than 1 minute of arc? been googling and can't find anything. the only thing i've been able to find is this:

I didn't say that, but I forgot you're unable to read. The retina (not vision so) doesn't entirely relate to Snellen's acuity. :rolleyes:

Because you said pixels can't be used at all regarding eye tests, because it's not black lines on paper.

How expensive is a snellen chart? with the snellen chart you can only tell a person can't actually see at a given acuity 20/20, etc. but you can't measure what correction is needed, yet you can with an LCD screen with high ppi, but wait you said you can't use pixels...

Snellen clearly designed an operational way to find out one's visual acuity.
I think (but who knows) that in 1862 they hadn't iPhones or screens. :rolleyes:

It's not a new scale, it's using the current scale with a high resolution display and see if there's a real difference or not between the paper one and the display.

That has nothing to do with Snellen's definition (again :eek:).

I'm still asking because you haven't said what exactly you want them to prove.

And as i said before, of course it's a marketing term, they're all about selling things.

:eek:
Marketing term means that it doesn't refer to reality, but it's just a marketing means of attracting potential customers. And obviously you didn't get that.

But here we have people from MIT proving that the screens do have a high enough resolution to be used as eye test tools. except they're pixels not lines, so they're probably just wrong...

Of course, you're misleading the goal of that project.
First: you might have not noticed, but there's a sensor to attach to the screen.
Second: it doesn't use optotypes.
So that's totally unrelated to acuity in Snellen's definition and has nothing to do with optyotypes linked with pixels (separated entities).

:rolleyes:
 
You're being so obtuse. :eek:
Optotypes are composed by contiguous lines and only optotypes (or very similar means) are used to determine acuity based on Snellen's definition.
Only optotypes are used to determine acuity and certainly pixels can't be compared to optotypes by their very nature of separated entities.
Third time I've repeated this.

For the 1000th time, snellen crafted the optotypes with the assumption that the eye can resolve at 1 minute of arc, that's why they subtend 5 minutes of arc and are crafted on a 5x5 grid/

Take a look at other optotypes, specially the landolt C, the only space in the C is 1 minute of arc, if you can't resolve 1 minute of arc you will see a circle.

So how can you say resolving at 1 minute of arc has nothing at all to do with snellens definition of 20/20??

Visual angle takes into account the distance of the observer from the optotypes: that's why I said one arc minute (the related distance if 6 metres) is a common reference. Also, by common reference I actually meant it's a precise value given for any test to be declared as valid, and not that it's a totally random value as you're supposing (based on nothing). :rolleyes:

What random value? what are you on? i'm saying snellen based 20/20 vision on being able to resolve 1 minute of arc. this is scientific fact. not random based on nothing.

I didn't say that, but I forgot you're unable to read. The retina (not vision so) doesn't entirely relate to Snellen's acuity. :rolleyes:

The retina has an ability to resolve or not at a certain angular distance, snellen proposed that for 20/20 vision that angular distance be 1 minute of arc.

HOW does the retina not entirely relate to snellen test?

You said exactly that. lets see :
Also, the real retina of the human eye has a much higher "resolution" in terms of angular distance.

From what i've found, With a perfect cornea, no internal reflection and no diffraction limit, you'd be physically limited to half an arc-minute.
Is that half an arc minute that much higher?

Snellen clearly designed an operational way to find out one's visual acuity.
I think (but who knows) that in 1862 they hadn't iPhones or screens. :rolleyes:

And the MIT test not only manages to find out your visual acuity but the prescription needed to correct it. all with pixels, which are not lines, so they're probably wrong.

That has nothing to do with Snellen's definition (again :eek:).

The snellen chart is supposed to be lit at 480 lux, if you could manage to configure a new ipad at the same brightness, how would it not have anything to do with snellens definition?

:eek:
Marketing term means that it doesn't refer to reality, but it's just a marketing means of attracting potential customers. And obviously you didn't get that.

Do you think gorilla glass is made out of gorillas?

Of course, you're misleading the goal of that project.
First: you might have not noticed, but there's a sensor to attach to the screen.
Second: it doesn't use optotypes.
So that's totally unrelated to acuity in Snellen's definition and has nothing to do with optyotypes linked with pixels (separated entities).

:rolleyes:

Thats not a sensor. it's a 1$ piece of plastic. it's based on the fact that the eye can resolve at 1 minute of arc for 20/20 vision and corrects based on that.

And what did snellen propose when he crafted his optotypes? that the eye can resolve at 1 minute of arc for what he called 20/20 vision.

Yeah, they have nothing at all to do with one another.

I guess you're right, after all they're not lines but pixels...
 
For the 1000th time, snellen crafted the optotypes with the assumption that the eye can resolve at 1 minute of arc, that's why they subtend 5 minutes of arc and are crafted on a 5x5 grid/

Take a look at other optotypes, specially the landolt C, the only space in the C is 1 minute of arc, if you can't resolve 1 minute of arc you will see a circle.

All you can see are contiguous lines and not separated entities such as pixels. :rolleyes:
That doesn't contradict what I said in any way.

So how can you say resolving at 1 minute of arc has nothing at all to do with snellens definition of 20/20??

In fact I never said it. :eek:
That's just you trying to fool yourself. :rolleyes:

What random value? what are you on? i'm saying snellen based 20/20 vision on being able to resolve 1 minute of arc. this is scientific fact. not random based on nothing.

Are you nuts? What part of "I actually meant it's a precise value given for any test to be declared as valid, and not that it's a totally random value" I just stated in my previous post isn't clear? :eek:

The retina has an ability to resolve or not at a certain angular distance, snellen proposed that for 20/20 vision that angular distance be 1 minute of arc.

Absolutely not: the retina is only a part of the human eye, and acuity is also determined by outer surfaces which can cause it to lower. :rolleyes:


HOW does the retina not entirely relate to snellen test?

You said exactly that. lets see :

That's obvious, if you have a basic knowledge of the human eye: the retina is only an inner tissue, and acuity is not determined only by it. :rolleyes:
Also enjoy: "Visual acuity is limited by diffraction, aberrations and photoreceptor density in the eye (Smith and Atchison, 1997)".

From what i've found, With a perfect cornea, no internal reflection and no diffraction limit, you'd be physically limited to half an arc-minute.
Is that half an arc minute that much higher?

Yes, it is and proves how obtuse you're being. Also, it should be even less than 0.4 arc minutes.


And the MIT test not only manages to find out your visual acuity but the prescription needed to correct it. all with pixels, which are not lines, so they're probably wrong.

That's almost ridiculous.
You found a random page, linked it without knowing what was being illustrated and now pretend that magically states what you're saying.

That project does not use optotypes and does not measure acuity, but refractive errors, which affect visual acuity.
So it has nothing to do with 20/20 acuity or "Retina" display at all. :rolleyes:


The snellen chart is supposed to be lit at 480 lux, if you could manage to configure a new ipad at the same brightness, how would it not have anything to do with snellens definition?

Even more ridiculous.
I've repeated this quite a few times. :rolleyes: Go read one of my last two posts and you'll find out.

Do you think gorilla glass is made out of gorillas?

Good joke. Actually, no, quite bad and sad. That has nothing to do with "Retina" as a marketing term, because it doesn't rely or suggest a false characteristic in order to attract the masses.

Thats not a sensor. it's a 1$ piece of plastic. it's based on the fact that the eye can resolve at 1 minute of arc for 20/20 vision and corrects based on that.

Absolutely not, you haven't even spent 2 minutes reading what that project is about.
It doesn't use Snellen's definition at all and does not measure visual acuity, but refractive errors.

And what did snellen propose when he crafted his optotypes? that the eye can resolve at 1 minute of arc for what he called 20/20 vision.

You've repeated this two times, without need as no one has stated that's false. :rolleyes:


I'm now ready for another series of random, unsupported and false statements. Take your time. :rolleyes:
 
Is the Galaxy Note 5.3 phone a Retina Display?

Hmm doing the 'math' it looks like the Samsung Galaxy Note would fit in the wishy washy definition of retina display:

iphone 4s: 3.5-inch 960 x 640 pixels, 326 ppi
Galaxy Note: 5.3-inch 800 x 1280 pixels, 285 ppi
New iPad: 9.7-inch 2048 x 1536 pixel, 264 ppi

Basically use Apple's 2 distances they suggest for the iphone and ipad and using the size of the note get its distance and see that given Note's resolution/ppi, it is a Retina Display. That and you get a simple wacom tablet digitizer under the screen surface ( sorry that was a bit of a troll remark - I have a note as well as a ipod touch that I use everyday - and I do like the pressure sensitive, pixel accurate pen).

Food for thought, But as has been said, RD is a bit of a marketing term anyway.

A bit of a marketing term? It's totally a marketing term. While clearly there's a principle behind it (can't see the dots individually at a normal usage distance, then it's "Retina").

I will go as far as saying no Samsung product (or LG, or Motorola, or HTC, etc) will ever be called a "Retina" (Registered trademark of Apple, Inc.) display...

http://www.patentlyapple.com/patent...-files-two-retina-trademark-applications.html
 
All you can see are contiguous lines and not separated entities such as pixels. :rolleyes:
That doesn't contradict what I said in any way.

In fact I never said it. :eek:
That's just you trying to fool yourself. :rolleyes:

Are you nuts? What part of "I actually meant it's a precise value given for any test to be declared as valid, and not that it's a totally random value" I just stated in my previous post isn't clear? :eek:

Absolutely not: the retina is only a part of the human eye, and acuity is also determined by outer surfaces which can cause it to lower. :rolleyes:

That's obvious, if you have a basic knowledge of the human eye: the retina is only an inner tissue, and acuity is not determined only by it. :rolleyes:
Also enjoy: "Visual acuity is limited by diffraction, aberrations and photoreceptor density in the eye (Smith and Atchison, 1997)".

Yeah you're right, being able to discern patterns at 1 minute of arc at 6 meters has nothing to do at all with the retina or your eyes for that matter.

Take a snellen chart and in the 20/20 line replace one of the letters with a checkerboard pattern, then do so for the two lines below it, if you have 20/20 vision you will see the first checkerboard, and only a grey spot on the other.

Is that so hard to imagine? it's how dithering works.
Dithering_example_red_blue.png


Yes, it is and proves how obtuse you're being. Also, it should be even less
than 0.4 arc minutes.

I'm not being obtuse, i've searched and found nothing to support what you say, can you link something that does?

That's almost ridiculous.
You found a random page, linked it without knowing what was being illustrated and now pretend that magically states what you're saying.

I actually read it a while ago, the fact that the displays in the devices since the iphone4 have such a great resolution that they can be used to correct eye defects is quite telling.

That project does not use optotypes and does not measure acuity, but refractive errors, which affect visual acuity.
So it has nothing to do with 20/20 acuity or "Retina" display at all. :rolleyes:

So you're saying the corrections they give so that you can get glasses and actually see, have nothing to do with visual acuity at all?

You say the correct refractive erros which affect visual acuity, yet what they do has nothing to do with 20/20 acuity or retina display?

The fact remains, if those screens didn't have such a high ppi they wouldn't be able to find the refractive errors at all.

Even more ridiculous.
I've repeated this quite a few times. :rolleyes: Go read one of my last two posts and you'll find out.

So if i get a snellen chart, and the new ipad side by side and someone with 20/20 vision can't see a difference it's ridiculous??

*IF* and it is a big if, they look and function the same wouldn't that prove that the display is in fact a retina display? and it's in fact based the ability of your eye to resolve at 1 arc minute??

Do you know what analogies are at all? do you have any imagination at all? or are you the kind of people that gets a product and if it doesn't say you can do X with it you never try?

Good joke. Actually, no, quite bad and sad. That has nothing to do with "Retina" as a marketing term, because it doesn't rely or suggest a false characteristic in order to attract the masses.

Lighten up francis.

Absolutely not, you haven't even spent 2 minutes reading what that project is about.
It doesn't use Snellen's definition at all and does not measure visual acuity, but refractive errors.

You've repeated this two times, without need as no one has stated that's false. :rolleyes:

I'm now ready for another series of random, unsupported and false statements. Take your time. :rolleyes:

And how do they measure refractive errors?

The fact is, if those display screens weren't as high ppi as they are they wouldn't be able to use them to measure or fix your eyesight.

why do they need that high ppi?
Because of the 1 arc minute someone with 20/20 vision can resolve.
 
Yeah you're right, being able to discern patterns at 1 minute of arc at 6 meters has nothing to do at all with the retina or your eyes for that matter.

I never said it. :eek: You're saying it now.

Take a snellen chart and in the 20/20 line replace one of the letters with a checkerboard pattern, then do so for the two lines below it, if you have 20/20 vision you will see the first checkerboard, and only a grey spot on the other.

Snellen thought about optotypes and not checkerboard patterns. :rolleyes:
You're free to re-define whatever you want, but don't expect others to care or look at it.

I'm not being obtuse, i've searched and found nothing to support what you say, can you link something that does?

I'll search something online, since I read it on a book.

I actually read it a while ago, the fact that the displays in the devices since the iphone4 have such a great resolution that they can be used to correct eye defects is quite telling.

:eek:
So you are saying if I keep watching at a "Retina" display my vision will get better. That's genius.

So you're saying the corrections they give so that you can get glasses and actually see, have nothing to do with visual acuity at all?

Again, I've never said it. :rolleyes:
It's a bad habit trying to put words into another's mouth.
I've clearly stated that the aim of the project is to give, or try to give, the refractive errors of one's eye, and not to give any acuity.

Also: no one states that's a good/suggested/best way to determine one's eyes' refractive errors and consequently get glasses based on that. It's an arguably precise tool, as you can learn from the page. :rolleyes: The official methods are others, based on way more expensive machines.

You say the correct refractive erros which affect visual acuity, yet what they do has nothing to do with 20/20 acuity or retina display?

:eek:
I've never said that refractive errors have nothing to do with visual acuity. Yet you're trying to fool yourself. :rolleyes:
What I've said (can't you really read?) is that acuity can be limited by refractive errors. :rolleyes:

The fact remains, if those screens didn't have such a high ppi they wouldn't be able to find the refractive errors at all.

That proves absolutely nothing.
And one more time you're displaying your inability to read. Refractive errors are find through a test based on alignment and this has nothing to do with determining one's visual acuity. :rolleyes:

So if i get a snellen chart, and the new ipad side by side and someone with 20/20 vision can't see a difference it's ridiculous??

:eek:
You completely miss the meaning of Snellen's acuity. Completely. Go read my last posts, since I've repeated why you can't draw any conclusion on pixels a few times now. :rolleyes:

*IF* and it is a big if, they look and function the same wouldn't that prove that the display is in fact a retina display? and it's in fact based the ability of your eye to resolve at 1 arc minute??

Once again: :eek:.
Absolutely not. With Snellen's optotypes, you've to resolve a series of lines (which compose an optotype), meanwhile with a screen you have single entities – pixels –, and it's logically wrong even to think that you could compare an optotype to a sum of pixels, since what you want to determine is about the pixels themselves and not the optotype they'd compose from a distance. This shows your clear limit of understanding a definition.

Do you know what analogies are at all? do you have any imagination at all? or are you the kind of people that gets a product and if it doesn't say you can do X with it you never try?

That has nothing to do with "Retina" display, and yet you don't realize it.
This shows your level of understanding anything, even from a written text. :rolleyes:
But I understand: you must be surprised almost anything you're trying to express is a big fail.

And how do they measure refractive errors?

http://en.wikipedia.org/wiki/Shack–Hartmann_wavefront_sensor
Read the page of the project you linked and you'll find out. :rolleyes:
"The subject looks into this display at a very close range and aligns (overlaps) displayed patterns".
Just to let you understand this: that doesn't involve optotypes. :eek:

The fact is, if those display screens weren't as high ppi as they are they wouldn't be able to use them to measure or fix your eyesight.

See reply above.

why do they need that high ppi?
Because of the 1 arc minute someone with 20/20 vision can resolve.

No, that's what the marketing term has taught you. In order to define 20/20 vision, you need a precise way to test it. And it has nothing to do with pixels, but with optotypes, so that's false and can't relate to what you're claiming.


Come on, now try to create new and original statements and say I've written them, uh. :rolleyes:
 

You're very hung up on the lines.

Can the eye resolve at 1 minute of arc or can't it?
Is that resolution only in the horizontal or vertical plane?

Snellen thought about the eye resolving 1 minute of arc in both planes being perfect vision, he used what he thought was best to test that theory by making optotypes of letters in which the distance between black and white and the width of both were 1 minute of arc for 20/20 vision.

Look at the landolt C, if you can only resolve lines at 1 minute of arc how can you tell where the 1 minute of arc opening is on the C?

thus the optotype can be recognized only if the person viewing it can discriminate a spatial pattern separated by a visual angle of 1 minute of arc (one element of the grid).

One element of the grid, not a line but one element on the grid.

You're confusing Visual Acuity with this perhaps?

Vernier acuity measures the ability to align two line segments. Humans can do this with remarkable accuracy, it is a hyperacuity (scientific term). Under optimal conditions of good illumination, high contrast, and long line segments, the limit to vernier acuity is about 8 arc seconds or 0.13 arc minutes, compared to about 0.6 arc minutes (20/12) for normal visual acuity or the 0.4 arc minute diameter of a foveal cone. Because the limit of vernier acuity is well below that imposed on regular visual acuity by the "retinal grain" or size of the foveal cones, it is thought to be a process of the visual cortex rather than the retina. Supporting this idea, vernier acuity seems to correspond very closely (and may have the same underlying mechanism) enabling one to discern very slight differences in the orientations of two lines, where orientation is known to be processed in the visual cortex.

Now that has everything to do with lines. that also says the diameter of a foveal cone is only 0.4 arc minutes, not less
 
Can the eye resolve at 1 minute of arc or can't it?

The answer is: it depends.
In the case of 20/20 vision, with well defined light conditions and with well defined optotypes (composed not by spaced entities) at a well defined distance, no.
Snellen's acuity proves valid only in those well defined conditions.
It doesn't absolutely state one with 20/20 vision can't resolve lines (and not pixels, by definition) at one minute of arc. :rolleyes:

Go out at night, and I bet you can't resolve some lines distant 20 arc minutes.
Put up a proper light context and your eyes will distinguish lines separated by 0.4 arc minutes perhaps.

You see, there are a lot of conditions, and Snellen's acuity applies only to optotypes within certain conditions. Everything else is speculation, and you have to prove the validity, that is to say correlation with Snellen's acuity. :rolleyes:

Snellen thought about the eye resolving 1 minute of arc in both planes being perfect vision, he used what he thought was best to test that theory by making optotypes of letters in which the distance between black and white and the width of both were 1 minute of arc for 20/20 vision.

That's not correct. In order to be able to replicate the same result you need very well defined conditions in which you work, otherwise that becomes totally non scientific. :rolleyes:
Snellen has defined how to measure acuity, and obviously there's no such thing as an "absolute acuity". Acuity changes even with psychological events, I've read. :rolleyes:

Look at the landolt C, if you can only resolve lines at 1 minute of arc how can you tell where the 1 minute of arc opening is on the C?

That doesn't contradict anything I said.
It's a continuous line, it's an optotype, and acuity can be determined with that if the patient is placed in well defined conditions. :rolleyes:

One element of the grid, not a line but one element on the grid.

You're confusing the whole story: even in a 5x5 regular optotype you see empty elements of the grid, and your ability to distinguish them with the lines tells something about your visual acuity.
Instead, that proves how wrong you are. All you can see is a line, and not separated entities. :rolleyes:

You're confusing Visual Acuity with this perhaps?



Now that has everything to do with lines. that also says the diameter of a foveal cone is only 0.4 arc minutes, not less

Not really. I knew that definition exists, but I'm actually referring to visual acuity.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.