Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Anyway, hopefully Apple engineers will know better, until then, it's just a waste of time :)

Yeah apple engineers are the greatest geniuses ever born. They can achieve god like innovative capability. Without them the world would have evolved back into the stone age.
 
Yep, pretty much...

I hope these geniuses come out with more super ideas like centralized settings, crippled email, one big oversized home button, copy and claim as your own or wipe your data when you uninstall apps or change account.
 
Yeah apple engineers are the greatest geniuses ever born. They can achieve god like innovative capability. Without them the world would have evolved back into the stone age.

The word you're looking for is "devolved".
 
Anyone complaining about no 1080p is just a spec-whore.

Seriously? There is a lot more to 1080p than just specs. For one, you can watch native 1080p without it being upscaled or downscaled thus not hammering the GPU thus saving battery life. Oh, never mind, I guess you want bad battery life.
 
So you look at your phone 4 inches from your eye?

No? Or have you seen me asking for 2000+ dpi displays?

Not the maximum? Isn't that why it's called the maximum resolving distance for a given resolution?

Maximum for seeing the individual pixels. Which would be the equivalent of being able to read the bottom line of the visual acuity chart. But even if I cant read the bottom line at some given distance (i.e. tell the individual pixels apart), I can still see that the undiscernable blobs are different from one another. And I can do that for quite a bit longer after I lose the ability to tell exactly what they are.

Then there is also the fact that you don't actually see in pixels, but your eyes and brain effectively oversample everything using minute movements of the eye, which would make you able to "resolve" detail far, far beyond that mathematically calculated maximum. Or are you trying to tell me, that you can't see the difference in "sharpness" between a picture of an object on a 5" FullHD display versus the actual object itself?

----------

I hope these geniuses come out with more super ideas like centralized settings.

I especially like the centralized settings, when some of the settings are still done in the individual apps.
 
Maximum for seeing the individual pixels. Which would be the equivalent of being able to read the bottom line of the visual acuity chart.

No, it wouldn't. An eye chart has nothing to do with the resolving limits of the human eye. Eye charts measure your ability to read text of a given size at a given distance. It's a test of SHARPNESS or your eye's focal point. Resolving power would your ability to distinguish two dots from each other on that chart and that would be FAR FAR smaller the smallest line on the chart assuming your eyes are normal. In other words, visual acuity is not the same thing as resolving power and resolution has everything to do with resolving power and nothing to do with your ability to focus on an object that is going to be just a foot or two in front of you (i.e. well within most people's ability to see without refractory lenses and thus applicable to anyone with the ability to correct to sharp vision. The point being visual acuity varies by the individual quite a lot (even a change in your sinus pressure can affect it quite a bit), but resolving power is pretty much a constant for all human beings that can see.

But even if I cant read the bottom line at some given distance (i.e. tell the individual pixels apart), I can still see that the undiscernable blobs are different from one another. And I can do that for quite a bit longer after I lose the ability to tell exactly what they are.

Once again, you seem to have confused resolving power with visual acuity. They are not the same thing. You are describing resolving power and again it would not be the bottom line on the chart. Not even close. It would be MUCH smaller.

Then there is also the fact that you don't actually see in pixels, but your eyes and brain effectively oversample everything using minute movements of the eye, which would make you able to "resolve" detail far, far beyond that mathematically calculated maximum. Or are you trying to tell me, that you can't see the difference in "sharpness" between a picture of an object on a 5" FullHD display versus the actual object itself?

I'm telling you that a two dimensional IMAGE of a three dimensional object does NOT contain the spatial data needed to accurately represent the object no matter how high the resolution of the image might be. You can't compare a two dimensional representation to a three dimensional object. It's like saying a circle in flat world is the same as a sphere. It's not. What you think you're seeing is merely PERSPECTIVE. It's a visual illusion of three dimensions in two dimensional space and this occurs because the human eye actually only sees two dimensions at any given moment. The brain combines the output of two eyes at slightly different angles to create three dimensional clues as to distance. There are monocular and binocular clues. But ultimately you are seeing two dimensions, not three. This is why the brain can be easily fooled into thinking it's seeing three dimensional objects on two dimensional surfaces or even 3D movies, by recreating at least SOME of the binocular (as in 3D movies) or monocular (as in perspective on a sheet of paper) clues so the brain thinks it's seeing something that it's not actually seeing.

Thus, while a hologram of sufficient quality (e.g. that Michael Jackson hologram performance recently) CAN be mistaken for reality, few would ever mistake a 3D movie for reality since it only contains SOME of the information and begins to fall apart as soon as you move your head, for example which the lack of change in perspective in any given dimensions immediately tells your brain you're looking at an image, even if it appears to be three dimensional. A hologram changes perspective data relative to your own movement and thus can fool the brain more completely. The RESOLUTION of the hologram isn't as important as the spatial data. In other words, a low resolution hologram might not look "sharp", but it will still appear to be REAL. Thus, it would simply appear to be a blurry or even a blocky object sitting in real space. But it would still appear to be real. Thus, it's the spatial data that truly tricks your brain into believing it's seeing a 3D object, not the resolution of the image itself.

So no, even a 10,000 DPI iPhone displaying a picture of a marble could never be confused with an actual marble since it lacks the spatial data needed to appear to be a real 3D object. A hologram of a marble COULD fool you quite easily. A low resolution hologram would simply appear to be an opaque or blurry marble. Thus, it might not accurately represent the real marble it is taken from due to blur, but it would still appear to be a real object visibly, just not the exact object in question. Obviously, whether the brain can deduce a hologram is real or not could be based on low resolution in that people aren't normally blocky looking, but it would still be confusing to the brain since it would still appear to actually exist (assuming full accurate color and full spatial data, etc. not those cheap toy holograms).
 
Here we go: http://onlinelibrary.wiley.com/doi/10.1002/jsid.186/abstract

There has been a rapid increase in the resolution of small-sized and medium-sized displays. This study determines an upper discernible limit for display resolution. A range of resolutions varying from 254–1016 PPI were evaluated using simulated display by 49 subjects at 300 mm viewing distance. The results of the study conclusively show that users can discriminate between 339 and 508 PPI and in many cases between 508 and 1016 PPI.

Now, can we please stop spouting that "no one can see beyond 300 DPI"-nonsense already?
 
Here we go: http://onlinelibrary.wiley.com/doi/10.1002/jsid.186/abstract



Now, can we please stop spouting that "no one can see beyond 300 DPI"-nonsense already?

Being able to discern the difference and having it matter at all for the user are completely different things. I think people probably know they're speaking colloquially when they use "no one" here. Some people can hear the difference between a 24 bit and 32 bit recording. I'm still not going to re-buy all of my music to get the tiny perceived difference.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.