Maximum for seeing the individual pixels. Which would be the equivalent of being able to read the bottom line of the visual acuity chart.
No, it wouldn't. An eye chart has nothing to do with the resolving limits of the human eye. Eye charts measure your ability to read text of a given size at a given distance. It's a test of SHARPNESS or your eye's focal point. Resolving power would your ability to distinguish two dots from each other on that chart and that would be FAR FAR smaller the smallest line on the chart assuming your eyes are normal. In other words, visual acuity is not the same thing as resolving power and resolution has everything to do with resolving power and nothing to do with your ability to focus on an object that is going to be just a foot or two in front of you (i.e. well within most people's ability to see without refractory lenses and thus applicable to anyone with the ability to correct to sharp vision. The point being visual acuity varies by the individual quite a lot (even a change in your sinus pressure can affect it quite a bit), but resolving power is pretty much a constant for all human beings that can see.
But even if I cant read the bottom line at some given distance (i.e. tell the individual pixels apart), I can still see that the undiscernable blobs are different from one another. And I can do that for quite a bit longer after I lose the ability to tell exactly what they are.
Once again, you seem to have confused resolving power with visual acuity. They are not the same thing. You are describing resolving power and again it would not be the bottom line on the chart. Not even close. It would be MUCH smaller.
Then there is also the fact that you don't actually see in pixels, but your eyes and brain effectively oversample everything using minute movements of the eye, which would make you able to "resolve" detail far, far beyond that mathematically calculated maximum. Or are you trying to tell me, that you can't see the difference in "sharpness" between a picture of an object on a 5" FullHD display versus the actual object itself?
I'm telling you that a two dimensional IMAGE of a three dimensional object does NOT contain the spatial data needed to accurately represent the object no matter how high the resolution of the image might be. You can't compare a two dimensional representation to a three dimensional object. It's like saying a circle in flat world is the same as a sphere. It's not. What you think you're seeing is merely PERSPECTIVE. It's a visual illusion of three dimensions in two dimensional space and this occurs because the human eye actually only sees two dimensions at any given moment. The brain combines the output of two eyes at slightly different angles to create three dimensional clues as to distance. There are monocular and binocular clues. But ultimately you are seeing two dimensions, not three. This is why the brain can be easily fooled into thinking it's seeing three dimensional objects on two dimensional surfaces or even 3D movies, by recreating at least SOME of the binocular (as in 3D movies) or monocular (as in perspective on a sheet of paper) clues so the brain thinks it's seeing something that it's not actually seeing.
Thus, while a hologram of sufficient quality (e.g. that Michael Jackson hologram performance recently) CAN be mistaken for reality, few would ever mistake a 3D movie for reality since it only contains SOME of the information and begins to fall apart as soon as you move your head, for example which the lack of change in perspective in any given dimensions immediately tells your brain you're looking at an image, even if it appears to be three dimensional. A hologram changes perspective data relative to your own movement and thus can fool the brain more completely. The RESOLUTION of the hologram isn't as important as the spatial data. In other words, a low resolution hologram might not look "sharp", but it will still appear to be REAL. Thus, it would simply appear to be a blurry or even a blocky object sitting in real space. But it would still appear to be real. Thus, it's the spatial data that truly tricks your brain into believing it's seeing a 3D object, not the resolution of the image itself.
So no, even a 10,000 DPI iPhone displaying a picture of a marble could never be confused with an actual marble since it lacks the spatial data needed to appear to be a real 3D object. A hologram of a marble COULD fool you quite easily. A low resolution hologram would simply appear to be an opaque or blurry marble. Thus, it might not accurately represent the real marble it is taken from due to blur, but it would still appear to be a real object visibly, just not the exact object in question. Obviously, whether the brain can deduce a hologram is real or not could be based on low resolution in that people aren't normally blocky looking, but it would still be confusing to the brain since it would still appear to actually exist (assuming full accurate color and full spatial data, etc. not those cheap toy holograms).