This thread is about a completely speculative yet completely possible feature to be built into a future iPhone. If you're not familiar with augmented reality, think of it as virtual images that are overlaid onto images of reality in real-time.
A simpler explanation can be given with an example. Imagine you're standing at an empty table. You point your iPhone's camera at the table and through the display, in the center of the table you see an apple. As you walk around the table while pointing the camera at the center, you see a full, three dimensional apple. By looking through the display, you'd swear that it's really there, except that when you take away the display the table is obviously empty. This is the concept of AR.
For more info, just Google augmented reality and watch a few videos. It's impressive in action.
Some demos have been created to show off AR on the iPhone. This method uses markers. The camera analyzes the image on a paper marker and then compares it to an internal library of these marker images. The camera then displays a three dimensional object on top of or around this marker, relative to the position of the marker in the camera's sight.
This is more of a fun app than a useful application. The next step would be a large scale outdoor implementation using GPS data instead of paper markers.
The first step towards this ability would be a built in digital compass in the iPhone. Imagine this:
You enter some directions in your iPhone. From the Maps app, there's a new button to "Show Destination Marker in Camera," or something like that. When you touch this, the Camera app opens. To the left or right on the screen are arrows to show you which way you need to aim the camera to see the destination. Through the display, you see the marker floating in the air in the distance above the destination.
This can all be done with GPS and a compass. GPS gets your coordinates and matches them to your point of view in the camera. Using the compass and data from Maps, the camera knows which direction to display the marker.
Then we need LIDAR and a stereo camera to judge distance to display augmented reality images in relation to their real world surroundings. LIDAR and a stereo camera wouldn't be used to judge the distance of the destination. GPS will do this. Instead, this technology would judge the distance of objects blocking the destination.
Imagine this in a close range scenario:
You're at one end of a small park and you add a destination marker to the other end. Let's assume that this augmented marker is just a big red ball about 10 feet tall if you were right up to it. First of all, the GPS would have to be precise enough to accurately display coordinates based on latitude and longitude but also altitude so it can display this marker at ground level, relative to where you're standing. The accelerometer would need to be sensitive enough to adjust data based on the tilt of your iPhone, as well. Assuming that it can, you would see this marker as a small globe on your display, fixed at a certain point in the distance.
The problem would be that this marker would block out trees in the foreground, so the only way to judge distance would be as it appears larger on the screen as you get closer to it. To remedy this, a combination of LIDAR and a stereo camera could analyze objects in the foreground and allow these objects to appear closer to you than the marker, so the ball would appear to be partially blocked by trees. This would make it appear, when looking through the camera display, that there really is a big red ball on the other end of the park.
I'm interested in hearing other people's thoughts and ideas about this topic.
A simpler explanation can be given with an example. Imagine you're standing at an empty table. You point your iPhone's camera at the table and through the display, in the center of the table you see an apple. As you walk around the table while pointing the camera at the center, you see a full, three dimensional apple. By looking through the display, you'd swear that it's really there, except that when you take away the display the table is obviously empty. This is the concept of AR.
For more info, just Google augmented reality and watch a few videos. It's impressive in action.
Some demos have been created to show off AR on the iPhone. This method uses markers. The camera analyzes the image on a paper marker and then compares it to an internal library of these marker images. The camera then displays a three dimensional object on top of or around this marker, relative to the position of the marker in the camera's sight.
This is more of a fun app than a useful application. The next step would be a large scale outdoor implementation using GPS data instead of paper markers.
The first step towards this ability would be a built in digital compass in the iPhone. Imagine this:
You enter some directions in your iPhone. From the Maps app, there's a new button to "Show Destination Marker in Camera," or something like that. When you touch this, the Camera app opens. To the left or right on the screen are arrows to show you which way you need to aim the camera to see the destination. Through the display, you see the marker floating in the air in the distance above the destination.
This can all be done with GPS and a compass. GPS gets your coordinates and matches them to your point of view in the camera. Using the compass and data from Maps, the camera knows which direction to display the marker.
Then we need LIDAR and a stereo camera to judge distance to display augmented reality images in relation to their real world surroundings. LIDAR and a stereo camera wouldn't be used to judge the distance of the destination. GPS will do this. Instead, this technology would judge the distance of objects blocking the destination.
Imagine this in a close range scenario:
You're at one end of a small park and you add a destination marker to the other end. Let's assume that this augmented marker is just a big red ball about 10 feet tall if you were right up to it. First of all, the GPS would have to be precise enough to accurately display coordinates based on latitude and longitude but also altitude so it can display this marker at ground level, relative to where you're standing. The accelerometer would need to be sensitive enough to adjust data based on the tilt of your iPhone, as well. Assuming that it can, you would see this marker as a small globe on your display, fixed at a certain point in the distance.
The problem would be that this marker would block out trees in the foreground, so the only way to judge distance would be as it appears larger on the screen as you get closer to it. To remedy this, a combination of LIDAR and a stereo camera could analyze objects in the foreground and allow these objects to appear closer to you than the marker, so the ball would appear to be partially blocked by trees. This would make it appear, when looking through the camera display, that there really is a big red ball on the other end of the park.
I'm interested in hearing other people's thoughts and ideas about this topic.