OK, so if the new iphone has support for a magnetometer that can precisely record the direction, angle etc. the iphone is being held, combined with gps- could Apple and Google have some new feature cooked up where you can take pictures of things and "flesh out" google maps or google earth?
For example- let's say you are in street view- right now you have the multiple pictures taken with the cars. Well, with everyone taking pictures and submitting them through some application to google the street-view could be enhanced. The angle and location information could be used to pain photographs on to objects, making google street-view a kind of "walk-view."
Anyone have thoughts on this idea?
For example- let's say you are in street view- right now you have the multiple pictures taken with the cars. Well, with everyone taking pictures and submitting them through some application to google the street-view could be enhanced. The angle and location information could be used to pain photographs on to objects, making google street-view a kind of "walk-view."
Anyone have thoughts on this idea?