So I am forcing myself to consider the iPhone X because of the lack of AR support my phone has. ARKit really delivers some amazing experiences with the iPhone. Just today, while looking at the Pixel 2, I stumbled across Google's ARCore, and Project Tango, which is a direct competitor to the Apple ARKit.
ARCore
https://developers.google.com/ar/
Project Tango
https://get.google.com/tango/
Project Tango equipped phones have a depth sensor on the rear that helps the phone get a sense of depth, not normally possible with just the camera to create more immersive experiences for the user.
Not knowing so much about this, does anyone know if the Augmented Reality in ARKit delivers more if not as much as what Google is providing to Android?
The big takeaway I got was "Depth Sensor in Rear of Phone" helps create more accurate AR. I assume the iPhone does not have this, or is there some way the camera or combination of dual lenses help the iPhone detect depth? If not... I am making the prediction right now that the next year's iPhone X successor is going to include a depth sensor on the back of the device.
ARCore
https://developers.google.com/ar/
Project Tango
https://get.google.com/tango/
Project Tango equipped phones have a depth sensor on the rear that helps the phone get a sense of depth, not normally possible with just the camera to create more immersive experiences for the user.
Not knowing so much about this, does anyone know if the Augmented Reality in ARKit delivers more if not as much as what Google is providing to Android?
The big takeaway I got was "Depth Sensor in Rear of Phone" helps create more accurate AR. I assume the iPhone does not have this, or is there some way the camera or combination of dual lenses help the iPhone detect depth? If not... I am making the prediction right now that the next year's iPhone X successor is going to include a depth sensor on the back of the device.
Last edited: