Apple today announced ARKit 4 alongside iOS 14 and iPadOS 14. The new version of ARKit introduces Location Anchors, a new Depth API, and improved face tracking.
Location Anchors allow developers to place AR experiences, such as life‑size art installations or navigational directions, at a fixed destination. Location Anchoring leverages the higher-resolution data in Apple Maps to place AR experiences at a particular point in the world, meaning AR experiences may now be placed at specific locations, such as throughout cities or alongside famous landmarks. Users can move around virtual objects and observe them from different perspectives, exactly as real objects are seen through a camera lens.
ARKit 4 also takes advantage of iPad Pro's LiDAR Scanner with a brand-new Depth API with advanced scene understanding capabilities, creating a new way to access detailed per-pixel depth information. When combined with 3D mesh data, this depth information makes virtual object occlusion more realistic by enabling instant placement of virtual objects within their physical surroundings. This can offer new capabilities for developers, such as taking more precise measurements and applying effects to the environment.
Finally, face tracking is expanded in ARKit 4 to support the front-facing camera on all devices with the A12 Bionic chip or newer. Up to three faces may now be tracked at once using the TrueDepth camera to power front-facing camera experiences like Memoji and Snapchat.
Article Link: Apple Announces ARKit 4 with Location Anchors, Depth API, and Improved Face Tracking