Apple Announces ARKit 4 with Location Anchors, Depth API, and Improved Face Tracking

MacRumors

macrumors bot
Original poster
Apr 12, 2001
48,667
10,093


Apple today announced ARKit 4 alongside iOS 14 and iPadOS 14. The new version of ARKit introduces Location Anchors, a new Depth API, and improved face tracking.


Location Anchors allow developers to place AR experiences, such as life‑size art installations or navigational directions, at a fixed destination. Location Anchoring leverages the higher-resolution data in Apple Maps to place AR experiences at a particular point in the world, meaning AR experiences may now be placed at specific locations, such as throughout cities or alongside famous landmarks. Users can move around virtual objects and observe them from different perspectives, exactly as real objects are seen through a camera lens.

ARKit 4 also takes advantage of iPad Pro's LiDAR Scanner with a brand-new Depth API with advanced scene understanding capabilities, creating a new way to access detailed per-pixel depth information. When combined with 3D mesh data, this depth information makes virtual object occlusion more realistic by enabling instant placement of virtual objects within their physical surroundings. This can offer new capabilities for developers, such as taking more precise measurements and applying effects to the environment.

Finally, face tracking is expanded in ARKit 4 to support the front-facing camera on all devices with the A12 Bionic chip or newer. Up to three faces may now be tracked at once using the TrueDepth camera to power front-facing camera experiences like Memoji and Snapchat.

Article Link: Apple Announces ARKit 4 with Location Anchors, Depth API, and Improved Face Tracking
 

mazz0

macrumors 68000
Mar 23, 2011
1,907
1,140
Leeds, UK
AR will be great when the hardware can be implanted in our eyes.
- - Post merged: - -

(or brains I suppose)
 

MEJHarrison

macrumors 65816
Feb 2, 2009
1,437
1,494
Location Anchors allow developers to place AR experiences, such as life‑size art installations or navigational directions, at a fixed destination. Location Anchoring leverages the higher-resolution data in Apple Maps to place AR experiences at a particular point in the world, meaning AR experiences may now be placed at specific locations, such as throughout cities or alongside famous landmarks. Users can move around virtual objects and observe them from different perspectives, exactly as real objects are seen through a camera lens.
At the risk of dragging this conversation too far into the political realm, this technology could be useful to those upset with the removal of whatever statues or monuments they hold dear. You want to see your statue of Mr. Unpopular Guy? Just use your phone, enjoy it all you want and it won't bother those around you.

Anyway, pretty neat idea from an absolutely technical point of view. But I suspect any conversation would go off the rails in about 3 posts.
 

Philip_S

macrumors newbie
Feb 6, 2020
8
5
At the risk of dragging this conversation too far into the political realm, this technology could be useful to those upset with the removal of whatever statues or monuments they hold dear. You want to see your statue of Mr. Unpopular Guy? Just use your phone, enjoy it all you want and it won't bother those around you.

Anyway, pretty neat idea from an absolutely technical point of view. But I suspect any conversation would go off the rails in about 3 posts.
…and if the unpopular guy isn't unpopular enough to remove, you could hold your phone up and see sky instead.

Another neat use for it, especially when an equivalent becomes available on enough Android devices too: proposed buildings could be posted in AR sketch-ups when planning permission is sought, both for advertising and to provide a realistic impression of what the building will look like rather than the helicopter view impressions typically used.
 

Nicky G

macrumors 6502a
Mar 24, 2002
881
338
Baltimore
It's funny that this "small" piece of WWDC news will go down, I reckon, as some of the most revolutionary stuff Apple announced this year, well beyond switching to ARM. It will take a few more years before it becomes more obvious, but when the "AR kit" that "ARKit" was designed for from the get-go eventually drops, it is going to have some very well-fleshed-out tech baked into it, stuff Apple has been "testing" out in the open for years now. Both via ARKit, and lots of other little things, such as Ultrawideband, early embrace of bluetooth beacon technology, etc. Science fiction has been describing this stuff (in terms of "fully-realized AR") since at least as far back as the early 90s, in Snow Crash. We're getting very close!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.