I would guess most apps. For example, I just downloaded an AR Beer Pong game, last updated 4 months ago. The first thing you are supposed to do is scan your room to find the surface to place the Beer Pong table. On an older device, I had to sweep the camera back and forth a bit looking at the floor before it could figure out that the floor was a flat surface that could fit the table. On my iPad 2020, the table popped up on the floor instantly.
Surface detection and human detection/occlusion are improved automatically with the LiDAR.
TLDR: so AR beer pong and measuring people’s height are some of its use lol. I always wanted to know how talk my nephew is. (I know the future potential is there I just find it funny that the most immediate use of the technology I’d silly things).
I haven’t tried this on my new 2020 iPad, but now I’m gonna try...
I play Pokémon Go. The AR+ feature can be a pain. You have to look around until yellow footprints show up on the ground. Then you tap a footprint to place the Pokémon into the environment. With my iPhone X this can be annoying trying to get the footprints to show up.
I suspect the lidar will make this a lot less annoying as the device will pick up the surfaces much faster and without having to get into uncomfortable positions, especially in tighter spaces. For this game, I’ll be happy when lidar comes to iPhone.
You know all those AR apps you use and get frustrated with because they aren’t as precise? LiDAR will run those better.
Seriously though, Apple is working so hard to make AR mainstream but every time they show it off its just that stale old 2016 demo of placing furniture into empty rooms. I kinda wish Apple would have more examples than that because right now it seems like Apple is following their trend of creating amazing hardware and then relying on 3rd party to figure out what to do with it.