Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,549
30,869


Apple today informed developers that ARKit 3.5 is now available, with the update adding support for the LiDAR Scanner and depth-sensing system included in the new 11 and 12.9-inch iPad Pro models.

appleipadproar.jpg

According to Apple, the new LiDAR Scanner will allow for a "new generation of AR apps" that use Scene Geometry for enhanced scene understanding and object occlusion.

Existing AR experiences on iPad Pro can also be improved with instant AR placement and improved Motion Capture and People Occlusion.

The LiDAR Scanner uses reflected light to measure the distance from the sensor to surrounding objects up to five meters away, indoors and outdoors. Depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from the two cameras, and data from motion sensors with computer vision algorithms handled by the A12Z Bionic to create a detailed and complete understanding of a scene.

At the current time, the new LiDAR Scanner with its enhanced augmented reality capabilities is limited to the new iPad Pro models, but Apple is also expected to include the feature in the 2020 iPhone models set to be released this fall.

Article Link: Apple Releases ARKit 3.5 for Developers With Support for iPad Pro's LiDAR Scanner
 

[AUT] Thomas

macrumors 6502a
Mar 13, 2016
774
972
Graz [Austria]
At the current time, the new LiDAR Scanner with its enhanced augmented reality capabilities is limited to the new iPad Pro models
But I guess the LiDAR should still help in normal AR applications as well, e.g. when it comes to detecting a surface. Currently this is done using the cameras and AI only. So, the LiDAR should be able to transparently assist that AI in current AR applications as well, or?
The question there would be: will it remove lag in AR applications? Last time I tried an AR app the stabilization of AR objects was imho too slow. For objects in space it doesn't really matter, but for objects supposed to be attached to or sitting on a surface it didn't really met my expectations...
 
Last edited:
  • Like
Reactions: otternonsense

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
But I guess the LiDAR should still help in normal AR applications as well, e.g. when it comes to detecting a surface. Currently this is done using the cameras and AI only. So, the LiDAR should be able to transparently assist that AI in current AR applications as well, or?
The question there would be: will it remove lag in AR applications? Last time I tried an AR app the stabilization of AR objects was imho too slow. For objects in space it doesn't really matter, but for objects supposed to be attached to or sitting on a surface it didn't really met my expectations...

Yes. The calibration step goes away.
Instant AR
The LiDAR Scanner on iPad Pro enables incredibly quick plane detection, allowing for the instant placement of AR objects in the real world without scanning. Instant AR placement is automatically enabled on iPad Pro for all apps built with ARKit, without any code changes.
 

vmistery

macrumors 6502a
Apr 6, 2010
942
688
UK
Presumably if it does come to the new iPhone it will be only on the non pro models? Have to wait and see I guess. It will be interesting to find out if the A13 can make any additional use of it as well over the A12Z
 
  • Like
Reactions: status.six

Pman17

macrumors 6502
Mar 12, 2011
335
256
Galveston, TX
Presumably if it does come to the new iPhone it will be only on the non pro models? Have to wait and see I guess. It will be interesting to find out if the A13 can make any additional use of it as well over the A12Z

I'm thinking we'll see a LIDAR on the iPhone 12 Pro models then it'll be on all of them next year. This A12Z is so odd that I think Apple decided to not develop a A13X to have more time to work on an A14X. They must have decided to release this iPad Pro anyways since it's due for a update and to get the LIDAR out in the wild for developers so some apps will be LIDAR ready when the iPhone 12 comes out and they can market the hell out of it. It's also a way to get the content rolling for the AR glasses so once those are announced, the apps are already here. We could see another iPad Pro in the Fall with a huge bump in features.
 
Another use for LiDAR could be to help jam Police fixed and mobile speed cameras which are used for revenue generation in Australia. Could also track Police speeding and report these Officers' abuse of power to the CJC. This is a business opportunity for an App Developer. An interesting but unknown fact, except among Police, is the Nokia 8110 (the Matrix mobile phone) would trigger the Police radar detectors - them thinking you had a radar detector in your car.
 

iPadified

macrumors 68000
Apr 25, 2017
1,860
2,051
Can the LiDAR be used as a 3D scanner? That would be very useful. Scan your foot and get the correct pair of shoes. Scan your head and try glasses in 3D.

Will photographs be improved by getting distance measurements to objects?

It far from a gimmick, but it is for other professionals than developers. 3D representations of objects on 2D screens are definitely useful right now. Think engineers, medical doctors, real estate agents, architects, teachers and probably many more that I cannot think of at the moment, but please fill in.
 

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
Can the LiDAR be used as a 3D scanner? That would be very useful. Scan your foot and get the correct pair of shoes. Scan your head and try glasses in 3D.

Will photographs be improved by getting distance measurements to objects?

It far from a gimmick, but it is for other professionals than developers. 3D representations of objects on 2D screens are definitely useful right now. Think engineers, medical doctors, real estate agents, architects, teachers and probably many more that I cannot think of at the moment, but please fill in.

My understanding is this LIDAR has good z (depth), but not very good x and y. So the point cloud you get is rather limited.
 
  • Like
Reactions: otternonsense

chucker23n1

macrumors G3
Dec 7, 2014
8,564
11,307
Do you have any source? I mean if it helps the Measure app x and y should be reliable as well.

I don't have a source.

But as far as the Measure app is concerned: that's helped by the z. The camera lenses are primes, so previously, you were prompted to move your device in a three-dimensional space, which creates a parallax, which helps extrapolate the z. With the LIDAR, it gets real actual z. Much faster and more accurate. The x and y are then good enough for measurements if you move the device around slightly.

LIDARs that produce decent point clouds are much bigger (think several AirPods stacked on top of each other). LIDARs that produce excellent point clouds are massive (about the form factor of a wastebasket), heavy, and expensive (five figures). [ I'm involved in a research project on using LIDAR in a maritime context. ]

But the one they put in here already helps a ton. Just, no, I don't think it's very suitable as a 3D scanner. Maybe if you combine that with machine learning, you can extrapolate something OK.
 
  • Like
Reactions: otternonsense

MandiMac

macrumors 65816
Feb 25, 2012
1,431
882
Isn't LIDAR in theory the same like the Face ID combination of sensors on the front?
 

otternonsense

Suspended
Jul 25, 2016
2,213
6,303
Berlin
Nineteen years ago that would have been: I'll take Who Needs 1,000 Songs in Your Pocket for $100, Alex.

Wrong equivalence is wrong. Almost all of us had Walkmans, Discmans, ill-designed early MP3 players etc. in hand, and could use something better indeed. The opportunity was there for the taking. Lidar on the iPad also counts as first-time innovation, sure, but it's still fantabulously niche no matter how hard you try to put it in context. To go back to your example, that would be like Apple releasing the iPod.. within the iMac. It doesn't scale. Not like that.

A great deal of people already walk around with depth sensing equipment: it's called Portrait mode and that's more than they care about. Other than Pokemon go, AR is yet to be proven as a thing. It's been the cringiest part of Apple's keynotes watching those wonky AR Lego demos with people clamouring around an empty table or forced "what's a computer" tone-deaf ads.

AR has potential but a better iPad camera (whose rear camera is used maybe once a year or invoked by accident) is kinda off for fully democratising this tech.

And yea, for a 2-year update of a flagship device from the richest company in the world, that was kinda weak.
 
Last edited:

citysnaps

macrumors G4
Oct 10, 2011
11,884
25,802
Wrong equivalence is wrong. Almost all of us had Walkmans, Discmans, ill-designed early MP3 players etc. in hand, and could use something better indeed. The opportunity was there for the taking. Lidar on the iPad also counts as first-time innovation, sure, but it's still fantabulously niche no matter how hard you try to put it in context. To go back to your example, that would be like Apple releasing the iPod.. within the iMac. It doesn't scale. Not like that.

A great deal of people already walk around with depth sensing equipment: it's called Portrait mode and that's more than they care about. Other than Pokemon go, AR is yet to be proven as a thing. It's been the cringiest part of Apple's keynotes watching those wonky AR Lego demos with people clamouring around an empty table or forced "what's a computer" tone-deaf ads.

AR has potential but a better iPad camera (whose rear camera is used maybe once a year or invoked by accident) is kinda off for fully democratising this tech.

And yea, for a 2-year update of a flagship device from the richest company in the world, that was kinda weak.

You missed my point. Back in 2001 when Jobs announced the iPod with his "1,000 songs in your pocket" catch-phrase, the cynical and imagination-less were crying, "Who asked for a 1,000 songs in your pocket?" Much like people here do today, and your post above (along with others), whenever some new and game-changing tech is brought out.

The new iPad with 3D depth-sensing camera is a great example of that.
[automerge]1585138630[/automerge]
True, but I don’t think ARKit is the runaway success just yet. There’s some other shoe yet to drop.

Of course not. ARKit needs a rear-facing 3D depth-sensing camera in order for that to happen.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.