Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m sure it will include LiDAR, but that can’t be the only technology. Else how would it know to overlay images into the real world in exactly the right spots? LiDAR works in tandem with the camera.
You don't need a camera, the screen is transparent so you're still looking into the real world. The LiDAR just finds the depth of your surroundings for it to understand the area around you better.
 
It doesn't have to be used in tandem with an optical camera. There's a stack of LiDAR scanner tech out there that have no optical cameras, yet scan an entire landscape and create a detailed enough 3D model.
You don't need a camera, the screen is transparent so you're still looking into the real world. The LiDAR just finds the depth of your surroundings for it to understand the area around you better.
Sure, it can get a landscape to project things onto semi-accurately but what about overlaying things like business names or gps directions with arrows on the street? LiDAR can only reach so far accurately. 5 meters to be exact.
 
  • Like
Reactions: subi257
It’s just too early for the tech to really expect anything before 2023, at the earliest. VR could be even further back than AR.
 
I really hope this is a product that can supplement, or replace, my prescription eyewear. I'd love to throw my glasses on, slip on the watch, toss my phone in my pocket, and have them all work in unison! It would also be beneficial if they could interact with other Apple products or Services. (ie. The glasses bring up certain menus, or options when facing my MacBook Pro, or Apple TV) No matter how "beta" or advanced the initial product is, I will jump on board.
I remember a rumour a while back that said Apple is looking to offer prescriptions with Apple glass. If true, I cannot imagine Apple not doing this, the yes you'll be able to replace your prescription glasses with these.
 
It doesn’t matter that Lidar only reaches 5m. The glasses will still be able to place virtual objects in real space beyond that. The glasses only need to know their approximate location (probably gps from the phone which which is accurate to what, 1-2m?). Exact orientation is no problem with compass and accelerometer. Once the glasses have an approximate location, they can then accurately place them relatively using lidar even when the glasses move, in a similar way that an optical mouse works. With accurate relative positioning, virtual objects can be anchored in real space at any distance.

VR has opened my eyes to what could be possible with AR and I think it’s going to go wel beyond what any of us expect, even when it’s first launched. You just have to look at what’s possible with VR passthrough and hand tracking. I could easily see us placing virtual screens that are anchored in real space in our houses and they can be set up using just your hands. Existing hand tracking allows surprisingly precise control.

I do think AR glasses will be a bit of a “closet” product for the first year or two in the same way that we know millions of VR headsets have been sold but we don’t see anyone with them.

I think anyone that uses more than one monitor can see the appeal of using glasses as a second monitor instead. Monitors of any size or even any number. We might not have the same contrast or fidelity but I think it will be surprising how well they could work and this is just one application of them but an important one because of people can justify them as a productivity tool then they can also justify buying them.

The extended screen doesn’t need to be limited to your laptop, you could just as well extend your watch, tablet, or phone screen (I don’t think folding phones will be around for long). So we’ll probably have some screens that are anchored to static objects in real space such as your walls, tables, fridges, and then some screens that are anchored to and move with other objects.

I think the analogy of a HUD is misleading and limits our imagination around how these can be applied. Sure the image will look like it’s beyond the glass but images in a HUD aren’t anchored in real space.

This extended screen stuff is just one application that would be enough for me to buy them. But then there’s games, GPS navigation or data that’s relevant to all kinds of activities - sports, pilots, cycling, skiing, professional uses. In a darkened room you could easily watch a movie on them.

It’s not a given that I’ll buy the first gen and even if I did, I wouldn’t be surprised if I didn’t want to wear them in public and that’s ok.
 
Sure, it can get a landscape to project things onto semi-accurately but what about overlaying things like business names or gps directions with arrows on the street? LiDAR can only reach so far accurately. 5 meters to be exact.
The range of LiDAR is based on the sensor used. We don’t know what Apple are using in this first gen product.

Since the glasses will be lightweight, I’d assume the LiDAR hardware would have a fairly limited range.

But there’s LiDAR scanners I’ve seen that can map out a 300 meter bubble accurately.

I don’t expect the AR glasses to do that kinda stuff just yet. I expect it to be more or less a HUD with some close (3-5 meter) mapping.
 
  • Like
Reactions: LeadingHeat
The range of LiDAR is based on the sensor used. We don’t know what Apple are using in this first gen product.

Since the glasses will be lightweight, I’d assume the LiDAR hardware would have a fairly limited range.

But there’s LiDAR scanners I’ve seen that can map out a 300 meter bubble accurately.

I don’t expect the AR glasses to do that kinda stuff just yet. I expect it to be more or less a HUD with some close (3-5 meter) mapping.
We don’t even need lidar to overlay stuff like directions and business names accurate enough - to within a metre or two. This was already done years ago with some accuracy with various apps that you could for example use to show landmarks or sky maps. That was like ten years ago!

I can’t see the glasses just being a hud, again, a hud has been possible for years. Why do you think Apple have been investing so heavily into AR with their frameworks? You wouldn’t need lidar at all for a hud, what you do need lidar for is for very accurate, close range positioning indoors so that you can anchor virtual objects in real space convincingly and for the purposes of occlusion. What about competitor products that are already doing AR but in a less desirable package?
 
We don’t even need lidar to overlay stuff like directions and business names accurate enough - to within a metre or two. This was already done years ago with some accuracy with various apps that you could for example use to show landmarks or sky maps. That was like ten years ago!

I can’t see the glasses just being a hud, again, a hud has been possible for years. Why do you think Apple have been investing so heavily into AR with their frameworks? You wouldn’t need lidar at all for a hud, what you do need lidar for is for very accurate, close range positioning indoors so that you can anchor virtual objects in real space convincingly and for the purposes of occlusion. What about competitor products that are already doing AR but in a less desirable package?

I think you have completely missed the point.

it’s been possible to accurately map things without LiDAR by using a camera.

To address privacy concerns glasses might not have a camera.

So the classes could use LiDAR instead. This would allow them to use the AR frameworks that they have created.
 
I think you have completely missed the point.

it’s been possible to accurately map things without LiDAR by using a camera.

To address privacy concerns glasses might not have a camera.

So the classes could use LiDAR instead. This would allow them to use the AR frameworks that they have created.
No you completely missed the point. I wasn’t talking about privacy concerns and I know the glasses are likely to have no camera in favour of Lidar which is more accurate in depth sensing. Oh and by the way, there may be privacy concerns with Lidar. Face ID works in a similar way, at the moment Face ID is a lot finer and more accurate but it’s the same principle so in theory you could identify who someone is with Lidar.

The point I was making is that you don’t need Lidar or even a camera to place virtual objects in real 3D space fairly accurately (within 1-2m) for the same reasons current GPS navigation works. Sure you can’t just turn on a system, get a GPS position, query a compass and accelerometer and expect it to be accurate immediately since compasses may not be that accurate but once the system has moved a couple of metres it can calibrate the compass against the new GPS position.

Oh and “mapping”? Again, you missed the point, we’re talking about positioning, not mapping. Mapping only applies to a limited area where data points are gathered that are *relative* to one another. There isn’t necessarily GPS data to map those points to real space on Earth. But anyway, that’s for reconstruction, not for projecting objects into the real world.

The point is: You don’t need to know what something looks like to know where it is.
 
Last edited:
One really cool use case for these glasses is NFTs. Artists can create the NFTs to place inside this AR world. Animated NFTs are coming. VeVe is the leader in licensed NFTs, I can see them utilizing the apple glasses. They already use NFTs in AR mode in their iOS app. I can see a whole world of NFTs placed strategically, land owners can start charging to place NFTs on their physical property. SPAM controls will need to be put in place, maybe Apple can approve them. This is exciting tech to me.
 
I would love for them to be similar to what Google had shown as "google Glass" Not some massive bulky frames....I think there will be something like that in the future.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.