Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Would you use AR glasses with the features in the OP?

  • Yes

  • No


Results are only viewable after voting.
I can do one better, I can show what my AR idea is because I'm already developing it with ARKit :) I intend a version of it to be ready when the device, whatever it ends up being, launches. Lets you customize how you see your home (and hopefully, how others see it too if they are wearing their own devices).

This has me pretty excited for AR. Looks awesome!
 
  • Like
Reactions: MandiMac
I can do one better, I can show what my AR idea is because I'm already developing it with ARKit :) I intend a version of it to be ready when the device, whatever it ends up being, launches. Lets you customize how you see your home (and hopefully, how others see it too if they are wearing their own devices).

Great showcase, thank you for sharing! If you're using that app hands-free, it will be interesting how to cycle through the different textures of walls and floor. Maybe with voice, like "next floor" or "next wall"?
 
I've had this idea since before AR even existed, but I didn't know any better as I was a kid. I still think the idea is cool.

I'd love to see a way to "write" in AR on any surface with either your fingertip or a special pen, and it shows the notes you handwrite from your AR view and notifies you when you're near a note.

Example: have your AR glasses on; write a note on the front door that says "get milk." Next time you walk by the front door (glasses on or off) your glasses and/or your phone would get a notification with the text (or image) of your written message.

I lack the know-how to make this happen so feel free to steal my idea, anyone.
 
I've had this idea since before AR even existed, but I didn't know any better as I was a kid. I still think the idea is cool.

I'd love to see a way to "write" in AR on any surface with either your fingertip or a special pen, and it shows the notes you handwrite from your AR view and notifies you when you're near a note.

Example: have your AR glasses on; write a note on the front door that says "get milk." Next time you walk by the front door (glasses on or off) your glasses and/or your phone would get a notification with the text (or image) of your written message.

I lack the know-how to make this happen so feel free to steal my idea, anyone.
Cool idea.
 
...and notifies you when you're near a note.
I guess this would be a great cornerstone of AR. Just think of museums for example: You're next to an aquarium, and you see all infos about the fish in it automatically. Such beacons could be really selling AR in terms of traveling and/or learning IMHO.
 
Looks like these glasses won't be coming out in 2022 like some rumors suggested. Maybe we get a sneak peek in this years' fall event with a launch in 2023?
 
  • Like
Reactions: CodeSpyder
Shamelessly plugged from user @DogDogDogFood and their thread Are NeRFs going to be Apple's secret AR/MR weapon? because it's so good and fitting:

There is an opportunity for RealityOS to be a Machine Learning platform. With a headset you have visual, audio, and spacial inputs - those can all be enhanced with ML to provide a novel AR/MR experience.

It doesn't have to be all consumption, either. Creating and sharing NeRFs (or whatever Apple ends up calling them) is the simplest case. Apple also has their no/low-code systems like SwiftUI and Trinity AI. It should be possible to create AI tools right within Reality OS - either by training object recognition through Object Capture + NeRFs, or with audio or spacial data. Someone working in this system could create a ML-driven solution and then share it with colleagues and friends to help with them with various tasks.

Imagine being a researcher counting migratory birds in the field. You could take some recordings of your subject and train a model to recognize them. Then carry the Apple headset with you, and the system would alert you on your Apple Watch when it hears the audio you are interested in. You then put on the headset, and it uses the mic array to display on-screen indicators showing the direction of the sound. Using optical zoom, you could get a better look; maybe even drop in a pre-trained image enhancement model. You could then log the results using the data service on your iPhone and put the headset away.

There are tons of uses around the home for this, too.
- My kid makes a sculpture for a school project, and I use the headset to create a virtual model I can share with the grandparents. This can even use Apple's subject-isolation technology to remove the background.
- I want to watch a movie on Apple TV+ with some friends - I can use Share Play to create a shared viewing experience, while offloading the streaming and decoding to my Apple TV to save on power.
- We get done painting a room, and want to share what it looks like before and after.
- I want to do something on my Mac, but it's a different part of the house and I'm outside on the porch. I can bring up a virtual screen on the headset, and do desktop-style work anywhere.

If Apple does this right, the eternal question of "What is AR really useful for?" will quickly become "What is AR not useful for?". Apple is the only company in the world who has competence in every field needed to bring a platform like this together. They can do great hardware, and have custom silicon to power it. They have many other devices and services that can be tied into the headset (Mac, iPhone, AirPods, Apply TV, Apple Music, App Store, etc). They have all the pieces needed for a breakthrough Machine Learning environment. They have mature developer tools and APIs, and a developer community eager to adopt new form factors. They have a world class ecosystem not just for apps, but for sharing and communication. Neither Microsoft, Meta, Google, or NVIDIA are on this level.
 
Thanks... Maybe that conversation is better suited to this thread, but I didn't want to cross post. :)

The other thing I mentioned was streaming live or recorded events in VR to the headset. Apple likely purchased NextVR for exactly that reason, and certainly others have been talking about it. What's new is fast NeRF training technology, which was developed just last year (NVIDIA called these "Instant NeRFs"). The heavy lifting would have to be done in an Apple data center, but realtime viewing would require a powerful Neural Engine on-device. Definitely a technical challenge, but the upshot sounds amazing: Not only will Apple be able to offer VR views from fixed camera locations, but you could conceivably experience an event from any position. This could be a 3rd-person view following your favorite player around the football field. You could watch a concert right up on stage with the performers, or from down in the mosh pit - anywhere you want. Spacial Audio would make it even more immersive.

It would be pretty extraordinary if Apple could make it *more* compelling to watch an event in VR than in person.
 
Just thought about this: While cooking, one could see timers and the step-by-step instructions of the recipe in the view. Exciting, even the simplest of things could help out so much when it comes to doing daily tasks or learning new things...
 
I really think AR features will be very helpful to use along with other Apple products. The problem is I always need to look at iPhone in order to check the map, music, notification, and more and if I ever have Apple glass, I no longer need to. I kept checking the map over and over again and it's very inconvenient.

The real problem for Apple glasses are...
1. Size
2. Battery
3. Camera

I really am waiting for it.
 
I'll start:

- non-obstrusive notifications (think like Mac banners flying in on the side)
- always-on Shazam (like Now Playing notification banners)
- subtitles for persons speaking (bonus: highlight them with colors and color the subtitles accordingly)
- navigation purposes, like Apple Maps extension
- ViewTime; like FaceTime but the called person see's what you're seeing (ie for support purposes)
- and of course games like Pokémon Go

What are your ideas? :)

How do you point and click with AR glasses? Using your eyes? Does it depend where your eyes stare at?
 
The real problem for Apple glasses are...
1. Size
2. Battery
3. Camera
1. I think if there is a company that gets miniaturization done right, it's Apple. Just look at iPhone mini, Apple Watch or AirPods.
2. Battery could be solved with very, very efficient processors. Again, if there is a company that can get this done right, it's Apple. Plus there are some patents flying around with truly OTA charging systems for batteries. Exciting times.
3. Are you sure a camera is needed? A good LIDAR sensor coupled with a Neural Engine could recognize most things anyway without all the privacy concern an outward-facing camera would introduce.
 
How do you point and click with AR glasses? Using your eyes? Does it depend where your eyes stare at?
It's a problem to get rid of, for sure. But maybe there is no point and click: Regarding navigation, you can use Siri to start a route from X to Y - no further need for pointing and/or clicking. Regarding timers, notifications, instructions, subtitles and the like, there's no need for interaction as well. I guess when you need to point/click/scroll in an AR glasses use case, you blew it.

But for simple interaction, there could be touch-sensitive areas on the sidepiece (the temples?) of the glasses, for example.
 
1. I think if there is a company that gets miniaturization done right, it's Apple. Just look at iPhone mini, Apple Watch or AirPods.
2. Battery could be solved with very, very efficient processors. Again, if there is a company that can get this done right, it's Apple. Plus there are some patents flying around with truly OTA charging systems for batteries. Exciting times.
3. Are you sure a camera is needed? A good LIDAR sensor coupled with a Neural Engine could recognize most things anyway without all the privacy concern an outward-facing camera would introduce.
Camera sensors ARE essential. LIDAR sensor consumes a lot of power and the one that Apple is using isn't really LIDAR sensor. And how do you even capture long distance objects with LIDAR? You just can't. I guess you dont know anything about the sensor.
 
Camera sensors ARE essential. LIDAR sensor consumes a lot of power and the one that Apple is using isn't really LIDAR sensor. And how do you even capture long distance objects with LIDAR? You just can't. I guess you dont know anything about the sensor.
If you're thinking Google Glass, of course they are essential. But why should Apple go down the same privacy-problem-riddled way as Google did? And you're right, I don't know anything about the sensor, so you're very welcome to enlighten us about the limitations. Please do share more information about the sensor not being really LIDAR and why long distance objects (what is "long distance", really? 5m? 10m? 1000m?) can't be captured with a high-resolution LIDAR sensor.
 
  • Like
Reactions: mgscheue
If you're thinking Google Glass, of course they are essential. But why should Apple go down the same privacy-problem-riddled way as Google did? And you're right, I don't know anything about the sensor, so you're very welcome to enlighten us about the limitations. Please do share more information about the sensor not being really LIDAR and why long distance objects (what is "long distance", really? 5m? 10m? 1000m?) can't be captured with a high-resolution LIDAR sensor.
Again, a true LIDAR sensor consume a lot of power and iPhone and iPad aren't really using a real one. They are using dToF sensor, not LIDAR sensor. This is why cars aren't using LIDAR sensors instead of camera sensors. Tesla cars are great example.

So how do you even use hight resolution LIDAR sensor if the power consumption is so high that cars cant even afford to use it? Even if Apple use dToF sensor, it's just another IR sensor which isn't really unsuitable for AR which contradict your claim.

You still need a camera sensor in order to visualize AR. Both LIDAR and dToF cant do that. Both iPhone and iPad already use cameras for AR so I dont see your point. OpenCV is another great example which is meant for AR and require video and photo processing power. LIDAR has nothing to do with it.
 
Last edited:
Even if Apple use dToF sensor, it's just another IR sensor which isn't really unsuitable for AR which contradict your claim.
Let's wait and see what Apple has up its sleeves. It's gonna be interesting, to say the least!
 
  • Like
Reactions: Solomani
w/ Smart Glasses and AirPods it would be IRL audio descriptions. That could be a really incredible breakthrough usage.

Got these sent to me from a friend in China
Prototype!
0e3e40047d96970850b792811ba87e90.jpg
 
One of the coolest (and impossibly futuristic) technologies depicted in Star Trek is the Holodeck, where you could describe an environment to the computer and it would create it for you.

Seeing the recent development of AI systems like Midjourney and DALL-E suggest that Star Trek-like AI content creation might actually be possible with today's technology. A month ago Apple announced Gaudi, which can create and navigate 3D scenes based on text prompts. Pair that up with Siri, a powerful neural engine, and a MR headset... we are going to have an honest-to-god Holodeck.
 
How do you point and click with AR glasses? Using your eyes? Does it depend where your eyes stare at?

I've used HoloLens and it can do eye tracking and hand tracking.

There's a few hand gestures, I'm sure it could detect a blink, but the calibration when you first start using it asks you to look at stuff and confirms you focused on it.

Otherwise, using the hand gesture, a click is an index finger movement.
 
  • Like
Reactions: Solomani
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.