Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
62,073
28,007


Apple on Tuesday seeded a sixth beta of visionOS, the software designed to run on the Vision Pro headset, and it includes two new tutorial videos shown to the user during the setup process.

apple-vision-pro-eye.jpg

The first 36-second onboarding video, shared by @M1Astra, is called "Input Training" and demonstrates how the user interacts with visionOS by looking at UI elements and selecting them using a double-tap gesture. The narrator says the following:
Your eyes and hands are how you navigate Apple Vision Pro. You browse the system by looking, and it responds to your eyes. Simply look at an element and tap your fingers together to select it. It's like a click on your Mac. To scroll, pinch your fingers together, and gently flick. You can keep your hands where they're comfortable, such as resting on your lap.

With the Apple Vision Pro headset, the built-in cameras create a customized "Persona" that resembles the user, and this Persona is used in video chat apps like FaceTime. In a second, 50-second video also shared by @M1Astra, "Persona Enrollment," Apple demonstrates how Personas are set up by using the EyeSight display to guide the user:
To set up your Persona, you'll remove Apple Vision Pro to capture your appearance. Take your time getting ready, and ensure nothing is covering your face. To start capturing, hold Apple Vision Pro at eye level. Keep your arms and shoulders relaxed. Then, follow the instructions.

Turn your head to the right, to the left, and tilt up, and down. Then you'll capture your facial expressions. Smile with your mouth closed, smile showing your teeth, raise your eyebrows, and close your eyes. When you're done, put Apple Vision Pro back on to see your Persona.

As we previously reported, the 3D capture process requires users to remove anything that covers the face, such as glasses. There does not appear to be any kind of secondary scanning mechanism to separately capture a user's glasses, and instead, Apple will allow users to "Select Eyewear" from a variety of options.

Personas are one aspect of ‌visionOS‌ that Apple is working on perfecting before the launch of the Vision Pro headset. In September, Apple began surveying developers who have the Vision Pro about their Personas, soliciting opinions on facial expressions, appearance matching, and more.

The Vision Pro headset is set to launch in early 2024 in the United States, and it will be priced at $3,500. The headset will likely be released in the UK and Canada later the same year.

Article Link: Apple Vision Pro Onboarding Videos Unearthed in visionOS Beta 6
 
Last edited:
  • Love
Reactions: SFjohn

Ctrlos

macrumors 6502
Sep 19, 2022
394
918
I know its first gen and everything but I was hoping for something more than 'floating iPad windows strapped to your face. With proper AR I want the room around me to become the interface, like every surface becomes an infinite whiteboard and monitor and I can pin things to physical spaces, eg my shopping list is just naturally floating on my fridge for me to add to. That would be proper spatial computing because it uses actual space.
 

MudHolland

macrumors newbie
Jan 15, 2022
2
11
I know its first gen and everything but I was hoping for something more than 'floating iPad windows strapped to your face. With proper AR I want the room around me to become the interface, like every surface becomes an infinite whiteboard and monitor and I can pin things to physical spaces, eg my shopping list is just naturally floating on my fridge for me to add to. That would be proper spatial computing because it uses actual space.
I'm not saying this product is going to be revolutionary, but your comment feels like the comments when the iPad was introduced: "It's just a big iPhone".
As an interaction designer I'm looking more at the Vision Pro ushering in the next interaction paradigm: Spatial computing. Right now it's a clucky Quest 3, a little less clunky Vision Pro, or an awkward pair of sunglasses. This might evolve into every device tapping into the Spatial UI, like your oven knowing you're looking at it and floating a window 'in space', outside of the confines of the oven, literally connecting the world around you.
 

EmotionalSnow

macrumors 6502
Nov 1, 2019
345
1,223
Linz, Austria
I know its first gen and everything but I was hoping for something more than 'floating iPad windows strapped to your face. With proper AR I want the room around me to become the interface, like every surface becomes an infinite whiteboard and monitor and I can pin things to physical spaces, eg my shopping list is just naturally floating on my fridge for me to add to. That would be proper spatial computing because it uses actual space.
You're not gonna walk to your fridge every time to read your shopping list though? That eliminates the entire advantage the digital world has over the physical world.

And even if you're in your kitchen you need to wear the Vision Pro, the only benefit over physical notes here is that you could have photos and interactive content — but that's hardly necessary for a shopping list.
 

WiiDSmoker

macrumors 68000
Sep 15, 2009
1,846
6,937
Dallas, TX
I know its first gen and everything but I was hoping for something more than 'floating iPad windows strapped to your face. With proper AR I want the room around me to become the interface, like every surface becomes an infinite whiteboard and monitor and I can pin things to physical spaces, eg my shopping list is just naturally floating on my fridge for me to add to. That would be proper spatial computing because it uses actual space.
That’s likely when it becomes glasses / contacts. Hard to do that with it strapped to a cord
 

timber

macrumors 6502a
Aug 30, 2006
992
1,770
Lisbon
One of the few somewhat recent Apple products that manages to once again get that old fashioned awe and amazement.

On a more rational note still a bit sketchy on what to really do with it beyond the usual VR user cases.
 

Ctrlos

macrumors 6502
Sep 19, 2022
394
918
You're not gonna walk to your fridge every time to read your shopping list though? That eliminates the entire advantage the digital world has over the physical world.

And even if you're in your kitchen you need to wear the Vision Pro, the only benefit over physical notes here is that you could have photos and interactive content — but that's hardly necessary for a shopping list.
Thats just a very basic usage case though. As and when this product gets refined down to glasses-level it needs to be less a device that pushes out data from its own apps and more something pulls it in from IOT objects around it. We want to use AR to use digital to enhance the physical space, not the other way around. That is to say I want a reason to get off my ass and gaze at the fridge so it can tell me what is going off and what needs replacing rather than that just popping up as a notification in my field of view. From a software design standpoint we dont want to overload the physical viewpoint of the users; it should only react to things being physically gazed at.

Kind of like how the Focus system works on Horizon Zero Dawn.
 
  • Haha
Reactions: Surf Monkey

Ctrlos

macrumors 6502
Sep 19, 2022
394
918
I'm not saying this product is going to be revolutionary, but your comment feels like the comments when the iPad was introduced: "It's just a big iPhone".
As an interaction designer I'm looking more at the Vision Pro ushering in the next interaction paradigm: Spatial computing. Right now it's a clucky Quest 3, a little less clunky Vision Pro, or an awkward pair of sunglasses. This might evolve into every device tapping into the Spatial UI, like your oven knowing you're looking at it and floating a window 'in space', outside of the confines of the oven, literally connecting the world around you.
The iPad was obviously revolutionary to me because it launched with a keyboard and Pages; it was a great writing machine! From day one Apple marketed it as a line of computers rather than an accessory.

I reckon the endgame product is actually nothing like Apple's centrally controlled system at all but rather an open-source system that pulls in and parses data from IOT sensors all around you. The Find My network, Amazon's Sidewalk; these things are all around us and don't need a centralised server to run from. An open platform to parse the data, like how webkit can be used for different web browsers is the answer.
 

HobeSoundDarryl

macrumors G4
These tutorials feel… weird? I don’t know how to explain it, it just gives me an odd feeling.

Probably what that first guy said when handed a cellular phone and told to roam around the area to make and take calls. Where's the cord? But where do I put in the quarter? There's no way this big hunk of tech is every going to be popular with the masses. Solution in search of a problem. If I want to make phone calls, I'll use the phones I already have... or a pay phone on every corner when out and about.

Probably what the first guy said when he sat down at a PC. Where's the room full of hardware? Where's do the punch cards go into this thing?

Probably what most said when they took their first flight. Cars, trains & boats get me to my destinations just fine. If God wanted man to fly...

Probably what many people said when they encountered that first automobile. How do you hook the horses to this thing? Where's the reigns? Wait, you want me to pour highly flammable liquid into this area right in front of where me and the family sit and then this- what do you call it- 'engine' is going to trigger explosions over and over to make it go? I've seen explosions sonny...

Imagine all of the people in their first exposure to a keyboard: "ummmm, this is 'merica... and in these parts, the alphabet starts with A, not Q. How is anyone supposed to find the right letter when they're all mixed up like this? At least you got the numbers mostly right... except zero comes before 1, not after 9. The funny thing is, over here on the keypad to the right, the zero is below the 1 instead of above the 9. So they got it right in one place and wrong in another. Must be some kind of metric keyboard.

Probably what many thought of the first lightbulb. So where does the kerosene go in this thing? How does the match get inside this chamber to light that little wick? My candles and lanterns work perfectly fine and don't require a monthly subscription to this electricity nonsense.

All "brand new" anything triggers "odd feelings." Fear of change is as old as mankind. It's perhaps the most normal feeling humans can feel short of the fundamentals: hunger, fear, lust, greed, anger, love, etc.

This thing is new and very different from the Apple tech we know. We should feel "odd" about it. We just had months of people quite familiar with USB-C on their other Apple devices ranting, raving, freaking, etc about USB-C replacing Lightning in iPhone. I can't even count how many posts I read with the words lint magnet, wobbly, broken tongues, etc in them. And then Apple actually launched it and that whole wall a trepidation seemed to evaporate.

And there's still some lint. And we're not tripping over all of the broken tongues everywhere. And a massive industry of USB-C repair shops hasn't popped up to handle all of the anticipated wobbly/broken port repairs.
 
Last edited:

XboxEvolved

macrumors 6502a
Aug 22, 2004
614
556
I like how the person that they use on the videos looks incredibly smug and looks like they are someone that would smugly let you know that they are one of the first people to experience the “spatial-computing(tm) revolution” while they are looking at you with creepy dead fake eyes while wearing their stupid VR space goggles.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.