Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They said we can use a PS5 controller. Like, analog sticks and buttons and whatnot.

VR controllers like the PS Sense, Valve Index, Quest, Vive, etc. are an entirely different type of technology. Which Apple would HAVE to make themselves.

(You cannot pair a PlayStation Sense controller with Bluetooth to a computer - or the Vision Pro - and call it a day. There are specialized sensors required to track those controllers in 3D space.)

Controllers like these:


Anyway, I’m out. Take care!
Ok, this feels like something Apple might work on a few generations in, like how originally iPads didn't have a stylus but eventually Apple came out with the Pencil.

Apple doesn't try to do everything at once. It releases a product that's good enough, then adds features and accessories gradually as the product gets adopted, and they get feedback on how people are using them in real life.
 
  • Like
Reactions: persona1138
VR isn’t “doomed.” It’s niche. And it will remain niche until they can make it as easy as wearing a pair of sunglasses.

We’ll see how it does!
I wait for the day when it IS as comfy and light as a pair of sunglasses.
Not interested to be a test pilot for these, but all of us who won’t be that, appreciate all of you who want to.
I will follow the evolution of this 😌
 
Hilarious. Cause wires coming out of peoples ears was less weird? Some people just can’t adapt. What they use is the only thing anyone should use.

I have a friend who constantly bombards people with how foolish their stuff is. Inevitably, someone talks him into using something, and he jumps on the bandwagon after whining for years. Pretty frustrating.

I can’t give their opinions much weight in these cases.
People with wires coming out of their ears don't look great either, but people who look like they have tampons coming out of their ears look beyond ridiculous. Time will never change that. I don't think anyone else should wear what I wear, in fact I prefer to wear things other people don't or can't wear. I definitely don't jump on any bandwagons. Your assessment of me is really poor, but then again you don't know me so I'm not sure why you tried to assess me in the first place.
 
The only truly “new” features are the smooth, user-scalable transition between AR and VR (which is neat). And the taking of 3D photos and video (which looks creepy AF in their presentation).

Honest question… What AR? What is augmented? All I’ve seen so far is more of a “pass through”. Like ‘transparency mode’ on NC headsets.
 
Honest question… What AR? What is augmented? All I’ve seen so far is more of a “pass through”. Like ‘transparency mode’ on NC headsets.
AR stands for “Augmented Reality.” It’s when you see the actual, physical space around you, but virtual elements (like a 3D object, or a web browser window, or a movie screen, as shown in Apple’s presentation) “float” within that space and can be interacted with digitally.

Think of it like in Minority Report when Tom Cruise is using his hands to move digital elements around. Those digital objects aren’t “real,” but they’re in the physical room with him.

(So yeah, you’re sort of right about what you called “transparency mode.”)

VR - or Virtual Reality - is an entire space where the environment itself and everything in it is digital. None of it is real. But you can suddenly (virtually) be in places like outer space, or on a mountaintop, or in the middle of the ocean (while you’re actually standing in your living room or whatever).

Typically, headsets are either ONLY AR or VR. Not both. (Or you have options on VR headsets that let you “pass through” video to see your actual surroundings. But that’s mostly just used so that you can periodically check your bearings or talk to someone without taking off the headset. The pass-through capability in VR headsets is not AR. There’s nothing virtual or augmented in that video pass-through.)

What the Apple Vision Pro does is let you toggle whether you’re interacting with apps in AR - where the apps sort of “float” in the actual space around you - OR, you can turn a dial and switch the space to be entirely virtual (so it looks like you’re using those apps while standing on top of a 3D, virtually-generated mountaintop.) You can even do a mix, so that half of the space around you is real (and AR), and the other half is virtual.

It’s a neat feature. And one of the few unique features on the Apple Vision Pro that you can’t do on any other single headset.
 
  • Like
Reactions: JamesHolden
All I’ve heard from anyone that’s used the Vision Pro is about swiping and selecting. I’m sure that’s great.

And I certainly think the way you described how you’d crop a photo with your finger could work… in theory.

But when Apple can’t even demonstrate in their presentation or to members of the press in demos a virtual keyboard… It makes me skeptical about how accurate it is.

I’m sure it’ll be more accurate hand tracking than anyone’s done before. But will it be good enough?

I have my reservations. But we’ll see.

(Meanwhile, a controller solves all those problems. If Apple had optional controllers - in the same way an Apple Pencil is optional for the iPad - I wouldn’t be concerned. But my hands and eyes are the ONLY input method for this thing when manipulating items in a 3D space.)
They also provide us developer kits to make our own gestures so it's not only limited to pinch and slide. But you are right you need to experience it yourself to answer your own question.
 
Daring Fireball impressions


He thought it sucked....no of course not. His mind was blown. He said he would buy it just for the sports experiences if they were actually as good as demoed. He said the software environment was perfect. He is a pretty big apple fanboy but lots of themes are consistent.
 
They said we can use a PS5 controller. Like, analog sticks and buttons and whatnot.

VR controllers like the PS Sense, Valve Index, Quest, Vive, etc. are an entirely different type of technology. Which Apple would HAVE to make themselves.

(You cannot pair a PlayStation Sense controller with Bluetooth to a computer - or the Vision Pro - and call it a day. There are specialized sensors required to track those controllers in 3D space.)

Controllers like these:


Anyway, I’m out. Take care!

I understand that you've left, but I'll continue the conversation you started.

Let's look at the Valve Index controller you pointed to.

iu


What is it? It's essentially an IMU with buttons. It measures the rotation and movement of your hands.

Why is it needed? Because most headsets don't have sufficiently good AI for hand tracking but also because they don't have the field of view on the cameras to see anything that isn't in front of your face. The point of these little guys is so you can keep your hands in a comfortable position and not hold your hands at 10 and 2 like your driving a big rig all day.

Does it provide any additional precision? Almost certainly not. How could it? Your hands are holding it, so however imprecise your hands are you also need to add the noise of the gyros and accelerometers. Why not simply track your hand directly?

It has some force sensors to detect squeeze force, but they're advertised to help simulate grabbing and throwing-- so while I think it would be hard for AVP to detect how hard you're squeezing, I think it would be easy to detect throwing and grasping because you aren't limited holding a stick you can fully articulate all 10 fingers.
 
Last edited:
Daring Fireball impressions


He thought it sucked....no of course not. His mind was blown. He said he would buy it just for the sports experiences if they were actually as good as demoed. He said the software environment was perfect. He is a pretty big apple fanboy but lots of themes are consistent.
Yes I read that, pretty much the same thing as other reviewers said but more in depth about his experience with the spatial, gestures, eye tracking, stability, latency and a lot more. Pretty good read.
 
Does it provide any additional precision? Almost certainly not. How could it? Your hands are holding it, so however imprecise your hands are you also need to add the noise of the gyros and accelerometers. Why not simply track your hand directly?
Couldn't this be calibrated, so for example when you are doing precision work, when your hand moves an inch, the virtual object you are manipulating moves a tenth of an inch?
 
I understand that you've left, but I'll continue the conversation you started.

Let's look at the Valve Index controller you pointed to.

iu


What is it? It's essentially an IMU with buttons. It measures the rotation and movement of your hands.

Why is it needed? Because most headsets don't have sufficiently good AI for hand tracking but also because they don't have the field of view on the cameras to see anything that isn't in front of your face. The point of these little guys is so you can keep your hands in a comfortable position and not hold your hands at 10 and 2 like your driving a big rig all day.

Does it provide any additional precision? Almost certainly not. How could it? Your hands are holding it, so however imprecise your hands are you also need to add the noise of the gyros and accelerometers. Why not simply track your hand directly?

It has some force sensors to detect squeeze force, but they're advertised to help simulate grabbing and throwing-- so while I think it would be hard for AVP to detect how hard you're squeezing, I think it would be easy to detect throwing and grasping because you aren't limited holding a stick you can fully articulate all 10 fingers.
🤦‍♂️ I’m not explaining anymore. Do more research. Or try VR. You’re thinking about it wrong.
 
But - based on my experience - a controller can give you far more precision.

An artist who uses oil paints uses a paint brush. (I don’t think DaVinci was a finger painter.) Same goes for calligraphy.

Or using an Apple Pencil on an iPad. Or a Wacom tablet on a computer.

These are all variations of "fat fingers" in the physical world. The reasons you need a brush or a pencil is because the pad of your finger is bigger than the spot your eye wants to hit-- but your eye is directing the action and there's no reason the virtual "brush size" can't be finer than your physical finger tip. Any drawing app has a brush that you control through additional controls. DaVinci selected different physical brushes.

I don't see how holding your hand in a fist makes anything more precise in this case.

Sometimes, you need something in your hand that gives you more accuracy. It’s just basic logic.

Basic logic would explain where that accuracy derives from. It's still your eyes and your hands. What does the tool provide that adds precision, and why is that the only way to achieve it?
 
Couldn't this be calibrated, so for example when you are doing precision work, when your hand moves an inch, the virtual object you are manipulating moves a tenth of an inch?

Exactly. Presumably this is true if you use a controller though too. If you need more precision than your hands allow, zoom in, essentially, or apply a sensitivity multiplier as you do with pencil pressure. Again, that's what we do in the physical world-- a jeweler uses a microscope.
 
🤦‍♂️ I’m not explaining anymore. Do more research. Or try VR. You’re thinking about it wrong.
You're not required to respond, I assumed you were already gone. I'll mention though that you still haven't actually explained anything, all I've seen is "everyone else does this, it must be for a reason". I haven't seen an explanation of what that reason is, only a presumption that there's magic in the controller.
 
You're not required to respond, I assumed you were already gone. I'll mention though that you still haven't actually explained anything, all I've seen is "everyone else does this, it must be for a reason". I haven't seen an explanation of what that reason is, only a presumption that there's magic in the controller.
I can imagine that for certain types of tasks, pushing buttons or joysticks might be easier for your brain to "translate" into motions in the virtual world. I think your theory that everyone else may be using controllers because they weren't able to achieve the same degree of hand-tracking precision as Apple's Virtual Pro also sounds plausible. We'll just have to wait until we can actually use the Virtual Pro to find out.
 
I can imagine that for certain types of tasks, pushing buttons or joysticks might be easier for your brain to "translate" into motions in the virtual world. I think your theory that everyone else may be using controllers because they weren't able to achieve the same degree of hand-tracking precision as Apple's Virtual Pro also sounds plausible. We'll just have to wait until we can actually use the Virtual Pro to find out.

Joysticks and buttons are supported-- we saw game controllers demo'd. What is easier for the brain to translate than using hands like you do in the real world? I'm not saying there's nothing, but I haven't seen an example given yet.

The virtual keyboard keeps coming up, but I don't see how the Valve Index controllers in your fists make that any easier than, well, using a Bluetooth keyboard (supported).

Adding, cropping and rotating an image into a word document came up, but that's just grabbing and manipulating handles by hand, which we've seen demo'd. As you suggest, use a motion multiplier if you want arbitrary precision. But I don't see how a controller in your fist helps in that case either.

The one thing a controller can do that hands in the air can't is provide haptic feedback. But that's not what's being discussed, what's being discussed is precision.

There may be something to aid precision, a sensor that helps, or a physical limitation that can be over come with a controller, or whatever-- I'd be curious to learn what it is, but I haven't seen one suggested yet.
 
VisionOS removes a layer of abstraction spatially. Using a Mac, you are in a physical place, there is a display in front of you in that place, and on that display are application windows. Using VisionOS, there are just application windows in the physical place in which you are.

...making applications windows <sic: as large as himself> and arranging them in a wider carousel not merely in front of him but around him. The constraints of even the largest physical display simply do not exist with VisionOS.

Eloquently said, Mr. Gruber. He summarized this experience into 3 parts:
1. The first, for lack of a better term, is simply “computing”. Work, as it were. Reading web pages, talking via FaceTime, reading email or messages. Using apps.
2. The second type of experience is the consumption of 2D content, like photos, videos and movies.
3. The third type of experience is fully immersive. Full immersion, like transporting you to a cliff’s edge atop a mountain, or lakeside on a beautiful spring day.

He also describes the VR ability on this to disperse people claim that it is only an AR device.


 
Last edited:
Oh please. If you really believed that, you wouldn't be starring at a screen and posting here. And what is so terrible about sitting in front of a screen and watching a movie? As life and society gets more complicated and dangerous, a little escapism can be a good and necessary thing.
Just wait until your little bubble of escapism becomes filled with obnoxious ads. I think there’s a big difference between a physical screen in front of you amongst an otherwise physical world and a screen sat right up against your eyes. Human beings are analogue creatures and can only evolve and adapt to change so much. Each to their own of course, by all means go for it if it suits you but I just find all of this very unappealing.
 
AR stands for “Augmented Reality.” It’s when you see the actual, physical space around you, but virtual elements (like a 3D object, or a web browser window, or a movie screen, as shown in Apple’s presentation) “float” within that space and can be interacted with digitally.

Think of it like in Minority Report when Tom Cruise is using his hands to move digital elements around. Those digital objects aren’t “real,” but they’re in the physical room with him.

(So yeah, you’re sort of right about what you called “transparency mode.”)

VR - or Virtual Reality - is an entire space where the environment itself and everything in it is digital. None of it is real. But you can suddenly (virtually) be in places like outer space, or on a mountaintop, or in the middle of the ocean (while you’re actually standing in your living room or whatever).

Typically, headsets are either ONLY AR or VR. Not both. (Or you have options on VR headsets that let you “pass through” video to see your actual surroundings. But that’s mostly just used so that you can periodically check your bearings or talk to someone without taking off the headset. The pass-through capability in VR headsets is not AR. There’s nothing virtual or augmented in that video pass-through.)

What the Apple Vision Pro does is let you toggle whether you’re interacting with apps in AR - where the apps sort of “float” in the actual space around you - OR, you can turn a dial and switch the space to be entirely virtual (so it looks like you’re using those apps while standing on top of a 3D, virtually-generated mountaintop.) You can even do a mix, so that half of the space around you is real (and AR), and the other half is virtual.

It’s a neat feature. And one of the few unique features on the Apple Vision Pro that you can’t do on any other single headset.
I appreciate the lengthy effort, but that just sounds like app windows floating over a camera feed.

When I think of AR, it’s adding things to (the augmentation) the environment that are not really there. There are many examples that work on our phones. Real time map overlays, adding store front names to buildings, games that interact with real objects, etc.

But I hear you. Perhaps in an ultra minimalist way, it’s an AR experience. The app windows do cast shadows that aren’t really there. 🤪
 
It’s incredible how passionate the debate is and remind me of the first iPhone after the launch. I personally think that Apple has did it again and made us a giant leap in the future. Thank you I will enjoy it …
 
Joysticks and buttons are supported-- we saw game controllers demo'd. What is easier for the brain to translate than using hands like you do in the real world? I'm not saying there's nothing, but I haven't seen an example given yet.

The virtual keyboard keeps coming up, but I don't see how the Valve Index controllers in your fists make that any easier than, well, using a Bluetooth keyboard (supported).

Adding, cropping and rotating an image into a word document came up, but that's just grabbing and manipulating handles by hand, which we've seen demo'd. As you suggest, use a motion multiplier if you want arbitrary precision. But I don't see how a controller in your fist helps in that case either.

The one thing a controller can do that hands in the air can't is provide haptic feedback. But that's not what's being discussed, what's being discussed is precision.

There may be something to aid precision, a sensor that helps, or a physical limitation that can be over come with a controller, or whatever-- I'd be curious to learn what it is, but I haven't seen one suggested yet.
I think a lot of this discussion is based off the fact that each one of us are making different assumptions about how precise the Virtual Pro is at interpreting hand motions, and how it will feel to manipulate virtual objects in this environment. To me, it feels like haptic feedback from a controller, even just the resistance provided by a physical button or joystick, might help me more precisely calibrate how far I'm moving the virtual objects, as opposed to moving my hand in the air with no resistance. But of course, in a virtual world, normal rules of physics don't apply. Apple can implement things in a variety of different ways. So we won't know whether controllers would be helpful for any tasks, and if so which ones, until we get our hands on the VP. But it's fun to speculate!
 
I literally had no interest in this product until your review @Dan Barbera. You seem really wowed by it which has made me think twice. Now I’m completely looking forward to it. That’s a turn around I never expected
 
I think a lot of this discussion is based off the fact that each one of us are making different assumptions about how precise the Virtual Pro is at interpreting hand motions, and how it will feel to manipulate virtual objects in this environment. To me, it feels like haptic feedback from a controller, even just the resistance provided by a physical button or joystick, might help me more precisely calibrate how far I'm moving the virtual objects, as opposed to moving my hand in the air with no resistance. But of course, in a virtual world, normal rules of physics don't apply. Apple can implement things in a variety of different ways. So we won't know whether controllers would be helpful for any tasks, and if so which ones, until we get our hands on the VP. But it's fun to speculate!
Yeah, agreed.

So far I haven't seen a single review indicating the interaction is anything short of extraordinary. These are people that went in with high expectations and could potentially differentiate themselves by finding flaws, but I haven't seen any yet-- certainly not when it comes to the quality of the experience. Obviously we'll start to find limitations as developers show what it can and can't do, and we'll start to see where it gets flakey as more kinds of people interact with it, but I think it's premature to say either that it lacks precision or that there's no means to use controllers.

The haptics limitation is real. We've done a lot without haptics in the computing world though. Pausing or adding some other perceived resistance like we do with mice and such to snap to reference points, for example. I think "tap to click" might in part be because we provide our own haptic feedback in that instance.
 
  • Like
Reactions: Night Spring
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.