You know when you wake up in the middle of the night, and just stumble around in the dark to go to the bathroom because you don’t want to turn on a light and hurt your eyes?
That’s basically what the pass-through is good for. I actually thought it would be pretty good. For example, open the video recorder on your iPhone 13 or later and just view your room through on the screen. It looks fine, right? Objects are crisp and details are clear. When you look around you see some motion blur as the image smears a bit in the direction you’re turning (a bit like a micro time-lapse photo capture). But if you had to walk around using nothing but the screen, you’d be completely satisfied.
The Vision Pro pass-through video isn‘t even close to that. It’s on the level of an iPhone 5 or 6. Maybe even the iPhone 4 honestly. Everything is blurry and lacks detail. Quality degrades with distance and edges get hazy. You can see stuff there, but you really don’t want to look at it. The immersive environments are actually more photorealistic than your room. I’m not kidding.
In contrast, virtual content is laser sharp. It creates this juxtaposition that makes the blurry pass-through image more stark. It serves well to keep you from getting disoriented in your physical space via the peripheral. It’s good enough to make sure you don’t bang your shin on a table, knock your glass over, an so on. But it’s unpleasant to actually look at things in the space around you. I really don’t know how Apple published the scene of a woman packing her luggage while wearing it with a straight face - that is patently absurd.
All of this is to say that the ”reality” half of this “mixed reality“ experience is a non-starter. While wearing the Apple Vision Pro, it’s only worth interacting with the virtual content. The moment that your focus turns away from that, it’s so much better for the headset to just come off. Your space is just kind of there as a backdrop for the floating virtual content. It’s almost hard to call it an AR device at all. This probably could have been a purely VR product and might have even been better for it.
That’s basically what the pass-through is good for. I actually thought it would be pretty good. For example, open the video recorder on your iPhone 13 or later and just view your room through on the screen. It looks fine, right? Objects are crisp and details are clear. When you look around you see some motion blur as the image smears a bit in the direction you’re turning (a bit like a micro time-lapse photo capture). But if you had to walk around using nothing but the screen, you’d be completely satisfied.
The Vision Pro pass-through video isn‘t even close to that. It’s on the level of an iPhone 5 or 6. Maybe even the iPhone 4 honestly. Everything is blurry and lacks detail. Quality degrades with distance and edges get hazy. You can see stuff there, but you really don’t want to look at it. The immersive environments are actually more photorealistic than your room. I’m not kidding.
In contrast, virtual content is laser sharp. It creates this juxtaposition that makes the blurry pass-through image more stark. It serves well to keep you from getting disoriented in your physical space via the peripheral. It’s good enough to make sure you don’t bang your shin on a table, knock your glass over, an so on. But it’s unpleasant to actually look at things in the space around you. I really don’t know how Apple published the scene of a woman packing her luggage while wearing it with a straight face - that is patently absurd.
All of this is to say that the ”reality” half of this “mixed reality“ experience is a non-starter. While wearing the Apple Vision Pro, it’s only worth interacting with the virtual content. The moment that your focus turns away from that, it’s so much better for the headset to just come off. Your space is just kind of there as a backdrop for the floating virtual content. It’s almost hard to call it an AR device at all. This probably could have been a purely VR product and might have even been better for it.