Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

mattspace

macrumors 68040
Original poster
Jun 5, 2013
3,599
3,134
Australia
I’ve watched the video, and while it’s very slick, the thing that most strikes me is that it’s almost entirely untextured 2D flats in an empty (video passthrough) space.

The gaming mention - again, a 2D game stream, mapped onto a flat. 2D video files, mapped onto a flat screen floating in the space.

This is an order of magnitude less demanding than the experiences a Vive or any other “real” VR headset displays.

I’d like to be proven wrong, but at this stage I don’t see any evidence to make me believe this device is capable of doing actual VR, by which I mean immersive 3D environments with 3D objects you can walk around, and interact with. Or, running a game like Half-life Alyx.

I think if the system could do it, they’d have shown one “create in 3D” spot. They haven’t even shown the “collaborative 3D design in AR” that was literally everyone’s primary justification for what Apple would do with AR.
 
This is not really a VR headset, but more of an AR headset.

I don’t really think it’s much of an AR headset either. There was no demonstration of any real interaction between virtual objects, and the real world. They didn’t show creating virtual objects, and then placing them on real world objects, and having them become persistent etc.

The more I think about it, the ore I think this was VERY smoke & mirrors.
 
Seems very much to be an office/professional tool to work on ‘things’ placed into a real space rather than an immersive experience.
 
  • Like
Reactions: Sikh
As was said, it's an AR headset, not VR. But it has an M2 inside of it so I expect it'll be very capable.

Apple silicon is not actually very capable at 3D (that’s why consumer AMD GPUs blow the doors off them in 3D benchmarks), and in a headset you’re rendering 2 distinct views of any object’s geometry & textures.

What I saw in that video seemed very carefully to avoid the things that other AR headsets do to demonstrate their capabilities.
 
Apple silicon is not actually very capable at 3D (that’s why consumer AMD GPUs blow the doors off them in 3D benchmarks), and in a headset you’re rendering 2 distinct views of any object’s geometry & textures.

What I saw in that video seemed very carefully to avoid the things that other AR headsets do to demonstrate their capabilities.
there are a bunch of apps on iPhone with lidar etc that you can use if you want to demonstrate it to yourself.
 
I’ve watched the video, and while it’s very slick, the thing that most strikes me is that it’s almost entirely untextured 2D flats in an empty (video passthrough) space.

The gaming mention - again, a 2D game stream, mapped onto a flat. 2D video files, mapped onto a flat screen floating in the space.

This is an order of magnitude less demanding than the experiences a Vive or any other “real” VR headset displays.

I’d like to be proven wrong, but at this stage I don’t see any evidence to make me believe this device is capable of doing actual VR, by which I mean immersive 3D environments with 3D objects you can walk around, and interact with. Or, running a game like Half-life Alyx.

I think if the system could do it, they’d have shown one “create in 3D” spot. They haven’t even shown the “collaborative 3D design in AR” that was literally everyone’s primary justification for what Apple would do with AR.

Well, the Disney section showed the most actual AR in 3d space. A live basketball game on a table top in AR would be awesome. I am not that surprised as they have kept it away from Developers that just love to blab. And Apple has no actual VR games in the store to show off. I am sure they will come, but yes I am surprised they didn't have one on show. Perhaps it's the lack of any physical controllers? I am shocked they didn't have an Apple Pen for art stuff.

The M2 is a beast right so there is no reason it can't be done. The M2 is way faster than even the up coming XR2 Gen2 in the Quest 3.

 
Apple silicon is not actually very capable at 3D (that’s why consumer AMD GPUs blow the doors off them in 3D benchmarks), and in a headset you’re rendering 2 distinct views of any object’s geometry & textures.

What I saw in that video seemed very carefully to avoid the things that other AR headsets do to demonstrate their capabilities.

The M2 is about 6 times faster Than the Meta Quest 2 Chip in single-core and blows it out of the water in Multicore. The M2 GPU is also way faster. It's a lot faster than the upcoming Quest 3 chip too.

Sure it's not going to be Discrete GPU / 4090 fast... but it should be able to handle 3D really well. I think they avoided it because they haven't created any demos. But reality Kit and AR kit can do this stuff.

They did hint at stuff with the Disney section and even the weird Breathe calm petals in space.
 
The M2 is about 6 times faster Than the Meta Quest 2 Chip in single-core and blows it out of the water in Multicore. The M2 GPU is also way faster. It's a lot faster than the upcoming Quest 3 chip too.

Sure it's not going to be Discrete GPU / 4090 fast... but it should be able to handle 3D really well. I think they avoided it because they haven't created any demos. But reality Kit and AR kit can do this stuff.

They did hint at stuff with the Disney section and even the weird Breathe calm petals in space.
If they can show OpenBrush or similar doing 3D content creation, in a 3D space, and keeping the frame rate up over 90, I’ll be more convinced, but what they showed seemed to be conspicuously missing the things that are actually hard for an AR system to do.
 
Apple silicon is not actually very capable at 3D (that’s why consumer AMD GPUs blow the doors off them in 3D benchmarks), and in a headset you’re rendering 2 distinct views of any object’s geometry & textures.

I think the fact that this thing is self contained indicates that Apple isn't as interested with the VR side - obviously you're not going to be able to shoehorn a discrete AMD solution into something like this. But when it comes to GPU capability given the constraints of the form factor, does M2 have any peers?
 
  • Like
Reactions: Sikh
I’ve watched the video, and while it’s very slick, the thing that most strikes me is that it’s almost entirely untextured 2D flats in an empty (video passthrough) space.

The gaming mention - again, a 2D game stream, mapped onto a flat. 2D video files, mapped onto a flat screen floating in the space.

This is an order of magnitude less demanding than the experiences a Vive or any other “real” VR headset displays.

I’d like to be proven wrong, but at this stage I don’t see any evidence to make me believe this device is capable of doing actual VR, by which I mean immersive 3D environments with 3D objects you can walk around, and interact with. Or, running a game like Half-life Alyx.

I think if the system could do it, they’d have shown one “create in 3D” spot. They haven’t even shown the “collaborative 3D design in AR” that was literally everyone’s primary justification for what Apple would do with AR.
Did you miss the part at the end where they talked about Unity integration? This is EXACTLY what they meant. 3D games in a 3D environment, and that can be either VR or AR. I'm pretty sure we'll see RE8 in VR, as I'm sure Apple didn't get the dev kit to any game company with enough time to show examples. Who knows, maybe Death Stranding 2 or something else from Kojima will have the full VR treatment. It has been rumored for long Kojima is working on a VR experience.
 
I’ve watched the video, and while it’s very slick, the thing that most strikes me is that it’s almost entirely untextured 2D flats in an empty (video passthrough) space.

The gaming mention - again, a 2D game stream, mapped onto a flat. 2D video files, mapped onto a flat screen floating in the space.

This is an order of magnitude less demanding than the experiences a Vive or any other “real” VR headset displays.

I’d like to be proven wrong, but at this stage I don’t see any evidence to make me believe this device is capable of doing actual VR, by which I mean immersive 3D environments with 3D objects you can walk around, and interact with. Or, running a game like Half-life Alyx.

I think if the system could do it, they’d have shown one “create in 3D” spot. They haven’t even shown the “collaborative 3D design in AR” that was literally everyone’s primary justification for what Apple would do with AR.
But why do I want "immersive 3d VR"? I want to put a giant monitor or Omnimax style screen in front of me wherever I am. That's the actual use case for a HMD, not pretending I'm a video game character who is constrained to my living room. I want to be able to change postions at my desk and not have to hunch over, crane my neck, move around displays or squint. I want to watch a movie on a plane or train with a theatrical quality experience.

Also, there's no doubt it's capable of 3d VR, the M2 alone is an order of magnitude more powerful than whatever snapdragon or whatever is in a Meta Quest, which millions of people use for 3d VR gaming every day.

VR is really cool, for something like a lasertag game where you can make a warehouse look like a martian battlefield for cheap. But outside of that kind of experience, MR and ideally AR is what will actually be useful. Providing large screens, useful info, or great content playback to my life exactly when and where I need it!
 
Apple silicon is not actually very capable at 3D (that’s why consumer AMD GPUs blow the doors off them in 3D benchmarks), and in a headset you’re rendering 2 distinct views of any object’s geometry & textures.

What I saw in that video seemed very carefully to avoid the things that other AR headsets do to demonstrate their capabilities.
Other VR headsets have been basically flops though. Why say "we do what the HTC vive does"? when no one knows what that device is. Show reasons people might actually want it. Like to replace my external monitors, ipad, etc. To be able to turn wheever I am into a great workspace or immersive theater. That's actually useful. Gaming is a fun add on, but not a reason many people will spend $3500. It's just not what this hardware is desgined for or capable of. I'd take input from my ~$1500 gaming PC on a virtual IMAX screen over any home VR game I've ever experienced.
 
  • Like
Reactions: nawoo
I have a similar concern. We need to wait for all the videos/dev sessions on how to create different experiences on visionOS (https://developer.apple.com/visionos/learn/). If you take a look at some of the video thumbnails it looks like they will demonstrate building more complex 'full 3D with world interaction' style apps instead of these floating windows. What you're describing right now can be accomplished on the iPhone using ARKit and RealityKit but I don't know how complex and detailed those scenes can be.

So why didn't they demo more of what you're describing? I have two theories:

#1: Vision pro is a computer, not a gaming or "3D Experience" device
I think Apple are focused on making people believe that this is a new paradigm of computing rather than a VR 'experience' headset. VR goggles right now have the reputation of being 3D game focused and AR goggles have a reputation for being shoddy.

Apple correctly identified that waving your arms in the air is not something conducive to a comfortable computing experience so that's not the primary method of interacting with the device. Yes, the device CAN detect your arm and wave around, which is useful for a game, but that's not what people would want to do with a generalized computing platform so Apple didn't focus on it (it had a brief mention).

We will see 3D-first novel productivity platforms in the future, a lot of them built on visionOS no doubt, but Apple need to make the experience familiar, at least initially, so people understand its utility and can relate it to how they already use their Apple devices. People liked the iPad because it was just a bigger version of what they already loved about their iPhone -- visionOS is certainly a bigger leap than that but to some degree Apple are intelligently leaning on that iPad strategy for the introduction.

They're picking their battles. They're pitching the practicality of the paradigm (infinite canvas, how the experience scales *literally*), not trying to reinvent the process of productivity (introducing a 3D native asset akin to the document for productivity, a new task tracking system) at the same time because that way you run the risk of losing the average person.

#2: Vision Pro can't compete on VR scene fidelity
If you take a look at some of the leading VR games like Half Life Alyx or a detailed racing sim on a headset like Valve Index, you need a powerful discrete GPU to run the games at a sufficiently high FPS and those headsets do not have anywhere near the same resolution as Vision Pro (Vision Pro has approx 5x the number of pixels vs. Valve Index).

So think about running complex 3D environments at 120+ fps on 23 million pixels using an onboard M2, the same chip that ships in a MacBook Air. Oh, and the chip can't get too hot because it's right next to the user's face. Last I checked, the MacBook Air cannot run 3D games very well (outside of simple arcade style games) and its display has less pixels than a single eye in Vision Pro.

Apple probably avoided showing 3D scenes because the device simply can't handle them. I don't think Apple necessarily care about that right now because they don't think gaming and AR gimmicks (physics and projecting stuff onto the world) is how you convert people, Oculus have already tried and failed. After trying to use Magic Leap for productivity and various other VR headsets, I'm inclined to agree.

I can't bloody wait for this thing.
 
Did you miss the part at the end where they talked about Unity integration? This is EXACTLY what they meant. 3D games in a 3D environment, and that can be either VR or AR. I'm pretty sure we'll see RE8 in VR, as I'm sure Apple didn't get the dev kit to any game company with enough time to show examples. Who knows, maybe Death Stranding 2 or something else from Kojima will have the full VR treatment. It has been rumored for long Kojima is working on a VR experience.

It wasn’t in the video - hence my questions. I’ll take it a lot more seriously if Unreal starts supporting it, because that’s the ascendant platform in 3D & content production.

But I remain unconvinced - nothing in the demo video showed any interactions with the real environment, nor any three dimensional content.

Also no mention of IPD calculation / adjustment, which again is odd because it’s a major part of headset tech requirements.
 
But why do I want "immersive 3d VR"?

Once you’ve worked on 3D content, in a stereoscopic 3D environment, you’ll understand. All the time wasted on viewport manipulation is gone. It’s the most fundamental change in technology in work since the GUI.


I want to put a giant monitor or Omnimax style screen in front of me wherever I am. That's the actual use case for a HMD, not pretending I'm a video game character who is constrained to my living room.

I want to see a slightly different angle on the thing I’m creating, I can:
  1. Change my current cursor tool to a viewport manipulation tool
  2. Use the mouse etc to drag and move the viewport to see the new angle.
  3. Change back to the work tool to continue working
  4. Switch back to the viewport moving tool. Etc.
Or:

  1. Move my head.
I want to be able to change postions at my desk and not have to hunch over, crane my neck, move around displays or squint. I want to watch a movie on a plane or train with a theatrical quality experience.

It’s definitely targeted at office workers people doing 2d tasks. But that’s the thing, I’m not seeing anything of actual AR in this headset.

Where’s the instructional guides showing the procedure for maintaining a mechanical system, identifying parts and providing the guidance etc.

Also, there's no doubt it's capable of 3d VR, the M2 alone is an order of magnitude more powerful than whatever snapdragon or whatever is in a Meta Quest, which millions of people use for 3d VR gaming every day.

But not 3D work. Most of those self-contained headsets are like cheap android tablets, novelties whose destiny is to end up in a drawer.

Again, it’s unsettling to me that they didn’t show a single “making 3D content in a 3D tool, in a 3D space” demo.

VR is really cool, for something like a lasertag game where you can make a warehouse look like a martian battlefield for cheap. But outside of that kind of experience, MR and ideally AR is what will actually be useful. Providing large screens, useful info, or great content playback to my life exactly when and where I need it!

Yeah, see that’s to miss the entire point of having stereoscopically separated content. The best analogy I can make, is that what Apple showed is the iPad pre-stylus.

The Apple Pencil is the most important transformative part of the iPad as a paradigm. Otherwise its just a big iPhone.
 
  • Like
Reactions: arkitect
Once you’ve worked on 3D content, in a stereoscopic 3D environment, you’ll understand. All the time wasted on viewport manipulation is gone. It’s the most fundamental change in technology in work since the GUI.




I want to see a slightly different angle on the thing I’m creating, I can:
  1. Change my current cursor tool to a viewport manipulation tool
  2. Use the mouse etc to drag and move the viewport to see the new angle.
  3. Change back to the work tool to continue working
  4. Switch back to the viewport moving tool. Etc.
Or:

  1. Move my head.


It’s definitely targeted at office workers people doing 2d tasks. But that’s the thing, I’m not seeing anything of actual AR in this headset.

Where’s the instructional guides showing the procedure for maintaining a mechanical system, identifying parts and providing the guidance etc.



But not 3D work. Most of those self-contained headsets are like cheap android tablets, novelties whose destiny is to end up in a drawer.

Again, it’s unsettling to me that they didn’t show a single “making 3D content in a 3D tool, in a 3D space” demo.



Yeah, see that’s to miss the entire point of having stereoscopically separated content. The best analogy I can make, is that what Apple showed is the iPad pre-stylus.

The Apple Pencil is the most important transformative part of the iPad as a paradigm. Otherwise its just a big iPhone.
Um, the iPad is a video screen. I have a pencil and a keyboard/trackpad for mine, they see 5% of the use the iPad does. VisionPro is also a video screen. The 3D use cases are great, and transformative, and something that this device can do without a doubt. But they're also not the point of this device. That's what things like Hololens are for -- commercial/professional use cases. VisionPro is a consumer video playback device. There will be so many apps that allow you to move your workpiece around or to move around your workpiece in VisionOS, but like the pencil for iPad, most users will not use them much, they'll use this thing to watch youtube or Marvel movies in an airplane. To check sports scores on the toilet. To send texts or snapshots to friends. That's what people use computers for. The actual serious work is a tiny niche of the computer industry and not one Apple is really interested in.
 
My first thought after yesterday's keynote: finally, there's a headset with a great user interface. A few months ago, John Carmack (Oculus founder) said that the current VR hardware is sufficient, yet the software and UX/UI is still lacking.

Apple is concentrating on the latter.
Great UI and user experience, combined with seamless connectivity to Apple's entire ecosystem = faster mass adoption.

Of course, at $3500, most people will wait. But they'll hear and read about Vision Pro from somewhere. Friends, colleagues, news, Apple Store etc...
And what they'll hear is that the headset is easy to use and work with. So when the price comes down, they'll know what to buy.

I own an Oculus Quest, and I'm fairly happy with it for gaming.
Yes, it's the best-selling headset. But from a UX/UI point of view the Oculus Quest is the Blackberry. The Vision Pro (from what I've seen yesterday) is the iPhone 1.
 
After trying to use Magic Leap for productivity and various other VR headsets, I'm inclined to agree.

A bunch of valid points, but I'm going to zero on this one because it's important - Magic Leap was a ponzi scam, created by the same types of people who pushed blockchain, NFTs, Cryptocurrency, and "AI". Its only purpose was to sucker investors and drain their money.

I've done significant work in VR - architectural modelling in the VR environment, 3D sculpting to 3D printing workflows, designed large objects for theatre productions that would hang over the cast and audience, and talked with people using it for medical rehab, particularly PTSD and phantom limb issues.

IF the story is primarily "putting 2D flats in the space around you" I'm just not convinced, because that's not a compelling long term reason to strap something heavier and bulkier than a welding helmet (I weld) onto your face to do office work.
 
  • Like
Reactions: zakarhino
I'm also unconvinced by putting the graphical horsepower in the headset, as opposed to tethering - that's a lot of expense to become obsolete REALLY quickly while we're still in the acceleration phase of GPUs and immersive 3D complexity. I want to see a story of how large models and environments can be persistent on the Mac, with the on-device processing focussed on driving the screens.
 
I’ve watched the video, and while it’s very slick, the thing that most strikes me is that it’s almost entirely untextured 2D flats in an empty (video passthrough) space.

The gaming mention - again, a 2D game stream, mapped onto a flat. 2D video files, mapped onto a flat screen floating in the space.

This is an order of magnitude less demanding than the experiences a Vive or any other “real” VR headset displays.

I’d like to be proven wrong, but at this stage I don’t see any evidence to make me believe this device is capable of doing actual VR, by which I mean immersive 3D environments with 3D objects you can walk around, and interact with. Or, running a game like Half-life Alyx.

I think if the system could do it, they’d have shown one “create in 3D” spot. They haven’t even shown the “collaborative 3D design in AR” that was literally everyone’s primary justification for what Apple would do with AR.
just because it doesnt seem to fit ur use case its not ********. would be like saying the Mac Pro is ******** just because u dont have a usecase for it. or saying a server is ******** because again u have no usecase for it
 
It wasn’t in the video - hence my questions. I’ll take it a lot more seriously if Unreal starts supporting it, because that’s the ascendant platform in 3D & content production.
Unity is probably the top game engine used for creating VR experiences/games, especially for mobile-class devices. Lack of Unreal support is probably because of the not-so-great relationship between Apple and Epic right now.
Also no mention of IPD calculation / adjustment, which again is odd because it’s a major part of headset tech requirements.
It's automatic.
The lenses' positions adjust automatically to match the user's IPD, according to people who have been given demos of the Apple Vision Pro.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.