Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Unity is probably the top game engine used for creating VR experiences/games, especially for mobile-class devices. Lack of Unreal support is probably because of the not-so-great relationship between Apple and Epic right now.

I wouldn't invest in Unity if I wanted to get my money back one day. I don't think they have the trajectory that Unreal has. I suspect they're on a path to losing more users than they gain, given how easily Unreal can lower their prices - and price was why people chose Unity.

There's a lot of stuff in the Apple world (like big games Apple pays to have made as platform exclusives) that is reliant on Unreal.

It's automatic.
The lenses' positions adjust automatically to match the user's IPD, according to people who have been given demos of the Apple Vision Pro.

OK so the lenses are on a motorised track? That's good.
 
I don’t really think it’s much of an AR headset either. There was no demonstration of any real interaction between virtual objects, and the real world. They didn’t show creating virtual objects, and then placing them on real world objects, and having them become persistent etc.

The more I think about it, the ore I think this was VERY smoke & mirrors.
Marques spoke a little about the AR experience here:
Pretty funny that mkbhd got to demo it but the primary writer at "macrumors" didn't.
 
And that is such a sad outcome.
Agree to disagree. Waiting for a good HMD to watch movies and have a virtual Multi monitor setup for my Powerbook C2D MacBook Pro Retina MacBook Pro M1Max Macbook Pro is the thing I've been waiting for longest in my life. Terrible implementations have existed for so long, and now there's finally a headset that hopefully won't be a miserably bad experince for those use cases? Where do I sign? I assume over time other uses will become relevant to me and other paradigms will develop, but the reason this will sell is that it's a better way to watch videos or do spreadsheets on the go.

All the fancy architecture or AAA gaming or virtual machine shop helper or 3d modeling or whatever apps are niche cases that will not move consumer hardware. One of those style use cases may very will be the next trillion dollar idea, but it will happen because there is a critical mass of people with hardware to handle it, and watching movies sells hardware. Google exists because people had web browsers and modems, Facebook exists because people had PCs, Instagram exists because we had phones with cameras, youtube exists because of DV camcorders (then cameraphones of course). The software has to follow the hardware, and the hardware only sells if there is an exsiting use case it makes better (DV sold because it was better than analog video which sold because it was better than 8mm which sold because it was better than nothing). It's a chicken and egg game, and working with 3D content in a 3d environment is very much the chicken, it can't hatch until there are a bunch of eggs out in the wild.
 
just because it doesnt seem to fit ur use case its not ********. would be like saying the Mac Pro is ******** just because u dont have a usecase for it. or saying a server is ******** because again u have no usecase for it

No, my point is that the demos don't show any evidence of the actual compellig thing that people largely claim AR is "for".

Prior to this preview, everyone talking about what an Apple headset would be was talking about real and 3D content intermingling. Lets be clear here - what Appl demoed didn't even show the level of AR you can do with a cellphone - where was the demo of seeing what a new piece of furniture would look like in the space? That was literally the primary use case for AR before this "hang 2D flats in the space around you" version ws demoed.
 
It's a nerd toy.. and a very early prototype at that.. wait for a mainstream version for $999 in 2025.. it will be much more useful.
 
No, my point is that the demos don't show any evidence of the actual compellig thing that people largely claim AR is "for".

Prior to this preview, everyone talking about what an Apple headset would be was talking about real and 3D content intermingling. Lets be clear here - what Appl demoed didn't even show the level of AR you can do with a cellphone - where was the demo of seeing what a new piece of furniture would look like in the space? That was literally the primary use case for AR before this "hang 2D flats in the space around you" version ws demoed.
Seriosuly shut up. This has the M2 plus the R1 processor and you're out here screaming about how a cellphone with less than half the processing power of just one of those chips can do more with AR. That's absurd. On specs alone, this is the most powerful standalone AR/VR headset in the world. By a large margin. It can do the AR **** you want it to. But no one buys based on use cases they don't understand. They buy things to make tasks they already do easier or more enjoyable. "give your laptop a huge screen" is a much easier pitch than "someday there will be an app that figures out a reason you'll want to manipulate virtual objects within real space, so buy this now while we figure out what that use case will be"
 
My first thought after yesterday's keynote: finally, there's a headset with a great user interface.
It remains to be seen how good the hand gesture control will work. It’s inherently limited in precision. It’s hard to imagine, for example, that the virtual keyboard they showed would be working well. Simple menu selection and sliders will be fine, but anything much beyond will probably tend to get difficult to control reliably.
 
Seriosuly shut up.

No, but the door to thread is over there, you're welcome to leave.

That's absurd. On specs alone, this is the most powerful standalone AR/VR headset in the world. By a large margin. It can do the AR **** you want it to.

WHEN I see it do the AR stuff, I'll believe it. Right now, I can buy a Vive XR Elite, and make 3D content in it and place that content in my real world space in a persistent fashion.

That's a "you can do this off-the-shelf" capability.

But no one buys based on use cases they don't understand.

Apple have literally spent the past ~5+ years pitching AR on the iPhone and iPad as being about putting virtual 3D objects into a view of he real world, and now they have a device literally designed for providing a 3D view of the world, suddenly no one understands what they've been pitching all this time?

Where was Anki Drive enabling cars to race around on your floor or dining table, where was "fitting a new iKea bookshelf in that nook etc?

They've literally shown this stuff previously, on phone and tablet. THAT was the entire reason for them to hype AR previously.

They buy things to make tasks they already do easier or more enjoyable. "give your laptop a huge screen" is a much easier pitch than "someday there will be an app that figures out a reason you'll want to manipulate virtual objects within real space, so buy this now while we figure out what that use case will be"

Sure, but again, they already have AR as a pitched solution on other devices for years, so why didn't they show it.
 
It remains to be seen how good the hand gesture control will work. It’s inherently limited in precision. It’s hard to imagine, for example, that the virtual keyboard they showed would be working well. Simple menu selection and sliders will be fine, but anything much beyond will probably tend to get difficult to control reliably.

If you look at a hand tracking demo from Ultraleap, the amount of detail you can pull is ridiculously precise - actually the bigger problem is too much data - there's a lot of frame-dumping and tweening to smooth the motions, because people are a lot more jittery than their proprioceptive experience tells them.

The bigger problem with a keyboard is a lack of tactile feedback, but that's in common with iPads etc.

A standout thing that's also coming to mind is the lack of use of hands - the interaction model was very conservative - it's just iPads floating in space, whereas most players in the field are doing things like mapping controls onto your hands - turn your palm over and palettes & controls are connected to it / bloom from it etc.
 
No, but the door to thread is over there, you're welcome to leave.



WHEN I see it do the AR stuff, I'll believe it. Right now, I can buy a Vive XR Elite, and make 3D content in it and place that content in my real world space in a persistent fashion.

That's a "you can do this off-the-shelf" capability.



Apple have literally spent the past ~5+ years pitching AR on the iPhone and iPad as being about putting virtual 3D objects into a view of he real world, and now they have a device literally designed for providing a 3D view of the world, suddenly no one understands what they've been pitching all this time?

Where was Anki Drive enabling cars to race around on your floor or dining table, where was "fitting a new iKea bookshelf in that nook etc?

They've literally shown this stuff previously, on phone and tablet. THAT was the entire reason for them to hype AR previously.



Sure, but again, they already have AR as a pitched solution on other devices for years, so why didn't they show it.
Because NO ONE uses the "What does this coffee table look like in my room" for more than 5 minutes ever. And, more importantly, because "This $3500 device does what your iphone already does" is not a selling point. What makes people interested is improving the experiences they already enjoy. And the simplest way to show that is "big screen, small device" it's not that deep.

And seriously please stop being obtuse. How can you on one hand praise the AR capabilites of chips as old as the A12, and on the other doubt that the M2 can do even the same AR tasks? Its just beyond idiotic as an argument. There are tons of things to question and doubt about this device. But "can it do what my iPhone Xs could do AR wise" is not one of those reasons to any serious person. because the answer to that question is self evident.
 
And seriously please stop being obtuse. How can you on one hand praise the AR capabilites of chips as old as the A12, and on the other doubt that the M2 can do even the same AR tasks? Its just beyond idiotic as an argument. There are tons of things to question and doubt about this device. But "can it do what my iPhone Xs could do AR wise" is not one of those reasons to any serious person. because the answer to that question is self evident.

What was demoed very carefully avoided demonstrating any workflows that would tax a 3D system.

How about YOU shut up about how amazing the M2 is until the headset shows an example of it actually doing something you can't do on a cheap snapdragon headset, or that you could do for the past 5+ years on iOS devices. Again, hanging flat 2D planes and mapping video files onto them is not a difficult task for a system.
 
  • Like
Reactions: chrash
The main selling point of Vision Pro is how convincing the 3D visuals and 3D audio actually are.

They mentioned several times that this unprecedented (supposed) high level of “spatial” fidelity cannot be conveyed on other devices or displays other than the Vision Pro.

So being that we can’t actually experience this audio visual combo from the phones and computers we watched the presentation on, obviously most have just hung onto the “two great 4K displays for singles living in tiny apartments wanting that home theater experience on Netflix and Disney+” aspect which clearly is just one of many use cases but far from the main selling point.

Vision Pro is about combining the best qualities of touch interfaces, like iPad and iPhone, with multi-tasking multi-window desktop computing, all in a completely hands free VR/AR experience with virtually endless, 4K screen real estate that is also portable.

Long term this will replace both your laptop and your desktop.

It’s just a matter of getting these sensors to track your body, fingers, eyes, voice, etc., that you get the same precision input as a mouse and keyboard can offer.

Not what this version offers quite yet. But in 5-10 years we’ll get very close.
 
  • Like
Reactions: Night Spring
If they can show OpenBrush or similar doing 3D content creation, in a 3D space, and keeping the frame rate up over 90, I’ll be more convinced, but what they showed seemed to be conspicuously missing the things that are actually hard for an AR system to do.

I agree they missed it - however AR on an iPhone is pretty rock solid.
 
Um, the iPad is a video screen. I have a pencil and a keyboard/trackpad for mine, they see 5% of the use the iPad does. VisionPro is also a video screen. The 3D use cases are great, and transformative, and something that this device can do without a doubt. But they're also not the point of this device. That's what things like Hololens are for -- commercial/professional use cases. VisionPro is a consumer video playback device. There will be so many apps that allow you to move your workpiece around or to move around your workpiece in VisionOS, but like the pencil for iPad, most users will not use them much, they'll use this thing to watch youtube or Marvel movies in an airplane. To check sports scores on the toilet. To send texts or snapshots to friends. That's what people use computers for. The actual serious work is a tiny niche of the computer industry and not one Apple is really interested in.

Disagree completely. The compute power is way ahead of Magic Leap, HoloLens and Quest Pro / 3. About 6x in fact. They have an M2 AND a custom sensor chip ( explains the battery life )

It's entirely up to the developers to use that power.

I agree they soul have shown more VR style Use cases - but some of the Disney stuff and the Dino demo we've heard about show it's true AR potential.

They actually showed an industrial application of a production line simulation.
 
  • Like
Reactions: Night Spring
But it's not - see above

WHEN they can demo a 3D content creation tool, that has a 3D work environment, not just flat screens hanging in space, it will have achieved what HTC and Steam were doing in 2017.

Office Workers doing screen-based computing in a welding helmet is going to lose its novelty very quickly. There has to be something inherently three dimensional about the task you're doing for it to have a long term advantage over just having screens for more than a "i can imagine me or someone would like this" scifi mcguffin fantasy.
 
  • Like
Reactions: chrash
I’ve watched the video, and while it’s very slick, the thing that most strikes me is that it’s almost entirely untextured 2D flats in an empty (video passthrough) space.

The gaming mention - again, a 2D game stream, mapped onto a flat. 2D video files, mapped onto a flat screen floating in the space.

This is an order of magnitude less demanding than the experiences a Vive or any other “real” VR headset displays.

I’d like to be proven wrong, but at this stage I don’t see any evidence to make me believe this device is capable of doing actual VR, by which I mean immersive 3D environments with 3D objects you can walk around, and interact with. Or, running a game like Half-life Alyx.

I think if the system could do it, they’d have shown one “create in 3D” spot. They haven’t even shown the “collaborative 3D design in AR” that was literally everyone’s primary justification for what Apple would do with AR.
Hey…maybe watch the State of the Union…
 
WHEN they can demo a 3D content creation tool, that has a 3D work environment, not just flat screens hanging in space, it will have achieved what HTC and Steam were doing in 2017.

Office Workers doing screen-based computing in a welding helmet is going to lose its novelty very quickly. There has to be something inherently three dimensional about the task you're doing for it to have a long term advantage over just having screens for more than a "i can imagine me or someone would like this" scifi mcguffin fantasy.

I think the issue they didn't show much of that... but it exists. reality kit / Ar Kit and Unity will allow for it.
Things shown in. full AR/VR

Butterfly
Breath app
Disney stuff - basketball game, Train, Mickey Mouse, What if... "game'
Dino Demo
and the record Decks / mixing console.

I think they have been so secretive with it, Disney are the only one they have let loose with it so far.

The M2 power is leaps ahead of the Meta Quest pro so should be very capable.


windows-256x256_2x.png

Windows​

You can create one or more windows in your visionOS app. They’re built with SwiftUI and contain traditional views and controls, and you can add depth to your experience by adding 3D content.
volumes-256x256_2x.png

Volumes​

Add depth to your app with a 3D volume. Volumes are SwiftUI scenes that can showcase 3D content using RealityKit or Unity, creating experiences that are viewable from any angle in the Shared Space or an app’s Full Space.
spaces-256x256_2x.png

Spaces​

By default, apps launch into the Shared Space, where they exist side-by-side — much like multiple apps on a Mac desktop. Apps can use windows and volumes to show content, and the user can reposition these elements wherever they like. For a more immersive experience, an app can open a dedicated Full Space where only that app’s content will appear. Inside a Full Space, an app can use windows and volumes, create unbounded 3D content, open a portal to a different world, or even fully immerse someone in an environment.




reality-composer-pro-96x96_2x.png

Reality Composer Pro​

Discover the all-new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps. Available with Xcode, Reality Composer Pro can help you import and organize assets, such as 3D models, materials, and sounds. Best of all, it integrates tightly with the Xcode build process to preview and optimize your visionOS assets.
unity.svg

Unity​

Now, you can use Unity’s robust, familiar authoring tools to create new apps and games or reimagine your existing Unity-created projects for visionOS. Your apps get access to all the benefits of visionOS, like passthrough and Dynamically Foveated Rendering, in addition to familiar Unity features like AR Foundation. By combining Unity’s authoring and simulation capabilities with RealityKit-managed app rendering, content created with Unity looks and feels at home on visionOS.
 
Yoda said:
Fear is the path to the dark side … fear leads to anger … anger leads to hate … hate leads to suffering.

OP, don't fear how good the Vision Pro could potentially be... don't be angry Apple is going to be the one to prove AR/VR is a huge market... don't hate Apple for leading a brand new market segment yet again.... it'll lead to the dark side and suffering... just embrace it...
 
I think the issue they didn't show much of that... but it exists. reality kit / Ar Kit and Unity will allow for it.
Things shown in. full AR/VR

*if* they can pull it off, it will be interesting. I'm still unconvinced about claims for the M2 - in proper tethered VR land even a 4090 has trouble keeping up with the demands of a high-end headset, and the M2 has to drive the screens & 3D, AND run the apps.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.