Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

terminator-jq

macrumors 6502a
Original poster
Nov 25, 2012
725
1,541
With the Vision Pro launch just hours away (and now that I can finally relax a little after getting my app approved), I wanted to do a little brain dump on my experience developing for Vision Pro. If there’s any other developers that would like to chime in, I’d definitely be interested in reading your experiences. This may also be interesting for those of you who are considering a Vision Pro just to hear the developers side of things.

My app FLOW INFINITY: Dreams (check it out here: https://vizmystech.com/flow-infinity-dreams.php) will be available for Vision Pro at launch (I actually just submitted the first update for the app so hopefully that version gets approved in time). It’s an immersive meditation app that was developed from the ground up for Vision OS. We also developed an iOS version of the app simultaneously which gave us an interesting contrast and highlighted some of the opportunities… and limitations of Vision OS vs iOS.

Screenshot 2024-02-01 at 8.58.30 PM.png



There a lot I could say but this post would be way too long so I’ll just give you the main Good and Challenge points and I’d be happy to add more details if you’d like.

The Good:

• I LOVED creating the immersive spaces for the app. As I shared in another post, the ability to create and share worlds is one of the most intimate experiences I have had as a developer. The ability to (temporarily) change someone’s reality offers so many opportunities and I am already flooded with ideas for my next project. This is the area of Vision Pro that really has some potential and will offer a new creative challenge.

• I was actually really surprised how easy it was to develop for vision OS (once you get past the quirks). Apple has carried over a lot of the same building blocks from iOS which makes development for Vision OS feel immediately familiar.

• The chipset makes things MUCH easier as far as performance. One of the biggest problems developing for VR / AR headsets is balancing features with performance. Since the R1 takes a lot of the background processing weight from the M2, it leaves a lot more headroom to push things without worrying about dropped frame rates.


The Challenges:

• There are some definite design limitations unique to Vision Pro. When Apple first showed the Vision Pro, it was obvious they weren’t intending this to be used like a VR headset and on the development side, that intention becomes even more obvious… Just the nature of Vision OS pushes you to design experiences around the assumption that the user will be stationary.


• Having to use eye and hand tracking for the main source of input also brings some limitations. There are certain design elements that work just fine for touch input but would be a pain in the butt to use with eye tracking. The app really must be designed in a way that makes it easy for the user to “focus” on UI elements.

• Then there’s the screens… Now this is another thing that caught me off guard a little during the WWDC reveal. To be honest I was a little disappointed at first with how much Vision OS relied on virtual screens. On the development side, you do have some amount of control to create smaller “screens” but the overall design pattern of the OS is focused on having all major interactions done through a virtual screen.


All of these factors together definitely pushed my designs and ideas in ways that may have been more free with a traditional VR focused headset. That being said… and as Apple has said, this is not a VR headset… this is Spatial computing… On one hand it seems like a buzzword that Apple uses to separate themselves but as a developer it really started to sink in that this truly is Spatial computing. Something different than VR. The more I began to understand that, the more our app took on a different shape.


Reality Composer Pro:

The other big challenge with development has to do with Reality Composer Pro. Now I wasn’t expecting Unreal Engine 5 levels of quality… or even Unity for that matter but Apple seriously needs to put more work into this.

• One of the biggest issues I ran into was the way Reality Composer pro processes, lighting, and its overall lack of modern day, rendering features for lighting. For example, there’s no real global illumination system there’s no ability to bake light maps and obviously there’s no newer features like Ray tracing.

Since our app Flow Infinity is so centralized on the immersive experiences, the lack of lighting features definitely forced us to have to design our spaces within the limitations of a rendering engine that doesn’t have a proper lighting system.

The biggest setback from this was having to make all of our immersive meditation experiences open air experiences where you’re floating above the clouds. While these look great, we had originally intended to have some immersive experiences that would place the user within an indoor environment. Unfortunately, with the lack of a proper global illumination system, with the lack of the ability to bake light maps or use any sort of light probe system, we could not get the indoor environments to meet our standards of graphical quality.

• There’s also a lack of features for common graphical effects like light flairs, glowing effects and other elements that can add some extra eye candy.

On a sidenote, another slight annoyance that came up that’s more an issue with Xcode than the Vision Pro is there’s no way to really preview fully immersive spaces with the preview window on Xcode. This means that you have to do a full build each time you want to test the immersive spaces. It’s not a big issue but what I said about the lighting definitely comes into play here as the way lighting appears for a scene in reality composer pro may not match how the lighting appears in the build. This can make setting up scenes a little tedious.

I’d imagine these issues are more specific to developers like myself who are planning to build apps that are really focused on immersive experiences, but for developers that want to show a 3-D object in the users room, these issues will probably not be as big of a deal.

Hopefully, Apple will update this as time goes on, and we will definitely update our immersive spaces to take advantage of any new features that they had.

• I also ran into another unexpected limit on the amount of immersive spaces the app can have. Luckily this didn’t impact our release build but it may impact how we handle future updates. I really hope I’m completely wrong about this so hopefully another developer can chime in but from our experience it seems like each Vision Pro app can only support up to 9 individual Fully Immersive spaces. Now again I could be (and hope I am) wrong about this. We may need to do something different in our “App Main” setup.

Other than those issues developing for vision, Pro was truly a blast. It’s obvious that Apple has a certain template that they want people to follow for designing their apps and the OS definitely pushes you to follow that template. That said, even with the boundaries that exist in spatial computing that may have not have existed with traditional virtual reality, there is still plenty of room for potential and plenty of room to create some really amazing things.

As I said before my mind is already flooded with so many ideas although some of those ideas will definitely require Apple to update their rendering system. So there’s my mind dump fresh after developing my first Vision Pro app.
 
Last edited:
Did you use the RealityKit engine for your rendering engine? Or Metal? And Swift/SwiftUI or C/C++?
I used reality kit. We originally planned to use Unity but they’ve done some… interesting business practices lately. Not to mention we wanted to stick to Apples native resources as much as possible.
 
This was incredibly fascinating and gives us a behind the scenes glimpse. When I think of the best in MR I think of posts like this. So refreshing. Can’t wait to try your App.
 
  • Like
Reactions: terminator-jq
I need this app in my life, especially on my Vision Pro! I know it is going to look INSANE in an immersive environment. One of my main VP uses will be for meditation purposes. It’s my way to escape reality and free my mind.

FYI: This is one of my favorites: Harmony:
 
  • Like
Reactions: terminator-jq
I need this app in my life, especially on my Vision Pro! I know it is going to look INSANE in an immersive environment. One of my main VP uses will be for meditation purposes. It’s my way to escape reality and free my mind.

FYI: This is one of my favorites: Harmony:
Thank you!!! Meditation definitely has a lot of potential with immersive applications. We are really looking forward to seeing how we can use Vision Pro for other mental health applications as well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.