Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

dvorak0

macrumors newbie
Original poster
Jan 17, 2024
3
1
I attended the AVP applab and tried the device there for more than 5 hours. After that, I have concerns about the game ecosystem in the new platform.

I admit apple's gesture is great. But, I still believe some games will need joystick for the following reasons:

  1. the gesture provided by vision pro has significant delay, as shown in
  2. playing beatsaber by hand is strange, right? I hope something in my hand when playing Rhythm game
  3. I know xbox/ps joystick works out of box. but is not for those games already in quest store.
will there be a solution to this? or maybe 3rd-party Accessories will help. I would definitely pay for that, once I successfully ordered the vision pro.
 
No offense, but as a game developer shouldn't you be able to answer these questions already. Especially since you attended the developer lab session? I believe the point of those labs were to test your apps and ask precisely the kinds of questions you've posted here.

There's also the Unity forums for developers working in Unity for vision.

For 1) The delay in the Reddit link you posted has been reported by some Twitter users to be a result of the simulator/Unity. Actual gameplay won't have the delay.

From the post's OP "According to multiple sources this delay is NOT present whilst in native apps, only whilst in the simulated development environment in Unity (which makes sense)"

If you're already working with Unity or Reality Composer you should know this already.

For 2) Apple makes suggestions on how to create and use custom gestures on their Developer site (example: https://developer.apple.com/documentation/visionos/happybeam). Again, as a developer shouldn't you know how to use the documentation?

For 3) Yes. You can use 3rd party controllers. About incorporating them into VR games I guess that's up to the developer. But you can see an example of them being used in the link I posted. Again, I'm sorry but you should know this already as a game developer.

I'm not a programmer, software engineer or game developer and I easily managed to find the aforementioned info.
 
  • Angry
Reactions: Flowstates
I think that the OP asked a relevant question about two noticeably different paradigms of HCI consisting of Gesture only vs Mediated interation and their relationship towards user feel and satisfaction. And that this question did in no way justify the pedantic response you just blurted, knowing all well that you are woefully unqualified to formulate it.

I completely agree that rythms games do gain from the grounding offered by contact with a substrate. Who knows, we may see a new standard come out of all of this considering that the range of interactions presented in the guidlines is quite shallow. Ala Spacemouse3D ?.
 
  • Haha
Reactions: dante_mr
I think that the OP asked a relevant question about two noticeably different paradigms of HCI consisting of Gesture only vs Mediated interation and their relationship towards user feel and satisfaction. And that this question did in no way justify the pedantic response you just blurted, knowing all well that you are woefully unqualified to formulate it.

I completely agree that rythms games do gain from the grounding offered by contact with a substrate. Who knows, we may see a new standard come out of all of this considering that the range of interactions presented in the guidlines is quite shallow. Ala Spacemouse3D ?.
Right. A spacemouse in 3D space with decent Joystick Vibrator is what I need. I hope but don't believe apple will delivery that. That depends on how apple think "spatial computing" is. I would pay for it for sure, solely for playing ping-pang game, saberVR.
 
  • Like
Reactions: Flowstates
No offense, but as a game developer shouldn't you be able to answer these questions already. Especially since you attended the developer lab session? I believe the point of those labs were to test your apps and ask precisely the kinds of questions you've posted here.

There's also the Unity forums for developers working in Unity for vision.

For 1) The delay in the Reddit link you posted has been reported by some Twitter users to be a result of the simulator/Unity. Actual gameplay won't have the delay.

From the post's OP "According to multiple sources this delay is NOT present whilst in native apps, only whilst in the simulated development environment in Unity (which makes sense)"

If you're already working with Unity or Reality Composer you should know this already.

For 2) Apple makes suggestions on how to create and use custom gestures on their Developer site (example: https://developer.apple.com/documentation/visionos/happybeam). Again, as a developer shouldn't you know how to use the documentation?

For 3) Yes. You can use 3rd party controllers. About incorporating them into VR games I guess that's up to the developer. But you can see an example of them being used in the link I posted. Again, I'm sorry but you should know this already as a game developer.

I'm not a programmer, software engineer or game developer and I easily managed to find the aforementioned info.
Thanks for reply and additional material you provide. I did observe hand tracking's delay with a native swift app. I might be wrong as i'm still learning the new platform. I share it, and ask question here, just to see whether it's a common request from the perspective of developer.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.