Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Almost60

macrumors newbie
Original poster
Nov 24, 2025
1
0
Denver
Purchased M5 AVP after released.



I purchased AVP because in my mind I needed a new computer. Have M1 Max, 64 GB, MBP.



AVP certainly sparks that novelty excitement of getting a new tool that takes a touch of learning but then just works. I have multiple injuries from 30 years ago to one of my eyes and I was just expecting that it wouldn’t work with the hole/scar in my macular which makes the vision in that eye usually, practicably a black hole. My depth of field perspective has been compromised for more than thirty years. I don’t need glasses to see better far or close with my other eye. I think the AVP is the sharpest and clearest image I have ever seen from a device, I guess that is expected from a new device, but phenomenal, crisp text. Apple has 3-D dinosaur video that seemed to come out of the screen. Not sure how the device was able to compensate for my eye but seemed 3d and have not noticed any shuttering.



Sound: if I cup my hands over the top of my ears the sound from AVP becomes louder and more dynamic. I can turn the volume up and get more depth to the sound (without cupping) but I am surprised at how much better the sound is when I cup my hands above ears. Just to note, I have tested my hearing with air pods pro 3 and the results were above the threshold for hearing loss.





Seems like the Apple watch could be used as input for AVP so that when hand is not seen by AVP (hands under blanket or desk) the Apple Watch could detect finger tapping. Wondering if accelerator chip in watch could detect motion well enough to play games. The Sony psvr2 controllers are not ideal because holding hand in fist position is not as natural as using hand and fingers. I have the controllers and subscription to fit fun and the controllers seem to work well with the game but seems like the Apple Watch has the same 6 dof and can recognize taps…





The hand tracking seems to work well but when it starts to not react (especially in the App Store App) I force quit all apps and tracking does seem to work with less accurate response but a full restart is necessary to have hand tracking respond as expected. I have played Asphalt8 (Apple Arcade game) with a Sony ps5 controller and psvr2 controllers with the AVP and it works great. I can switch between Sony controller or hand gestures for the game menus ( after race ends I may get a drink from glass and can use hand input immediately), and it feels more responsive and more reliable than OS when I suddenly use hand to click game menu after using controller - it’s so quick to click through things - feels snappy.



Sometimes I see glare - almost like the light is shining on my eyeballs and that shine is reflecting off the lenses. Not sure what is happening because it does not happen every time, not associated with light coming in around seal and can’t reproduce it just happens invariably. I don’t think it’s a dirty/smeared lens- but too many variables for me to decipher.



Just like many new things it takes bit to get head around workflow optimization. Tonight, I watched PBS News on my TV that was on in my living room while I tweaked an image in Affinity on my MacBook Pro that seemed just as good or better than working directly on my computer.. Also, first time used trackpad on computer and it also worked in AVP - again, difficult to optimize because so novel but didn’t need to depend on AVP to recognize my gestures because trackpad on computer works same on AVP OS while connected to computer.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.