I'm on the fence here. Even as a developer who would love to get full access to all the capabilities of the hardware.
The issue is - a mac takes in only the input you give to it, directly. Via the keyboard, mouse, files you store on it, etc. The Vision Pro has a metric ton of incidental input - it literally can and has to see every thing you look at, down to specifically what you are focusing on, because the cameras and eye tracking are always on. It can see your financial records out on your desk, your family members walking around the house, your children, your pets, what products you have on the shelf. It can also listen to everything you are saying. It also has a 3d model of your face and potentially the ability to recreate your voice (assuming the iPhone feature comes over.)
Every single part of that would be abused if unleashed to third party developers.
That said, at least give us a developer mode with access to a terminal. I don't have high hopes for that given the rate at which iPadOS gets developer focused features.