The AR demo that Apple did got me to thinking that maybe an iPhone + cheap headset might be the future. They were moving the phone around and we saw the AR adjust (things getting bigger and perspective changing as they moved closer to the table). So it seems like the iPhone's hardware might be capable of delivering a good-enough experience. Everyone already owns their phone, so then they just need an inexpensive headset and external controller(s) so that you can bring your hands into the experience. And because the phone has a rear-mounted camera, they could implement something like I mentioned above to allow you to easily see the outside world.
Anyways, that's my hope...that we see a headset and external controller and not just AR apps/games designed to have you holding the phone (like the ones demoed by Apple). Those can be great, too, but the immersive experience of being inside of the VR world is another thing altogether.
The HTC Vive has an external camera rig that allows you to see what's going on around you on the outside such as spectators watching you or keeping an eye on your specific area so you don't go out of bounds from the zone. It's extremely handy.
Apple is supposedly working on a headset but I worry about their laziness in slapping a phone into it which could be problematic for people with astigmatism or wear glasses. That headset should've been mentioned or revealed at their recent keynote to assure people that they got this under control.
But my concern is their encouragement in holding an iPad via AR for long periods of time that could make one look like a jacka$$ outside of the house or in the streets. That last thing you want is to have spectators coming to you and asking what's going on with your screen if you're doing something that's private. See the problem?
Secondly, they were anti-touch screen for the iMac thinking it would create 'gorilla arms'. Well, look at their demo of the guy holding the iPad playing the game. To me, it's hypocritical and tells me they didn't keep the ergonomics of interaction in mind. T
This is why I'm an advocate for using headsets for ergonomic and privacy reasons.
I'm quite sure that it is possible to have the iOS device point at the area while streaming to your goggle live. The goggles or glasses would have to be transparent, so that you can see what's going on externally to avoid bumps or people trying to talk to you. Problem is, the goggle would need some kind of mirror to hold in the graphical overlays.
Also, the goggle/glasses would need a camera built in so that it can stream to the iOS device, then it exports the graphics back to you, and so on. You would need TWO cameras. One to view the surface(s) or surroundings to interact with the AR program and a second camera to provide a live feed of your surroundings (ie. people, fences, walls, etc ).
HTC Vive already got that part solved.
[doublepost=1497969617][/doublepost]
I don't think you get it either. ARKit will improve AR and make it more widely available. Being accessible means ease of entry. Meaning you don't have to buy additional hardware, which is what Apple is doing with their iPhone.
Oh, I do get it. But using the iOS device is not going to be the only way. Watch. You may want AR glasses or goggles to use without holding an iPad or iOS device for long period of time.
If you said you don't have to buy additional hardware, then why is Apple making a headset so late in the game?