Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

izzy0242mr

macrumors 6502a
Original poster
Jul 24, 2009
745
524
I had this idea over a decade ago, before AR or VR were really viable things with existing technology, but the idea was that you point your phone camera at physical objects and then use your finger to "write" on those objects, and your phone would detect your finger "writing," and would virtually overlay your written text onto real objects. So when you point the camera at those objects later, you'd see the note. Example would be you write out a list of ingredients on your fridge as a list of what things you need to buy from the store.

Of course, that was not only not technologically feasible, but not very practical (why write virtual notes you have to scan with your phone vs just putting them on your phone or physically on your fridge).

But it's the way the Vision Pro works, it allows you to pin windows in physical space. And maybe that ability itself renders my idea pointless…but I wonder if you could easily do something like use an Apple Pencil (recognized by visionOS) to write on physical surfaces in handwriting style floating pinned text, and eventually (when the Vision Pro becomes something that fits in a normal size glasses form factor and/or when battery life is a lot better) it's something that actually makes sense to use regularly.
 
The app would have to remember where every window was placed and what was written on every window so that when the app starts back up, the user could load the windows from a saved file or resource.
 
The really dumb part of the VP’s initial experience, is that it does nothing to actually augment reality.
 
A good idea, and I'd like to add this: the ability to scan the Instructions/receipts of those items, to attach. Would be a fantastic way to inventory my tools, camera gear, etc; for future reference, warrantee claims, and insurance claims.
 
  • Like
Reactions: izzy0242mr
The app would have to remember where every window was placed and what was written on every window so that when the app starts back up, the user could load the windows from a saved file or resource.
visionOS already remembers where windows are placed within large buildings (see MKBHD's video on this). So that's entirely feasible.
 
The really dumb part of the VP’s initial experience, is that it does nothing to actually augment reality.
I don't think we'll have true AR until we have usable devices that are true glasses—not camera passthroughs, but like actual glass glasses with a digital overlay (something more like a high powered Google Glass) rather than a world replicated by camera onto a screen (like the Vision Pro).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.