I agree that the AppleWatch is a relatively unnecessary accessory.
It is convenient for many and brilliant is some aspects (like health), but make no mistake, Apple Glasses will change EVERYTHING if Apple delivers.
Imagine:
-NOT having to look over to your dashboard-mounted phone when driving, or even better, getting non-obstructive walking directions while in motion.
-Getting notifications for emails, text, or calls in your peripheral vision, and being able to respond or swipe them away with a hand gesture or a light head shake.
- Being on a train watching Netflix on your glasses, and it pauses and gets out of the field of vision the moment you stand up
-Having the device call 911 when detecting a life threatening sudden stop while device is registering your eyeballs are close to the glass
-Being able to see a new paint color on the walls, or how furniture fits in a room without having to hold a device in front of you and with greater realism.
-Imagine not needing a computer monitor at all when stationary (for light tasks, at least).
I could go on, but this is the FUTURE. IDK if all or ANY of the above can/will happen, but I'm genuinely curious to see where Apple goes with this.
These are some great examples and I think at least some if not all of these are exactly where Apple will take this. Plus no doubt some other examples we haven’t even thought of yet.
Your computer monitor one is one I’ve thought about a lot. Right now I can have a large and/or multi monitor set up at home/office but when I’m out and about I have to settle for the smaller laptop display - or the iPhone/iPad/watch display depending on what I’m doing. These glasses could change all that: I could set the glasses up to emulate (AR) any monitor arrangement I like, anywhere for any of my devices - 32 inch 6K display for my watch? No problem. All these devices we have are varying levels of computing power and different screen sizes. The glasses can give me any virtual screen screen size(s) I want for any of my devices. The only limitation would be the resolution of the glasses themselves.
Perhaps even that’s limiting it too much anyway - imposing today’s paradigms on the new tech. Instead these glasses could potentially redefine how we do everything we do on computers today in ways we can’t even imagine yet.
For now, these plus some AirPods Pro, say, will certainly redefine how we see, and hear (ie. receive) a computer’s output - what currently the monitor and speakers do. For further revolutionizing computing, we need to revolutionize how we input into the computer. For decades it’s been keyboard and mouse. More recently it’s touch and speech, with gestures on the rise. What’s next for input?
The ultimate is when we figure out how to send our thoughts directly to these things. That’s a way off yet of course - and even when we figure that out there’s still be work to do to make sure it’s safe and always accurate. In the meantime...
Well, if these can track our eye movements and focus (tech exists that can do that, but can it fit into something like a pair of glasses?), then that either alone or combined with gestures or touch (eg. look at something on my virtual monitor to emulate the mouse movement, and tap finger and thumb together by my side to do what a mouse click does - or something like that) can potentially replace the mouse for a computer and touch screen input for phone and tablet). But still need a more convenient way to get text into a computer. Speech cuts it only to a point. Pretty hard to dictate code (eg. Web developer inputting html, css, JavaScript, etc.). So some kind of text input still required until we get to a point where perhaps we’re redefining programming languages to be more compatible with dictated language.
Needless to say, these glasses are the start of some (more) huge changes. We’ll look back in 10 years at how they’ve changed the world in ways similar to how we look back at what the iPhone has done.
Exciting stuff if you ask me. 😊