WARNING: this is a bit of an incoherent ramble of a post:
I've been thinking about this for the past couple days and want to revise my position to be more in line with OP's, I think they are fundamentally correct on most of their points except the one about smartphone technology plateauing. I'm still uncertain about the future but there's more nuance to it than I initially thought and some of what I said in my first post is wrong.
I still believe AR glasses will be very useful for productivity in some contexts but
I agree they won't necessarily see mass adoption by the general public UNLESS there is a significant reason to wear AR glasses all day every day; as of right now there's no evidence for that, in fact there's evidence for the opposite: the Apple Watch (and Google Glass). At the very least my notion that AR glasses reduces the friction of interacting with the digital world is wrong, the opposite is true. Additionally, I'm 100% convinced AR glasses will neither replace smartphones nor be used more than smartphones (I know very few argue this but thought it was worth mentioning anyway as Apple's glasses are talked about in the media as "the thing to replace the iPhone" or "the next iPhone").
Let's start with the problem of the Apple Watch and how intrusive it is in social contexts, be it work meetings or conversing at the pub. The reason I gave for the mass adoption of AR glasses is that it reduces the friction of dipping in and out of the real world and the digital world. Given how many people stare at their phone all day I thought it was only logical that people will want to immerse themselves in the digital world further whilst retaining connection to the real world by blurring the line between the two (AR glasses, right?). Wrong, that point is foolish.
In reality the hard line between looking down at your phone and looking up to engage with the world is
a positive of the smartphone experience. The Apple Watch is proof of this because constantly looking down at it when talking with someone is very rude and often distracting to the person wearing it. Not everyone has this experience of course but I certainly can't help but flick my wrist when it buzzes and it seems a lot of Apple Watch users I interact with do the same. Meanwhile when my phone is in my pocket I don't bother checking it until there's a break in whatever conversation or work I'm doing at the moment.
The Apple Watch is hostile to sociability and productivity, arguably a lot more so than the iPhone because it interrupts work and social flow. It blurs that hard line between the two worlds just enough to cause a nuisance, so much so that I now only wear mine when working out.
Now can you imagine how much more irritating it would be to have a conversation with someone and watch their eyes lose contact with yours before bouncing around staring into the void mid conversation? That's what seamless connectivity looks like in the AR glasses world. Sure, Apple can solve this by only engaging the display and showing notifications after you actively press a button on the remote (the rumored control device for the glasses) but at that point you've somewhat destroyed the idea of seamless connectivity by reintroducing the hard line of the two worlds. Maybe AI will decide when to interrupt you based on context but even that will be hit and miss.
It seems even if
you like wearing your AR glasses all the time other people interacting with you won't, to the point that it becomes a faux pas to wear them outside solo use or rare in person multi-user experiences that I think will be few and far between. If that's the case AR glasses will just be another thing to carry around with you rather than something you put on at the beginning of the day and take off when you get home.
It's also worth thinking about what the AR UX experience will be in general and in particular how existing devices might be superior for most daily use cases. Will apps project into thin air on a virtual canvas or onto the environment and objects around you? A hybrid of both? If apps project onto real 3D space around you how do you guarantee a consistent user experience in different environments? If apps occupy a virtual canvas how much different is the experience between that and a regular display? As for using AR glasses as a replacement for multi monitor workstations you sacrifice the ability for other people to watch what you're doing. Also do people want to use a virtual 80 inch display 20 inches in front of them? Most of that content will be in your peripheral anyway and will AR micro displays ever provide the same fidelity and picture quality that a HDR display provides? Will the glasses be operated via touch remote or hand control? Touch remote and flicking a virtual cursor around is a lot more difficult than tapping things on a physical display (use an Oculus or Magic Leap and you'll see what I mean). So maybe hand and finger controls in 3D space instead? Well that's a lot more complex than a touchscreen and requires both more energy and physical space than a phone (people are lazy and want to flick their thumbs, not wave their arms around in thin air). I guess contextual info bubbles when shopping would be a use case but would that cause more irritation than usefulness? It would for me, I would feel like the glasses are constantly nagging me to buy something.
My experience with testing a Magic Leap headset and various VR headsets has shown me that 3D experiences often introduce more hassles and variables to process than anything else (even in the obvious use case of gaming immersion, hence my previous prediction around VR being an optional accessory for peripheral immersion in gaming rather than trying to emulate physical reality via hand remotes, etc.). In some ways the restriction of apps to a fixed size 2D plane is actually liberating vs. the added navigational complexity of 3D. We've had ARKit for a while now and the ability to render 3D scenes long before the iPhone came about... have there been many core apps, productivity or otherwise, that use 3D as the primary mode of interaction? No. In that case AR will find itself limited to stuff that necessitates 3D rather than porting all existing experiences to AR... and therefore we're back to AR glasses being a niche accessory rather than an evolution of the smartphone.
The truth is AR glasses will ultimately end up being an optional accessory for smartphones and desktops both in the consumer and professional world but they will never replace either one of them because:
1) Glasses don't provide anything fundamentally new to the connected consumer experience except convenience in some very limited scenarios.
2) Most people not only get by with smartphones but I think will ultimately prefer them because of the familiarity, reduced complexity, and inherent benefits of the design (chiefly the screen and cameras).
3) The new experiences possible with AR won't be compelling and broad enough for everyone to want to buy them let alone utilize them all day in the same way we use smartphones all day.
4) Wearing AR glasses all the time is distracting for the wearer and hostile to those interacting with you.
It would seem Apple already know this which is why they're rumored to be focusing on productivity with their bulky AR goggles first vs. immediately going after the average consumer with pointless games and "here's how cool your day will be with AR Glasses on!" video reels a la Google Glass' first iteration (Google also learnt the hard way and quickly abandoned the regular consumer use case instead choosing to focus on factory and warehouse use cases where you need information whilst assembling a car or something).
tl;dr Ultimately the smartphone introduced a way for the average person to be connected with the digital world anytime and anywhere. AR glasses will not have anywhere near that same societal impact because they fundamentally don't innovate upon that paradigm outside of providing a 3D experience of limited usefulness on top. Both 3D and accompanying AR glasses introduce more hassle and friction than they do compelling use cases for daily use vs. a smartphone. The 3D metaverse shows no signs of supplanting traditional 2D experiences that can be accessed on a smartphone. Even if most of the UX issues are solved that doesn't address the issue of the glasses being 100x more socially hostile and invasive than existing tech like smart watches and smartphones.
You can make the argument that 'nobody' saw the impact of smartphones coming as I did in my first post but actually that's wrong, technologists DID see it coming because the true innovation being delivered was the portable internet and smartphones just enabled that. Much of the tech world saw the use cases in mass adoption of smartphones before smartphones were even a thing. Maybe it took everyone else a little longer to see it back then because the intial tech was rough around the edges (hence the mobile email story) but has the tech world shown us what the impact of AR will be despite AR experiences being around long enough for core use cases to arise? It doesn't seem like it.
Hell the Apple Watch is more useful than AR glasses because people will always be a lot more interested in their health than they will 3D experiences, metaverse nonsense, and intrusive notifications/info bubbles popping up in their vision.
Is it possible that the current form of iPhone really is the best possible portable device we will ever invent?
I initially said no but now I think YES... sorta. Yes, I'm convinced as of right now the smartphone is the best portable connectivity device we've invented and will ever invent for the foreseeable future BUT there is much room for improvement. I think foldables will be mass adopted because they solve the problem of how to maintain portability (or 'one handed walk-ability' I suppose) whilst satisfying the demand for bigger smartphones fit for media consumption. A phone the size of the original iPhone that unfolds into the size of an iPad Mini is realistically possible.
Wetware brain implants are a whole different kettle of fish. I think brain-to-computer interfaces are so far off we might as well call them impossible given the current lack of understanding regarding how the brain functions let alone attaching a seamless computer experience on top of it. Neuralink's end goal is a pipe dream at the moment.
“AR will enable a new mode of interaction with technology by taking a massive step towards the seamless unification of the digital and physical realm.” Ummmm, okay . . . . so our homo sapien species evolves over MILLIONS of years with a profoundly magnificent sensory-motor system based on 3 dimensional space and time (you know, gravity . . . time . . . concrete object space, visual, auditory, spatial and kinestetic orienting memory systems of perception etc etc; otherwise known as “the real world”); and we are to believe that the “seamless unification of digital and physical” will be a “massive step”? Towards what??? Forwards or backwards?? To purchase or sell what??? To better play army soldier or remodel my kitchen? To be sure, I am not a Luddite . . . but neither am I a digital sucker (born every minute . . .).
Yeah you're actually on the money here basically. Initially my long term obsession with AR productivity led me to think this was a reactionary take but you are very correct: until a compelling use case comes along for AR glasses they won't be mass adopted to the degree smartphones are, and so far all of my experiences lead me to believe there won't EVER be a compelling use case similar to that of the smartphone. Many players in the tech world are working on AR glasses but so far none of them have even hinted at why the average person should buy AR glasses and use them every day.
"Towards what???" is the crux of it. There is no obvious benefit to AR like there was for internet connected smartphones so what exactly are we working towards in the development of AR glasses outside of niche use cases?
The impression I get is AR glasses being hyped up because of a Sci Fi fantasy and obsession with the technology rather than anyone presenting a vision of how much better daily life will be with them.
Digital suckers indeed...