Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Don’t agree with everything you said, but a interesting points about the Apple Watch, I wouldn’t include Air Pods and fashion in the same sentence though, as they do look daft!
Unfortunately you made the point of Apple controlling the software and hardware, they do with all their devices but they’ve proven that isn’t always an advantage, iOS 11 proved that beyond doubt!

But with the Apple Watch it has become a geeky fashionable wearable tech accessory, still I cannot see these glasses getting anywhere beyond the workplace.

Truth be told, I can’t see myself wearing AR glasses in the public either, but then again, I recall being very self-conscious when using my airpods initially.

I recognise that by catering to the enterprise market, Google doesn’t have to care about aesthetics as much, but I also see that as a concession that they have pretty much lost the consumer market.

Essentially, my main point is that when it comes to something as personal as wearables, the user must first and foremost be willing to not only wear it, but be seen wearing it. Else, all the best tech in the world is moot. And I believe that Apple has many advantages here. They are still seen as fashionable, so I believe that consumers will be more amenable to trying out a pair of smartglasses if they were released by Apple than by any other company.

Smart glasses won’t be cheap either. I can see a pair with prescription lens maybe going all the way to $1000 and beyond? Again, the people who are willing to splurge on apple watches are iphone users who have the disposable income to spend on said accessory as well.

It’s a variation of the old “Apple could release a turd with an apple logo on it and it would sell” trope, but I have come to see it as more of a compliment than anything else.
 
I’ve personally seen two good uses.

The first was a noise locator. It uses an array of microphones and measures the time delay to each microphone to pinpoint where a sound is coming from. It also has a live camera feed displayed on a monitor. You simply move the camera around and it highlights the component making the noise in real time.

The second is for technicians working on an engine. They can point a tablet at the engine and it can identify the components seen on the screen. Selecting a component can bring up additional information or show proper steps to remove/replace the component. When connected to a diagnostic system it can also highlight the most likely causes for a stored fault code.
OK, but you don't need AR to solve either of those problems. In fact, with the service use case on an engine, it may not be as intuitive. First, you need to be holding a tablet which ties up your hands from working on the engine (headsets don't have the accuracy or graphical capabilities to display over very complex machinery - even HoloLens 2). It would be cheaper to have an interactive manual (which you can do with any CAD or PLM system) and record a video of someone working on an actual fix of the engine. It would be a lot cheaper/faster/accurate to share service knowledge that way.
 
Ah... good point. In that case, scratch that. This isn’t AR so it’s not going to be good enough. This is just a ‘heads-up’ display for *one* eye in 2D.

My new opinion is that it will fail and AR glasses will be better suited to industry.

I like the idea of Google Glass (even if it is just a heads up display) but the implementation is terrible. Even current Glass devices are crap.

Glass feels like using the Palm VII web clipping vs. the Hololens as a Gen 1 iPhone. Or, it’s like trying to build Enterprise VR content by betting the farm on Google Cardboard.

I’d rather use AR on Mobile Devices and wait for Hololens to advance and drop in price than try to use Glass.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.