Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You lack imagination, my friend.

I for one would love a HUD that gives me driving or walking directions overlaid onto the road, the ability to answer calls and see and dismiss notifications with a tap of the air (the lidar sensor registers my hand position), or a giant virtual monitor.

I believe we're on the brink of a revolution in human/computer interfaces.
I have lots to say to that, so forgive me if I seem a bit ranty! Mostly I'm just avoiding work :p
1) It's not imagination, it's faith in the current tech. Those things are great in principe, but if the tech existed to do it well we'd have seen parts of it already. Remember, Apple's strength has never been inventing great new breakthrough technologies, it's always been about bringing together existing technologies in a great new way.

To overlay useful information on the real world Apple need to know a) where stuff is, and b) useful information about it, and they clearly don't or it'd be in Maps already. Given they've been working on this for years I find it borderline unbelievable that they're suddenly going to go from crap** to great* at exactly the time they announce their AR glasses.

*and it needs to be great
**and it currently is crap, and it's no good people saying "
their POI data is good where I live" - it needs to be reliably good in most places to be trustable enough for you to rely on the product.


I suspect they could do a HUD to give decent driving directions, that's pretty easy. But walking directions? No chance. If I followed Apple's walking directions I'd have been run over multiple times, and I strongly suspect having these overlaid on your vision would make you considerably less likely to think critically about the instructions you've received and far more likely to blindly follow them.

2) That's the obvious and commonly cited use-case example, but how often does it actually come up? I don't think most people need directions very often, even outside of lockdown, certainly outside of a car (where the directions could be overlaid on the windscreen with no need to wear glasses).

3) It's not just POI data that's the problem, it's accurately positioning stuff in AR so it's perfectly positioned, so it looks static when necessary. I feel that because of it's position as a kind of augment to your own senses rather than a clearly standalone device (live a phone), an AR product needs to be closer to perfect than any other product category Apple makes. If stuff floats loosely round in your view, or wobbles, or flickers, of clips, it'll ruin the experience and you won't want it attached to your face. Nothing I've seen suggests Apple are close to perfect AR.

An example: one use case I've heard suggested is to give people instructions on how to repair or assemble things, but to do that well it would need to be able to identify particular components, even when they're partially (or fully) obscured by other components, or by the fact that I've picked them up. Current AR tech isn't even close to good enough to do that.

4) Same goes for the Minority Report style input you describe. User input recognition needs to be perfect or it's frustrating. Button pushes are pretty perfect, they pretty much always work. Touch screens are somewhat less perfect, sometimes they get it wrong and sometimes they're frustrating, but I think they get away with it. I've never experienced any sort of motion capture input system that's close to perfect enough to not be frustrating, and again I've seen no indication that anybody is close to a breakthrough here despite lots of effort in the field (eg in VR). I'd like to be wrong, but given how poorly secrets are kept these days I doubt it.

5) What are the use cases other than directions and floating tooltips showing me user ratings of the Eiffel Tower? It's not just my imagination here, I've seen plenty sci-fi TV, films and games with such technology, as well as material from the likes of Google and Facebook imagining such technologies, and none of it looks compelling for day-to-day use.

I can see professional use cases where essentially it's less about AR and more about just having a screen that's always in your field of view that you don't need to hold (useful, admittedly), or even more ARy use-cases like the repairing/assembling one I mentioned earlier, so sure, once the technology is good enough for those use-cases it may be of great interest sometimes to people in certain jobs, but I still don't see what makes it interesting to the ordinary iPhone buying, MacRumours reading public like myself.

6) Battery life and weight. Well this is an obvious one, but unless there've been significant technological breakthroughs that somehow none of us have heard about then these things must necessarily be either chunky and heavy, or have highly reduced display quality and/or battery life.

TL;DR:
1) The perfection required for AR to be usable is far higher than for other product categories, and the tech isn't there
2) Most use-cases people put forward just aren't actually that useful

I agree that 100 years from now nobody will want to go out without their AR (though I doubt it'll be in the form of a frame hanging over their face), but right now I think it's like fusion power or self driving cars.
 
I love the animation - What this makes me think about -

New software that allow us to create AR ART thanks to a combination of ipad and pencil + airtags for the spatial rendering, all of that using no code. It would be the 'killer' feature fir airtags vs. tiles (airtags as iPad accessory). It would also make AR more `funky`and `cool` and pave the way to apple glass (if apple watch is a good indicator, they are willing to go to great lengths - gold version and celeb endorsement - to make their wearable tech seem cool and not dorky).

New software that allow us to collaborate live with pencils (just like google docs where 2 persons can be in the same document making changes, 2 persons with 2 ipads drawing in the same document at the same time)

apple pencil intagration with iMac (transformation into 24" digital canvas when screen is tilted).

one can dream :)
 
Last edited:
Most AR implementations are dumb and gimmicky. Maybe Apple will do to Apple Maps what Google has done to Google Maps with indoor and outdoor AR navigation cues.

https://blog.google/products/maps/redefining-what-map-can-be-new-information-and-ai

Indoor_Live_View_2_QSAeaS7.gif
Unifi AR is quite useful to me but I agree, it has not developed in the pace i thought.
 
Unifi AR is quite useful to me but I agree, it has not developed in the pace i thought.

That's still gimmicky and also proprietary. The standard for data center is labeling the switchport on the switch and labeling the patch cable ends with hostname. No expensive proprietary device needed or to drop and break. Switch port assignment rarely changes anyway.
 
  • Like
Reactions: ipponrg
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.