Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,257
39,058


Apple is developing a new augmented reality app codenamed "Gobi," according to iOS 14 code leaked today by 9to5Mac.

The report claims the app would revolve around QR code-like tags that trigger an augmented reality experience, adding that Apple appears to be testing integrations with Apple Stores and Starbucks. "For instance, users would be able to hold up their phone in an Apple Store and view information about the products on display, get pricing, and compare features."

apple-ar-glyph.jpg

The report adds that Apple plans to make an SDK or API available to third-party companies to "provide their own tag identifiers, which would load up custom assets and scenery for that company," noting that this would be based on extensions built into App Store apps. It is unclear if the API would be widely available or limited.

For more new features and changes expected in iOS 14, keep an eye on our roundup.

Article Link: iOS 14 Leak Reveals Apple Developing New Augmented Reality App Codenamed 'Gobi' With QR Code-Like Tags
 
Last edited:
This sounds a bit like Qualcomm's Vuforia Library. It's an object recognition platform and you can attach actions to it in your own app. It worked pretty good at that time (about 5 years ago when I worked on such an app).

You can for example organise a fox hunt and have people scan objects that they need to visit in order to get the next message. I think this is quite fun and if Apple delivers it without issues it can be a big success.
 
  • Like
Reactions: motulist
What happened to the whole Tim trying to prevent leaks? (Not that I’m complaining)
 
What happened to the whole Tim trying to prevent leaks? (Not that I’m complaining)
It's the developers internally testing iOS 14 that are selling this piece of information (or perhaps a device unit running it) for money. Tim can't start checking each developer if they're the one doing this. It might be hard to catch.
 
Apple is leaking quite a lot these days :)
That’s what I thought this whole week has been a leak-fest.
Would it be crazy to think that it might have something to do with the ‘work from home’ program Apple has just initiated?
How could they stop someone from snapping a picture of their computer screen when working on unreleased-OS code? Future product glyphs, etc?
 
The whole issue I am having with AR is that it tries to fix non existing issues so far and makes one look like an idiot and is simply inconvenient. Maybe it’s great for something like education or research but I have yet to see a satisfying use case in every days life.

last time I tried it was with the easyJet app to measure my carry on bag. Needless to say I went back to a ruler shortly after. I just didn’t trust it’s result and it was cumbersome
 
Sounds pretty neat! I can see lots of applications for both personal and business use.
 
well, AR will be huge as it will extend human abilities but you are right, the implementation now is still almost non existent. It will take a while before its usable and more importantly useful. :)

The whole issue I am having with AR is that it tries to fix non existing issues so far and makes one look like an idiot and is simply inconvenient. Maybe it’s great for something like education or research but I have yet to see a satisfying use case in every days life.

last time I tried it was with the easyJet app to measure my carry on bag. Needless to say I went back to a ruler shortly after. I just didn’t trust it’s result and it was cumbersome
 
I’m not being pessimistic, but this doesn’t sound like a compelling use case for AR.

Apple Stores, for instance, already have iPads next to most products on display that are used to browse pricing and information.

And as for Starbucks, they already have an app and information in store.
 
  • Like
Reactions: toph2toast
The whole issue I am having with AR is that it tries to fix non existing issues so far and makes one look like an idiot and is simply inconvenient. Maybe it’s great for something like education or research but I have yet to see a satisfying use case in every days life.

Well, there’s this (Sorry for sideways image, $&#@%! iOS image library orientation problem....it’s right side up in the library!)

FF8836E6-4316-487B-9D8B-79EF122B53FF.jpeg


Coming to a Home Depot near you.

Lots of great applications for workers. Will it be that useful for everyday life? I dunno, might result in a rise in injuries due to non-attentiveness. I live on the edge of downtown San Diego. Just yesterday I saw tourists crossing the street holding phones in front of them, paying little to no attention to traffic.
 
Last edited:
The whole issue I am having with AR is that it tries to fix non existing issues so far and makes one look like an idiot and is simply inconvenient. Maybe it’s great for something like education or research but I have yet to see a satisfying use case in every days life.

last time I tried it was with the easyJet app to measure my carry on bag. Needless to say I went back to a ruler shortly after. I just didn’t trust it’s result and it was cumbersome


The only good, real world AR implementation I've used was a few years ago with the app WordLens. You could hold your phone up in front of text in a different language, and it would translate it for you. Not sure if I would call that something you would use in everyday life, but I used to travel international for work a lot and it made things easier for me.
 
What happened to the whole Tim trying to prevent leaks? (Not that I’m complaining)

He has. In terms of hardware, I think Apple has actually Improved in containing leaks Considerably. The Apple Watch hardly has hardly any identifiable hardware leaks, the iPhone has reduced significantly, (aside from logic board/camera module leaks), the iPad had a chassis type leak, but for the most part, it’s not nearly as bad as it was in 2016.

furthermore, leaks for software are far more difficult to contain, because they are embedded into code, which obviously code is released/beta tested, and can be more ‘deconstructed’ versus say someone from a Foxconn factory leaking a camera module.
 
The only good, real world AR implementation I've used was a few years ago with the app WordLens. You could hold your phone up in front of text in a different language, and it would translate it for you. Not sure if I would call that something you would use in everyday life, but I used to travel international for work a lot and it made things easier for me.
This indicates how clueless the whole AR strategy is.
But hopefully AirTag is so brilliant it can retrieve the AppleCar/AirTile/AirPower/.... that got lost somewhere along his endless product pipeline
 
Apple appears to be testing integrations with Apple Stores and Starbucks.
Be good if you could hold your phone up on the different aspiring actors baristas at Starbucks and you could see who can, and can't make a decent coffee.

(Just waiting for the first "if you can see them working in a Starbucks, they can't make a decent coffee" comment. They're not all bad in Oz)
 
What I’m looking for is a way to have a QR code on a poster, be able to point your phone to it, without having to install a specific app, and have an overlay of information “anchored” to the poster.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.