Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Oh, I've watched plenty of sci-fi movies in my life. I can thinking of millions of ways we could use AR. But fortunately for everyone, we don't live in my mind. We live in the real world, where the actual application of it doesn't really exist. So while it's cool that Apple is enhancing a feature, they're enhancing it on something a majority of people don't use.
If I asked you about the iPhone in 2006, you would say we didn't need it.
 
No. Each phone needs simply to agree on the physical location and orientation of the virtual object. Then each phone renders it the same as each phone renders any other virtual object.

Not just location, orientation too, if someone on one phone moves it, modifies the object, the info about what happened must go real fast to the other phone or the effect of being in the same space is broken. If you block off the sun and put a shadow on the virtual object on one side, how it looks 90 degree from there is not the same, so you have to not just know about the object itself but how the rest of the scene interacts with it as seen from all orientations.

Wonder if it were not more efficient in such a case to split the work accross the devices than have each device do the whole work. A kind of grid of computation would be interesting as a device on the back would have a different view of the real world scene than one on the front. Both device could share their depth map and other scene info (for example sound arrives at first phone first means you get a very good idea of its direction, better than you could do from just one phone), (another example, reflections, shadows in the scene from both view can reconstruct sun position even if the sun is not in view) to create something really intriguing.

The idea of all the devices with a view of a scene sharing data about it, and sharing processing, is something that goes beyond just AR. Many people see the Augmented par as merely visual, but it could also be any type of sensor data originating not only from your device, but from every linked device around you. These pervasive sensors (vision, motion, orientation, heat, chemicals, wind, sound, etc) would create immense flows of real time info.

Anyway, all this is something that sounds simpler than it actually is and the consequences are deeper than you would think at first.

That's why i think AR and the sharing/analysis of this flow of real world info and action upon in will lead to things pretty crazy and it won't take that long to happen.

Many people, tend to only see the limitations of new tech, or see tech in isolation. It's when you integrate one tech into all other emerging and old ones that you could the true measure of what can happen.
 
Last edited:
  • Like
Reactions: DoctorTech
Not just location, orientation too, if someone on one phone moves it, modifies the object, the info about what happened must go real fast to the other phone or the effect of being in the same space is broken. If you block off the sun and put a shadow on the virtual object on one side, how it looks 90 degree from there is not the same, so you have to not just know about the object itself but how the rest of the scene interacts with it as seen from all orientations.

Wonder if it were not more efficient in such a case to split the work accross the devices than have each device do the whole work. A kind of grid of computation would be interesting as a device on the back would have a different view of the real world scene than one on the front. Both device could share their depth map and other scene info (for example sound arrives at first phone first means you get a very good idea of its direction, better than you could do from just one phone), (another example, reflections, shadows in the scene from both view can reconstruct sun position even if the sun is not in view) to create something really intriguing.

The idea of all the devices with a view of a scene sharing data about it, and sharing processing, is something that goes beyond just AR. Many people see the Augmented par as merely visual, but it could also be any type of sensor data originating not only from your device, but from every linked device around you. These pervasive sensors (vision, motion, orientation, heat, chemicals, wind, sound, etc) would create immense flows of real time info.

Anyway, all this is something that sounds simpler than it actually is and the consequences are deeper than you would think at first.

That's why i think AR and the sharing/analysis of this flow of real world info and action upon in will lead to things pretty crazy and it won't take that long to happen.

Many people, tend to only see the limitations of new tech, or see tech in isolation. It's when you integrate one tech into all other emerging and old ones that you could the true measure of what can happen.

Lol. “Not just location, orientation too.” Did you read the first sentence you quoted?
 
Did you read the rest of what I wrote.. Should practice WTH you speak bud.
I did. It’s just funny that I specifically said you need to track location and orientation, and you respond “not just location, but orientation!” as if you just discovered that.
 
But it doesn't exist. I still haven't seen anyone deploy a useful AR application for consumers. Industry, maybe. I don't doubt that there will be some use, but it's surely not there now.
Fair enough. It's been something I've been wanting to do for ages. I feel I'm a good enough developer to tackle it now, so who knows.
 
  • Like
Reactions: fairuz
My fearless predictions for WWDC '18:

Lotsa new emojis and animojis. Not a hell of a lot else.

Great job, Tim.
 
  • Like
Reactions: BigDO
I did. It’s just funny that I specifically said you need to track location and orientation, and you respond “not just location, but orientation!” as if you just discovered that.

Not reading you anymore. Go talk to yourself.
[doublepost=1528009650][/doublepost]
My fearless predictions for WWDC '18:

Lotsa new emojis and animojis. Not a hell of a lot else.

Great job, Tim.

Want an applause for a 2016 meme. Maybe try harder.
 
  • Like
Reactions: martyjmclean
Great tech demo. Meanwhile, still trying to find practical uses for AR in the real world.

Ok, not just me then. I try really hard to get excited about this and see the possibilities, but just can't.
 
Adding signs for every piece of information one might want to know about a business is highly impractical. With such an app, there is no limit to the amount of information that can be exposed, by simply pointing the camera at the business.

That's very rudimentary though. Not really AR as much as matching a photo with a database entry.

The GPS/mapping utilization is somewhat more exciting, but still not a killer app.

Don't see games having the killer app either, other than the silly, trivial Pokémon kind.

AR is really a solution in search of a problem currently.
 
If I asked you about the iPhone in 2006, you would say we didn't need it.

That's an old/flawed argument tactic. "It'll be awesome and everyone's going to use it because people said the same thing about a completely different item". But hey if you're going to use it, so can I:

AR is going to fail cause 3D TVs failed ... or cause the Newton failed ... or cause the lasagna I attempted last night failed!

Also, you need to get your ESP checked, cause that's NOT what I would of said considering I bought an original iPhone on launch day (you know, when it was $600 WITH a 2 year contract).
 
That's an old/flawed argument tactic. "It'll be awesome and everyone's going to use it because people said the same thing about a completely different item". But hey if you're going to use it, so can I:

AR is going to fail cause 3D TVs failed ... or cause the Newton failed ... or cause the lasagna I attempted last night failed!

Also, you need to get your ESP checked, cause that's NOT what I would of said considering I bought an original iPhone on launch day (you know, when it was $600 WITH a 2 year contract).
AR isn't a product...it's a concept with clear value. My point is anyone can see the conceptual benefits but it's difficult to discern exactly what they will be in practice. That's why Apple didn't ask you what you'd like to see in a phone. Consumers don't know until they see it. Then it's obvious.
 
That's very rudimentary though. Not really AR as much as matching a photo with a database entry.

The GPS/mapping utilization is somewhat more exciting, but still not a killer app.

Don't see games having the killer app either, other than the silly, trivial Pokémon kind.

AR is really a solution in search of a problem currently.
Okay, then let me rephrase this according to your reasoning.

Problem: Finding business information requires either entering the shop and locating an appropriate sign or asking someone, assuming they know, or searching around for their website, either posted somewhere in the store, or perhaps via a google/yelp search.

Solution: Point the camera at the business and find everything you need to know, in the blink of an eye.
 
After all of this “AR is the future” hype, it has turned out to be pretty useless. I guess kids are having some fun but for the most part it’s pretty lame. I guess some of those apps that let you “furnish” your home are cool.
 
After all of this “AR is the future” hype, it has turned out to be pretty useless. I guess kids are having some fun but for the most part it’s pretty lame. I guess some of those apps that let you “furnish” your home are cool.

It’s not Apple’s fault, it’s the fact that the developers are not fully utilizing the ideas behind the software.

For example, imagine a diet program that could show you the number of calories in a certain food when you focus the camera on it. Or walking around town looking at historical landmarks, and being able to see tons of information about it in AR. Or perhaps show you a picture of what the site used to look like hundreds of years ago in comparison. What about a live retail application that would allow you to compare prices across multiple vendors by just showing it? Or when you show it a peach at a supermarket, it tells you what to look forward to ensure it’s properly ripened? Or a plant or wildlife recognition program? Or a drug identification program in case you find pills in your children’s pockets at home?

There’s a lot more ideas than that, of course. But I agree right now we’re just seeing mostly play toys.
 
I
If only someone could invent something even better than this. Something tangible, in the real world, without requiring special hardware and software. Maybe we could take a board or paper and write those things on it? Then somehow attach it to the building with nails or adhesive. We could call it a "sign" and anyone could use it! Will probably never work though because people's phones are always blocking their view of the world.
I hope that was written as sarcasm. I live in Indiana but I am currently in Lima, Peru on business and last year I was in Valencia, Spain. I would LOVE to be able to walk down the street with my iPhone (or preferably AR glasses) and read street signs and business signs in English. I would also like to walk into stores and see prices on products in US dollars. I can think of a LOT of practical uses for AR.
 
AR isn't a product...it's a concept with clear value. My point is anyone can see the conceptual benefits but it's difficult to discern exactly what they will be in practice. That's why Apple didn't ask you what you'd like to see in a phone. Consumers don't know until they see it. Then it's obvious.

I’d call it a feature. One that’s not worth investing in at the moment. Just like the multi-touch feature is what made today’s smartphones what they are. AR is a feature that’s ideal for wearable glasses. Holding up a phone just won’t work outside of specialized industries and of course keynote demos.
 
I hope that was written as sarcasm. I live in Indiana but I am currently in Lima, Peru on business and last year I was in Valencia, Spain. I would LOVE to be able to walk down the street with my iPhone (or preferably AR glasses) and read street signs and business signs in English. I would also like to walk into stores and see prices on products in US dollars. I can think of a LOT of practical uses for AR.
Yes.
 
Or walking around town looking at historical landmarks, and being able to see tons of information about it in AR.

Now I'm picturing hoards of tour groups with their phones in the air, not aware of the traffic coming at them or the phone thieves carefully selecting their easiest target.
 
Now I'm picturing hoards of tour groups with their phones in the air, not aware of the traffic coming at them or the phone thieves carefully selecting their easiest target.

It’s already been that way for years. They are waving their iphones and ipads to take pictures all the time. Or use Facebook. Or text. Etc. I live near Washington DC so I’m pretty used to seeing it constantly.
[doublepost=1528097830][/doublepost]
I would LOVE to be able to walk down the street with my iPhone (or preferably AR glasses) and read street signs and business signs in English.

You can already do that right now. Just tap the camera on Google Translate and get AR real-time translation.
 
  • Like
Reactions: martyjmclean
Ok, not just me then. I try really hard to get excited about this and see the possibilities, but just can't.

With an iPhone I see limited possibilities. I have an apps (and some more in the works) similar to Ikea for selling forniture, it is something fun but not mind blowing and not a game changer.
I'm not a fan of the idea of AR glasses, but they would be much more useful for using AR than holding a phone or an tablet and move it around
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.