Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Like I said my old device does not support AR, so I can’t test. The demoyou show with the coffee is just sitting there at the table. I want to know if that user say goes outside of the house and points the phone back at where the coffee table is, will they see he coffee suspended in front of the home where the table was? if the user gets behind a wall that partially covers up the coffee cup, would it cover it or would the cup still show in its entirety? Even more simply, could the user have moved the phone under the table along with the head and point up at the table and not see the coffee cup because the phone knows the coffee cup is on top of the table?

Videos like the one you link show things right in the room where he phone isn’t really moving around much within the confines of the space. I want to know if you can leave the confines of that room and if the world tracking and understanding of horizontal surfaces knows the wall is covering up the AR object when you walk outside of the room.

Pokémon Go is a great example of bad AR (though it doesn’t have any real AR capabilities built in) Wherever you go you point your phone in the right direction and that pikachu is there. To me, that is just not accurate and not the way AR should really work. That pikachu you see on your screen should appear smaller as you go farther away from it, if there is a trash can that you walk behind and point it at that pikachu, it’s either partially covered up or you can’t see it.

That is the kind of tests I want to see. I can’t do it in the Apple store because the phone is tied to the table. Too many AR videos show the phone right in the same general vicinity where they spawn the objects.

Show me better videos or somebody with AR supported phones please do this test!
I didn’t realize you didn’t have the hardware. That said, is the video the person above posted about the dragon not sufficient?

I feel like people are trying to be helpful and you’re just telling them they’re not at this point.

Regardless, best of luck in finding the answer you seek. Despite our back and forth I still can’t quite seem to figure out how the videos you were provided don’t clear up your questions on the subject.
 
I’ve done some quick experimentation with the the Ikea app. If you place a piece of furniture in a room then move to a point where the item should be blocked by a wall or other stuff you will still be able to see the furniture ‘through’ the obstacle. So the system is not smart enough to detect that the virtual object should be obscured or not. It is good at tracking location and relative size, however.

This might be a limitation in the Ikea app, in the underlying ARKit or it might actually be an app design decision (e.g. you can’t see your furniture through the wall so why would people be interested in experiencing that; don’t both implementing this feature).
 
  • Like
Reactions: Starfyre
OP:

Based on the amount of threads you've started and questions you've asked and how much attention you pay to unbelievably minor details, it's clear you're going to buy one of these phones regardless - so since you're going to buy one anyway, does it really matter how the AR works? Just test it yourself when you (finally) pull the trigger and buy one.
 
  • Like
Reactions: akash.nu
I just made you a quick video with that dragon app, leaving the room.

Persistence isn’t bad. This is on a 6S+.



Edit: haha wrong video, give me a minute!
[doublepost=1508081711][/doublepost]

Thank you so much. This is the only video in the thread that attempts to answer my question with a live video.
[doublepost=1508210074][/doublepost]
I’ve done some quick experimentation with the the Ikea app. If you place a piece of furniture in a room then move to a point where the item should be blocked by a wall or other stuff you will still be able to see the furniture ‘through’ the obstacle. So the system is not smart enough to detect that the virtual object should be obscured or not. It is good at tracking location and relative size, however.

This might be a limitation in the Ikea app, in the underlying ARKit or it might actually be an app design decision (e.g. you can’t see your furniture through the wall so why would people be interested in experiencing that; don’t both implementing this feature).

I remember hearing about how Apple has world tracking and how they are able to detect horizontal surfaces... seems like the reality of it is maybe it really can’t detect that after all... your post/experience is exactly what I am looking for.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.