Should be 3.1 (Sorry, double post)
Apple seems to want to focus on quick intuitive operations with their built-in Camera app. So, it’s doing a scan in order to properly render bokeh or separate foreground from background, for example, but it’s doing so quickly behind the scenes so you just see the result. Other Camera apps do it differently.I searched the app in the App Store and a new version was there. I updated the app and I now have the new AR icon.
I used it and it is quite cool. Why doesn't Apple put that scan feature in the camera app?
Thanks for the help.
Sure, but the AR scan is actually pretty good at showing the 3D space. It would work in the dark.Apple seems to want to focus on quick intuitive operations with their built-in Camera app. So, it’s doing a scan in order to properly render bokeh or separate foreground from background, for example, but it’s doing so quickly behind the scenes so you just see the result. Other Camera apps do it differently.
You have move the phone around to model the whole space instead of what’s right in front of the lens, which would take away from the simple straightforward implementation they appear to be going for.Sure, but the AR scan is actually pretty good at showing the 3D space. It would work in the dark.
I could scan the whole room and have a model of the whole space
Clips has so much potential, I wish Apple invested more into it.
One thing they’ve already done is taken your Memoji and inserted them into Clips. So, you can create an animation more than 30 secondsI think a good next step would be to add a Clips button in iMessage in the apps bar so there's direct integration with iOS' most used app and a halo effect on Clips.
I wish Dance Floor had some sort of beat detection in it.I thought this looked pretty neat in the M1 iPad Pro announcement. Now that I’ve had a chance to play around with it, it actually works better than I thought. I think Prism, Disco, and Dance Floor look the coolest.