that’ll be a fun phone to carry around/hold without constantly touching the other camera 😂Just move one of the cameras to the other end of the iPhone for true depth.![]()
that’ll be a fun phone to carry around/hold without constantly touching the other camera 😂Just move one of the cameras to the other end of the iPhone for true depth.![]()
Yep, I’m thinking LiDAR is playing a huge role in all of this. I was excited for the added hardware when it was first announced, but it seems as if it’s been under the radar ever since. (Pun intended. 😉)
Sounds cool, but I’m not sure how that would work or how good it would look. You need the 2 separated cameras to capture 2 unique perspectives. The LIDAR is very low-res, so what could it add?They've gotta be using LiDAR for spatial video, right?
Two barely-separated cameras doesn't give you a lot to work with, but when your phone can "see" the 3D positioning of everything in front of you, that seems like all the data you need for 3D video.
Millions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
Yes, there is definitely an app that does this. I have used it.I would honestly be more impressed to be finally able to shoot video with both cameras (front and back) at the same time like Samsung or is there at least a 3rd party app that can do this?
Apple has a real chance to be very influential on the trajectory of AR/VR aka spatial computing going forward…Maybe because there is not enough detail on “How”?
One aspect that has me questioning; what is the frame rate? The 4k60 already maxes out the pipeline and to do it in spatial?
This will be interesting.
🤔 Hm, there is also potential for combining watch and camera data to better track this gesture. The Watch could become an input device for the Apple Vision Pro. When you double tap while the line of sight to your hand is blocked, the watch could still register a click.They will also be trained to use hand gestures with the new Apple Watch feature. That was my first thought when they presented it.
I think this is more than just stereoscopic photography. In the demo, it looked like you could move around and see people from different angles. It would definitely be better if the camera were separated, though, but I'm guessing they can use the two cameras to get textures from both sides of an object and use the depth sensor to make 3D models of objects in the shot.
"You are holding it wrong"that’ll be a fun phone to carry around/hold without constantly touching the other camera 😂
Yup, another major problem with the iPhones. (Dunno if the 15P has fixed this.) Basically, BEFORE shooting, you have to decide whether zooming via camera switching in hardware (not just digital cropping) is more important for you than fluidity (60p) and accordingly select between 30p and 60p. Sometimes a REALLY hard decision as 30p really makes panning a chore.Also also why is ist not possible to smoothly transition between lenses when filming 4K60
Apple has a real chance to be very influential on the trajectory of AR/VR aka spatial computing going forward…
Facebook was never a contender, but they can become the “microsoft” of spatial computing maybe. However, considering they are relying on Android for their OS.. They dont really have a chance. Google/Alphabet will eat their lunch and dip into the rich data from AR/VR for their business model..
Considering Apple… They are a 100% vertically integrated company with a multi-year lead on the mobile computing space with silicon. Apple will have a good stretch of time where they can leverage their ARM advantage to put out crazy AR products until atleast 2027 when TSMC will feel the consequences of not getting the angstrom era UV litography tech. BUT ofc, TSMC is building fabs in the US to avoid that scenario…
Apple Vision is going to be a crazy technology around 2026, when nanolenses should be hitting the market… And after that all bets are off I guess, however Apple is positioning itself very well to succeed.
You already can since iPhone 11 Pro with apps like DoubleTake.I would honestly be more impressed to be finally able to shoot video with both cameras (front and back) at the same time like Samsung or is there at least a 3rd party app that can do this?
Because that’s what gets people to pay more for the Pros.Since these special videos and photos are obtained thanks to the wide camera and the regular camera, I wonder: why is this feature limited to iPhone Pros?
I was thinking about that as well - but maybe they compute the 3d differently. After all, it's also different fields of view. The lens that's more tele "compresses" the image more - maybe there's a way to compute a stereoscopic image from that data?Sounds cool, but I’m not sure how that would work or how good it would look. You need the 2 separated cameras to capture 2 unique perspectives. The LIDAR is very low-res, so what could it add?
NoCould be easily done via SW with losing some resolution.