That is my thinking. They have some separation but you likely need more to get proper 3d. There may be some math tricks to increase the perceived inter ocular distance which would have been computationally expensive years ago but not so much with today’s chips. Perhaps there will be some way to use two phones synced and in a special holder to get closer to human eye spacing. I’ve seen phone holders that do that.Don’t the lenses have to be separated like on the vision pro like the eye distance, how do they achieve the same with the lenses next to each other
Tim’s focus on Vision Pro is making for boring updates for those of us who don’t plan to purchase a Vision Pro headset.
I think this can be huge for the porn industry.
Recording 3D/Spatial Videos, then watching them as if you're really there in the scene.
Similar to VR Porn, but I think it'd be a more seamless experience with iPhone/VisionPro/Spacial
I don’t think a unique file format is needed, when 3D-HEVC exists.
Edit: This is most likely what they are using: https://developer.apple.com/av-foundation/HEVC-Stereo-Video-Profile.pdf
This is pretty cool, IMO! I remember some android manufacturers tried to make 3D a “thing” in mobile back in 2011/2012, I remember the HTC Evo 3D and LG Optimus 3D, both failed horribly, in fact 3D in general has never really taken off even with the TVs that tried it.
During Apple's "Wonderlust" event earlier this week, Apple claimed that the iPhone 15 Pro camera system "pushes the limits of what you can capture with a smartphone." It was a reference to the device's support for "spatial computing video," but the mention was brief and lacking detail.
![]()
For those who missed it, spatial video is Apple's name for what is essentially 3D video. The iPhone 15 Pro is able to record spatial videos natively by combining video captured by the sensors of the Main and Ultra Wide cameras, which results in something akin to stereoscopic video.
This 3D video can then be played back using Apple's "spatial computing" Vision Pro headset when it's released early next year, allowing users to relive these memories in a fully immersive way. Apple didn't say, but presumably the iPhone 15 Pro can also capture spatial photo stills using similar camera processing.
Other questions remain that Apple's senior VP of worldwide marketing Greg "Joz" Joswiak didn't answer during his whirlwind tour of the iPhone 15 Pro's new camera capabilities, which include the ability to record 4K ProRes video at 60 frames per second, as well as 5x optical zoom on the iPhone 15 Pro Max.
![]()
It's not clear for example what kind of unique file formant these spatial videos will use, nor did Apple mention how large they will be. Presumably, iPhone 15 Pro users will need to select a special new mode in the Camera app to capture spatial video. How these videos will be saved in the Photos app, and whether they will be playable on the iPhone in any way, is also unknown.
Apple will undoubtedly reveal the answers to these questions soon enough. The ability to shoot spatial video using an iPhone 15 Pro isn't available yet, but it's "coming later this year," according to the company.
Article Link: iPhone 15 Pro Cameras to Support Spatial Video Later This Year, But Key Questions Remain
Kids have a smaller eye distance than adults and yet they see in 3D. 👀Don’t the lenses have to be separated like on the vision pro like the eye distance, how do they achieve the same with the lenses next to each other?
I have zero interest in a Vision Pro but would absolutely use the 15 Pro to shoot Spatial Video experiences to sell to people with the Vision Pro.Tim’s focus on Vision Pro is making for boring updates for those of us who don’t plan to purchase a Vision Pro headset.
I think this is more than just stereoscopic photography. In the demo, it looked like you could move around and see people from different angles. It would definitely be better if the camera were separated, though, but I'm guessing they can use the two cameras to get textures from both sides of an object and use the depth sensor to make 3D models of objects in the shot.
Could be easily done via SW with losing some resolution.
That is my thinking. They have some separation but you likely need more to get proper 3d. There may be some math tricks to increase the perceived inter ocular distance which would have been computationally expensive years ago but not so much with today’s chips. Perhaps there will be some way to use two phones synced and in a special holder to get closer to human eye spacing. I’ve seen phone holders that do that.
Yep, I’m thinking LiDAR is playing a huge role in all of this. I was excited for the added hardware when it was first announced, but it seems as if it’s been under the radar ever since. (Pun intended. 😉)They've gotta be using LiDAR for spatial video, right?
Two barely-separated cameras doesn't give you a lot to work with, but when your phone can "see" the 3D positioning of everything in front of you, that seems like all the data you need for 3D video.
They will also be trained to use hand gestures with the new Apple Watch feature. That was my first thought when they presented it.Millions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
I would honestly be more impressed to be finally able to shoot video with both cameras (front and back) at the same time like Samsung or is there at least a 3rd party app that can do this?