Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I can't rule out that Apple may come up later this year or the year next saying: "Yeah, you know, in order to capture spatial videos your phone must have 512 GB of storage at least".
 
Will the 16 ultra get a 4th lens on the other side to capture “true spatial videos” so it matches better the distance of human eyes? It could fill the gap of the 5x Zoom of the 16PM too.
 
Don’t the lenses have to be separated like on the vision pro like the eye distance, how do they achieve the same with the lenses next to each other
That is my thinking. They have some separation but you likely need more to get proper 3d. There may be some math tricks to increase the perceived inter ocular distance which would have been computationally expensive years ago but not so much with today’s chips. Perhaps there will be some way to use two phones synced and in a special holder to get closer to human eye spacing. I’ve seen phone holders that do that.
 
The centers of the lenses are about 19.5mm apart. The average human IPD is about 63mm.
So the layout definitely isn’t optimal for 3D capture.
Canon has a special dual fisheye lens for their interchangeable lens camera system. It has a 60mm offset (probably best to go a bit lower than average, and IPD is measured at a distance of infinity, but it is lower when you are looking at a closer object)
It will be a compromise. No software magic will make it look as good as an offset that matches human IPD, sort of like how portrait mode on the iPhone is a poor substitute for a big lens with a wide aperture.

I was thinking that if Apple arranged 4 cameras—ultra-wide, wide, telephoto, 2nd ultra-wide—vertically with the current spacing, the 2 ultra-wides would be almost perfectly spaced. The phone would look goofy without a case, though, and it would be a waste of money for most iPhone Pro purchasers because a large majority of them wouldn’t have the Vision Pro. Maybe a 4th lens could work for an iPhone Ultra, and maybe even having a 3D screen on the phone. I had fun taking stereoscopic photos on my Nintendo 3DS, even though the camera quality is awful.
 
Apple on Tuesday: “We have a phone that plays PS5 games!”

Also Apple on Tuesday: “we finally added a function from the 3DS you can’t use without spending an extra four grand.”
 
Tim’s focus on Vision Pro is making for boring updates for those of us who don’t plan to purchase a Vision Pro headset.

All of Apple's other platforms are very mature - the youngest of them, Apple Watch, is pushing 10 years already.

Mature industries, for better or worse, tend to be boring because a dominant form factor is in place and all the low hanging fruits have been plucked long ago.
 
Last edited:
I think this can be huge for the porn industry.

Recording 3D/Spatial Videos, then watching them as if you're really there in the scene.

Similar to VR Porn, but I think it'd be a more seamless experience with iPhone/VisionPro/Spacial

You will need another add on device that is placed close to the waist to really make it big
 
  • Haha
Reactions: Razorpit
  • Like
Reactions: JamesHolden


During Apple's "Wonderlust" event earlier this week, Apple claimed that the ‌iPhone 15 Pro‌ camera system "pushes the limits of what you can capture with a smartphone." It was a reference to the device's support for "spatial computing video," but the mention was brief and lacking detail.

spatial-video-iphone-15-pro.jpg

For those who missed it, spatial video is Apple's name for what is essentially 3D video. The iPhone 15 Pro is able to record spatial videos natively by combining video captured by the sensors of the Main and Ultra Wide cameras, which results in something akin to stereoscopic video.

This 3D video can then be played back using Apple's "spatial computing" Vision Pro headset when it's released early next year, allowing users to relive these memories in a fully immersive way. Apple didn't say, but presumably the iPhone 15 Pro can also capture spatial photo stills using similar camera processing.

Other questions remain that Apple's senior VP of worldwide marketing Greg "Joz" Joswiak didn't answer during his whirlwind tour of the iPhone 15 Pro's new camera capabilities, which include the ability to record 4K ProRes video at 60 frames per second, as well as 5x optical zoom on the iPhone 15 Pro Max.

spatial-video-apple-visioin-pro.jpg

It's not clear for example what kind of unique file formant these spatial videos will use, nor did Apple mention how large they will be. Presumably, iPhone 15 Pro users will need to select a special new mode in the Camera app to capture spatial video. How these videos will be saved in the Photos app, and whether they will be playable on the iPhone in any way, is also unknown.

Apple will undoubtedly reveal the answers to these questions soon enough. The ability to shoot spatial video using an iPhone 15 Pro isn't available yet, but it's "coming later this year," according to the company.

Article Link: iPhone 15 Pro Cameras to Support Spatial Video Later This Year, But Key Questions Remain
This is pretty cool, IMO! I remember some android manufacturers tried to make 3D a “thing” in mobile back in 2011/2012, I remember the HTC Evo 3D and LG Optimus 3D, both failed horribly, in fact 3D in general has never really taken off even with the TVs that tried it.

I hope Apple has cracked it here, sounds like they have found a pretty solid area to carry this with Vision Pro, but I guess this is all down to Vision Pro being a success and that requires it to have an ecosystem and where else to begin than iPhone.
 
  • Like
Reactions: Malarkey
I think this is more than just stereoscopic photography. In the demo, it looked like you could move around and see people from different angles. It would definitely be better if the camera were separated, though, but I'm guessing they can use the two cameras to get textures from both sides of an object and use the depth sensor to make 3D models of objects in the shot.

Could be easily done via SW with losing some resolution.

That is my thinking. They have some separation but you likely need more to get proper 3d. There may be some math tricks to increase the perceived inter ocular distance which would have been computationally expensive years ago but not so much with today’s chips. Perhaps there will be some way to use two phones synced and in a special holder to get closer to human eye spacing. I’ve seen phone holders that do that.

They've gotta be using LiDAR for spatial video, right?

Two barely-separated cameras doesn't give you a lot to work with, but when your phone can "see" the 3D positioning of everything in front of you, that seems like all the data you need for 3D video.
Yep, I’m thinking LiDAR is playing a huge role in all of this. I was excited for the added hardware when it was first announced, but it seems as if it’s been under the radar ever since. (Pun intended. 😉)
 
  • Like
Reactions: dmelgar and DaveN
Millions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
They will also be trained to use hand gestures with the new Apple Watch feature. That was my first thought when they presented it.
 
I just want to be able to set phone on a desk, table or any flat surface and not have it bobble and wobble as I use it....
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.