Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have a horrible feeling it will be locked down to some proprietary format but I’d love to be able to record and then view it on a Reverb G2
 
  • Like
Reactions: dhershberger
Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.

But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?
 
Millions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
Because they prefer to rag on Cook, his leadership and Apple in general? Although it won't always work out, Cook and his senior leadership team seem to know what they are doing most of the time

Most, but not all, technophiles are not strategic planning whizzes. I see a lot of it here and on other reader participation tech oriented forums - technical brilliance and creativity, but not so much when it comes to understanding business management, finance and strategic planning.
 
Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.

But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?
I had already decided to buy, but this is a great bonus. My worry is that I will want to use it often between now and when the AVP is released, so that I have a library of personal spacial videos to view with my new headset.

But that means I will be choosing to shoot spacial videos at times when I would otherwise shoot regualar videos. I hope that I will be able to extract regular 2D video from the file so I can share them on FaceBook.

Also, will I be able to share the spacial videos on Facebook, so that people using Facebook on the AVP or a MetaQuest 3 headset will be able to view them as intended?
 
  • Like
Reactions: Ted_Appleseed
None of us can truly judge this, other than the select few who've experienced it. Those who've been lucky enough have, to that I'm aware of, unanimously voiced approval at the experience, describing it as if they were there. I have yet to hear bad reviews of the spatial video/photo experience.

I have a feeling this is going to be one of those things where we wish we had video of past events of our and our kids' lives. The sooner we all start recording moments in spatial video, the better – even if we don't have any immediate plans for buying a VisionPro.
 
Don’t the lenses have to be seperated like on the vision pro like the eye distance, how do they achieve the same with the lenses next to each other

Not necessarily. The sensor size to lens distance ratio is probably comparable to our retina to eye distance ratio.

Distance to the subject also comes into play and depth perception is dependent primarily on foreground/background separation in addition to two different perspectives that stereo vision provides.

A VisionPro would allow you to get closer to a subject while maintaining detail on both sides of the subject, but a camera a few metres away from the subject would get similar detail, enough to sell the idea of stereo vision.
 
Not necessarily. The sensor size to lens distance ratio is probably comparable to our retina to eye distance ratio.

Distance to the subject also comes into play and depth perception is dependent primarily on foreground/background separation in addition to two different perspectives that stereo vision provides.

A VisionPro would allow you to get closer to a subject while maintaining detail on both sides of the subject, but a camera a few metres away from the subject would get similar detail, enough to sell the idea of stereo vision.
I wonder if the VisionPro will let you get up and walk around while viewing spacial video. How will it look if you look at the video from the side rather than from the front? Or just at an angle other than straight on.

I'm thinking of the holograms from Minority Report. When they're viewed from the side they don't model the contours of the faces. There are just two planes (subject and background) Our brains fill in the solidness of the subject from other cues, like the lighting and shadows. And there's a hole in the background, because the camera taking the picture didn't capture what was behind the subject.
image.png


Or is the image just stereoscopic? We get two angles, one for each eye. And if we try to walk around, the spacial video we're watching will move to stay the same distance from us and facing us head-on. You won't be able to go around to the back of the video any more than you can get to the end of a rainbow. In that case, the iPhone may still create that multi-plane model, masking the foreground subject, then snapping two images at the proper eye separation from the shadow-box they created from the original frame. iOS has been doing that kind of thing for years.

I think there will be utilities eventually to take some 2D movies and make them appear 3D. FaceTime has had a feature that does this to still photos.. That effect will likely range from uncanny to comedic. Remember the melting bridges in Google Maps?
 
  • Like
Reactions: FriendlyMackle
Millions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
I think it's especially impressive to now look back and see how Apple has been slowly but steadily laying the foundation for the Vision Pro. Animoji from the iPhone X, design cues found in the Apple Watch and AirPods Max, stage manager in the iPad, now spatial video recording in the latest iPhones, it's all coming together in a manner you simply won't see from the competition.
 
Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.

But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?
The 3D video format is almost certainly based on MV-HEVC which is new, and was announced at this year’s WWDC. There’s nothing public for Final Cut Pro yet, but I will certainly be asking the FCP team about that at the FCP Creative Summit this November.

If you've ever wanted to visit Apple in person, this is a great way to do it: fcpcreativesummits.com
 
  • Like
Reactions: Ted_Appleseed
So on an iPhone screen, it will still be available as a normal 2D video. Maybe it would be wise to record every video in Spacial format, it sounds like it will use up massive amounts of data which is probably why they increased iCloud storage tiers.
 
Remember all the people slagging on the Vision Pro about the idea of Spatial Video? The idea of how awkward it would be for a dad to wear a Vision Pro to take video at their kids birthday party? :)

I said then that Apple would roll this 3D photography capability into iPhone. Done.
It's done when I say it's done.

We've yet to see how good these spatial videos look when taken by the iPhone versus the VisionPro (versus whatever they used to record the demo). It's still early days. I'm excited, and curious about future versions of spatial video that will make this first iteration (that I haven't even seen yet) look crude and primitive.
 
Since these special videos and photos are obtained thanks to the wide camera and the regular camera, I wonder: why is this feature limited to iPhone Pros?
People being like ‘cause they want to sell you the Pro’ but to me it seems obvious that it’s because the Pros have LiDAR, so they can get the two video streams from the two cameras, which are really close to each other, and still add the depth information thanks for the LiDAR. It’s probably a more complex process than we think, not just two videos.
 
I wonder if the VisionPro will let you get up and walk around while viewing spacial video. How will it look if you look at the video from the side rather than from the front? Or just at an angle other than straight on.
Emerging technology uses generative AI to construct a 3D model from a 2D image. Here it is for products:

You could see how an adapter layer could be trained on images of people who regularly appear in your library that would give it enough detail to make the 3D generated image realistic.

I don't know the model size of Alpha3D, but it's possible that this plus the LoRA could run entirely on-device.
 
People being like ‘cause they want to sell you the Pro’ but to me it seems obvious that it’s because the Pros have LiDAR, so they can get the two video streams from the two cameras, which are really close to each other, and still add the depth information thanks for the LiDAR. It’s probably a more complex process than we think, not just two videos.
At first l thought that too. But then it came to my mind that 3D cameras don't have a lidar sensor, so that means two points of view are enough to recreate a 3D image digitally (duh!), so I'm back from the start: why the Pro version can, and the regular can't?
 
I just don’t get how the iPhone could record spatial video very well when the separation between lenses isn’t anywhere close to the separation between the human eye required for stereo vision. Is there some AI trick at play here?

And won’t the video look kind of strange since one camera won’t be as good as the other? For instance the main camera is extremely high quality and low noise compared to the ultra wide which they are likely using as the secondary, but wouldn’t they have to crop in on that noisier 12MP sensor to get the same field of view as the normal wide? And won’t there be distortion on the wide that would need to be corrected in real time to match? No way they’re using the telephoto unless that is feeding in some additional perspective data to an algorithm? But the 5X on the Max makes me think no because that’s way too much peripheral context lost.

This whole thing extremely confuses me as a photographer who has a decent grasp on a lot of the math and science behind optics.
 
  • Like
Reactions: Razorpit
I'd be more interested in other applications, like shooting 3D footage to be post-produced.
Not judging whoever feels excited about it, but I personally found the idea of recording the precious moments in your life with a headset on to be most disturbing.
 
I would honestly be more impressed to be finally able to shoot video with both cameras (front and back) at the same time like Samsung or is there at least a 3rd party app that can do this?
There is an app called DoubleTake that does this.
 
The centers of the lenses are about 19.5mm apart. The average human IPD is about 63mm.
So the layout definitely isn’t optimal for 3D capture.
Canon has a special dual fisheye lens for their interchangeable lens camera system. It has a 60mm offset (probably best to go a bit lower than average, and IPD is measured at a distance of infinity, but it is lower when you are looking at a closer object)
It will be a compromise. No software magic will make it look as good as an offset that matches human IPD, sort of like how portrait mode on the iPhone is a poor substitute for a big lens with a wide aperture.

I was thinking that if Apple arranged 4 cameras—ultra-wide, wide, telephoto, 2nd ultra-wide—vertically with the current spacing, the 2 ultra-wides would be almost perfectly spaced. The phone would look goofy without a case, though, and it would be a waste of money for most iPhone Pro purchasers because a large majority of them wouldn’t have the Vision Pro. Maybe a 4th lens could work for an iPhone Ultra, and maybe even having a 3D screen on the phone. I had fun taking stereoscopic photos on my Nintendo 3DS, even though the camera quality is awful.
I seeeeriously believe they are going to include LiDAR Scanner readings for distance measurements....
 
Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.

But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?
I am praying FCP gets really kick ass 3D editing features so I can fully edit 3D or maybe a custom FCP3D app who knows... but with Apple's App Store pricing I would be down 100% – Fan Boy (me)
 
  • Like
Reactions: Ted_Appleseed
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.