I have a horrible feeling it will be locked down to some proprietary format but I’d love to be able to record and then view it on a Reverb G2
Because they prefer to rag on Cook, his leadership and Apple in general? Although it won't always work out, Cook and his senior leadership team seem to know what they are doing most of the timeMillions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
I think I've heard "hitherto" most of my life, but not very often.First of all, it’s the first time I see the word “hitherto”.
I had already decided to buy, but this is a great bonus. My worry is that I will want to use it often between now and when the AVP is released, so that I have a library of personal spacial videos to view with my new headset.Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.
But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?
Don’t the lenses have to be seperated like on the vision pro like the eye distance, how do they achieve the same with the lenses next to each other
I doubt that's the reason. I'm pretty sure Apple is trying to build as big a momentum for the Apple Vision as possible.Because that’s what gets people to pay more for the Pros.
I wonder if the VisionPro will let you get up and walk around while viewing spacial video. How will it look if you look at the video from the side rather than from the front? Or just at an angle other than straight on.Not necessarily. The sensor size to lens distance ratio is probably comparable to our retina to eye distance ratio.
Distance to the subject also comes into play and depth perception is dependent primarily on foreground/background separation in addition to two different perspectives that stereo vision provides.
A VisionPro would allow you to get closer to a subject while maintaining detail on both sides of the subject, but a camera a few metres away from the subject would get similar detail, enough to sell the idea of stereo vision.
I use DoubleTakeI would honestly be more impressed to be finally able to shoot video with both cameras (front and back) at the same time like Samsung or is there at least a 3rd party app that can do this?
I think it's especially impressive to now look back and see how Apple has been slowly but steadily laying the foundation for the Vision Pro. Animoji from the iPhone X, design cues found in the Apple Watch and AirPods Max, stage manager in the iPad, now spatial video recording in the latest iPhones, it's all coming together in a manner you simply won't see from the competition.Millions of iPhone 15 Pros will support the recording of spatial video even before the Vision Pro goes on sale. I have only one question: Why aren't more people congratulating Tim Cook to this amazing product strategy?
The 3D video format is almost certainly based on MV-HEVC which is new, and was announced at this year’s WWDC. There’s nothing public for Final Cut Pro yet, but I will certainly be asking the FCP team about that at the FCP Creative Summit this November.Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.
But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?
It's done when I say it's done.Remember all the people slagging on the Vision Pro about the idea of Spatial Video? The idea of how awkward it would be for a dad to wear a Vision Pro to take video at their kids birthday party?
I said then that Apple would roll this 3D photography capability into iPhone. Done.
People being like ‘cause they want to sell you the Pro’ but to me it seems obvious that it’s because the Pros have LiDAR, so they can get the two video streams from the two cameras, which are really close to each other, and still add the depth information thanks for the LiDAR. It’s probably a more complex process than we think, not just two videos.Since these special videos and photos are obtained thanks to the wide camera and the regular camera, I wonder: why is this feature limited to iPhone Pros?
Emerging technology uses generative AI to construct a 3D model from a 2D image. Here it is for products:I wonder if the VisionPro will let you get up and walk around while viewing spacial video. How will it look if you look at the video from the side rather than from the front? Or just at an angle other than straight on.
At first l thought that too. But then it came to my mind that 3D cameras don't have a lidar sensor, so that means two points of view are enough to recreate a 3D image digitally (duh!), so I'm back from the start: why the Pro version can, and the regular can't?People being like ‘cause they want to sell you the Pro’ but to me it seems obvious that it’s because the Pros have LiDAR, so they can get the two video streams from the two cameras, which are really close to each other, and still add the depth information thanks for the LiDAR. It’s probably a more complex process than we think, not just two videos.
There is an app called DoubleTake that does this.I would honestly be more impressed to be finally able to shoot video with both cameras (front and back) at the same time like Samsung or is there at least a 3rd party app that can do this?
I seeeeriously believe they are going to include LiDAR Scanner readings for distance measurements....The centers of the lenses are about 19.5mm apart. The average human IPD is about 63mm.
So the layout definitely isn’t optimal for 3D capture.
Canon has a special dual fisheye lens for their interchangeable lens camera system. It has a 60mm offset (probably best to go a bit lower than average, and IPD is measured at a distance of infinity, but it is lower when you are looking at a closer object)
It will be a compromise. No software magic will make it look as good as an offset that matches human IPD, sort of like how portrait mode on the iPhone is a poor substitute for a big lens with a wide aperture.
I was thinking that if Apple arranged 4 cameras—ultra-wide, wide, telephoto, 2nd ultra-wide—vertically with the current spacing, the 2 ultra-wides would be almost perfectly spaced. The phone would look goofy without a case, though, and it would be a waste of money for most iPhone Pro purchasers because a large majority of them wouldn’t have the Vision Pro. Maybe a 4th lens could work for an iPhone Ultra, and maybe even having a 3D screen on the phone. I had fun taking stereoscopic photos on my Nintendo 3DS, even though the camera quality is awful.
I am praying FCP gets really kick ass 3D editing features so I can fully edit 3D or maybe a custom FCP3D app who knows... but with Apple's App Store pricing I would be down 100% – Fan Boy (me)Spatial Video is an exciting feature. Once it was introduced, it made me instantly change my decision from skipping this generation to "must buy." I'm also getting the Vision Pro, which has already been a plan.
But I still have many questions. As a video editor, I wonder how it will work with Final Cut Pro. And will Spatial Video be an industry standard that other platforms will also support, such as YouTube?