Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Aren't the lenses too close together for proper 3D cinematography? The distance should be about the same as the average distance between our own eyes, I believe.

Is that even far enough apart for there to be enough parallax for a proper stereoscopic view?


Canon's new stereoscopic RF-S7.8mm F4 STM DUAL camera lens for spatial video recording recently became available for pre-order. In the U.S., pricing is set at $449.99, and orders are estimated to be delivered in mid-November.

Canon-Vision-Pro-Lens.jpg

Apple and Canon announced the lens at WWDC in June. The lens attaches to Canon's EOS R7, enabling the mirrorless camera to record 3D videos for playback on AR/VR headsets like Apple's Vision Pro and Meta's Quest 3. More details about the lens are available on Canon's website, and in our coverage of the WWDC announcement.

After recording spatial videos with the Canon EOS R7 and this lens, Apple said users would be able to edit the videos in Final Cut Pro on the Mac, and upload them to Vimeo. Final Cut Pro will likely be updated with spatial video editing capabilities in mid-November, and Vimeo released a Vision Pro app with spatial video support last month.

Spatial video can also be recorded on both iPhone 15 Pro models and all iPhone 16 models, with no additional hardware required.

Article Link: Canon Now Accepting Orders for Spatial Video Lens Previewed at WWDC


THis is a stereo camera that recorded mv-hevc video. yes, the Apple Vision-Prodevice can play back this kind of video but there are mANY other devices that can. Including the "$8 Google Cardboard" headset.


Yes, it costs eight bucks. It is not quite as good as the $3,500 Apple headset. I have one of these but only made of plastic with better optics. What you do is slide an iPhone in the slot and the software places two images in the screen. The resolution is only half as good as your iPhone but the 3D effect looks decent. An iPhone already has the hardware to do head motion tracking so in theory you could do actual VR.

And then are the other, better devices like the old Optimus headsets. The Canon camera simply cretes mv-hevc files, how you edit them and watch the video is up to you. Viewer prices range from $8 to $3,500 with many price points in between

Some people have said the lenses are too close to do Stereo. But all they need is finer pixes. THe two images need to be different enough to computer stereo disparity. MV-HEVC is not simply two video streams. It is one image plus stereo prediction data. It is a little like the normal interframe production the hevc does.

I think Stereo Disparity can be adjusted computationally to give the effect there filmmaker wants. (but I have no good references on that)


Note that mv-hevc is not limited to only two layers. I hope some day they will make true 6 degree video using MANY layers taken from many points of view.

The required distance between the lens depend on two things:

1) the size of the pixels and,
2) the range of shooting distances you want.

Even human stereo vision is subject to the above rules. With out eye resolution and eye to eye distance we are "stunned" is see stereo from the distance of our hands to about 8 or 10 meters away. After that we judge distance be other means

All you need is for the two images to be enough apart so that the two images are different, with very small pixels the images don't need to be so different
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: DaveN
love people downvoting this for just being an accurate observation. pls downvote me too, i find you hilarious :)
Yes, the observation that the llenses are closer then human eyes is right. The the conclusion that it can not create realistic stereo is not correct. All it needs is enough separation that given the pixel size the software can comput stereo disparity.

There is enough metadata in the mv-hevc file to allow good playback.
 
Tell me you’re ignorant about lens physics without telling me you’re ignorant about lens physics.
Care to point out where @zorinlynx's post is ignorant?

Your eyes depend primarily on parallax/stereo correlation for depth perception, and the distance your eyes are apart determines the parallax. This camera's lenses are far closer together, meaning less parallax, meaning the angles of convergence will be way off compared to people's eyes, meaning the perceived size/distance of objects will be way off.

They can probably fix this with some computational photography magic, but it would be better to just have the lenses the correct distance apart.
 
Noice.
Instantly turn your camera into Peppa Pig.

It's still a joke that you can't use this lens on any other camera. Like using anamorphic lenses, it should just be a software problem to be able to turn a double image into a stereoscopic one.

None of these issues are addressed.
Likely you can use the lens on any camera that has built-in mv-hevc encoding but as of today Camera only makes one such camera. Yes it is a software thing but the software riuns in the camera.

If you put this lens on a normal camera then you would get what Apple calls "side by side" stereo video where there are two images on each frame. Apple has a tutorial on how to convert side-by-side to MV-HEVC. The process is not hard but it is NOT something most non-technical consumers could do.

https://developer.apple.com/documen..._3d_video_to_multiview_hevc_and_spatial_video

So it appear you COULD use this lens on any Canon camera but you be needing to convert the side-by-side format to something else using software that is not on camera.
 
  • Like
Reactions: Dredd67
Yes, the observation that the llenses are closer then human eyes is right. The the conclusion that it can not create realistic stereo is not correct. All it needs is enough separation that given the pixel size the software can comput stereo disparity.

There is enough metadata in the mv-hevc file to allow good playback.
does the vision pro 'compute' anything though? i thought it just delivers each of those feeds to each eye. and yes you can get a stereo effect from less, but it is, nonetheless, less. also while the eos r7 shoots 4k, this lens will no doubt produce side-by-side 2k content, which is what we've been using in vr for a decade. i'm sure its enough of a gimmick to sell some people, but personally i wouldn't find 1-2cm of parallax very rewarding.
 
Care to point out where @zorinlynx's post is ignorant?
Where is the post ignorent?

He seems to not know how the data are recorded. Yes the images are side-by-side on the image sensor but the camera records ONE image and then computes stereo disparity based on that image, it does not record two images.

Noitice that an iPhone can record spaccial video for use with vision pro. The iPhone has two DIFFERENT lenses, one is a wide angle or zoom and the other is a standard. The two lense are even closer together and have different resolutions The iPhone also does not record two images. that is not how spacial video works.

It is OK to be ignorant of technical details. But BEFORE you post a complaint it is best to become educated, you can look up and read how this all works.
 
  • Like
Reactions: DaveN
Where is the post ignorent?

He seems to not know how the data are recorded. Yes the images are side-by-side on the image sensor but the camera records ONE image and then computes stereo disparity based on that image, it does not record two images.

Noitice that an iPhone can record spaccial video for use with vision pro. The iPhone has two DIFFERENT lenses, one is a wide angle or zoom and the other is a standard. The two lense are even closer together and have different resolutions The iPhone also does not record two images. that is not how spacial video works.

It is OK to be ignorant of technical details. But BEFORE you post a complaint it is best to become educated, you can look up and read how this all works.
i did look this up and found no consistent info either way. some people said spatial video is just apple's marketing term for stereo 3d video. others said "you can walk freely through your videos" which sounds like a stretch and the hallmark of some tech journalist misinterpreting what 3d video is. if it computes a depth map and generates a stereo image on the fly, i could see that allowing for slight dynamic parallax and compensating for the miniscule interocular distance of two iphone cameras. but also... i'd be wary of how well that works without creating some blobby smeary mess in many challenging scenarios where it struggles to compute depth (esp if one goes off the portait mode/cinematic mode on their cameras as indicative of that depth computation).

my assumption is that it is stereo video and any 'processing' done on iphone recordings are to compensate for the colour balance and fov differences between the lenses. open to being shown any tech paper that shows otherwise.
 
Because Canon already makes a dual-fisheye lens for their full frame cameras. It costs $2000. This lens costs less than a quarter of that, and it works on their also less expensive APS-C cameras. It opens up the market to more creators.
Can you provide a link to the dual-fisheye lens for the Cannon full frame camera? I’d be interested in checking that out. Thanks!
 
  • Like
Reactions: Jumpthesnark
Here it is, and ooo the price dropped $200 today. Amazon reflects that new price too.
Thanks! That’s a really interesting product. Shame it has no path to create content for the AVP, and needing to pay for 2 software subscriptions to use it. I’ll keep an eye on it, hopefully Cannon will figure out there is a wider market for this lens. Ideally I’d like to record 3D 180 pics & video to export to the yet to be updated Final Cut X.
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
  • Like
Reactions: Jumpthesnark
Thanks! That’s a really interesting product. Shame it has no path to create content for the AVP, and needing to pay for 2 software subscriptions to use it. I’ll keep an eye on it, hopefully Cannon will figure out there is a wider market for this lens. Ideally I’d like to record 3D 180 pics & video to export to the yet to be updated Final Cut X.
The one just like it that already exists for the Canon R7 is also very interesting and seems more "pro" and capable of full 180°. The new little one only has a 60° field of view.
 
It would be much better if people could edit a 3D video with a 3D headset, namely Apple Vision Pro.
I've said it already, but I truly think apple should rethink and release a version of all its apps for Vision Pro, including the professional ones.
Yes, they should, but since AFAIK Final Cut Pro still doesn't support Spatial Video directly, it would be incredibly frustrating to have it on the AVP without that feature. It sounds like that support is supposed to be added to FCP in the next update, so I wouldn't be surprised if they add an AVP version of FCP at the same time, as that would kind of explain the delay in adding Spatial support. Thats what I'm hoping, anyway.
 
  • Like
Reactions: Le0M and SFjohn
Aren't the lenses too close together for proper 3D cinematography? The distance should be about the same as the average distance between our own eyes, I believe.

Is that even far enough apart for there to be enough parallax for a proper stereoscopic view?
I have a Sony 3D Handycam with similar lens spacing and it works fine for 3D.
 
The one just like it that already exists for the Canon R7 is also very interesting and seems more "pro" and capable of full 180°. The new little one only has a 60° field of view.
Also interesting, finally there is this:
Yet anything with Cannon & VR requires their subscription software, which leaves me ill at ease. 🤬
 
  • Like
Reactions: Jumpthesnark
Also interesting, finally there is this:
Yet anything with Cannon & VR requires their subscription software, which leaves me ill at ease. 🤬
$50/year when they already have you for anywhere between $1,700 and $5,000 for one of the lenses and a compatible camera seems so petty. My bet would be Apple will license the Canon software and just bake it into Final Cut. I am hoping they choose to support more than just Spatial Video as an output format though. If I am a creator investing in creating this content, I want to be able to publish to more than one platform.
 
Who is making content for the 5 people that own and actually still use an AVP?

If the AVP isn't a consumer-level product, then why is Canon making consumer-level lenses for consumer-level cameras to produce consumer-level content for Vimeo that can only be viewed on an AVP?

Meta users aren't going to care about this unless it's for porn. They buy VR headsets for gaming.
FYI the Meta Quest can also play the Apple immersive videos so the market is much larger than just AVP.
 
  • Like
Reactions: SFjohn
Are they getting the Vimeo app too? A simple universal way to distribute and embed content on the web is critical for people adopting this.
 
FYI the Meta Quest can also play the Apple immersive videos so the market is much larger than just AVP.

Presumably 3DTVs can also play them. That doesn't really increase the market. These are also not 'immersive' videos. They're not 360VR. They're stereoscopic videos filmed with lenses closer together than human eyes, so they're not even an effective 3D format.

Ultimately this concept will fail for a really simple reason, these videos cannot be played back on the devices used to shoot them. People want to shoot it and share it. It's always going to be extremely niche adoption. Who is going to invest in producing videos for an audience so small? Nothing about this concept makes any sense.
 
They're stereoscopic videos filmed with lenses closer together than human eyes, so they're not even an effective 3D format.

Comparing lens separation for the 2.8 dual fisheye with the distance between my eyes in a mirror for the 2.8 dual fisheye the distances look about the same, or very close.

Not sure why you say they are not an effective 3D format.

these videos cannot be played back on the devices used to shoot them. People want to shoot it and share it.

Sharing is quite simple. Just put it in Photos and anyone can see it. In 2D if they don't have a Vision Pro.
 
Sharing is quite simple. Just put it in Photos and anyone can see it. In 2D if they don't have a Vision Pro.

So, people can't share a 'spatial video' with someone that has the same device as was used to film it. They can just share a low-resolution video with someone. If they don't have a Vision Pro (the few people that do don't use them, so effectively no one has one), then Spatial Videos cannot be viewed on any other Apple product, so sharing them is pointless.
 
people can't share a 'spatial video' with someone that has the same device as was used to film it.

Don't understand this sentence.

Spatial Videos cannot be viewed on any other Apple product, so sharing them is pointless.

They can be viewed on most apple products - only in 2D rather 3D. I'm watching on my iPad one that I just created. In a shared album so others can see.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.