Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They’re gonna sell like 8 of these total at that price.
Every decent camera rental shop will have these. They are not made with individuals in mind, but professionals who rent. If someone rented this for $1,000 a week (and they charge the client) and they use it twice or 3 times a year, it’s a no brainer instead of buying one that sits on the shelf week after week between shoots and is outdated in 2 years.
 
Too bad the corn crop is so poor these days. In the end quantity aways comes at the expense of quality, as commodification demands the cheapening of every good thing. :(
Torrent 🏴‍☠️ there’s always a guy with a maxed out master card helping the bros
 
Is it just me or are all these consumer cameras putting the sensors way too close to each other? What do the 180 degree stereoscopic production studios using?
Human pupils are an average of around 2.5 inches (62-64mm depending on gender), so this very-definitely-not-consumer camera looks right about what you'd expect for human eyes. Keep in mind what matters is the distance between the centers of the lenses, not the edges.

As far as consumer lenses like the recently-released Canon one, they need to attach to a single body with an existing, relatively small sensor, and both lenses need to share the small surface area of that sensor, so without doing some crazy periscope thing to get the lenses farther apart there's a pretty tight physical constraint on what you can do, so I guess manufacturers just go with "eh, it's far enough to get some stereo separation".

I'm actually a bit curious about the results from cameras with too-close lenses: I would think distance would look subtly compressed, since there would be less parallax than your eyes naturally have, so closer objects would appear to be farther away than when they were shot. I'm going to guess that's why you can get away with it--most people shoot things that aren't really close to the lens, so the reduction in depth is less noticeable when you view it. There are probably also postprocessing tricks you can do to simulate some of what's lost.
 
You obviously don’t own a Vision Pro. If you want to view 2D content and interact with someone else in the same room, that’s fine. But if you want to feel like you’re actually in the action or at an event, there is nothing like Immersive Video in AVP.

No I don't, but I have used one a few times. I'll spend the money to go the actual event.

I'm too social to spend time behind a pair of goggles.
 
  • Haha
Reactions: G5isAlive
It is designed to capture content with a resolution of 8,160 x 7,200 per eye...
Obviously there's future-proofing and movie-theater-sized 8K projection built into a pro camera like this, but I'm really curious how (or if) cropping works (given that Apple Vision Pro is on the order of half the pixel dimensions of these sensors).

With a still or single-lens video camera if you're shooting at 8K+ and displaying in 4K, you have the option of downscaling or cropping to the part of the frame you actually want to use, so it buys you a bunch of flexibility in post-production if your shot wasn't framed quite right.

With a stereo camera I would assume that cropping in the middle of the frame works fine, but without really knowing the details of the optics and playback my gut feeling is you might get some kind of weird, nausea-inducing parallax distortion if you cropped off center.

Maybe not, though--anybody with better optics/VR knowledge have an idea of whether this works in practice?
 
Mhm... mixed feelings about that one...
So is it gonne be "fixed direction with 180° FOV" i.e. immersive video or look around inside the 180° with a smaller FOV? The AVP is 100-110° iirc, so I guess a little bit of look around?
Anyway, ultra-wide lenses are not known for their superior optical qualities. Also focus might be a problem. At least working with depth of field could be tricky/awkward as when looking around it doesn't focus on the area looked at (as the human eye does when look somewhere else).

Meanwhile for actual VR content, there are 360 cameras that allow full look around. Naturally, these are fixed focus:

If you want actual look-around VR content you need to capture a few more angles... like that device here from 2017:
1718142878165.jpeg

or https://www.insta360.com/product/insta360-pro

Or for more than just horizontal capture looks like that:
1718143116960.png


In a nutshell I don't see the blackmagic being used for movies nor actual VR content...
My guess is this camera will be primarily usefull in very specific scenarios like documentaries, that both want to capture the subject but also capture the scenery in detail (and preferably don't require much post processing).

The Field of View makes ALL the difference.

All these failed 3D attempts of flat screens. The biggest jump in innovation of cameras has only come after the AVP was produced. Haters gunna hate, but Apple have done it again because everyone is getting on board.
FOV alone doesn't get you a stereo-image... stereoscopy is what delivers the perceived depth.
As for innovation: What happened is that the attention of the mass media jumped in that direction so the perception changed.
The AVP can be a game-changer, yes... but stereoscopic video is not it -simply because stereoscopic videos, no matter how good, justify a VR headset priced at 3k+. So, that's not what's going to make the big impact.
 
  • Like
Reactions: Riff_Raff
I'm actually a bit curious about the results from cameras with too-close lenses: I would think distance would look subtly compressed, since there would be less parallax than your eyes naturally have, so closer objects would appear to be farther away than when they were shot. I'm going to guess that's why you can get away with it--most people shoot things that aren't really close to the lens, so the reduction in depth is less noticeable when you view it. There are probably also postprocessing tricks you can do to simulate some of what's lost.
It's the opposite of what you're saying. You want the lenses farther apart when shooting things farther away. The only reason I'd want the lenses closer together than average human eyes would be to do close up photography of smaller items.
Obviously there's future-proofing and movie-theater-sized 8K projection built into a pro camera like this, but I'm really curious how (or if) cropping works (given that Apple Vision Pro is on the order of half the pixel dimensions of these sensors).

With a still or single-lens video camera if you're shooting at 8K+ and displaying in 4K, you have the option of downscaling or cropping to the part of the frame you actually want to use, so it buys you a bunch of flexibility in post-production if your shot wasn't framed quite right.

With a stereo camera I would assume that cropping in the middle of the frame works fine, but without really knowing the details of the optics and playback my gut feeling is you might get some kind of weird, nausea-inducing parallax distortion if you cropped off center.

Maybe not, though--anybody with better optics/VR knowledge have an idea of whether this works in practice?
This Blackmagic camera has lenses with a 180° FOV. The Vision Pro FOV is closer to 100°. Also, the Vision Pro has higher PPD in the center than the edges of the image. So the PPD will be similar between the two if you are displaying things at a 1:1 scale.
 
It's the opposite of what you're saying. You want the lenses farther apart when shooting things farther away. The only reason I'd want the lenses closer together than average human eyes would be to do close up photography of smaller items.

This Blackmagic camera has lenses with a 180° FOV. The Vision Pro FOV is closer to 100°. Also, the Vision Pro has higher PPD in the center than the edges of the image. So the PPD will be similar between the two if you are displaying things at a 1:1 scale.

So there are a lot of people here that know the physics better than myself (like you for example), but at the end of the day I am an empirical scientist, not a theoretician, and all I know is it makes sense you would need reasonable separation to get good stereo...but... and here comes the but...Exhibit A, the 15 pro max can take good spatial photos with small lenses that are close together.. so Apple must be doing something computationally. My only problem with spatial photos is they are relatively small. Exhibit B, in AVP Beta 2 Apple has introduced converting old 2D photos into an amazing spatial photograph that just really pops... so again they are doing something computationally.

So in the end, and I am not meaning to argue with you, at all, recognizing your knowledge but this seemed as good a place to comment as any, I am just going to wait and test out these new camera systems designed for Apple Spatial before reaching any conclusions.
 
  • Like
Reactions: fatTribble
If only Apple allowed to consume adult material in the Vision Pro…

Apple can't and doesn't block what you download to your Vision Pro. Or so a friend told me. They don't allow out and out pornography on the App Store, but the same friend says so what? No shortage of sources. So I hear.
 
  • Like
Reactions: fatTribble
Is it just me or are all these consumer cameras putting the sensors way too close to each other? What do the 180 degree stereoscopic production studios using?
It might look like they are too close together, but remember those lenses are much bigger than your eyeballs, so they 'look' closer.
Distances between peoples eye's CENTERS are between 60-65mm (about 2-1/3" to 2-1/2")
 
I'd love to bolt this into a race car...

Yes, this camera inside the most popular teams cars at Le Mans, Daytona, Indy... Short tracks like North Wilksborro, Five Flags, .. The Chili Bowl...

Watching the race feeling like you were inside the car.. yes.. this has real potential..
3D vision would make racing games finally actually realistically feel like driving a car. 2D vision is a massive limitation of all FPV games.
 
2 hours of video on the integrated 8TB drive. Whew. I thought Apple ProRes was a lot to deal with.
Well damn, shame about the price and size limitation of Mac SSDs.... I guess you'd need a bunch of cheap external drives to store your library huh.
 
that’s a huge bet someone is making. Unless you already have a plan on how you’re recouping your investment that’s a risky spec purchase, especially with anything tech related.
They can possibly re-purpose it for regular 3D content using different lenses
 
  • Like
Reactions: rp2011
So there are a lot of people here that know the physics better than myself (like you for example), but at the end of the day I am an empirical scientist, not a theoretician, and all I know is it makes sense you would need reasonable separation to get good stereo...but... and here comes the but...Exhibit A, the 15 pro max can take good spatial photos with small lenses that are close together.. so Apple must be doing something computationally. My only problem with spatial photos is they are relatively small. Exhibit B, in AVP Beta 2 Apple has introduced converting old 2D photos into an amazing spatial photograph that just really pops... so again they are doing something computationally.

So in the end, and I am not meaning to argue with you, at all, recognizing your knowledge but this seemed as good a place to comment as any, I am just going to wait and test out these new camera systems designed for Apple Spatial before reaching any conclusions.
I viewed some spatial video I captured on an iPhone 15 Pro and I was not impressed unless the subject was within a couple feet of the camera (though I didn't use the Vision Pro to view them, I used a different VR setup). Maybe for still photos they are handling it differently.

Even if Apple is doing some computation in post, well, AI can also do a decent job of colorizing black and white photos, but it's still better to capture the original color.

I understand Apple doing the best they can with what they have (a preexisting iPhone with closely-spaced cameras), but I just don't understand making a purpose built tool with the same limitations.

I haven't seen the conversions in VisionOS 2, but I'd imagine it would have trouble with foliage, light shafts, floating particles, transparency, chain-link fences, etc.
It would be fun to compare a stereo image shot with correct spacing and a 3D conversion of one of those images.

I'm not an expert or professional with any of this, but I just have a strong interest in vision and how it works, and displays, cameras, VR, optical illusions, etc.
 
That Alicia Keys part of the Vision Pro store demo, where she stands in front of you and sings right at you. That was the first time my brain was ever tricked into thinking that a virtual object in front of me is actually real. The video resolution, the stereoscopic effect with object separation, and spatial audio just made it too real.

I can't imagine the kind camera and mic setup that was needed to capture it, but I hope that this Blackmagic spatial cam is close to it.
The first for me was when the dinosaur comes up to sniff you - I actually took a couple steps back
 
  • Like
Reactions: zubikov
This ain’t no 3D.
If sports, events etc could be streamed with this technology it would make even current gen of VPRO an order of magnitude more attractive.

It would be the famous killer app.
There are some new sport samples in the Immersive Video section on Apple+ - basketball, football -
They are only a couple seconds long, but pretty spectacular. For football, I don't know how well it would work for a whole game, since the experience is so much more impressive when up close. Maybe more for sideline cams for showing replays. Could be pretty good using the skycam, but - vertigo?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.