Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,698
39,602


All four iPhone 13 models feature a new Cinematic mode that lets users record video with a shallow depth of field and automatic focus changes between subjects, and TechCrunch's Matthew Panzarino spoke with Apple marketing executive Kaiann Drance and designer Johnnie Manzari to learn more about how the feature was developed.

iPhone-13-Cinematic-Mode.jpeg

Drance said Cinematic mode was more challenging to implement than Portrait mode for photos given that rendering autofocus changes in real time is a heavy computational workload. The feature is powered by the A15 Bionic chip and the Neural Engine.
We knew that bringing a high quality depth of field to video would be magnitudes more challenging [than Portrait Mode]. Unlike photos, video is designed to move as the person filming, including hand shake. And that meant we would need even higher quality depth data so Cinematic Mode could work across subjects, people, pets, and objects, and we needed that depth data continuously to keep up with every frame. Rendering these autofocus changes in real time is a heavy computational workload.
Manzari added that Apple's design team spent time researching the history of filmmaking and cinematography techniques for realistic focus transitions.
When you look at the design process, we begin with a deep reverence and respect for image and filmmaking through history. We're fascinated with questions like what principles of image and filmmaking are timeless? What craft has endured culturally and why?

Manzari said Apple observed directors of photography, camera operators, and other filmmaking professionals on sets to learn about the purpose of shallow depth of field in storytelling, which led Apple to realize the importance of guiding the viewer's attention.

The full interview goes into more detail about the work that went into Cinematic mode and highlights Panzarino's testing of the feature at Disneyland.

Article Link: Apple Discusses How it Created the iPhone 13's Cinematic Mode
 
It still doesn’t look great, watch Joanna Sterns review. All that blur around the subject. It has way to
go. Just consider it as the new Animoji, but one with potential. Hopefully Apple will perfect it with software and not require a new iPhone to get better quality. Also watching Zollotechs review, I still don’t see the 35 mm film quality.
 
This is all marketing BS anyway. Apple, Google, and other smartphone manufacturers all do this. They take their lackluster smartphone cameras, put them in the hands of professional videographers in expensive studio locations then post process the bejeezes out of the video with a battery of expensive computers and software. First, those videos look crappy on the TV commercials. Almost every dark scene has the subject front lit via studio lighting, which is cheating IMO. Second, the average person will never be able to grab their smartphone and produce anything like that. It is total BS. It is putting lipstick on a pig. If you want to create TV quality video, then mortgage your house and shell out a small fortune for good equipment, don't buy a smartphone to attempt that. Or you can drink their cool-aide and be disappointed. Smartphone equals low quality images/video for social media. Smart advertising won’t change that.
 
It still doesn’t look great, watch Joanna Sterns review. All that blur around the subject. It has way to
go. Just consider it as the new Animoji, but one with potential. Hopefully Apple will perfect it with software and not require a new iPhone to get better quality. Also watching Zollotechs review, I still don’t see the 35 mm film quality.
It appears they need the new chip for power, it is a software effect, therefore it will not work on older models. In any case these small lenses cannot create depth of field on their own like real cameras do
 
It still doesn’t look great, watch Joanna Sterns review. All that blur around the subject. It has way to
go. Just consider it as the new Animoji, but one with potential. Hopefully Apple will perfect it with software and not require a new iPhone to get better quality. Also watching Zollotechs review, I still don’t see the 35 mm film quality.
Watch the Julia Wolf music video. Cinematic mode is new software, but you can get controllable depth in practice. Once more talented people get a handle of it, you'll see it used very well. It'll never be perfect, it's a thin smartphone controlled by software, but Portrait mode also had a rocky start.
 
This is all marketing BS anyway. Apple, Google, and other smartphone manufacturers all do this. They take their lackluster smartphone cameras, put them in the hands of professional videographers in expensive studio locations then post process the bejeezes out of the video with a battery of expensive computers and software. First, those videos look crappy on the TV commercials. Almost every dark scene has the subject front lit via studio lighting, which is cheating IMO. Second, the average person will never be able to grab their smartphone and produce anything like that. It is total BS. It is putting lipstick on a pig. If you want to create TV quality video, then mortgage your house and shell out a small fortune for good equipment, don't buy a smartphone to attempt that. Or you can drink their cool-aide and be disappointed. Smartphone equals low quality images/video for social media. Smart advertising won’t change that.
More or less the truth... There is but one advantage with a smartphone, and that is you got it always with you, period.
 
I'm impressed that the focus data in Cinematic Mode is also stored within the video file, allowing you to create rack focusing in post production. IMO, that's a lot more useful than trying to create it on the fly. Outside of better zoom, this is what I've been waiting for on the iPhone. When I watched Marques demonstrate it, I wasn't impressed. But he was really close to the objects he was testing, and according to this article CM works best at several feet away. I'm eager to try it out for myself.
 
The CPU is making the background blur, is not “true”, just a special fx, so it could be done in iPhone 12, if Appled wanted to enable it. The new CPU is not 100% faster than the previous, not even 30% faster!. What is the excuse?
Even if it is not natural, remains to be seen how good that is. Maybe they implement some sort of new ranging system that works together to recreate the blur effect.
 
  • Like
Reactions: SFjohn
The CPU is making the background blur, is not “true”, just a special fx, so it could be done in iPhone 12, if Appled wanted to enable it. The new CPU is not 100% faster than the previous, not even 30% faster!. What is the excuse?
Not saying the A14 can't do it, but its handled by the Neural Engine not the CPU, and the A15 claims to have an 'all new 16-core neural engine' over the previous SoC
 
It still doesn’t look great, watch Joanna Sterns review. All that blur around the subject. It has way to
go. Just consider it as the new Animoji, but one with potential. Hopefully Apple will perfect it with software and not require a new iPhone to get better quality. Also watching Zollotechs review, I still don’t see the 35 mm film quality.
That's because they advertise edited stuff which people will think they can pull off directly from the phone and it's almost never the case.
 
So, all this simply means that depth of field is "artificial" or at least sounds like it. It is more an effect than the product of a wide open lens that will create a very tight depth of field.
Yes, it is artificial, but anything but simple. The footage I've seen is impressive for it's ability to mimick the focus effect, but it doesn't convince me yet. I'm sure future phones will ramp up to 4K with even more convincing fields of depth. One thing's for certain: computational video- and photography is here to stay, and will allow for creative uses we cannot even imagine yet.
 
This is all marketing BS anyway. Apple, Google, and other smartphone manufacturers all do this. They take their lackluster smartphone cameras, put them in the hands of professional videographers in expensive studio locations then post process the bejeezes out of the video with a battery of expensive computers and software. First, those videos look crappy on the TV commercials. Almost every dark scene has the subject front lit via studio lighting, which is cheating IMO. Second, the average person will never be able to grab their smartphone and produce anything like that. It is total BS. It is putting lipstick on a pig. If you want to create TV quality video, then mortgage your house and shell out a small fortune for good equipment, don't buy a smartphone to attempt that. Or you can drink their cool-aide and be disappointed. Smartphone equals low quality images/video for social media. Smart advertising won’t change that.
So what.

Does anyone really believe that buying that sports car will make them a better driver, or that kitchen gadget will make you a chef? Of course not, but professional drivers and chefs will take their money and help the marketing people to give us what we want. And we want to be seduced....

We don't have our own TV studio, so the fact it's not the same as pro gear makes no difference. Most people have no want or need for pro gear anyway. What people want is a good enough image to share with their friends, so they can see it on the 6" screen of their phone. This particular feature will appeal to some and will allow them to justify the purchase to themselves. For others, it may be longer battery life or whatever. None of us need this stuff, but we do want it.

It's been going on since the dawn of advertising and this is no different.

Look what happened to Nike once they got Michael Jordan on board. It was another shoe, just like all the others they had made, but suddenly the target market was seduced and Nike went massive.
 
Ok, but WHY did they create this?

My best guess is to make a new feature that makes 13 sound less like a negligible update from 12? Sometimes itd be great if Apple took a year out and came back with more impressive tech instead of all these gimmicky increments.
 
This is all marketing BS anyway. Apple, Google, and other smartphone manufacturers all do this. They take their lackluster smartphone cameras, put them in the hands of professional videographers in expensive studio locations then post process the bejeezes out of the video with a battery of expensive computers and software. First, those videos look crappy on the TV commercials. Almost every dark scene has the subject front lit via studio lighting, which is cheating IMO. Second, the average person will never be able to grab their smartphone and produce anything like that. It is total BS. It is putting lipstick on a pig. If you want to create TV quality video, then mortgage your house and shell out a small fortune for good equipment, don't buy a smartphone to attempt that. Or you can drink their cool-aide and be disappointed. Smartphone equals low quality images/video for social media. Smart advertising won’t change that.
Actually, Moment sells some accessories that will help an amateur get better lighting and produce videos that look good, as well as occasionally run programs you can donate to to help artists afford equipment. Would they help someone produce results that pass as what you think of as TV quality? I highly doubt it.

But if you’ve got kids who are in film classes and they need to at least practice applying some of the principles that pros with the house-mortgage expensive equipment use, it’s not a bad start. They may even be able to produce sellable results that will help them on their way to true professional grade stuff.

Not that any of this stuff is cheap and Cinema mode is far from polished, but it’s more accessible than pro grade equipment. And that’s important to learning principles and processes that will help make the PERSON a pro someday.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.