Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The look became a bit of a fad after some TV shows started recording using DSLRs and incorporating the effect. It takes big, expensive lenses to do well, so the public was convinced it was a "professional" look. I do agree with you for the most part, it's a look that has its place in a small percentage of of shots. But this cheap second-rate imitation is going to be abused to death and a flash-in-the-pan fad. When every trashy photo uses it, the "professional" reputation isn't going to last.

You need two cameras because there is parallax information between the two images that can be used to judge depth and isolate the foreground from the background.

LOL

Shallow depth of field has been around much longer than television has even existed! And television (and movie) cameras have certainly had the ability to do shallow depth of field for far longer than before any of them started using DSLRs for shooting.

It is not a "gimmick" or a "fad". It's a photographic _technique_ that, when used properly, can enhance the artistry of a photograph.

Most commonly it can be used to isolate a subject. This can obviously be used to call attention to the subject and, naturally, bring the viewers eye to the subject. It can also be used in many more subtle ways like giving a _feeling_ of isolation to the viewer or even a feeling of dread by obfuscating some foreboding presence in the background.

Photography is NOT just about "capturing a scene". It's about creating art. And just as painters have many techniques for conveying emotion/mood/feeling or emphasizing a subject, etc. so do photographers.

A short depth of field is just one of those techniques. It will continue to be used (and I'm sure, at times, abused) by anyone creating art with a lens and sensor.

Personally, I'm happy to have this option on the device in my pocket.

Just because I don't have my huge camera with me... doesnt mean that I don't see things that I want to capture artistically... and, if it would enhance the photo, it's nice to have the option of a "short depth of field".
 
As a consumer I think this is a nice feature.

As a photographer I think this is taking some of the 'art' out of photography.
 
As a consumer I think this is a nice feature.

As a photographer I think this is taking some of the 'art' out of photography.

As a photographer, the art of creating a compelling image, comes from a photographer's life experiences, imagination, curiosity, skill, judgment, the ability to see, read light, and ultimately the ability to take a scene in front of you and create a composition that stirs a viewer's mind, evoking a reaction. There's are many decisions, most having little to do with gear technicals.

Employing shallow depth of field (or not) is just one of many decisions to be made in the creation process. Its a "tool" to draw upon when desired.

During several long-term periods, including the present, my daily camera has been my phone. Having shallow DOF ability in a phone, even though it doesn't have the same strength of using an f/1.4 lens on a large sensor camera, is great.
 
  • Like
Reactions: ErikGrim
You are right. Although I am not photography expert but here are some pictures

http://www.windowsphonearea.com/tune-focus-using-nokia-camera/

Yeah, the key here is that the blur effect you can get from a sensor/lens combo depends on multiple factors. One is distance to your subject. The closer you are, the stronger the blur you can achieve. The Nokia camera there doesn't really get a strong blur, but to get what it does, the front subject is basically inches away from the camera. With a good lens on a DSLR, the blur should be a lot more pronounced in this situation. Bring the camera out for what you would use for portraits, and you will get little to no bokeh.

That example really just shows that you can find the limits of the depth of field of a smartphone camera, but the depth is still huge, even when trying to create a macro shot. Here's a better example of what you can do with very close focus on a larger sensor in terms of bokeh. Note how about the only way you can tell the snake's own body from the background is the color difference. If this was B&W, it would be even harder. The distance between the in-focus head and the body is maybe 1-2 feet.

DSC05923.jpg

Why can't people just be chill. As a photographer, I think it's amazing these small sensor cameras on mobile phones are pushing the limits of what can be done by something that fits in a jean pocket.

Apple has done a great job in creating a really nice out of focus area, when done right. Obviously not every instance in which this "mode" will be used will work well, but in the hands of people who understand photography and depth of field and can appreciate bokeh, this is exciting AF.

I sure am gonna have loads of fun with this.

Yup, it is impressive since it's the best imitation of bokeh I've seen from a smartphone to date. And it is a good feature to use when it is applicable. But there are still important gaps that make it different from optical bokeh enough to almost be uncanny valley for me in certain situations too, even when it works. It's just very good at not making it obvious, which I like.
 
Last edited:
LOL

Shallow depth of field has been around much longer than television has even existed! And television (and movie) cameras have certainly had the ability to do shallow depth of field for far longer than before any of them started using DSLRs for shooting.

It is not a "gimmick" or a "fad". It's a photographic _technique_ that, when used properly, can enhance the artistry of a photograph.

Most commonly it can be used to isolate a subject. This can obviously be used to call attention to the subject and, naturally, bring the viewers eye to the subject. It can also be used in many more subtle ways like giving a _feeling_ of isolation to the viewer or even a feeling of dread by obfuscating some foreboding presence in the background.

Photography is NOT just about "capturing a scene". It's about creating art. And just as painters have many techniques for conveying emotion/mood/feeling or emphasizing a subject, etc. so do photographers.

A short depth of field is just one of those techniques. It will continue to be used (and I'm sure, at times, abused) by anyone creating art with a lens and sensor.

Personally, I'm happy to have this option on the device in my pocket.

Just because I don't have my huge camera with me... doesnt mean that I don't see things that I want to capture artistically... and, if it would enhance the photo, it's nice to have the option of a "short depth of field".

ROFL

Historically it was hard to get a deep enough depth of field. Film used to be much lower resolution -> larger sheet size -> longer lens and less sensitive -> wider lens aperture. Shallow depth of field is how photography started. Decades of advances in better film led to usable depth of field.

Now, fast forward to more recent times, everyone was used to nice deep depths of field most of the time. Shallow portraits were always a niche thing professionals, but at no point in photographic history did consumer oriented cameras do it well. The "look" only took off big time recently when certain TV shows started using it, and to do it they were using DSLR cameras.

As far as a gimmick or a fad, can you name a single digital filter "look" that was ever not a fad. They all get very popular, very overdone, and then a backlash knocks them out of fashion insanely fast.

Remember when we'd desaturate most of a picture leaving a small area oversaturated? Remember the filters that turned every point source of light into a star? The list is endless. And every single one was a passing fad once it became easy to do. But I guess you know best and this will be the one time that's different from every other "look" that made it huge for a couple of years. Or can you name one single digital photo enhancement filter that became easily widely available, became hugely popular, and didn't fizzle away as a passing fad. :rolleyes:
 
Color management is not possible? Everything you said except the A10 is the highest performing CPU has been ********, including color management. http://spyder.datacolor.com/blog/2013/07/26/datacolor-announces-spydergallery-for-android/

The issue with that example is that it is about calibrating and then using that calibration for management in one app. Starting with iOS 9 and the iPad Pro 9.7", Apple is enabling color management system wide. So when building an app UI, if I want, I can design it such that the colors are all mapped using sRGB and will get displayed correctly on wider gamut displays. Lightroom Mobile is also using this to display photos during editing more accurately with the latest update when on a wider gamut display device (iPad Pro 9.7 & iPhone 7).

This stuff isn't baked into Android. Samsung's display modes get close, but apps can't seem to tap into it to make it seamless. The user has to switch into the appropriate mode and assume the app doesn't try to undermine it in some way by clipping everything to sRGB.
 
Or, in the settings, they should allow us to customize the amount of options we want to have visible on the camera app. For example, I only use photo, video and rarely slo-mo, so 3 of them are just pointless to me and make me swipe unnecessary.

Same thing for the Control Centre, I only use the wifi and bluetooth switch. The other 3 are just sitting there, wasting space. And now theres also Night Shift, which I also don't use and it's taking up even more space.

I still find it strange Apple hasn't opened up these customizable features to let us have more control to what we want and don't want to see on our phones.

You can remove the calculator icon from Control Center by uninstalling the built in calculator app. Doesn't gain you anything, but you can do it.

I think Apple tries to avoid having customization features like that, because sometimes tech-savvy people will take the device of a non-tech-savvy person and change settings as a prank. Then it behaves oddly and the pranked person blames it on Apple.

I saw that kind of thing all the time in high school - I never did it myself.
 
ROFL

Historically it was hard to get a deep enough depth of field. Film used to be much lower resolution -> larger sheet size -> longer lens and less sensitive -> wider lens aperture. Shallow depth of field is how photography started. Decades of advances in better film led to usable depth of field.

Now, fast forward to more recent times, everyone was used to nice deep depths of field most of the time. Shallow portraits were always a niche thing professionals, but at no point in photographic history did consumer oriented cameras do it well. The "look" only took off big time recently when certain TV shows started using it, and to do it they were using DSLR cameras.

As far as a gimmick or a fad, can you name a single digital filter "look" that was ever not a fad. They all get very popular, very overdone, and then a backlash knocks them out of fashion insanely fast.

Remember when we'd desaturate most of a picture leaving a small area oversaturated? Remember the filters that turned every point source of light into a star? The list is endless. And every single one was a passing fad once it became easy to do. But I guess you know best and this will be the one time that's different from every other "look" that made it huge for a couple of years. Or can you name one single digital photo enhancement filter that became easily widely available, became hugely popular, and didn't fizzle away as a passing fad. :rolleyes:

The thing is: this technique has already withstood the test of time. As you point out... it's been with us since the beginning (indeed - even our EYES generate good Bokeh) and it's still very popular today. There are still raging debates that you can find on any photography forum about which lenses generate the most pleasing Bokeh... and every time a new fast lens comes out reviewers write huge posts about the quality of the Bokeh compared to the old lenses.

Will _this_ form of slim depth of field simulation withstand the test of time? No idea. But the general idea of sharp focus on subjects with a gradually increasing blur into the background isn't going out of style any time soon.

EDIT: BTW... my own work is generally landscapes... where I'm looking to maximize depth of field without running into the diffraction limit. So I definitely understand the purpose of the choice of depth of field....
 
  • Like
Reactions: ErikGrim
You can remove the calculator icon from Control Center by uninstalling the built in calculator app. Doesn't gain you anything, but you can do it.

I think Apple tries to avoid having customization features like that, because sometimes tech-savvy people will take the device of a non-tech-savvy person and change settings as a prank. Then it behaves oddly and the pranked person blames it on Apple.

I saw that kind of thing all the time in high school - I never did it myself.

Hm, not a good reason tbh as most phones are passcode locked. If they don't want people messing with other people's phones, then they would also make you put in your code before you could use the camera.

About the calculator, it's not going anywhere, since I use that a lot. :p But yeah, still wish they opened up the settings a bit more so we could configure it a bit more. It doesn't have to be Android-like customization which is huge, but some control from the user would be a big welcome.
 

Well, it's close, but not perfect. No, Apple isn't applying gaussian blur. The camera assesses the portrait subject, and focuses the appropriate lens on that. Then the background is photographed intentionally out of focus (mechanically), and the two images are merged. Both versions are optical and the only trickers involves feathering the two images together.

Where it fails to be perfect is noticeable to a professional eye. That is that the out of focus area doesn't have varying degrees of Bokeh (out of focus) depending on the depth. This sample photograph demonstrates this well, you have one defined area of focus for the portrait subject, and then one consistent bokeh for all remaining areas. In other words, the fence should be more in focus where it is closer, and then fade to extreme blurriness in the far field area. But with this trickery, it doesn't do so, nor is it physically possible without software tricks.

This is a pleasant trick, but it is still not possible to replace a correct large aperture camera for serious photography.
 
As nifty as the new "portrait mode" is.... it doesn't even kind of look like real DoF. It looks like simple post-processing blur.... which, of course, it is....
 
Well, it's close, but not perfect. No, Apple isn't applying gaussian blur. The camera assesses the portrait subject, and focuses the appropriate lens on that. Then the background is photographed intentionally out of focus (mechanically), and the two images are merged. Both versions are optical and the only trickers involves feathering the two images together.

Where it fails to be perfect is noticeable to a professional eye. That is that the out of focus area doesn't have varying degrees of Bokeh (out of focus) depending on the depth. This sample photograph demonstrates this well, you have one defined area of focus for the portrait subject, and then one consistent bokeh for all remaining areas. In other words, the fence should be more in focus where it is closer, and then fade to extreme blurriness in the far field area. But with this trickery, it doesn't do so, nor is it physically possible without software tricks.

This is a pleasant trick, but it is still not possible to replace a correct large aperture camera for serious photography.

You act like you know. But you don't.

That's not at all what Apple is doing.

And the blur is depth dependent.

Here's a good writeup and examples showing the depth dependent blur: https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/3/
 
  • Like
Reactions: ErikGrim
As the techcrunch article states, software blurring is being used for the preview, and lens blur is being used for the background, just as I said.

About the 'greater and greater' amounts of blur being applied to layers, take a look at the image I referenced. There is only one layer of blur for anything other than the portrait. Regardless of what techcrunch says, it is either not currently working, or it just doesn't work very well.

Regardless, Apple has done some very cool stuff here. I can't wait to see how it improves over time.
 
Last edited:
As a consumer I think this is a nice feature.

As a photographer I think this is taking some of the 'art' out of photography.
100% agreed. While this is a feature that sold me on the 7+, I also look at all of my camera equipment and think...well, ****.
 
As the techcrunch article states, software blurring is being used for the preview, and lens blur is being used for the background, just as I said.

About the 'greater and greater' amounts of blur being applied to layers, take a look at the image I referenced. There is only one layer of blur for anything other than the portrait. Regardless of what techcrunch says, it is either not currently working, or it just doesn't work very well.

Regardless, Apple has done some very cool stuff here. I can't wait to see how it improves over time.

Where do you see that?

The article states that:

"So, for instance, if the camera analyzes the scene and pins your subject at 8 feet away, it will slice the image and apply a blur effect on a progressive gradient scale across the other layers. Things that are very close to your subject may be sharp — included in that variable-width slice of the in-focus area. Once they get further away they get a little blur, then more, then more — until things in the far foreground or far background are blurred to the “maximum” level."

And from the photos it's obvious that it's a gradual blur as things get moved further away.

Look at the fence in this one...

scale.jpg


And how do you explain the slowly increasing blur in this one:

bird1.jpg



I don't see anywhere in there that it says "The iPhone takes a second blurry picture with the other camera and then the two are combined". That's not what's happening. It's using the two cameras to sense depth... and then applying progressive blur in up to 9 different levels based on the distance objects are from the focal plane.
 
  • Like
Reactions: Zirel and ErikGrim
I do not understand this feature. Why would you want a blurry background, and how does it require two cameras?

Blurred out backgrounds are a visually pleasing effect used in photography all the time. This can be only achieved on professional cameras because of something called aperture equivalency. This simulates that effect with a software blur
[doublepost=1474585759][/doublepost]
Incorrect:
500px-Faux-bokeh-comparison.jpg


1. No Bokeh
2. Synthetic bokeh using lens blur (what this function is doing)
3. Guassian blur, photoshop

That effect is actually pretty good. Better than I thought, but I would have to see full pics and use it myself to really see how accurate it is
[doublepost=1474585793][/doublepost]
Blurred out backgrounds are a visually pleasing effect used in photography all the time. This can be only achieved on professional cameras because of something called aperture equivalency. This simulates that effect with a software blur
[doublepost=1474585759][/doublepost]

That effect is actually pretty good. Better than I thought, but I would have to see full pics and use it myself to really see how accurate it is
I do see fringing around him in thhe effect, though
 
No, it's not, it has the two images to compose a single image.

Huawei cameras don't have different focal distances.
You don't need different focal distances to see the depth. Ever heard of stereo photography? Both phones are doing the same under the hood, calculating depth from two images and applying a blur.
 
Last edited:
You don't need different focal distances to see the depth. Ever heard of stereo photography? Both phones are doing the same under the hood, calculating depth from two images and applying a blur.

You need for better depth mapping.
 
Yeah I agree. Amazing new tool that will only get better with time.

Messing about with the beta now, it needs work but when it gets it right it's pretty damn good.

Not gonna give you Leica bokeh, not yet anyway


Why can't people just be chill. As a photographer, I think it's amazing these small sensor cameras on mobile phones are pushing the limits of what can be done by something that fits in a jean pocket.

Apple has done a great job in creating a really nice out of focus area, when done right. Obviously not every instance in which this "mode" will be used will work well, but in the hands of people who understand photography and depth of field and can appreciate bokeh, this is exciting AF.

I sure am gonna have loads of fun with this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.