Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,555
30,882



When Apple announced the iPhone 7 Plus, one major feature it focused on was a new "Portrait" mode that allows the device's camera to simulate a shallow depth of field effect, similar to what can be achieved with a high-end DSLR.

Portrait mode wasn't ready to go when the iPhone 7 Plus shipped, but Apple promised to introduce it at a later date and did so today, with the release of iOS 10.1. Available as a beta feature, Portrait mode is built into iOS 10.1, and we went hands-on with it to see how well it performs.


Portrait mode uses the 56mm lens to capture the image and uses the wider 28mm lens to generate a depth map of a scene. Using the small differentiations between the 28mm and 56mm lenses, the iPhone separates the different layers of an image and uses machine learning techniques to apply a blur that simulates a shallow depth of field.

When shooting, Portrait is similar to other modes in the camera app, with a timer to take an image and a Tap to Focus tool to set the focus. One helpful feature is the ability to see the depth effect live before snapping a photo.

In order for the Portrait effect to work properly, you need good lighting and a subject that's properly placed -- it can't be too close or too far away.

Portrait mode is in beta, and is currently only available for developers running iOS 10.1. This Friday, Apple will also make iOS 10.1 available for public beta testers, so Portrait mode will be more widely available. There are some issues and quirks that still need to be worked out during the beta testing process, but as a first effort, Portrait mode can produce some impressive images.

Article Link: Hands-On With the New 'Portrait' Mode Beta Feature in the iPhone 7 Plus
 
  • Like
Reactions: jjm3

Telos101

macrumors regular
Apr 29, 2016
219
886
Ireland
I don't think of it as a gimmick, but it does feel like another one of those developments where an effect that used to take a bit of photography knowhow is now available to the masses. Makes me wonder what professionals in the field think.
 
  • Like
Reactions: obamtl and 2010mini

adamneer

macrumors 6502
Apr 18, 2013
420
747
Chicago, IL
I don't understand why I keep reading/hearing that DSLRs can't show depth of field live. Of course they can - it's not an effect that requires processing using a DSLR, it's just the natural characteristics of the lens optics and aperture. If this were true, you'd never be able to manually focus an image using live view mode on a DSLR.
 

RDeckard

macrumors regular
Sep 23, 2013
188
572
I don't understand why I keep reading/hearing that DSLRs can't show depth of field live. Of course they can - it's not an effect that requires processing using a DSLR, it's just the natural characteristics of the lens optics and aperture. If this were true, you'd never be able to manually focus an image using live view mode on a DSLR.

Agreed - most DSLRs have a button called "DOF (Depth of Field) preview." You just press it you'll see a preview in the viewfinder.

On mirrorless cameras, like the Sony A series, Fuji X-series (I have an X-T1), Olympus OM, etc. show the effect right in the digital viewfinder or on the screen on the back of the camera.
 

zed1291

macrumors regular
Jun 4, 2010
200
238
NYC
I don't think of it as a gimmick, but it does feel like another one of those developments where an effect that used to take a bit of photography knowhow is now available to the masses. Makes me wonder what professionals in the field think.

I think it's a newt feature. It definitely couldn't replace a DSLR for a shoot, but if I'm shooting for fun and not professionally then it would make my iPhone shots look a little better. Still, I always carry my canon on me so I still wouldn't use it that much.

Shallow depth of field comes from wide apertures & the real advantages to wide apertures is better low-light performance. The artificial effect of a wide aperture won't help when an actually wide aperture is needed.
 

bluespark

macrumors 68040
Jul 11, 2009
3,098
4,010
Chicago
I'd bet money that this will be a widespread feature on most phone cameras in a year or so.

Except that to do it the right way, which Apple is, two cameras are required. The other way, which other manufacturers have tried, simply isolates one image and blurs everything else. As you saw from the video when he moved the plant, that was a middle ground object and it was blurred but not as much as the background objects. This is what you would see with a true portrait lens, and it's why Apple's approach is better than what has come before.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.