Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.



When Apple announced the iPhone 7 Plus, one major feature it focused on was a new "Portrait" mode that allows the device's camera to simulate a shallow depth of field effect, similar to what can be achieved with a high-end DSLR.

Portrait mode wasn't ready to go when the iPhone 7 Plus shipped, but Apple promised to introduce it at a later date and did so today, with the release of iOS 10.1. Available as a beta feature, Portrait mode is built into iOS 10.1, and we went hands-on with it to see how well it performs.

Portrait mode uses the 56mm lens to capture the image and uses the wider 28mm lens to generate a depth map of a scene. Using the small differentiations between the 28mm and 56mm lenses, the iPhone separates the different layers of an image and uses machine learning techniques to apply a blur that simulates a shallow depth of field.

When shooting, Portrait is similar to other modes in the camera app, with a timer to take an image and a Tap to Focus tool to set the focus. One helpful feature is the ability to see the depth effect live before snapping a photo.

In order for the Portrait effect to work properly, you need good lighting and a subject that's properly placed -- it can't be too close or too far away.

Portrait mode is in beta, and is currently only available for developers running iOS 10.1. This Friday, Apple will also make iOS 10.1 available for public beta testers, so Portrait mode will be more widely available. There are some issues and quirks that still need to be worked out during the beta testing process, but as a first effort, Portrait mode can produce some impressive images.

Article Link: Hands-On With the New 'Portrait' Mode Beta Feature in the iPhone 7 Plus
great...how about you move the lens to the centre so we don't cover it with our finger. Basics apple thats why I'm now a very happy samsung user.
 
I never said apple do EVERYTHING better. the list I provided is enough to show the consistency of their better implementation of tech. if it has no base, then what is?

Eco system as a whole is an innovation. an individual product feature may not innovative enough but a collection of products together as a whole can be a killing feature, innovation that makes apple unique that can't be replaced.

is it why you are hooked into this eco? or are you stuck with apple eco because there are no alternative?

Remember Blackberry had sort of an eco system and they were the leaders in smartphones. I am hoping Google does better not because I hate apple but because it's good for consumers. Eco system will last till you innovate in every area of the eco system. We know how much innovation has happened in the computer business.

So you are right we choose the best from what is available (stuck) and I just demand more from a company that claims to be innovative. I just realized the digression from the topic :).
 
mmm... those edges need work, also people are forgetting that real DOF has Foreground & Background bluring, look at the hand on the sony image...
Also I wonder what background bokeh will look like it should appear soft and “creamy”, with smooth round circles of light and no hard edges.

Not bad for a phone but way off DSLR.
 
  • Like
Reactions: daviddth
Selective Focus is definitely not the same. The M8 tries to do something similar but is quite terrible at it.

The thing with the plethora of Asian manufactures of phones that use Android is that they're generally very quick at the mark when it comes to releasing features but they're just not good at nailing or refining them because they're too busy working on the next half baked feature.
 
It doesn't always get it right, but it does have gradation. Check out the chair compared to the laptop compared to outside the window:
Ugh, I don't know...that did weird things to his hair at the edges. Overall the effect looks like stuff I used to do in Photoshop and one of my Ulead programs years ago. I do like how the lens does maintain the natural shape of his face though. For too long, our wider angled cell phone lenses have been making us look big nosed and chipmunk cheeked. This alone would make me want the larger slab of a phone. Very nice.
[doublepost=1474544942][/doublepost]
Ah...funny...because that would mean you're gay and no one would want that, right? :confused:

The size problem is easily solved by avoiding skinny jeans.
Lol, I took him to mean he's no Indiana Jones! Now Indy could rock a man purse!
 
They could implement something similar using a depth sensor like kinect uses, rather than a second camera. That would have the advantage of providing more accurate depth information for a better gradual focus fall-off.

The main issue for me on phone cameras is they are too wide angle. I'd happily have just the telephoto lens from the 7+ without the wide angle one, but I realise I may be in the minority.
 
Shouldn't have called it portrait though. Probably confuse some users into thinking its only for when the phone is upright. Whats wrong with Bokeh?

I don't have a 7+ so this is exactly what I thought it was. I was thinking to myself "seriously the 7+ had to be in landscape to take a picture?" but yeah this is pretty neat hope it makes it to the regular sized phone some day
 
  • Like
Reactions: 9052042
To me the most impressive aspect is the real time preview of the blur effect. Yes it's artificial blur but no DSLR on the market does that.

I'm sure others have already said it: but you are 1000% incorrect.

In fact, EVERY DSLR "does that" by default.

It's actually the other way around from what you (and Apple!) say: shallow depth of field (DoF) (which is what this is simulating) is the default on DSLRs and you actually have to press a button to preview a deep DoF!!

It works like this:

1. You get the shallowest DoF with the lens "wide open" (i.e. Using the widest aperture... which is a smaller f-stop like f/1.4)

2. DSLRs are always in this "wide open" setting when "at rest" (i.e. When not taking a photo). So anytime you just look through the viewfinder (or preview on a screen) to frame a shot you are ALWAYS seeing the shortest DoF "preview" (with the most Bokeh).

3. This is done because you want as much light coming in as possible during this time to help with autofocus and metering.

4. If you are wanting to shoot with a deep DoF (so everything is in focus / nothing is blurry) you would select a narrower aperture (say f/16). This would be closer to what a photo would normally look like from an iPhone.

5. To get a "preview" (again, either through the viewfinder or on the screen) of this deep DoF you have to press the "DoF Preview" button... that will CLOSE DOWN the aperture to let you see (in real time!) what the photo will look like.

6. Anytime you're not holding down the DoF preview button you're seeing (live!!) the shallowest DoF possible: the one with the most background blur with the most Bokeh.

7. When you actually press the shutter button... what happens is that the aperture is closed down to your desired setting (like f/16) and the photo is taken me. This all happens so fast that it's essentially instantaneous on modern gear.

In summary: you (and Apple!!) are VERY wrong. A DSLR can (and does by default!) give you a preview of a shallow DoF.
 
  • Like
Reactions: Elijen and daviddth
The iPhone doesn't have an f-stop. That's why it can do this "live". Real cameras let you change the aperture.
err, yes it does. it has an aperture that changes size, and a max aperture rating based on its widest opening.

saying 'the iphone doesnt have an f-stop' is like saying a person doesnt have height.
 
Just at the store today. Had to check out the new Edition...

IMG_7863.JPG IMG_7864.JPG
 
All mirrorless cameras do this by simply showing you what the sensor sees on the screen/view finder.

Every camera where you see any live view through the lens while framing (all DSLRs and all mirrorless) do this. No screen required: PHYSICS and the laws of nature give you the best Bokeh preview by default!
 
Yes it's quite ridiculous how badly researched these claims are. With mirrorless cameras especially, you have no trouble fully previewing the actual final result. However, it is worth noting that a DSLR makes it impossible to see the actual depth of field for larger aperture lenses due to the fresnel effect of the pentaprism apparatus. Basically what happens is the maximum aperture size is limited, specifically it's about f/2.8 for a 35mm full frame camera and f/4 for APS-C cameras.

So if you had it in your mind that you were shooting with an actual DSLR (not mirrorless, not rangefinders) with a lens at f/1.4 (or equivalent if using a non-135 format camera) then it's actually true that it's not possible to preview the final result through the viewfinder.

Of course, I haven't seen a DSLR that couldn't also shoot in live view mode if you wanted to do that. And the in-between design of the Sony SLT cameras has no such limit on maximum aperture that can be previewed at any time.

Basically, even with the caveat mentioned, this is an incredibly ignorant claim (and not one that Apple is making, afaik.)

Oh man. Now you've done it! :)

Through pedantry you've actually given some credence to Apple's silly claim. Well, I challenge your pedantry with more pedantry...

You can replace the viewfinder of your DSLR with one that will allow you to more closely see the current DoF. For instance: https://www.bhphotovideo.com/c/product/590458-REG/Canon_3357B001_Eg_S_Super_Precision_Matte.html

That one goes down to f/1.8.

Other camera systems without prntaprisms (like medium format) also aren't so limited. Or, like you say, just use the live view screen :)

But, to be even more pedantic, we actually don't know what aperture the iPhone is simulating... so we have no way to say whether or not a DSLR can show that depth of field in the viewfinder :)

At any rate: we're arguing minutia. Apple's claim about not seeing a short depth of field on a DSLR is completely bogus!
 
I think it's got definite potential, even at this early stage it's already looking pretty good.

We're a long, long way from a phone replacing my dslr, but I don't always carry that and you know what they say, the best camera is the one you have with you. I've taken some really nice shots with my iPhones over the years and this is just one more nicety to add to the available options.
 
It was interesting when I saw during the keynote. But when I saw this video, everything has change. Apple is claiming something about taking advantage of the two camera technology. This can easily be done with a any camera and a regular photo app. It can be done better with Pixelmator and Photoshop.
[doublepost=1474548189][/doublepost]
You know it's a Japanese word right!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.