Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,493
37,780



Portrait Mode on the new iPhone 11 works with not only human faces, but also objects and pets, according to Apple.

iphone-11-portrait-mode-800x366.jpg
Apple's sample photos of Portrait Mode on the iPhone 11

From the iPhone 11 page:
Take portraits to new places. With new kinds of portraits and more lighting controls, the dual cameras in iPhone 11 work together to create stunning images. And Portrait mode now works with everything you love to shoot -- that includes your best friends, two-legged or four.
While this was already possible on the iPhone X, iPhone XS, and iPhone XS Max, Portrait Mode on the iPhone XR was only able to detect human faces in Apple's stock Camera app. (A few third-party apps like Halide did manage to enable Portrait Mode for objects and pets on the iPhone XR.)

There are also six Portrait Lighting effects available on the iPhone 11, including Natural, Studio, Contour, Stage, Stage Mono, and High-Key Mono. This is up from three on the iPhone XR: Natural, Studio, and Contour.

Portrait Mode automatically creates a depth-of-field effect known as bokeh, allowing iPhone users to shoot a photo that keeps the subject sharp with a blurred background, while Portrait Lighting applies studio-quality lighting effects like black-and-white stage lighting to the Portrait Mode photos.

To use Portrait Mode, open the Camera app and swipe to Portrait mode. Portrait Lighting effects will appear at the bottom of the viewfinder.

iPhone 11 is the direct successor to the iPhone XR, with features including a dual-lens rear camera system with Ultra Wide and Night modes, faster A13 Bionic chip, improved water resistance, six new colors, up to one hour longer battery life, Dolby Atmos sound, 802.11ax Wi-Fi, Gigabit-class LTE, and more.

Article Link: Portrait Mode on iPhone 11 Works With Objects and Pets
 
  • Like
Reactions: MrUNIMOG
Yeah, when they said this yesterday I was confused, as I know for a fact I've been taking Portraits of my dogs using Portrait mode since I got an X. Maybe they mean it's optimised to be easier? But I've never had issues
 
  • Like
Reactions: oneMadRssn
Yeah, when they said this yesterday I was confused, as I know for a fact I've been taking Portraits of my dogs using Portrait mode since I got an X. Maybe they mean it's optimised to be easier? But I've never had issues
On XR it was only working for people due to a lack of second camera.
 
On XR it was only working for people due to a lack of second camera.
I personally think it was only working for people due to software limitations, not the single lens. Google made it work with 1 lens and machine learning. I think Apple could have done the same thing, but figured it wasn't worth it because 1. desire to sell more flagships and 2. desire to sell more flagships.
 
This is a bit weird because I've been using portrait mode with objects for a long time with my iPhone 7 plus... :confused:

It has some trouble with certain things, like the foam on a glass of beer, and the whiskers on cats, but most of the time it's quite good. :)

Edit: ...or has objects been meant to work on all (earlier) phones with dual lenses for a while now...? ;)
 
  • Like
Reactions: Gorms
Nice to see. Just to note Google has had Portrait mode on objects other than people with their Pixel's with one camera...so the two camera thing isn't the only way to do it (although Apple seems to be playing it that way).
 
This is a bit weird because I've been using portrait mode with objects for a long time with my iPhone 7 plus... :confused:

It has some trouble with certain things, like the foam on a glass of beer, and the whiskers on cats, but most of the time it's quite good. :)

Edit: ...or has objects been meant to work on all (earlier) phones with dual lenses for a while now...? ;)
Read the second paragraph again.
 
This is a bit weird because I've been using portrait mode with objects for a long time with my iPhone 7 plus... :confused:

It has some trouble with certain things, like the foam on a glass of beer, and the whiskers on cats, but most of the time it's quite good. :)

Edit: ...or has objects been meant to work on all (earlier) phones with dual lenses for a while now...? ;)
It’s always worked on dual lens cameras. In fact, Portrait Mode was the main selling point of the 7 Plus.
 
I personally think it was only working for people due to software limitations, not the single lens. Google made it work with 1 lens and machine learning. I think Apple could have done the same thing, but figured it wasn't worth it because 1. desire to sell more flagships and 2. desire to sell more flagships.
Bingo bango! Got to have something for next year!
 
Read the second paragraph again.
It’s always worked on dual lens cameras. In fact, Portrait Mode was the main selling point of the 7 Plus.
Yes, but the software were - at least in the beginning - best at recognising faces, and not much more...AFAIR. :)

Edit: Remember the iPhone 7 is 3 years old, while the referenced phones are newer and (much) more powerful... ;)
 
I personally think it was only working for people due to software limitations, not the single lens. Google made it work with 1 lens and machine learning. I think Apple could have done the same thing, but figured it wasn't worth it because 1. desire to sell more flagships and 2. desire to sell more flagships.
Yes, you’re right. I’m just lazy sometimes to explain technical details. It had no true portrait mode due to a lack of second camera but they trained the software to recognize people and blur the background so XR portrait mode was based on software.
They could teach it to recognize more things so in the end we can say it was due to software limitations.

I’m actually curious what will happen if all three models will have the same camera setup. It seems some people will buy ‘Pro’ just because it’s ‘Pro’. XR (and now 11) is the true flagship and the most selling iPhone model so its camera limitation didn’t affect it much. And Apple kind of acknowledged it by calling it just iPhone 11 and other two models ‘Pro’.
 
the stage light portrait mode doesnt currently work with pets, or i just cant get it to work
 
I have shot tons of animals with my 8 Plus using portrait mode and got the bokeh effect. Why is this news?
 
I have shot tons of animals with my 8 Plus using portrait mode and got the bokeh effect. Why is this news?

All dual lens iPhones support the true bokeh effect:
iPhone 7 Plus
iPhone 8 Plus
iPhone X
iPhone XS
iPhone XS Max

Only one single lens iPhone supports bokeh effect (using software trickery):
iPhone XR

This article is pointing out that while the new iPhone 11 does have a dual lens, people might be wondering if the bokeh effect would be limited since the second lens is ultra wide angle instead of telephoto. Therefore the software limitation of the iPhone XR has been resolved in the 11.

At least I think I’m understanding the point of the article? I don’t know. I can’t keep Apple products straight in my head anymore since sj isn’t there to reign in the product line.
 
What’s still unanswered though- does portrait mode on the PRO now work on the regular focal length also and not just in telephoto mode??
 
I mean it would be great to have heard this from Tim yesterday. This is quite significant.

The whole presentation just seemed rushed.
 
I personally think it was only working for people due to software limitations, not the single lens. Google made it work with 1 lens and machine learning. I think Apple could have done the same thing, but figured it wasn't worth it because 1. desire to sell more flagships and 2. desire to sell more flagships.

Or
3. Desire to not mine user photos, robbing privacy, then making money from it.
 
All dual lens iPhones support the true bokeh effect:
iPhone 7 Plus
iPhone 8 Plus
iPhone X
iPhone XS
iPhone XS Max

Only one single lens iPhone supports bokeh effect (using software trickery):
iPhone XR

This article is pointing out that while the new iPhone 11 does have a dual lens, people might be wondering if the bokeh effect would be limited since the second lens is ultra wide angle instead of telephoto. Therefore the software limitation of the iPhone XR has been resolved in the 11.

At least I think I’m understanding the point of the article? I don’t know. I can’t keep Apple products straight in my head anymore since sj isn’t there to reign in the product line.

The answer has already been posted in this short thread several times.
All these iPhones achieve Bokeh through software. None of them can support it natively as they don’t have real lens system of bigger cameras. Therefore, you would see artifacts around some objects sometimes when the software miscalculates the effect. The XR was worse in that regard since it had only one camera to rely on, so it was more complicated for the chip to imitate Bokeh.
 
  • Like
Reactions: Anonymous Freak
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.