Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Gaussian blur? What in the world apple use lens blur ffs. How else will they get the bokeh like effect. Maybe someone misreported? Lens blur makes bright objects in the background turn into those beautiful bokeh balls while gaussian is just a regular blur kind of like those used to censor on television. Maybe apple has their own in house filter that'll do something like lens blur? Because gaussian will not make it look "bokeh" at all …
Well I think what it really boils down to is this: the iPhone 7 Plus portrait mode is simply using software to apply a fake blur to everything behind the subject. Simple as that. In high-end cameras, bokeh/blur is actually being caused by the optics of the camera. The light from the out-of-focus background is diffracted differently than the light from the subject. To my knowledge, real bokeh (at least, right now) cannot be emulated through software, as it is something that happens on a hardware level in high-end cameras.
 
Were you actually expecting to get optical bokeh from lenses and sensors the size of your pinky nail? :)

This was always gonna be a digital effect applied in post.

The iPhone may do a billion calculations when you press the shutter... but this is still a special effect.

I was honestly hoping that you could use image data from both cameras combined to make one good picture... not this faux-blurred-Photoshop effect.

How do you know it doesn't use image data from both cameras? When I first heard them describe it, that's the impression I got. I assumed it would work something like this: the software analyzes the depth map to establish the delineation of the in-focus subject(s), and the wide angle camera grabs an intentionally-out-of-focus image to create the background to the subject(s) captured by the zoom lens. This would be like HDR, except that instead of successive over/under-exposed shots, you have simultaneous in/out-of-focus shots being combined by software.
 
Very neat, but I already felt like the mode switcher was getting crowded on my 6+ with:

  1. Time-Lapse
  2. Slo-Mo
  3. Video
  4. Photo
  5. Square
  6. Pano

I think Apple should figure out another UI for swapping between all these modes. Maybe bring back the straight forward toggle between photo and video from iOS 3, and then make Time-Lapse and Slo-Mo be sub-modes for Video, and have Square, Pano, and Portrait be sub-modes for Photo?
 
Very neat, but I already felt like the mode switcher was getting crowded on my 6+ with:

  1. Time-Lapse
  2. Slo-Mo
  3. Video
  4. Photo
  5. Square
  6. Pano

I think Apple should figure out another UI for swapping between all these modes. Maybe bring back the straight forward toggle between photo and video from iOS 3, and then make Time-Lapse and Slo-Mo be sub-modes for Video, and have Square, Pano, and Portrait be sub-modes for Photo?
That's what I was hoping for, too.
 
  • Like
Reactions: ender21
That's what I was hoping for, too.
Same! They need a button like they did with Mail to see all unread, just have one small button or box somewhere on the screen. Tap it, brings up a menu.. heck with 3D touch you could easily have this.
 
How do you know it doesn't use image data from both cameras? When I first heard them describe it, that's the impression I got. I assumed it would work something like this: the software analyzes the depth map to establish the delineation of the in-focus subject(s), and the wide angle camera grabs an intentionally-out-of-focus image to create the background to the subject(s) captured by the zoom lens. This would be like HDR, except that instead of successive over/under-exposed shots, you have simultaneous in/out-of-focus shots being combined by software.

Sorry... let me explain. I was talking about another topic not related to the blurred background effect.

It's possible to combine two or more images to reduce noise. Or other phones use one camera to capture a color image and the other camera to capture black-n-white to increase detail in the final image.

Apple's implementation uses both cameras to do depth mapping for the faux-blurred effect. Which is fine.

But I was hoping it would use both cameras to improve general image quality. I wonder if other camera apps can do it?
 
  • Like
Reactions: mcdspncr
Portraits of people only? What about animals and objects such as flowers etc?
I think blurred backgrounds for non-human subjects will come at a later time. Since the blurred background is simply done with software, it is easiest to start with human faces. The phone realizes "Okay, this is a face. On a human. Okay, now I know where the subject is and where the background is." The phone has a reference point.

With a flower, per se, the phone just sees a mess of colors. It doesn't distinguish the objects in the photo. That's my theory, anyway.

EDIT: Portrait mode CAN be used for subjects that aren't people. @mcdj linked a TechCrunch article that shows it off.

[doublepost=1474481671][/doublepost]
Sorry... let me explain. I was talking about another topic not related to the blurred background effect.

It's possible to combine two or more images to reduce noise. Or other phones use one camera to capture a color image and the other camera to capture black-n-white to increase detail in the final image.

Apple's implementation uses both cameras to do depth mapping for the faux-blurred effect. Which is fine.

But I was hoping it would use both cameras for general image quality. I wonder if other camera apps can do it?
Perhaps the different focal lengths are the reason why two images can't be combined into one? Each lens is seeing different things. Thoughts?
 
Last edited:
  • Like
Reactions: tkukoc
Is there a way to turn it off what if i dont want the blur background??????? please tell me theres a way to shut it off
"Portrait mode is a new feature in the camera app that can be found alongside other video and photo taking options like "Video" and "Panorama." It even includes a Live Preview effect that lets you see what the image will look like before you take it, something that's unique to the iPhone."

Looks like it's just another choice alongside, "panorama" and "slo-mo"
 
Perhaps the different focal lengths are the reason why two images can't be combined into one? Each lens is seeing different things. Thoughts?

Yeah... that's probably the case.

Which is sad... since there are two cameras with two lenses and two sensors.

Sure we're getting two focal lengths and a faux-blurred background effect.

But I was hoping they could do some magic with improving the image quality by combining two image captures into one.
 
  • Like
Reactions: v0lume4
Wow, the portrait mode looks amazing. Makes me want to buy the plus iPhone next year (if Apple makes the dual camera exclusive to the plus model).

The phablet is too big for me. I hope that they do something for the smaller phone because I doubt I will ever move up to the monster size. My motto is - never let yourself go to where you need a plus size. :D I apply that motto to everything in life. :p

I was honestly hoping that you could use image data from both cameras combined to make one good picture... not this faux-blurred-Photoshop effect.
Totally Agree. I was hoping for dual pictures that could get merged to either reduce noise or increase overall pixel count (like they do with the pano mode).
 
I don't like this at all. I want the opposite. I am a photographer, with 50+ years experience, and what I want is everything sharp in the image, full depth of field vs no depth of field. If they can do one they should be able to do the other...
 
Yeah... that's probably the case.

Which is sad... since there are two cameras with two lenses and two sensors.

Sure we're getting two focal lengths and a faux-blurred background effect.

But I was hoping they could do some magic with improving the image quality by combining two image captures into one.
Same here. Noise/intense noise reduction is still an issue once you start zooming in on iPhone photos. It's be great if they could iron some of that out.
 
Presumably something about the Portrait mode requires both cameras on the 7+, otherwise wouldn't it be available on the 7 too?
 
But I was hoping they could do some magic with improving the image quality by combining two image captures into one.

That was my hope as well, ever since Apple bought LinX. But at least the 56mm lens will enhance image quality in some situations or at least broaden shot opportunities.

Also, consumers love zoom cameras as evidenced in the popularity of superzoom cameras. The market for people who appreciate higher image quality is tiny but mention "telephoto lens" and everyone instantly imagines all the "magical" photos they could take.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.