Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

predation

macrumors 65816
Apr 3, 2013
1,237
867
i think the portrait mode doesn't require both cameras. just a marketing ploy to make you buy the more lucrative + model.

i understand why apple gives all the bells and whistles to the +, but ffs, I'm not going up in size.
 

Nr123*123

macrumors 6502
Jul 1, 2014
296
1,014
Is this sort of like selective focus on Samsung phones? Two images are taken and you can select the focus you want, blurring the foreground or the background.
 

brickm

macrumors newbie
Sep 21, 2016
2
0
I agree they need to rework the UI. Sliding through the options is cumbersome. I wonder if they could create a ghosted 3x2 grid that you tap, and then it animates to a little floating button that you can tap when you want to change to another mode.

Very neat, but I already felt like the mode switcher was getting crowded on my 6+ with:

  1. Time-Lapse
  2. Slo-Mo
  3. Video
  4. Photo
  5. Square
  6. Pano

I think Apple should figure out another UI for swapping between all these modes. Maybe bring back the straight forward toggle between photo and video from iOS 3, and then make Time-Lapse and Slo-Mo be sub-modes for Video, and have Square, Pano, and Portrait be sub-modes for Photo?
 

M-5

macrumors 65816
Jan 4, 2008
1,100
93
First of all, it needs the "Portrait" toggle, because since the lenses are of different focal lengths, it will use the 56mm lens as the primary lens, and it simply uses the wide-angle lens to extract depth information to apply the effect. Since this effect uses the telephoto lens which has the f/2.8 lens, I wonder if it won't work in low-lit areas.

I assume it primarily only works with people, because Apple is also using face-detection to detect the subject and what should be in focus for more precision. The Huawei phone with a similar effect actually allows you to adjust the aperture and blur effect on the fly as well as to change the focal point after the photo has been taken. Apple probably wants to keep everything as simple as possible so won't offer these adjustments within the camera app.

I do hope that 3rd party apps will allow for more manual control over all of this, as it would be cool to use this effect to take photos of objects as well.
 

brickm

macrumors newbie
Sep 21, 2016
2
0
I wonder if this will evolve to include manual control over the depth field through some UI. Then you could change the focal point and have a sort digital ability to rack focus.
 

Kevin2055

macrumors 6502
Sep 22, 2015
393
537
It does need 2 lens to draw a map of distance. Close one eye, you will not feel distance as well as both eyes are open; same for camera.

i think the portrait mode doesn't require both cameras. just a marketing ploy to make you buy the more lucrative + model.

i understand why apple gives all the bells and whistles to the +, but ffs, I'm not going up in size.
 

a.gomez

macrumors 6502a
Oct 10, 2008
924
726
There you go - get a smartphone that "mimic" a high-end DSLR, eat Mcdonalds and pack your bag with crayons as art supplies and you set. Next stop, mimic ville - home of the lowest common denominator.

Or just have some self respect and get an A99 II, eat at Bouley and learn how to paint with Oils :rolleyes:

Now that we done with all this iOS/mobile garbage can we get back to MACs/OSX? Not sure how many more "Pro" labeled crap that "mimic" something else we need to live past before getting an actual creative tool from Apple.
 

WinstonRumfoord

macrumors 6502
Mar 27, 2014
482
1,174
does it matter what they are calling it? They've already shown many examples of what it will look like. Judge based off that, not what they are calling the software blur.

Because in software (PS at least) Gaussian blur is VERY different from Lens Blur or Field Blur. It is curious that they chose to use the nomenclature of a very dated blurring method.
 

macduke

macrumors G5
Jun 27, 2007
13,152
19,722
I'm surprised Apple got this beta out so quickly. Does anyone know if a sample gallery has been posted anywhere yet?
 

GQB

macrumors 65816
Sep 26, 2007
1,196
109
Is there a way to turn it off what if i dont want the blur background??????? please tell me theres a way to shut it off

Oh my god. You're kidding, right?

"Portrait mode is a new feature in the camera app that can be found alongside other video and photo taking options like "Video" and "Panorama." "
 

sniffies

macrumors 603
Jul 31, 2005
5,646
14,853
somewhere warm, dark, and cozy
There you go - get a smartphone that "mimic" a high-end DSLR, eat Mcdonalds and pack your bag with crayons as art supplies and you set. Next stop, mimic ville - home of the lowest common denominator.

Or just have some self respect and get an A99 II, eat at Bouley and learn how to paint with Oils :rolleyes:

Now that we done with all this iOS/mobile garbage can we get back to MACs/OSX? Not sure how many more "Pro" labeled crap that "mimic" something else we need to live past before getting an actual creative tool from Apple.
A99ii is so passé. GFX ftw. :p
 
  • Like
Reactions: skidu

fiveainone

macrumors 6502a
Sep 16, 2011
761
76
Androids have been blurring backgrounds for awhile now. How does having a telephoto make the bokeh better? Or does it not? Does it separate the foreground and background better because it can detect the depth better?
 

kingpushup

macrumors regular
Jun 24, 2013
222
234
Could Apple software simulate Bokeh in future via the 7+?

I could see this being VERY popular with Instagramers. Heart shaped dots akin to the Apple Event Invitation.
 

Krevnik

macrumors 601
Sep 8, 2003
4,100
1,309
Yeah... that's probably the case.

Which is sad... since there are two cameras with two lenses and two sensors.

Sure we're getting two focal lengths and a faux-blurred background effect.

But I was hoping they could do some magic with improving the image quality by combining two image captures into one.

The issue is one of registration when it comes to the two photos before you can combine. Having two focal lengths makes it a lot more difficult, and a lot less useful since the noise doesn't map correctly between the two frames. The end result is that you can make the noise worse rather than better. There are techniques that could be applied here, but there's a lot of bad trade offs involved to the point where it doesn't really help in many real low-light photography situations (portraits, scenes with motion, etc).

Same here. Noise/intense noise reduction is still an issue once you start zooming in on iPhone photos. It's be great if they could iron some of that out.

The problem there is that you need to address the physics of it to make noise better. There are basically two categories of noise:

1) Shot Noise. The light you are trying to capture isn't perfectly uniform, so you get randomness in your signal that you capture.
2) Sensor Noise. This is erroneous signal generated by the sensor itself. This has been broken down into different categories, especially in astrophotography, where a lot of work needs to be done to weed it out.

The catch here is that shot noise can be a very big part of why your images are so noisy. Shooting faster, and using a higher ISO (on cameras where you have this control) drive the noise up, since you are collecting fewer photons, and so that randomness of how many photons will strike the sensor in that particular pixel over X period of time becomes more pronounced. And really, the only way to address it is to capture more photons and reduce that variability. How do you do that? Shoot at a lower ISO, longer exposure times, and use bigger pixels. Things like BSI sensors in phones are so huge because it allows the individual pixels to get bigger, as all the circuitry is now behind it all rather than on the surface of the sensor that's also trying to collect light. But we then used it to cram more pixels on the sensor, negating the benefit.

Not to mention a lot of the easy stuff to improve things on the sensor noise front are already done, and there's hard physical limits to what you can do about shot noise if you are unwilling to make the sensor itself bigger, or put fewer pixels on it. Shot noise is a big reason why cameras with bigger sensors will always pull ahead in IQ over camera sensors, assuming similar generations of technology is used in each to maximize surface area of the pixel and minimize sensor noise for both.
 
  • Like
Reactions: shadowbird423

shanson27

macrumors 68020
Nov 27, 2011
2,199
20,648
IMG_2017.JPG
Looks great
 

kingpushup

macrumors regular
Jun 24, 2013
222
234
Sorry... let me explain. I was talking about another topic not related to the blurred background effect.

It's possible to combine two or more images to reduce noise. Or other phones use one camera to capture a color image and the other camera to capture black-n-white to increase detail in the final image.

Apple's implementation uses both cameras to do depth mapping for the faux-blurred effect. Which is fine.

But I was hoping it would use both cameras to improve general image quality. I wonder if other camera apps can do it?

Apple event notes that the dual camera is improving color mapping, sharpness, low-light pics etc, if I recall correctly.
 
  • Like
Reactions: 1041958
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.