Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Attachments

  • JW.jpg
    JW.jpg
    141.4 KB · Views: 200
  • Street Portrait.jpg
    Street Portrait.jpg
    153.2 KB · Views: 196
  • SP.jpg
    SP.jpg
    128.5 KB · Views: 196
Last edited:
Here's how the underlying iPhone 7+ "portrait mode" works.

What I'd like to know is how point source lights in the distance render. For sure I'm expecting too much to get the effect that I'm used to in my street photos below, but I would like to see some tests with that in mind.

Here's how the underlying iPhone 7+ "portrait mode" works.

What I'd like to know is how point source lights in the distance render. For sure I'm expecting too much to get the effect that I'm used to in my street photos below, but I would like to see some tests with that in mind.
Thats exactly what I want to see as well. Hopefully my 7+ arrives tomorrow and I can do some comparisons.
 
  • Like
Reactions: citysnaps
Thats exactly what I want to see as well. Hopefully my 7+ arrives tomorrow and I can do some comparisons.

Excellent - looking forward to your results! Still on the fence till I see more photos from photographers. And I'm not particularly in a hurry to buy.
 
The phablet is too big for me.


Once you get a plus model and start using it, you'll probably never want to go back to a smaller phone. You get used to the size in no time and the benefits of the larger screen can be felt almost every time you use the phone. In saying that, if they get rid of some of the bezel for next year, I'll be a happy camper.
 


Once you get a plus model and start using it, you'll probably never want to go back to a smaller phone. You get used to the size in no time and the benefits of the larger screen can be felt almost every time you use the phone. In saying that, if they get rid of some of the bezel for next year, I'll be a happy camper.
I had the plus for 2 days then returned it. It's cartoon like huge.
 
You are aware the aperture on the iPhone is f1.8, which is pretty dang good. You aren't going to get that in any kit lens, you're going to have to spend some money on a new lens. Granted you can pick up f1.8 50mm lenses for pretty dang cheap, but you know the old adage you get what you pay for.
You are aware that the iPhone's lens is physically much, much smaller, right? And that f1.8 on an iPhone camera is not the same thing as f1.8 on a larger lens? I guess you weren't or you wouldn't have posted that irrelevant comparison.

The conversation here is about depth of field. Go ahead, take a normal photo with your iPhone and tell me how much the background is out of focus. And yeah, even a "kit lens" included in a regular old off-the-shelf dSLR or has a physically much larger maximum aperture than any iPhone lens.
 
Intrigued to see what these beta users think of it and photographers who are in beta as well.
I like it but its totally faux-keh.

The effect is passable and I do like it, but I'm also slightly disappointed.
+ Does an okay job at separating subject from background
+ Works on People AND Objects
+ Nice effect to have built into a phone
+ Very clever use of dual camera system
- Lots of ugly artifacts that dont exist in original photo
- Nagging instructions (Too far. Back up. Wait for it....)
- Several shots done properly ended up not having depth effect
- Requires very good lighting for the effect to look right

Note: I understand its a Beta so no need to remind me. Its just that so many times we hear "Just wait, Apple's not done with it" but months or years (or decade, eh Time Machine) later and nothing has changed. Maybe this will get better, but maybe this "Extra Credit" team has disbanded. You never know with Apple.
 
But the S7 isn't doing anything special. its just taking multiple pics with different focus points, and it doesn't do it very fast. If you were trying to take the same pic of the little girl with the S7 she would probably have moved by the time the S7 was finished taking all its pics.

So you're saying the advantage with the two camera is faster captures? I don't remember S7 taking long to do that. Genuinely curious here. Do you have a iPhone 7 yourself?
 
So many armchair photographers here. Same armchair QBs that are out on Sundays I suppose.
 
So you're saying the advantage with the two camera is faster captures? I don't remember S7 taking long to do that. Genuinely curious here. Do you have a iPhone 7 yourself?

I was just commenting on the people remarking that the S7 did this first. But really all the S7 is doing is taking multiple pics of the scene with different focus points
Just watch this youtube video (start at 1:29).
Its not fast at all. And any movement during the pics will be captured.
Might as well just take 2 pics, one focused on the foreground and one on the background.

The iPhone is obviously faster as its only taking one pic, plus in my opinion a 56mm focal length is much better for portraits.
 
Androids have been blurring backgrounds for awhile now. How does having a telephoto make the bokeh better? Or does it not? Does it separate the foreground and background better because it can detect the depth better?

It makes it better by allowing Apple to have yet another feature/justification to upsell the 7+ over the 7 :)

Theoretically having both focal lengths allows image processing logic to help differentiate the foreground from the background, allowing it to apply the software blur filter more precisely. In practice the difference will be small and so again Apple is just using it as a way to upsell the 7+.
 
Androids have been blurring backgrounds for awhile now. How does having a telephoto make the bokeh better? Or does it not? Does it separate the foreground and background better because it can detect the depth better?

The 56mm camera identifies your foreground subject. Simultaneously, the 28mm camera quickly grabs nine shots of the background at different focus distances, and then applying a gradient blur resulting in a composit background with blur varying with distance, simulating bokeh over the background. Pretty clever!
 
Androids have been blurring backgrounds for awhile now. How does having a telephoto make the bokeh better? Or does it not? Does it separate the foreground and background better because it can detect the depth better?

Yes.
 
Probably you should discard the Mac and OSX, since it is used to mimic the typewriter, the Moviola, optical film printing tables, reel-to-reel music recording studios - amongst other old technologies.

Looking at pictures of Bouley on the web - it looks awful, a cheap disney-esque representation of a European restaurant. Please don't eat there, please move to France and eat only at Thoumieux or L'Astrance.
Furthermore Your digital Sony camera is just mimicking a "real 35mm SLR", and I expect many of us didn't realize that "learning to paint in oils" was the official definition of "having self respect".

How far back do you want to go to escape the mimicry and lowest common denominator ?
You could request Apple to design a pro app for your specific needs. It would 3D print some tools from the stone age, and create daily schedules for you to go about the worthy business of hunting & gathering - an honest day's work before all this modern stuff happened and polluted everything.

I never knew a computer was just to mimic a typewriter, seems like a silly under representation of what a computer was actually intended to be. Well, at least it is a much superior typewriter in every possible way - not really the case of what I was comparing.

Nice that you saw it on the web - you should actually visit it before making up your mind. Well, what a traditionalist you turned out to be - I'm more exploring mom and pops in the 9th myself when in Paris - But to each their own. Perhaps Maison Harlem is more your taste - started by a very nice ex Alvin Ailey dancer.

And yet another example of one thing that manages to match or exceed its predecessor (you seem to want to make my point)... You must be a Brahms fan. "Crayons to oils" was the actual line - my angle was more about the progression, you a bit to stuck on the destination. If you do not like Oils - lets say some Unison Pastels.

Your examples are more like Rand to Beirut. Not just a Mimic - but an actual progression (or at minimum a more convenient equal). Silly fixed double lens on a phone and "pro" items running iOS are not a progression from an SLR or a MAC - they are just an inferior pretend.

I did like your last part - except I consider "Pro App" to be a bit of an oxymoron :rolleyes:
 
Last edited:
I have found that the strength and power of a photograph rarely correlates or has much to do with the kind or cost of the camera used.

Rather, it has everything to do with the photographer. And his/her life experiences, curiosity, imagination, ability to see, ability to read light, and the ability to take what is before him/her and craft a compelling composition, ideally releasing narrative and stimulating a viewer's mind. I could go on, but I'll stop there.

True - but I feel the possibilities/freedom of the tool you choose serve not to limit the photographer, painter, cartoonist, designer. Allowing "his/her life experiences, curiosity, imagination, ability to see" shine more clearly. And perhaps in the complexity that tool offers capture something that they never thought they could.

People should not draw with crayons because they are cheap - but because they are limiting. Regardless, good point.

3 good counters tonight - I guess not all is lost.
 
you can apply all the filters in the world. you still dont have phase detection focus

While that's true... I've never had a problem with iPhone auto-focus. I tap where I want it to focus... and it does.

My problems are sometimes with general image quality, low-light performance, etc.

But not auto-focus.
 
Androids have been blurring backgrounds for awhile now. How does having a telephoto make the bokeh better? Or does it not? Does it separate the foreground and background better because it can detect the depth better?


Great in-depth and knowledgable response. If you have the facts, share it. If you don't, don't make yourself sound like you do.

After reading this threading and doing some visual research, I can answer my own question - no, it doesn't separate the background better. The halo-blur around the subject looks terrible. You can do creamy bokeh with either 28mm or 56mm focal length on the SLR, so that doesn't make either one better or worse.

From the examples posted, and this video, the only advantage over existing methods is live preview and faster captures. Which are both cool in their own right - but this thread has been nothing but misinformed biased comments on the image quality due to two cameras. There is no difference. Someone proof me wrong by showing side by side examples. My earlier post with iPhone 7 blurring out the hair too much, vs S7's nicer blur effects already shows it might actually be worse.
 
The issue is one of registration when it comes to the two photos before you can combine. Having two focal lengths makes it a lot more difficult, and a lot less useful since the noise doesn't map correctly between the two frames. The end result is that you can make the noise worse rather than better. There are techniques that could be applied here, but there's a lot of bad trade offs involved to the point where it doesn't really help in many real low-light photography situations (portraits, scenes with motion, etc).



The problem there is that you need to address the physics of it to make noise better. There are basically two categories of noise:

1) Shot Noise. The light you are trying to capture isn't perfectly uniform, so you get randomness in your signal that you capture.
2) Sensor Noise. This is erroneous signal generated by the sensor itself. This has been broken down into different categories, especially in astrophotography, where a lot of work needs to be done to weed it out.

The catch here is that shot noise can be a very big part of why your images are so noisy. Shooting faster, and using a higher ISO (on cameras where you have this control) drive the noise up, since you are collecting fewer photons, and so that randomness of how many photons will strike the sensor in that particular pixel over X period of time becomes more pronounced. And really, the only way to address it is to capture more photons and reduce that variability. How do you do that? Shoot at a lower ISO, longer exposure times, and use bigger pixels. Things like BSI sensors in phones are so huge because it allows the individual pixels to get bigger, as all the circuitry is now behind it all rather than on the surface of the sensor that's also trying to collect light. But we then used it to cram more pixels on the sensor, negating the benefit.

Not to mention a lot of the easy stuff to improve things on the sensor noise front are already done, and there's hard physical limits to what you can do about shot noise if you are unwilling to make the sensor itself bigger, or put fewer pixels on it. Shot noise is a big reason why cameras with bigger sensors will always pull ahead in IQ over camera sensors, assuming similar generations of technology is used in each to maximize surface area of the pixel and minimize sensor noise for both.

Yes to everything you just said.

On an unrelated note, Ergo Proxy.
 
Yes its an f1.8 lens which will gather the same amount of light as a full frame f1.8 prime lens. But when you are comparing the lens for DOF the iPhone 2nd camera has an equivalent aperture of around f16

No...no...no. That's not what f-stop means.

It's the ratio between the focal length of the lens and the cross-sectional area of the opening in the lens. For a given lens opening, if you double the focal length, you double the f-stop number. For a given focal length if you double the diameter of the opening, you quarter the f-stop number (because you quadruple the cross-sectional area).

So a very, very short f1.8 like the iPhone lets in much, much less light than a 50mm f1.8 prime lens would.

Then again, the light a 50mm f1.8 gathers focused down to the size of the iPhone sensor would burn the sensor like ants under a magnifying glass. And to gather the same amount of light would require the same diameter of the lens furthest from the sensor (basic physics). So what you said utterly fails the common sense test.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.