Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's built in and accessible to all iPhone XS and XS Max users. So basically millions more can make use of it without the need to download and launch a separate app. They can launch and access it right from the lock screen and it launches faster.
It'd be sad if a whole generation grows up thinking these deformed images are normal.

That's the downside of having it built-in and provided with Apple's blessing: Some will genuinely think it's okay. And you know, you can't really knock them, because they've never experienced better.
[doublepost=1538082979][/doublepost]
Just here for the kitty. :)
And a fine kitty it is too. Pity about the last photo.
 
  • Like
Reactions: Dave FL
I think The Verge did a review of this feature last week and they also included a shot taken by a basic DSLR to compare what the software in the iPhone does compared to a real camera's bokeh (the out of focus area in the background when shooting with a wide open lens). It was so not close, I can't imagine anyone using this feature on a phone for serious photography. I do think though, in a few years the engineers will have a feature that really works. But that is not the current reality, it's really just a gimmick. Gotta start somewhere though!

Right, Verge, the lying cherry picking Patel project huh, I so "trust" them...
[doublepost=1538088426][/doublepost]
If you're talking about the photo in this article, it's a sloppy comparison. Their iPhone photo isn't even in focus. I recommend looking at other sites for a better depiction of what the feature can do and what its limitations are.

The Verge are habitual liars and biased reviewers when it comes to Apple... It's not even funny these days.
They're a click bait circus.
 
IMG_0815.jpg
I shot this with 7 Plus. If you have the right lighting and distance, you can get good shots. I can only imagine even better results with the XS. Still this kind of quality is crazy from a damn phone. It destroys point and shoot digital cameras from the last decade with much larger lenses.
[doublepost=1538089324][/doublepost]
The effect is pretty cool, and basically - you get to decide if you like them or not. That's the beauty here.

Beautiful dog and nice shot! These cameras are getting better and better!
 
It's funny that you cling to the notion that simulated bokeh is not real, yet simultaneously reject that a two-dimensional photographic print representation of a three dimensional space is similarly not real.

Try as I might, I have personally yet to see a ruler up close or a line of dominos that resembles a print made from my camera and wide aperture lens and resulting shallow DOF. Maybe your eyes' apertures are abnormally huge, like some forrest creature at night.

With regard to photography being real or not, you may want to study the history of photography a bit.
You miss the point, bokeh while not an effect immediately seen by the eyes, is none the less a physical effect of aperture size. It blurs things differently depending on distance. Thus why the Apple one looks unnatural.
 
You miss the point, bokeh while not an effect immediately seen by the eyes, is none the less a physical effect of aperture size. It blurs things differently depending on distance. Thus why the Apple one looks unnatural.

It still looks good and puts all of the attention on the subject. Much better than portrait shots with deep depth of field.
 
You miss the point, bokeh while not an effect immediately seen by the eyes, is none the less a physical effect of aperture size. It blurs things differently depending on distance. Thus why the Apple one looks unnatural.

You're saying nothing new and intentionally/disingenuously missing the point, my original one, and that's failing to understand Apple's iPhone customer base. And the reason why some effects, such as shallow DOF, are simulated. Which allows casual snappers a degree of artistic expression that would otherwise not be possible without an expensive camera/lens that could fit in your pocket.

And...that while bokeh from a wide aperture lens is obviously a function of aperture size, it is something no one will ever see looking at the same scene with ones' eyes. The same is true making B&W prints (unless you are 100% color blind).

I shoot a lot with a 35mm f/1.4 lens wide open. I have yet to personally observe the level of shallow DOF it achieves looking at the same scene with my eyes. Not even close. But then I understand that photography is hardly real and most often reflects biases and decisions made by the photographer at exposure time.
 
  • Like
Reactions: recoil80
View attachment 789784
I shot this with 7 Plus.
The left hand side of the cat's face/head is enormously distracting: The whole edge is crudely "smudged". There are blurry areas of hair immediately adjacent to others that are pin-sharp. It's crude butchery of a cute cat picture.

The ear on the left is just awful. It gets worse the more you look at it. A good photograph is about the subject, not post-processing goofs.
 
  • Like
Reactions: Dave FL
You're saying nothing new and intentionally/disingenuously missing the point, my original one, and that's failing to understand Apple's iPhone customer base. And the reason why some effects, such as shallow DOF, are simulated. Which allows casual snappers a degree of artistic expression that would otherwise not be possible without an expensive camera/lens that could fit in your pocket.

And...that while bokeh from a wide aperture lens is obviously a function of aperture size, it is something no one will ever see looking at the same scene with ones' eyes. The same is true making B&W prints (unless you are 100% color blind).

I shoot a lot with a 35mm f/1.4 lens wide open. I have yet to personally observe the level of shallow DOF it achieves looking at the same scene with my eyes. Not even close. But then I understand that photography is hardly real and most often reflects biases and decisions made by the photographer at exposure time.

Hi. reading this thread, Im sure you are a creative photographer , Though photography does not defy Physics in any way, it's just your perception/understanding of physics how physics works, and that is fine.

I have some amazing pics of the auuro, just cause I did not see the same image through the viewfinder, does not mean physics is defied in any way, from human to human, we will see a different interpretations of the same scene, though when you actually use software to simulate what the camera is seeing, that is a slippery slope of not actually pushing people to learn how to learn the art of photography, but making decisions for them based on developers algorithms, which looking at these pictures, are poor.....

I have a number lenses that range from f/0.95 to f/1.2 .......one of the main reasons is for the realistic and amazing Bokeh they produce.

I understand where you are coming from, but us photographers .....and everyone else, is bound by physics daily. How we individually understand physics, is individual perception.

That last image of the Cat, is not realistic, plain and simple. If you are happy with images that are manipulated in this manner, good, and I am sure millions will be. I personally want to control that myself, and not have software make the decisions for me.
 
I think the picture of the cat was a bad example of when to use the fake bokeh in a picture. I mean, it would look ok if taken with a DSLR but on an iPhone the fake bokeh can't really do a great job in that scenario, as the cat is really furry and there isn't a good contrast with the background.

Portrait mode is really hit or miss and sometimes you can have a shot that looks pretty good on the iPhone or on the iPad then you export the pic and watch it on a big screen and it sucks.
I attach a couple of examples taken with my 8+ (still on iOS 11 at the time), I think that apart from some weird stuff happening near the edges of the subject (the nose of the dog is cut) the fake bokeh is doing a good job, the background is blurred and there is a sharp transition from the subject and the rest of the picture.
iOS 12 has improved the effect a little bit on my 8+ and I'll soon have a Xs and hope I can get even better pictures with it.
I really love my DSLR, but can't have it with me all the time, in fact I can only seldom bring my camera as I have a lot of stuff to carry with me when I go out with my daughter. I'd say 1 out of 3 portrait shots that I take are pretty good and as I didn't have the Nikon with me I'd have missed those moment, so it is better than nothing.
 

Attachments

  • IMG_2359.jpg
    IMG_2359.jpg
    498.5 KB · Views: 87
  • IMG_5585.jpg
    IMG_5585.jpg
    777 KB · Views: 81
You're saying nothing new and intentionally/disingenuously missing the point, my original one, and that's failing to understand Apple's iPhone customer base.
I am honestly not being disingenuous.
And the reason why some effects, such as shallow DOF, are simulated. Which allows casual snappers a degree of artistic expression that would otherwise not be possible without an expensive camera/lens that could fit in your pocket.
I completely agree that it allows artistic expression that would not be possible without an expensive/pocketable solution.
But disagree that a shallow depth of field is being simulated. It is not. Depth of field is by definition showing depth to an image by differing degrees of blur in an image at differing depths.

Apples current implementation does not do this but simply blurs the background evenly in most cases. It is this fact that takes away information from the image that confuses the brain. Or at least does to me.
Some images are better than others. But the higher the blur, the worse the image is.

There are many posts pointing this out all over the web.
And...that while bokeh from a wide aperture lens is obviously a function of aperture size, it is something no one will ever see looking at the same scene with ones' eyes.
I don't disagree with that, but if you take all depth out of an image, that is not artistic impression but butchering a photo*. (*Depending on the photo, portraits with a background that is on same angle as the sensor will probably look not too bad so a great use of this feature)
The same is true making B&W prints (unless you are 100% color blind).
Colour blindness has nothing to do with black and white.
I shoot a lot with a 35mm f/1.4 lens wide open. I have yet to personally observe the level of shallow DOF it achieves looking at the same scene with my eyes. Not even close.
I agree
But then I understand that photography is hardly real and most often reflects biases and decisions made by the photographer at exposure time.
At the end of the day, Apples implementation will get better and I predict that it will almost replicate that of real glass.
But to do this Apple will either need to implement focus stacking or dual same focal length lens the correct distance apart to interrogate the depth information in the image. Focus stacking will not replicate real glass, but could get close. Dual lens would most likely replicate real glass.

TL;DR
I don't disagree with most of what you say, just that Apples constant blur implementation removes depth information from the image that confuses the brain.
[doublepost=1538126761][/doublepost]
It still looks good and puts all of the attention on the subject. Much better than portrait shots with deep depth of field.

In portrait shots where the background is parallel to the sensor, yes I would agree that this is a great use of Apples implementation. See post above to see where I am coming from and where I think Apple will eventually nail it.
 
I am honestly not being disingenuous.

I completely agree that it allows artistic expression that would not be possible without an expensive/pocketable solution.
But disagree that a shallow depth of field is being simulated. It is not. Depth of field is by definition showing depth to an image by differing degrees of blur in an image at differing depths.

Apples current implementation does not do this but simply blurs the background evenly in most cases. It is this fact that takes away information from the image that confuses the brain. Or at least does to me.
Some images are better than others. But the higher the blur, the worse the image is.

There are many posts pointing this out all over the web.

I don't disagree with that, but if you take all depth out of an image, that is not artistic impression but butchering a photo*. (*Depending on the photo, portraits with a background that is on same angle as the sensor will probably look not too bad so a great use of this feature)

Colour blindness has nothing to do with black and white.

I agree

At the end of the day, Apples implementation will get better and I predict that it will almost replicate that of real glass.
But to do this Apple will either need to implement focus stacking or dual same focal length lens the correct distance apart to interrogate the depth information in the image. Focus stacking will not replicate real glass, but could get close. Dual lens would most likely replicate real glass.

TL;DR
I don't disagree with most of what you say, just that Apples constant blur implementation removes depth information from the image that confuses the brain.
[doublepost=1538126761][/doublepost]

In portrait shots where the background is parallel to the sensor, yes I would agree that this is a great use of Apples implementation. See post above to see where I am coming from and where I think Apple will eventually nail it.

You continue to ignore the fact that the feature works fine for its intended purpose for Apple's iPhone customer base, the majority of which are not serious/professional photographers, but casual snappers with a few who may want to engage in a bit of artistic expression now and then.

Similarly you are certainly free to engage in pedantry and expect that it should produce the same result as a large aperture lens, with accompanying indignation that it doesn't.
 
E71E89DB-6B8D-4EFE-856B-CE48F1BB0226.jpeg
The left hand side of the cat's face/head is enormously distracting: The whole edge is crudely "smudged".

You’re right. I had not noticed it. It is difficult to get proper separation around the edges. When I remove the portrait effect from the shot, it actually looks much, much better. Keep in mind that I’m not a pro photographer and have never taking any sort of classes.
 
Last edited:
You continue to ignore the fact that the feature works fine for its intended purpose for Apple's iPhone customer base, the majority of which are not serious/professional photographers, but casual snappers with a few who may want to engage in a bit of artistic expression now and then.

Similarly you are certainly free to engage in pedantry and expect that it should produce the same result as a large aperture lens, with accompanying indignation that it doesn't.

Can I ask how can you say the feature works fine for the intended customer base? As here we are discussing how the pictures look “fake” and the discussion is not on a professional photographers site.

As much as you believe he is ignoring your point , you are also ignoring his.

Let’s see how this feature evolves , hoping it improves as time goes on . Look like a beta release for me now.
 
Why Does it matter if it looks ‘fake’? I just like the fact that you’re able to actually control the amount of blur and highlight the actual subject in the photo, it really is an interesting detail to the iPhone camera with manual control.

Yeah it’s like any software manipulation of images. Nice feature for users , if they choose to use it, software Bokeh is not new, been around for ages. For the purists it’s looks like crap, for average jo, awesome feature.
 
Can I ask how can you say the feature works fine for the intended customer base? As here we are discussing how the pictures look “fake” and the discussion is not on a professional photographers site.

As much as you believe he is ignoring your point , you are also ignoring his.

Let’s see how this feature evolves , hoping it improves as time goes on . Look like a beta release for me now.

Sure. I think you said it best in the post right above (post# 116): "For the purists it’s looks like crap, for average jo, awesome feature."

As an aside, I've often found some "purists" produce the most boring ho-hum photos. Their head is so wrapped up in having the best this or that, or best specs, that in their quest for the best, their photographs as a consequence exhibit little power. Perhaps technically accurate, but void of any emotion or heart.
 
Last edited:
Sure. I think you said it best in the post right above (post# 116): "For the purists it’s looks like crap, for average jo, awesome feature."

As an aside, I've often found some "purists" produce the most boring ho-hum photos. Their head is so wrapped up in having the best this or that, or best specs, that in their quest for the best, their photographs as a consequence exhibit little power. Perhaps technically accurate, but void of any emotion or heart.

I know exactly what you mean. Equipment will never trump the creative skills of the good photographer.

One does not need the best of eve best....Though you do realise the irony of that statement when it comes to iPhones right :)
 
So you don't use Photoshop on your full frame images, then?

What does that even mean? Are you trying to catch me out?
I'll answer your question, but first:
Define 'Full frame image'. What do you mean? The full frame of what? Do you mean a whole image regardless of sensor size or device, or do you think you mean a 35mm sized 'Full Frame' Equivalent sensor? I'm assuming you mean that.

Yes, I sometimes use photoshop on images that I make, regardless of the sensor size.
No, I don't apply depth of field adjustments (or blur) to pictures post capture. Is this what you think Photoshop is for?

If you're implying that using digital processing tools is no different than this abhorrent eye scraping software 'depth of field' machine learning AI BS, then you should probably bring yourself up to date with modern photographic practice. The assumption that using photoshop is a shortcut or 'cheating' is about as 1997 as you can get in the context of this argument, and nowhere near relevant to what mobile cameras are having to do to try to beat the laws of physics.

I really don't get the sudden 'comeback' nature of your question. Do you want to elaborate?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.