It'd be sad if a whole generation grows up thinking these deformed images are normal.It's built in and accessible to all iPhone XS and XS Max users. So basically millions more can make use of it without the need to download and launch a separate app. They can launch and access it right from the lock screen and it launches faster.
And a fine kitty it is too. Pity about the last photo.Just here for the kitty.![]()
I think The Verge did a review of this feature last week and they also included a shot taken by a basic DSLR to compare what the software in the iPhone does compared to a real camera's bokeh (the out of focus area in the background when shooting with a wide open lens). It was so not close, I can't imagine anyone using this feature on a phone for serious photography. I do think though, in a few years the engineers will have a feature that really works. But that is not the current reality, it's really just a gimmick. Gotta start somewhere though!
If you're talking about the photo in this article, it's a sloppy comparison. Their iPhone photo isn't even in focus. I recommend looking at other sites for a better depiction of what the feature can do and what its limitations are.
The effect is pretty cool, and basically - you get to decide if you like them or not. That's the beauty here.
You miss the point, bokeh while not an effect immediately seen by the eyes, is none the less a physical effect of aperture size. It blurs things differently depending on distance. Thus why the Apple one looks unnatural.It's funny that you cling to the notion that simulated bokeh is not real, yet simultaneously reject that a two-dimensional photographic print representation of a three dimensional space is similarly not real.
Try as I might, I have personally yet to see a ruler up close or a line of dominos that resembles a print made from my camera and wide aperture lens and resulting shallow DOF. Maybe your eyes' apertures are abnormally huge, like some forrest creature at night.
With regard to photography being real or not, you may want to study the history of photography a bit.
You miss the point, bokeh while not an effect immediately seen by the eyes, is none the less a physical effect of aperture size. It blurs things differently depending on distance. Thus why the Apple one looks unnatural.
You miss the point, bokeh while not an effect immediately seen by the eyes, is none the less a physical effect of aperture size. It blurs things differently depending on distance. Thus why the Apple one looks unnatural.
The left hand side of the cat's face/head is enormously distracting: The whole edge is crudely "smudged". There are blurry areas of hair immediately adjacent to others that are pin-sharp. It's crude butchery of a cute cat picture.View attachment 789784
I shot this with 7 Plus.
You're saying nothing new and intentionally/disingenuously missing the point, my original one, and that's failing to understand Apple's iPhone customer base. And the reason why some effects, such as shallow DOF, are simulated. Which allows casual snappers a degree of artistic expression that would otherwise not be possible without an expensive camera/lens that could fit in your pocket.
And...that while bokeh from a wide aperture lens is obviously a function of aperture size, it is something no one will ever see looking at the same scene with ones' eyes. The same is true making B&W prints (unless you are 100% color blind).
I shoot a lot with a 35mm f/1.4 lens wide open. I have yet to personally observe the level of shallow DOF it achieves looking at the same scene with my eyes. Not even close. But then I understand that photography is hardly real and most often reflects biases and decisions made by the photographer at exposure time.
I am honestly not being disingenuous.You're saying nothing new and intentionally/disingenuously missing the point, my original one, and that's failing to understand Apple's iPhone customer base.
I completely agree that it allows artistic expression that would not be possible without an expensive/pocketable solution.And the reason why some effects, such as shallow DOF, are simulated. Which allows casual snappers a degree of artistic expression that would otherwise not be possible without an expensive camera/lens that could fit in your pocket.
I don't disagree with that, but if you take all depth out of an image, that is not artistic impression but butchering a photo*. (*Depending on the photo, portraits with a background that is on same angle as the sensor will probably look not too bad so a great use of this feature)And...that while bokeh from a wide aperture lens is obviously a function of aperture size, it is something no one will ever see looking at the same scene with ones' eyes.
Colour blindness has nothing to do with black and white.The same is true making B&W prints (unless you are 100% color blind).
I agreeI shoot a lot with a 35mm f/1.4 lens wide open. I have yet to personally observe the level of shallow DOF it achieves looking at the same scene with my eyes. Not even close.
At the end of the day, Apples implementation will get better and I predict that it will almost replicate that of real glass.But then I understand that photography is hardly real and most often reflects biases and decisions made by the photographer at exposure time.
It still looks good and puts all of the attention on the subject. Much better than portrait shots with deep depth of field.
I am honestly not being disingenuous.
I completely agree that it allows artistic expression that would not be possible without an expensive/pocketable solution.
But disagree that a shallow depth of field is being simulated. It is not. Depth of field is by definition showing depth to an image by differing degrees of blur in an image at differing depths.
Apples current implementation does not do this but simply blurs the background evenly in most cases. It is this fact that takes away information from the image that confuses the brain. Or at least does to me.
Some images are better than others. But the higher the blur, the worse the image is.
There are many posts pointing this out all over the web.
I don't disagree with that, but if you take all depth out of an image, that is not artistic impression but butchering a photo*. (*Depending on the photo, portraits with a background that is on same angle as the sensor will probably look not too bad so a great use of this feature)
Colour blindness has nothing to do with black and white.
I agree
At the end of the day, Apples implementation will get better and I predict that it will almost replicate that of real glass.
But to do this Apple will either need to implement focus stacking or dual same focal length lens the correct distance apart to interrogate the depth information in the image. Focus stacking will not replicate real glass, but could get close. Dual lens would most likely replicate real glass.
TL;DR
I don't disagree with most of what you say, just that Apples constant blur implementation removes depth information from the image that confuses the brain.
[doublepost=1538126761][/doublepost]
In portrait shots where the background is parallel to the sensor, yes I would agree that this is a great use of Apples implementation. See post above to see where I am coming from and where I think Apple will eventually nail it.
The left hand side of the cat's face/head is enormously distracting: The whole edge is crudely "smudged".
You continue to ignore the fact that the feature works fine for its intended purpose for Apple's iPhone customer base, the majority of which are not serious/professional photographers, but casual snappers with a few who may want to engage in a bit of artistic expression now and then.
Similarly you are certainly free to engage in pedantry and expect that it should produce the same result as a large aperture lens, with accompanying indignation that it doesn't.
Why Does it matter if it looks ‘fake’? I just like the fact that you’re able to actually control the amount of blur and highlight the actual subject in the photo, it really is an interesting detail to the iPhone camera with manual control.
Can I ask how can you say the feature works fine for the intended customer base? As here we are discussing how the pictures look “fake” and the discussion is not on a professional photographers site.
As much as you believe he is ignoring your point , you are also ignoring his.
Let’s see how this feature evolves , hoping it improves as time goes on . Look like a beta release for me now.
Sure. I think you said it best in the post right above (post# 116): "For the purists it’s looks like crap, for average jo, awesome feature."
As an aside, I've often found some "purists" produce the most boring ho-hum photos. Their head is so wrapped up in having the best this or that, or best specs, that in their quest for the best, their photographs as a consequence exhibit little power. Perhaps technically accurate, but void of any emotion or heart.
So you don't use Photoshop on your full frame images, then?