Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s fine with sharp edges, but terrible with feathery/ill-defined edges. Kitty’s a great example of a subject where a DSLR is essential.

I guess professional pet photographers can breathe a sigh of relief.

Until next year, perhaps?
 
It looks awful.

I think The Verge did a review of this feature last week and they also included a shot taken by a basic DSLR to compare what the software in the iPhone does compared to a real camera's bokeh (the out of focus area in the background when shooting with a wide open lens). It was so not close, I can't imagine anyone using this feature on a phone for serious photography. I do think though, in a few years the engineers will have a feature that really works. But that is not the current reality, it's really just a gimmick. Gotta start somewhere though!
 
I agree. Much of the time it doesn't quite work... but its' *non-destructive* so you can just remove it. You can't fix a focussing error with an SLR's f/1.4 image once it's taken.

This isn't fixing a focussing error either. It's just a built in slider for the kind of blur you could achieve with any basic image editor after the fact, which is also non-destructive.
 
The live view post processing of the shallow depth of field definitely isn’t accurate enough, to me and maybe a few of you, to warrant not being able to bring this to other devices, at the very least the dual camera 7+,8+, and X but rather hopefully the 7 and 8 too. The algorithm seems to do a very rough application to give you an idea of the outcome until the photo is taken then background processing really does the work and it applies it to the photo. This is my first run so far with it, I’m using an XS, but that’s what I’m gathering right now

Edit: what I meant by the comment about why it should be on other devices, to clarify I am referring to those devices should be able to do the post processing without a hitch since it seems to be a background run process. It may take a second longer for a 7 series to do it but I think it could pull it off no issue.
 

Attachments

  • 63A87BD3-5029-490A-B586-71C497968F57.jpeg
    63A87BD3-5029-490A-B586-71C497968F57.jpeg
    854.9 KB · Views: 269
  • 04830F8B-EEE0-4789-9C04-61ED303396C9.jpeg
    04830F8B-EEE0-4789-9C04-61ED303396C9.jpeg
    627.3 KB · Views: 269
  • 86419F9E-FAE4-420E-8348-8A7EAF8D6052.jpeg
    86419F9E-FAE4-420E-8348-8A7EAF8D6052.jpeg
    736.3 KB · Views: 236
  • Like
Reactions: Marekul
Has anyone seen the Youtube videos by Unbox Therapy where he shows how Apple have this weird filter on the Iphone XS Max that beautifies the skin on people and how there's no way to turn it off? So very lame.
 
You will end up surrounded by a cloud of cat hairs forever. All you home will be covered in cat hairs, your car will be lined in cat hairs, your clothes will be covered in a protective and highly visible layer of cat hairs. I hope you don't have any interviews planned any time soon. They'll put a red line through your resumé (and say it's a cat hair).

The love and companionship that I would receive from having a cat in the family would make it worth it. The cherry on top would be being able to rub on the kitty's paw pillows.
 
  • Like
Reactions: brushlee
It seems to me that the effect was done with software algorithmically after the images were captured. Look at the hair around the edges nand the adjacent background, which dropped out of focus almost immediately. It is not the same as true optical depth of field achieved by opening/closing the lens aperture, which would be gradual.

Would someone shoot a deck of domino tiles and lining up from foreground to background and play with it?

If that’s the case, I consider this as a gimmick. Any serious photographer will snicker at it.
Photog snobbery aside this is a nice feature.
 
There is something weird going on with this year’s iPhone cameras and i think it’s all software based. Sometimes the images look really flat, sometimes people’s faces look too smooth. There is something wrong going on and I really believe that software shouldn’t be as involved in creating images as they are. I know that google has had great success with their algorithms and Samsung delivers great, though not true images through software, I just don’t think Apple should go there especially when they are not very good at it. I am so far not happy with this year’s iPhone camera. Too artificial for my tastes. If I want it to look like instagram filtered photos as Samsung does, then I would have gotten a Galaxy Note 9.
 
RE: "Right now, editing the depth of a photo is limited to images that have already been captured, but starting in iOS 12.1, it will work in real time too. "

OMG, are you saying they don't already have that ???

There's at least one third-party app out there, possibly more, that's had Live Preview Depth Control for 6+ months !

More concrete evidence that AAPL does in-deed have a Complete & Total Stranglehold on App Discovery !

It's truly surprising that AAPL has been able to hold onto it's iOS App Store "monopoly" for as long as it has !

BTW, Portrait Mode is simply based-upon the capture & processing of "Depth Data Maps" in addition to the normal RGB camera processing steps ... the resolution & capture rate of such "maps" is currently fairly limited with internal hardware.
 
RE: "Right now, editing the depth of a photo is limited to images that have already been captured, but starting in iOS 12.1, it will work in real time too. "

OMG, are you saying they don't already have that ???

There's at least one third-party app out there, possibly more, that's had Live Preview Depth Control for 6+ months !

More concrete evidence that AAPL does in-deed have a Complete & Total Stranglehold on App Discovery !

It's truly surprising that AAPL has been able to hold onto it's iOS App Store "monopoly" for as long as it has !

BTW, Portrait Mode is simply based-upon the capture & processing of "Depth Data Maps" in addition to the normal RGB camera processing steps ... the resolution & capture rate of such "maps" is currently fairly limited with internal hardware.

This makes a little more sense then, I missed that. See my post above about how the live view has awful interpretation of depth. This could still be something I think at the very least the 8, 8+, and X could easily adopt so it still raises the question why they hold out like that.
 
  • Like
Reactions: Marekul
What are you supposed to do with pictures of your cat? Show them to people? Look at them yourself? You can just look at the actual cat. It looks even more like a cat in real life than it does in pictures.
 
  • Like
Reactions: BruceEBonus
I think The Verge did a review of this feature last week and they also included a shot taken by a basic DSLR to compare what the software in the iPhone does compared to a real camera's bokeh (the out of focus area in the background when shooting with a wide open lens). It was so not close, I can't imagine anyone using this feature on a phone for serious photography. I do think though, in a few years the engineers will have a feature that really works. But that is not the current reality, it's really just a gimmick. Gotta start somewhere though!
That’s what I thought about Siri.
 
I think The Verge did a review of this feature last week and they also included a shot taken by a basic DSLR to compare what the software in the iPhone does compared to a real camera's bokeh (the out of focus area in the background when shooting with a wide open lens). It was so not close, I can't imagine anyone using this feature on a phone for serious photography. I do think though, in a few years the engineers will have a feature that really works. But that is not the current reality, it's really just a gimmick. Gotta start somewhere though!

If you're talking about the photo in this article, it's a sloppy comparison. Their iPhone photo isn't even in focus. I recommend looking at other sites for a better depiction of what the feature can do and what its limitations are.
 
Apple, i find really hard to believe that you need to upgrad Iphone Xs or MAX in order to get this feature Depth Control, where i could do so easily using this app called "FOCOS" that will do the trick as well...

This can be done mostly software based not the hardware.

Yeah that's been the case for the last several years now with iPhone upgrades. Mostly software based things that can be done with apps or through jailbreak.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.