*sigh*Not sure you realize iOS is doing all this stuff in real time.
Sure… Colour me impressed.
*sigh*Not sure you realize iOS is doing all this stuff in real time.
As for PS being hard to select around hairs, no it's not at all. I do it all the time on landscapes and bird images, replacing the backgrounds of them and not losing any fine feather or tree details at all. In fact, there are many methods to achieve your selections around fine details. A quick YouTube search will show you how to do it in a myriad of different techniques. It really wouldn't be that hard to write an algorithm that replicates one or two of these techniques for the Apple development teams.
Not sure you realize iOS is doing all this stuff in real time. Through camera shake and subjects moving and all. Of course it won’t be perfect and especially on very fine strands of hair. Hair is by far THE hardest thing to mask out. Even if you spend time in photoshop after the image is taken (so a 100% still subject) it often has a hard time with hair and won’t catch each and every stray strand. Can photoshop do it better? Yes, because it’s working on an image already taken, not moving and not in real time.
It doesn't do it in real time, it takes the frame and then calculates the best application of the algorithm it can and applies it, or them actually, as you can almost immediately switch through them all to choose which one you want to go with.Not sure you realize iOS is doing all this stuff in real time.
It doesn't do it in real time, it takes the frame and then calculates the best application of the algorithm it can and applies it, or them actually, as you can almost immediately switch through them all to choose which one you want to go with.
If you watch the process in action, you will see the one second delay going on.
That's not true at all, the fine feather details of many of my bird shots are equally as tricky to correctly mask, if you look at a feather in close up, they have fine strands to them, just the same as hair and I manage to mask around those fine strands with no issues by various masking techniques. This I can achieve regardless of whether the feathers are backlit, front lit or from above or below, and quite often have very little contrast difference to their backgrounds because of being backlit.I’m well aware of methods in Photoshop, I don’t need to YouTube it. It’s not always perfect. Also feathers are drastically easier to mask than very, very fine hair strands that have practically zero contrast against the background.
Is Photoshop doing it all in real time? When slight camera shake can happen? When the subject may not be perfectly still? No, it is not.
Not sure you realize iOS is doing all this stuff in real time. Through camera shake and subjects moving and all. Of course it won’t be perfect and especially on very fine strands of hair. Hair is by far THE hardest thing to mask out. Even if you spend time in photoshop after the image is taken (so a 100% still subject) it often has a hard time with hair and won’t catch each and every stray strand. Can photoshop do it better? Yes, because it’s working on an image already taken, not moving and not in real time.
You get an approximation of the effect being applied. Take a screenshot of the preview and compare it to the actual image, you will see quite a few differences!Yes it is doing it in real time. Select portrait, focus on the subject and your able to view portrait mode live in real time on screen, even as you move the camera around and it will continue to track the subject and blur the background. That is real time.
That's not true at all, the fine feather details of many of my bird shots are equally as tricky to correctly mask, if you look at a feather in close up, they have fine strands to them, just the same as hair and I manage to mask around those fine strands with no issues by various masking techniques. This I can achieve regardless of whether the feathers are backlit, front lit or from above or below, and quite often have very little contrast difference to their backgrounds because of being backlit.
It's very simple for backlit birds when you have a close-up of them. One technique is to simply select the colour ranges of the background, refining that as needed till you get a clear and accurate selection, I would have thought that a graphics designer of 15 years experience would know this method and use it! Since Photoshop CC dropped, the refinements to the myriad of selection processes has made amazing leaps forward in regards to accuracy of fine detail masking.That‘s because you know where the feather is and which details are associated with the object. The iPhone needs to figure it out by AI and has to calculate it in the moment you are snapping the picture in millions of different situations. Show me how you isolate feathers or hair that got lit from the back. There is transparency. If you managed to succeed (which I doubt, and btw.: I am graphics designer with 15 years of experience, working for high profile brands), put it on a black backdrop and let‘s see the fantastic result. It will look ugly, no doubt about it.
That's not true at all, the fine feather details of many of my bird shots are equally as tricky to correctly mask, if you look at a feather in close up, they have fine strands to them, just the same as hair and I manage to mask around those fine strands with no issues by various masking techniques. This I can achieve regardless of whether the feathers are backlit, front lit or from above or below, and quite often have very little contrast difference to their backgrounds because of being backlit.
If you truely are as pro at photoshop (as you seem to be coming off as) and have been around photography for long enough you really would understand how much is happening in a blink of an eye to make portrait mode work.
Getting back to the thread, and its focus, I'm impressed with the efforts of the Apple teams to get this beta to where it is, but it's still a beta, obviously. I do reckon they will get better, well I hope they do, in regards to being able to differentiate fine details of clothes, hair, etc...
It's very simple for backlit birds when you have a close-up of them. One technique is to simply select the colour ranges of the background, refining that as needed till you get a clear and accurate selection, I would have thought that a graphics designer of 15 years experience would know this method and use it! Since Photoshop CC dropped, the refinements to the myriad of selection processes has made amazing leaps forward in regards to accuracy of fine detail masking.
Getting back to the thread, and its focus, I'm impressed with the efforts of the Apple teams to get this beta to where it is, but it's still a beta, obviously. I do reckon they will get better, well I hope they do, in regards to being able to differentiate fine details of clothes, hair, etc...
Here's a couple of images of one of our mutts using the Portrait Mode, funny thing is, that the Portrait Mode feature has been able to mask correctly around a few sections of fine hair, it's just not consistent in recognising them all as yet.
View attachment 722856
View attachment 722857
I know of this methode and many others, thank you Mr. Photoshop. But I was talking about something different. Please read my reply again.
For a phone camera the pictures are far better than what I have seen in the past. As someone with zero photo taking skills/review of photo skills, they look good to me. I would rather see these photos on my FB/Instagram.
The main take away is that anyone can take, what I think to be, really solid photos. The average joe is going to think the photos in this thread look way better than previous camera phone photos.
It's a pity it took Apple 3 years to come to the party, with nothing much better than what the Motorola M8 had with its UFocus feature which launched back in 2014, that could selectively blur backgrounds, with similar hit and miss results. Instagram had a better tool at that same time, because it allowed you to select the region you wanted in focus and how much blur to apply to the rest.Because the photos do look better than previous camera phone photos. It doesn’t even take just average joes to really realize that. Computational photography is the future for smartphone camera technology. And pretty much mandatory.
It's a pity it took Apple 3 years to come to the party, with nothing much better than what the Motorola M8 had with its UFocus feature which launched back in 2014, that could selectively blur backgrounds, with similar hit and miss results. Instagram had a better tool at that same time, because it allowed you to select the region you wanted in focus and how much blur to apply to the rest.
It's not like this hasn't been done before, the only difference is that Apple added some light sculpting features too.
Nonetheless, the Portrait Mode is still just a beta, and hopefully it will improve in consistency over time.