Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As for PS being hard to select around hairs, no it's not at all. I do it all the time on landscapes and bird images, replacing the backgrounds of them and not losing any fine feather or tree details at all. In fact, there are many methods to achieve your selections around fine details. A quick YouTube search will show you how to do it in a myriad of different techniques. It really wouldn't be that hard to write an algorithm that replicates one or two of these techniques for the Apple development teams.

I’m well aware of methods in Photoshop, I don’t need to YouTube it. It’s not always perfect. Also feathers are drastically easier to mask than very, very fine hair strands that have practically zero contrast against the background.

Is Photoshop doing it all in real time? When slight camera shake can happen? When the subject may not be perfectly still? No, it is not.
 
Not sure you realize iOS is doing all this stuff in real time. Through camera shake and subjects moving and all. Of course it won’t be perfect and especially on very fine strands of hair. Hair is by far THE hardest thing to mask out. Even if you spend time in photoshop after the image is taken (so a 100% still subject) it often has a hard time with hair and won’t catch each and every stray strand. Can photoshop do it better? Yes, because it’s working on an image already taken, not moving and not in real time.

The fine details of hair are the give away. Also, I believe Apple is focusing on people not pets as it optimizes the software behind this. I’m sure in time the software will get better. Everyday objects can be the subject with no visible artifacting.

Heck in next years s models it may even be possible to use these features when recording video!
 
Not sure you realize iOS is doing all this stuff in real time.
It doesn't do it in real time, it takes the frame and then calculates the best application of the algorithm it can and applies it, or them actually, as you can almost immediately switch through them all to choose which one you want to go with.

If you watch the process in action, you will see the one second delay going on.
 
I'm not gonna lie: some of the pictures here truly look absolutely fantastic. Heck, if I didn't know they were taken with an iPhone 8 Plus I'd definitely assume they come from an DSLR!

Haven't really put portrait lighting to good use. During the only tests I've done the result was pretty awful to be honest. But then I see the pictures in this topic... Gonna give it another try later today! Portrait mode on the other hand has already given me some nice outcomings. This is one I made of my cat. Posted this one in another topic by the way, so you might've seen it already. :p

fullsizeoutput_a21.jpeg


It's a pretty simple one, and nothing compared to some others that I have seen. But still a pretty nice result.

I think the one below also shows how good portrait mode works, even with some more tricky situations. The leaves are of incredibly high detail, and at the same time the background has the appropriate blur for still being able to see the leaves there. Yet, it's still unlike 'Focus Pixels' that's been added with the iPhone 6 (Plus). This looks just so much better in my opinion.

fullsizeoutput_9ff.jpeg


Really love Portrait Mode. Hopefully I can get portrait lighting to work as well.
 
I don’t think it’s that serious of an issue/feature to bicker over.

Someone with the eye and hand for photography will be able to produce stunning images with a half baked, beta software trick and a phone.

Others can be given a professional studio with supermodels, photoshop and a reincarnated Herb Ritts as a mentor and still produce garbage that only belongs on Snapchat for 3 seconds.
 
It doesn't do it in real time, it takes the frame and then calculates the best application of the algorithm it can and applies it, or them actually, as you can almost immediately switch through them all to choose which one you want to go with.

If you watch the process in action, you will see the one second delay going on.

Yes it is doing it in real time. Select portrait, focus on the subject and your able to view portrait mode live in real time on screen, even as you move the camera around and it will continue to track the subject and blur the background. That is real time.
 
I’m well aware of methods in Photoshop, I don’t need to YouTube it. It’s not always perfect. Also feathers are drastically easier to mask than very, very fine hair strands that have practically zero contrast against the background.

Is Photoshop doing it all in real time? When slight camera shake can happen? When the subject may not be perfectly still? No, it is not.
That's not true at all, the fine feather details of many of my bird shots are equally as tricky to correctly mask, if you look at a feather in close up, they have fine strands to them, just the same as hair and I manage to mask around those fine strands with no issues by various masking techniques. This I can achieve regardless of whether the feathers are backlit, front lit or from above or below, and quite often have very little contrast difference to their backgrounds because of being backlit.
 
Not sure you realize iOS is doing all this stuff in real time. Through camera shake and subjects moving and all. Of course it won’t be perfect and especially on very fine strands of hair. Hair is by far THE hardest thing to mask out. Even if you spend time in photoshop after the image is taken (so a 100% still subject) it often has a hard time with hair and won’t catch each and every stray strand. Can photoshop do it better? Yes, because it’s working on an image already taken, not moving and not in real time.

Exactly.
 
  • Like
Reactions: iSayBoourns
Yes it is doing it in real time. Select portrait, focus on the subject and your able to view portrait mode live in real time on screen, even as you move the camera around and it will continue to track the subject and blur the background. That is real time.
You get an approximation of the effect being applied. Take a screenshot of the preview and compare it to the actual image, you will see quite a few differences!
 
  • Like
Reactions: iTom17
That's not true at all, the fine feather details of many of my bird shots are equally as tricky to correctly mask, if you look at a feather in close up, they have fine strands to them, just the same as hair and I manage to mask around those fine strands with no issues by various masking techniques. This I can achieve regardless of whether the feathers are backlit, front lit or from above or below, and quite often have very little contrast difference to their backgrounds because of being backlit.

That‘s because you know where the feather is and which details are associated with the object. The iPhone needs to figure it out by AI and has to calculate it in the moment you are snapping the picture in millions of different situations. Show me how you isolate feathers or hair that got lit from the back. There is transparency. If you managed to succeed (which I doubt, and btw.: I am graphics designer with 15 years of experience, working for high profile brands), put it on a black backdrop and let‘s see the fantastic result. It will look ugly, no doubt about it.
 
  • Like
Reactions: iSayBoourns
That‘s because you know where the feather is and which details are associated with the object. The iPhone needs to figure it out by AI and has to calculate it in the moment you are snapping the picture in millions of different situations. Show me how you isolate feathers or hair that got lit from the back. There is transparency. If you managed to succeed (which I doubt, and btw.: I am graphics designer with 15 years of experience, working for high profile brands), put it on a black backdrop and let‘s see the fantastic result. It will look ugly, no doubt about it.
It's very simple for backlit birds when you have a close-up of them. One technique is to simply select the colour ranges of the background, refining that as needed till you get a clear and accurate selection, I would have thought that a graphics designer of 15 years experience would know this method and use it! Since Photoshop CC dropped, the refinements to the myriad of selection processes has made amazing leaps forward in regards to accuracy of fine detail masking.

Getting back to the thread, and its focus, I'm impressed with the efforts of the Apple teams to get this beta to where it is, but it's still a beta, obviously. I do reckon they will get better, well I hope they do, in regards to being able to differentiate fine details of clothes, hair, etc...

Here's a couple of images of one of our mutts using the Portrait Mode, funny thing is, that the Portrait Mode feature has been able to mask correctly around a few sections of fine hair, it's just not consistent in recognising them all as yet.




IMG_0747.jpg


IMG_0749.jpg
 
  • Like
Reactions: AlliFlowers
That's not true at all, the fine feather details of many of my bird shots are equally as tricky to correctly mask, if you look at a feather in close up, they have fine strands to them, just the same as hair and I manage to mask around those fine strands with no issues by various masking techniques. This I can achieve regardless of whether the feathers are backlit, front lit or from above or below, and quite often have very little contrast difference to their backgrounds because of being backlit.

Even if your photoshop skills are super pro, you’re also using a DSLR that probably has at least twice the megapixels to work with (therefore more detail retained), much larger sensor, and 100% guaranteed far superior optics (aka much better glass.) Which all result in overall a huge difference in detail Photoshop has to work with vs what the iPhone has to work with (and has to do it with on a live subject, in just fractions of a second.) Photoshop has all the time it needs to process it, with manually using a brush tool to refine edges.

If you truely are as pro at photoshop (as you seem to be coming off as) and have been around photography for long enough you really would understand how much is happening in a blink of an eye to make portrait mode work.
 
  • Like
Reactions: narr
some pictures look better without the effect anyway. IMHO it's just distracting to have an unfocused market with 2 people in the background.

"what are they doing? what kind of market is this? i can't tell, wuaaaaaaaa" :p
 
If you truely are as pro at photoshop (as you seem to be coming off as) and have been around photography for long enough you really would understand how much is happening in a blink of an eye to make portrait mode work.

I did comment on this immediately above your response, but I'll repeat it for you here:
Getting back to the thread, and its focus, I'm impressed with the efforts of the Apple teams to get this beta to where it is, but it's still a beta, obviously. I do reckon they will get better, well I hope they do, in regards to being able to differentiate fine details of clothes, hair, etc...

I then went on to show a couple of examples of its current inconsistencies in regards to masking hair details, which it actually does well in several places in the images and then completely stuffs it up in other places.
 
It's very simple for backlit birds when you have a close-up of them. One technique is to simply select the colour ranges of the background, refining that as needed till you get a clear and accurate selection, I would have thought that a graphics designer of 15 years experience would know this method and use it! Since Photoshop CC dropped, the refinements to the myriad of selection processes has made amazing leaps forward in regards to accuracy of fine detail masking.

Getting back to the thread, and its focus, I'm impressed with the efforts of the Apple teams to get this beta to where it is, but it's still a beta, obviously. I do reckon they will get better, well I hope they do, in regards to being able to differentiate fine details of clothes, hair, etc...

Here's a couple of images of one of our mutts using the Portrait Mode, funny thing is, that the Portrait Mode feature has been able to mask correctly around a few sections of fine hair, it's just not consistent in recognising them all as yet.




View attachment 722856

View attachment 722857

I know of this methode and many others, thank you Mr. Photoshop. But I was talking about something different. Please read my reply again.

Show me an example of feathers or blonde hair backlit from extreme sunlight, put agains a black backdrop in b/w and let‘s see how great it looks, even when the selection is more accurate (but then you are a human and not a CPU with logarithmic calculations, you work with a much better photograph in higher resolution and probably haven‘t isolated fine hairs perfectly in the same time I snapped a photo with stage light FX). That‘s my point.
 
Last edited:
  • Like
Reactions: iSayBoourns
For a phone camera the pictures are far better than what I have seen in the past. As someone with zero photo taking skills/review of photo skills, they look good to me. I would rather see these photos on my FB/Instagram.

The main take away is that anyone can take, what I think to be, really solid photos. The average joe is going to think the photos in this thread look way better than previous camera phone photos.
 
I know of this methode and many others, thank you Mr. Photoshop. But I was talking about something different. Please read my reply again.

He’s also comparing using a lot of manual finessing in photoshop to “make it perfect” vs fully automated on the fly processing.

It’s easy for a human to spend multiple minutes knowing that there is a cat whisker there he needs to spend time perfectly masking out. A computer won’t know it’s a whisker to begin with. All it sees is some very faint pixels barely a shade darker than the background.

Photoshop if you literally just push the button and don’t touch any of the refine edge controls to manually “perfect” it, it won’t do much of a better job at all. And like we’ve both pointed out. It’s not also having to do it in the moment, on a live subject.
[doublepost=1506945035][/doublepost]
For a phone camera the pictures are far better than what I have seen in the past. As someone with zero photo taking skills/review of photo skills, they look good to me. I would rather see these photos on my FB/Instagram.

The main take away is that anyone can take, what I think to be, really solid photos. The average joe is going to think the photos in this thread look way better than previous camera phone photos.

Because the photos do look better than previous camera phone photos. It doesn’t even take just average joes to really realize that. Computational photography is the future for smartphone camera technology. And pretty much mandatory. For a current smartphone cameras to achieve the same look naturallly the same way a 85mm f/1.8 on a full frame DSLR can do (for example), the f/stop would quite literally have to be f/0.2 on the smartphone lens. Which I doubt will be possible in a smartphone anytime soon and can therefore only be done via software.

Some people are just picky and/or on their dslr high horse because they picked up a Canon Rebel 2 years ago.
 
  • Like
Reactions: Ntombi and narr
Because the photos do look better than previous camera phone photos. It doesn’t even take just average joes to really realize that. Computational photography is the future for smartphone camera technology. And pretty much mandatory.
It's a pity it took Apple 3 years to come to the party, with nothing much better than what the Motorola M8 had with its UFocus feature which launched back in 2014, that could selectively blur backgrounds, with similar hit and miss results. Instagram had a better tool at that same time, because it allowed you to select the region you wanted in focus and how much blur to apply to the rest.

It's not like this hasn't been done before, the only difference is that Apple added some light sculpting features too.

Nonetheless, the Portrait Mode is still just a beta, and hopefully it will improve in consistency over time.
 
It's a pity it took Apple 3 years to come to the party, with nothing much better than what the Motorola M8 had with its UFocus feature which launched back in 2014, that could selectively blur backgrounds, with similar hit and miss results. Instagram had a better tool at that same time, because it allowed you to select the region you wanted in focus and how much blur to apply to the rest.

It's not like this hasn't been done before, the only difference is that Apple added some light sculpting features too.

Nonetheless, the Portrait Mode is still just a beta, and hopefully it will improve in consistency over time.

The HTC m8 Ufocus doesn’t even come close to Apple’s Portrait Mode. You’d have to be blind to think they are close in hit/miss results.
8096954E-F20E-477F-95CD-97011D089B97.jpeg


Half the head is blurred for no reason (ears, most the hair, cheeks, basically every single edge of his body, most the hands, space between the arms and body not blurred) it’s so bad and laughable you can even compare that to Portrait mode. Yes it’s the same “feature” but done so horriblely bad, and this is literally just one of many just as bad examples offered in a quick HTC m8 UFocus google search.

Sure they may have done it first but Apple is doing it better (as with most things Apple does, never first, but generally better.) This happens time and time again. Someone released some feature, it flops due to poor implementation. Some time later Apple comes up with a more elegant way and a way that just works so much better, becoming more mainstream, then others follow suit to get on the bandwagon of a feature that originally flopped by someone else. Classic example being TouchID.

It will no doubt improve over time. Even just within the last year of it launching in 10.1 to now in 11.1 (beta) it has improved a fair amount.
 
During good lightning conditions the 8 can produce some incredibly detailed images. This is the first time owning a plus model with the portrait lens and the results vary quite a bit. Sometimes the software is very accurate, other times it makes the corners of subjects look very messy or greasy with smeared lines or blurred areas. I have mixed feelings about the studio lights, sometimes they work, a lot of the times it looks rather bland.
Overall though the iPhone 8 (and a lot of the older generations of iPhones as well) can produce stunning images, as long as you have sufficient light. For quick snapshots the iP8 is a great camera most of the time, the in-camera software is truly amazing, yet sometimes it'll ruin some images with the software bokeh.
I'm really happy with the camera, it's amazing what such a small sensor combined with Apples software can do.

One of the portrait modes. The results are okay most of the time, esp for social media where you and others don't look at the images on a large monitor or actually print the images.
q2PXTDl.jpg
 
Quick question... When I open my camera app, go to portrait mode, and then select "stage lighting mono" the view finder shows black and white but when I take the picture it's in color. Am I doing something wrong? It should be black and white and pretty much blacking out the background, right?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.