Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sean000

macrumors 68000
Original poster
Jul 16, 2015
1,628
2,346
Bellingham, WA
So the Apple Event presentation showed off the new Depth Control feature, but it was unclear to me whether it was a feature unique to the new iPhone models or if it was a new feature of iOS 12.

I currently use an app on my iPhone X called Focos that allows you to adjust depth of field, the shape of the bokeh, and even change the point of focus. It works with any iPhone that has dual lenses, and the results are impressive. You can also use it to edit photos taken in portrait mode using the native camera app.

So Focos actually does more than the new iOS feature, so I guess I'm just curious. If it is only available on the newer models, is it using an upgraded dot projector to help more accurately manage depth transitions? Sometimes the portrait mode on the X, as well as the Focos app, get things slightly wrong. Focos has a mask mode that allows you to tidy things up a bit, but if the new models are simply more accurate when it comes to mapping depth that would be an interesting advancement in the hardware.

Not sure if this belongs in the iOS 12 forum, or the iPhone forum.
 
If they treat it like Portrait Lighting mode like they did for the 8+/X (where it was only available on those models) then it will only be available on the new phones.

The funny thing is the iPhone 7+ has the capability to edit PL because if you take a picture on an iPhone 8+/X and then open it up on an iPhone 7 you get the Portrait Lighting editing options.
 
If they treat it like Portrait Lighting mode like they did for the 8+/X (where it was only available on those models) then it will only be available on the new phones.

The funny thing is the iPhone 7+ has the capability to edit PL because if you take a picture on an iPhone 8+/X and then open it up on an iPhone 7 you get the Portrait Lighting editing options.

That’s a good point about Portrait Lighting.
[doublepost=1537077456][/doublepost]
They'll probably lock it to this year's phones to encourage people to upgrade, but the ability to refocus after shooting portrait mode shots has been around for years. I use an app called Focos (App Store link) but I'm sure others exist, as well.

Focos is the app I mention in my first post. It’s amazing! It even does much more than the new feature Apple is touting.

I guess I’m just curious to see if it will be a new iOS 12 feature for all dual lens devices, or if Apple will enable it only for the latest iPhones.
 
Depth Control is also available for the Xr, which has a single lens camera. It is a pretty darn neat feature, and as I understood it during the keynote, it relies on the power of the new A12 Bionic capabilities (Especially since it doesn't require a second lens!), so likely not available for other models than the new ones.
 
If you check the phones comparison page on apple’s website (https://www.apple.com/iphone/compare/), it’s explicitly stated that only the new phones have advanced bokeh and depth control. It feels like a marketing decision, but it might be also a choice related to computing power and battery consumption. We don’t know if the feature works like Focos or whether it uses other ways. For example a lot of these apps use ai and machine learning to recognize that something in the foreground is a person and adjust settings accordingly, this kind of live analysis can be very intense on the cpu (like, the phone gets hotter), so if there isn’t a powerful enough system, everything gets slower and the experience is bad, something that apple doesn’t like.
 
If you check the phones comparison page on apple’s website (https://www.apple.com/iphone/compare/), it’s explicitly stated that only the new phones have advanced bokeh and depth control. It feels like a marketing decision, but it might be also a choice related to computing power and battery consumption. We don’t know if the feature works like Focos or whether it uses other ways. For example a lot of these apps use ai and machine learning to recognize that something in the foreground is a person and adjust settings accordingly, this kind of live analysis can be very intense on the cpu (like, the phone gets hotter), so if there isn’t a powerful enough system, everything gets slower and the experience is bad, something that apple doesn’t like.

Thanks for finding that. I’m leaning towards marketing decision, until I see evidence that the new models are better at isolating the main subject and determining true depth. I never saw any mention of a dot projector for the outward facing camera, so it’s not Tru-depth the way the selfie camera with dot projector is. Maybe they figured out a better way to get around it with software and processing power, which might indeed be too resource intensive for even the A11.

Portrait mode on my iPhone X already does a pretty impressive job of isolating the main subject (as does the focos app), but it doesn’t always get things right of course. What impressed me about focos is the way you can adjust the mask after shooting to be more precise. You can even adjust the depth of field distance for front and back bokeh, shape of bokeh, lighting, etc.

As a photographer who uses dedicated cameras with larger sensors and fast (wide aperture) lenses, I’m just fascinated by this stuff. I’m also impressed at how well it usually works. With the portrait photos I can often find flaws if I examine them closely, but overall (and under decent lighting conditions) it’s getting harder to distinguish my iPhone photos from my Olympus and Nikon photos from a normal viewing distance and small to medium print size.
 
  • Like
Reactions: rawCpoppa
New Phones
Was never manifested in betas for older platforms
Looks like a marketing decision
 
Was never manifested in betas for older platforms
well, that might as well mean that it's not available to older ones because they are not powerful enough. We should also define at what point something becomes a 'marketing' decision. Is nixing a feature from a platform "a marketing decision" if it's technically possible even if it's too slow, the phone gets hot and it's a battery drainer? Or is it a design decision?
It's a tricky evaluation, as Apple as always valued the substance of the experience as a differentiating element of its products. Of course they need something shiny to sell the phones, but as pretty much all of the real time evaluation for depth control and advanced bokeh is software based, I'm inclined to think that processing power plays a role.
 
So I've read a correction on Daring Fireball that said that one can edit the background bokeh with the advanced controls on the iPhone X - of course the picture must have been shot on an Xs. That gives credit to the theory that the "up to 9-times faster" neural engine is necessary to build the depth model quickly enough in real time, but modifying the bokeh is not so complicated calculations-wise (and in fact the Focos app does it very well already). So the new phones have a new, better way of building the depth information, and once the model is embedded in the picture the editing is available to everyone because it's not so taxing in terms of calculations.
 
So if you take a photo with 8+ or X and transfer it to 7, then the stock Photos app on 7 shows option for portrait lighting?
And if so, then is it only for that photo or does it then work on other photos shot with 7 itself.
Looks like it must be checking exif data, and if it's shot with 8+ or X then enables the portrait lighting but the stock Photos app is the same for all models.
If they treat it like Portrait Lighting mode like they did for the 8+/X (where it was only available on those models) then it will only be available on the new phones.

The funny thing is the iPhone 7+ has the capability to edit PL because if you take a picture on an iPhone 8+/X and then open it up on an iPhone 7 you get the Portrait Lighting editing options.
 
Looks like it must be checking exif data, and if it's shot with 8+ or X then enables the portrait lighting but the stock Photos app is the same for all models.
it's not simply the information that a picture has been shot with a particular model: the picture file itself must include also the depth map that's needed to edit it. I'm not sure how the iPhone stores it (if it's an extra bit of info attached in the JPG file, or some other trickery), but if that is not included in the file, it cannot be editable.
At the same time, if a phone is not capable of creating the depth map, it won't be possible to have the effect on pictures shot on that phone. So the key is that editing pictures that have some sort of depth map can be done on many devices, but creating this depth map (a simpler one like on the iphone X and a more complex one on the Xs), can only be done on certain models.
 
I am willing to bet that the A11 Bionic in the iPhone X is plenty capable of doing Depth Control, seeing as how the A12 Bionic is only about 15% faster. Yes, I understand the Neural Engine is WAY faster, and able to handle 5 trillion operations per second vs. the 600 billion operations per second. However, how much of the Neural Engine plus the CPU do you think is required to handle Depth Control? I am going to guess and say not all 5 trillion operations per second and the full power of the A12 CPU.

I think this was a decision so that people (especially in photography) would upgrade from an iPhone X. Since many people felt this "S" cycle was not really that great, especially if you have an iPhone X.

I have upgraded every year since the original iPhone in 2007, however, because of this mediocre "S" cycle update, this is the first time I have not upgraded.

:apple:
 
Last edited:
I am going to guess and say not all 5 trillion operations per second and the full power of the A12 CPU.
I am not enough into the technical side and honestly everyone can think what they want, since this is clearly a grey area. My point is that a lot of the work that has to be done to build the depth model and decide what’s background and foreground, and when something on the foreground is a person or a dog or something else, these are very often specifically AI tasks that are executed by the neural engine. So in a scenario where one must do these calculations in real time at the moment of shooting, having a stuttering preview because the neural engine is too slow makes for a bad user experience, something that Apple doesn’t want to sell. The difference in performance between the old and new justify this in my view.
Of course this is just my opinion and as you remark at the end of your comment, the best choice is always to vote with your wallet as Steve Jobs himself was fond of saying.
 
  • Like
Reactions: SRLMJ23
I am willing to bet that the A11 Bionic in the iPhone X is plenty capable of doing Depth Control, seeing as how the A12 Bionic is only about 15% faster. Yes, I understand the Neural Engine is WAY faster, and able to handle 5 trillion operations per second vs. the 600 billion operations per second. However, how much of the Neural Engine plus the CPU do you think is required to handle Depth Control? I am going to guess and say not all 5 trillion operations per second and the full power of the A12 CPU.

I think this was a decision so that people (especially in photography) would upgrade from an iPhone X. Since many people felt the this "S" cycle was not really that great, especially if you have an iPhone X.

I have upgraded every year since the original iPhone in 2007, however, because of this mediocre "S" cycle update, this is the first time I have not upgraded.

:apple:

I agree. It's a shame, although not one I'm too worried about since the Focos app does this and more. As a photographer I'm actually much more tempted to upgrade to the XS because it looks like it handles high contrast situations much better than the X in terms of increasing dynamic range with more intelligent HDR. It makes the XS look like it's on par with the Pixel 2 XL's HDR+ technology.
 
  • Like
Reactions: SRLMJ23
I agree. It's a shame, although not one I'm too worried about since the Focos app does this and more. As a photographer I'm actually much more tempted to upgrade to the XS because it looks like it handles high contrast situations much better than the X in terms of increasing dynamic range with more intelligent HDR. It makes the XS look like it's on par with the Pixel 2 XL's HDR+ technology.
Focos is amazing! The A11 in the X is plenty powerful to handle the adjustable bokeh feature in the new phones. It’s just a way for them to give the new phone features not available to older models so people update. Just like giving the iPhone7 additional haptics that 6s didn’t get even though it easily could of been updated through a software update
 
  • Like
Reactions: SRLMJ23 and sean000
Focos is amazing! The A11 in the X is plenty powerful to handle the adjustable bokeh feature in the new phones. It’s just a way for them to give the new phone features not available to older models so people update. Just like giving the iPhone7 additional haptics that 6s didn’t get even though it easily could of been updated through a software update
Handling there bokeh is not the complicated part! Building the depth model in real-time is what requires a high performing neural engine!
 
Handling there bokeh is not the complicated part! Building the depth model in real-time is what requires a high performing neural engine!
Considering you can do the same thing and even more using a 3rd party app compared to what’s available on the new Xs says it’s just a choice Apple made. The A11 is more than capable to handle the features considering there’s only a 15% improvement in the processing power between the the two.. Hardly a huge leap in performance. I’ll be getting the Xs because I’m on the IUP but, I’m almost positive I’ll hardly notice a difference coming from my X
 
  • Like
Reactions: sean000
I am willing to bet that the A11 Bionic in the iPhone X is plenty capable of doing Depth Control, seeing as how the A12 Bionic is only about 15% faster. Yes, I understand the Neural Engine is WAY faster, and able to handle 5 trillion operations per second vs. the 600 billion operations per second. However, how much of the Neural Engine plus the CPU do you think is required to handle Depth Control? I am going to guess and say not all 5 trillion operations per second and the full power of the A12 CPU.

I think this was a decision so that people (especially in photography) would upgrade from an iPhone X. Since many people felt this "S" cycle was not really that great, especially if you have an iPhone X.

I have upgraded every year since the original iPhone in 2007, however, because of this mediocre "S" cycle update, this is the first time I have not upgraded.

:apple:
Agreed, and I'm in the same boat - this is the first time since 2008 (the original iPhone wasn't sold here in Japan; I started with the iPhone 3G in 2008) I'm not upgrading. Nothing truly compelling, this time. :(
 
  • Like
Reactions: SRLMJ23
For any of you who thinks Depth Control needs A12 Bionic... Recently I sent to myself a portrait photo from iPhone Xs in local apple store using AirDrop. I have iPhone X and when I edited imported photo Depth Control slider appeared and... Worked just as on Xs. So not only they limited this feature to their newest devices, it's also hidden and disabled in older devices' software. A dick move from Apple.
 
For example a lot of these apps use ai and machine learning to recognize that something in the foreground is a person and adjust settings accordingly, this kind of live analysis can be very intense on the cpu (like, the phone gets hotter), so if there isn’t a powerful enough system, everything gets slower and the experience is bad, something that apple doesn’t like.
Very true. Even I have noticed this. Apple is/was perfectly OK with the UI stuttering on iPhone 4/iPad 2 on iOS 7 & 8 but they cannot tolerate a minor delay if a new feature like bokeh might cause on 1 yr old hardware of iPhone X.
 
Very true. Even I have noticed this. Apple is/was perfectly OK with the UI stuttering on iPhone 4/iPad 2 on iOS 7 & 8 but they cannot tolerate a minor delay if a new feature like bokeh might cause on 1 yr old hardware of iPhone X.

I would be very surprised if there were any performance issues on the iPhone X. I use the Focos app on my iPhone X all the time, and it does depth control and more. Of course it's possible that Apple's software for accomplishing depth control is more resource intensive, but I doubt it. This seems all about providing one more marketable difference for the iPhone XS.
 
  • Like
Reactions: SRLMJ23 and Sebosz
That’s a good point about Portrait Lighting.
[doublepost=1537077456][/doublepost]

Focos is the app I mention in my first post. It’s amazing! It even does much more than the new feature Apple is touting.

I guess I’m just curious to see if it will be a new iOS 12 feature for all dual lens devices, or if Apple will enable it only for the latest iPhones.

Apple's depth control is a toy compared to Focos.
 
  • Like
Reactions: SRLMJ23 and sean000
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.