Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
66,723
36,090



Developers behind iOS camera app Halide say that they are planning to ship a version of the app that enables Portrait Mode on iPhone XR for "all sorts of things," and not just people (via Reddit). Because the iPhone XR has a single-lens rear camera (and not a dual-lens like iPhone XS), the cheaper smartphone doesn't capture as much depth information and the Portrait Mode bokeh effect in Apple's own Camera app only works on people.

iphone-xr-halide-tweet.jpg

Depth on iPhone XR vs iPhone XS. The XR has 1/4 the depth data. (Ignore color differences.) pic.twitter.com/WMkDtznY5o - Ben Sandofsky (@sandofsky) October 27, 2018
As Wired explained in its review, if you try to take a Portrait image of a pet or object, the camera app will state "No person detected" at the top of the screen. Now, Halide says that it has already gotten the iPhone XR camera to work with Portrait Mode on pets and inanimate objects, but results haven't been consistent and some subjects can be harder to create a depth effect around.
We think with some more tooling, we'll be able to ship a version of our app that enables portrait mode for all sorts of things. It seems it'll be a bit more 'temperamental'; in some settings it won't work if there's not enough variance in relative distance of objects, but a can of soda water on my desk worked just fine.
Still, as Halide's Reddit post explains, this means that third-party camera apps on the iOS App Store will be able to provide users with a form of Portrait Mode on iPhone XR that enables bokeh effects around more than just people. Halide mentions that the iPhone XR's depth map is "way lower resolution" than the dual cameras on the iPhone XS, "but it seems usable."

iPhone XR launched a few days ago on October 26 and was met with positive reviews from the media, which praised its LCD display, bright colors, and iPhone XS-level performance. Thanks to the smartphone's lower price tag in comparison to the iPhone XS and XS Max, most outlets agreed that the iPhone XR will be the smartphone that most people will want to choose when looking at the new 2018 lineup of iPhones.

Update 12:00 p.m. PT: Halide has submitted version 1.11 to the iOS App Store, which unlocks the ability to take photos with Portrait Mode effects on pets and objects using iPhone XR. Now that it's been submitted, Halide says the update will be out soon, once it passes App Store review. More information can be found in the company's blog post.

Article Link: Halide Developers Enable Portrait Mode on iPhone XR for Objects and Pets [Updated]
 
  • Like
Reactions: jeremiah256
Since the single lens Pixel can do portrait for things other than people, its good to see iOS getting this option as well, thanks Halide. Maybe as a feature next year Apple will figure out how these guys do this for non "people" as well.
 
It would be good if Apple provides an option for using portrait mode with Xs' wide angle camera only. Reviewers have pointed out that while Apple upgraded the wide angle lens sensor, it cleverly chose to ignore mentioning the telephoto lens, so as a result, under low light conditions XR portraits come out better than XS'. It would have been better if both sensors were upgraded at the same time.. but that could be its marketing strategy saving it for the next year's upgrade.
 
Since the single lens Pixel can do portrait for things other than people, its good to see iOS getting this option as well, thanks Halide. Maybe as a feature next year Apple will figure out how these guys do this for non "people" as well.

Sure they will figure this out. This is just a software feature. However, they will make you buy next years phone in order to enable it ;)
 
Since the single lens Pixel can do portrait for things other than people, its good to see iOS getting this option as well, thanks Halide. Maybe as a feature next year Apple will figure out how these guys do this for non "people" as well.
I'm sure Apple can do portrait mode for most objects on the XR but decided to limit it, because, you know why.
 
  • Like
Reactions: eyeseeyou
I'm sure Apple can do portrait mode for most objects on the XR but decided to limit it, because, you know why.
Yes, we do. Because the algorithm is optimized for people. Using it for other targets can yeild unpredictable results that Apple does not want to be measured against.
If there were anything marketing aspect to it, Apple would advertise this is a distinguishing feature between the models.
 
No pet portrait mode was the nail in the coffin for the XR for me.

XS is CRAZY expensive so I'm just sticking with my 7+ for now... still hopeful for a good holiday season trade in promotion.
 
  • Like
Reactions: chfilm
Wait so in essence Apple is holding back a feature that is capable on the XR ?

Are they’re purposely limiting their own software from being able to achieve portrait mode on objects and animals ?

Did I miss something here ?
 
Last edited:
Wait so in essence Apple is holding back a feature that is capable on the XR ?

Are they’re purposely limiting their own software from being able to achieve portrait mode on objects and animals ?

Did I miss something here ?

You don't miss anything. It is the usual thing:
Apple touts the massive KI power of the hardware. Which should be able to run object recognition and classification algorithms just fine. They could have implemented it in the XR, but chose not to in order to push people to buy the XS.
Nothing new to see here, move along people, move along...
 
<speculation>Since in the development documentation it is stated that in order to get Portrait Matte effect you need to have depth data available, I can assume that Apple uses a so called dual-pixel camera (like Pixel). Therefore there’s always depth data available during the capture. Portrait Matte effect uses depth and refines it with the help of specifically trained ML. Apple possibly didn’t enable standard portrait function due to very low quality depth map (immagine social media gaga over poor quality portraits on iPhone). </speculation>

It seems a similar story to Portrait Lighting and Depth control on iPhone 7 Plus, 8 Plus and X. In Chromatica we enabled both effects for Photos app on all dual camera iPhones (and not just Xs/max). More apps will add support for general portrait mode on XR once it becomes more available.
 
Yes, we do. Because the algorithm is optimized for people. Using it for other targets can yeild unpredictable results that Apple does not want to be measured against.
If there were anything marketing aspect to it, Apple would advertise this is a distinguishing feature between the models.

That makes perfect sense. I am disappointed the XR lacks this, but if apps will help achieve it, I'm okay with it. Regardless, I went to the Apple store to check out the XR and it by no means looks like a "budget" iPhone to me -- and I was looking at my X next to it. I plan on saving some money and getting the XR in Dec/Jan using the Apple upgrade program.
 
These are very poor portrait mode shots especially from devs promoting a premium camera app. Come on! Most amateurs can do a better job.
[doublepost=1540837777][/doublepost]
Wait so in essence Apple is holding back a feature that is capable on the XR ?

Are they’re purposely limiting their own software from being able to achieve portrait mode on objects and animals ?

Did I miss something here ?

Apple always tends to limit stuff even when it may be possible. Siri on the iPad 2 was a great example. They had an explanation at the time about but many people got it working through JB and using a file from a 4S phone.
 
It would be good if Apple provides an option for using portrait mode with Xs' wide angle camera only. Reviewers have pointed out that while Apple upgraded the wide angle lens sensor, it cleverly chose to ignore mentioning the telephoto lens, so as a result, under low light conditions XR portraits come out better than XS'. It would have been better if both sensors were upgraded at the same time.. but that could be its marketing strategy saving it for the next year's upgrade.
Does this app allow that?
 
At least not yet..but technically it's possible since both XR and XS use the same wide angle camera sensor. Depends on whether Apple provides access to the wide angle and telephoto cameras independently - will have to check on that
Does this app allow that?
 
  • Like
Reactions: Cocoi
Apple is going to kill this update. I don’t see it passing the review team. Too many business issues that can cause real problems for Apple with the Xs and Xr having strategic features.
 
Do you mean - accessing the wide angle and telephoto cameras independently on XS or getting the depth map on XS?
By the way, Halide developers have a good blog explaining how the depth map feature works and its limitations. It's really nice to see the developers taking time for technical a write up.
It was mentioned that the XR cannot live stream the depth map data, so you cannot get a live preview of the portrait shot while taking, it's only applied after the photo is taken and the same is true for Pixel also.
I would like to know if that's because of Apple's implementation/hardware limitations or that's the case in general since even Pixel cannot do it and is it possible in future versions? Does the Pixel3 show live preview of the portrait shot?


Yep, we're waiting to see if we can do that. So far it's not looking good.
 
Apple is going to kill this update. I don’t see it passing the review team. Too many business issues that can cause real problems for Apple with the Xs and Xr having strategic features.
Take off the tin foil hat. There is no reason for this to be blocked. Not even imagined “strategic features”.
Edit. It was release today.
 
Last edited:
Apple is going to kill this update. I don’t see it passing the review team. Too many business issues that can cause real problems for Apple with the Xs and Xr having strategic features.

Fortunately, we made it through! Went live last night.
 
We wrote a bit about how it works here. The basic idea is that the XR creates a small depth map with its focus pixels, and we use that to apply our own blur.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.