Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

MacRumors

macrumors bot
Original poster
Apr 12, 2001
52,211
13,852


Though it is a budget device with a single-lens camera, the iPhone SE features support for Portrait Mode, enabled through the powerful A13 chip in the smartphone.

iphonesehandson.jpg

It is the first of Apple's smartphones to offer Portrait Mode photos created entirely with software techniques rather than hardware, which prompted the developers behind popular iOS camera app Halide to take a deep dive into how it works.

The iPhone SE is equipped the same camera sensor as the iPhone 8, based on a recent teardown done by iFixit, but its camera can do more because it's using "Single Image Monocular Depth Estimation," aka generating Portrait Mode effects using a 2D image.

As Halide developer Ben Sandofsky points out, the iPhone XR is also a single-lens camera with Portrait Mode support, but the iPhone XR gets depth information through hardware. That's not possible on the iPhone SE because the older camera sensor doesn't support the feature.

Halide has discovered that unlike other iPhones, the iPhone SE can take a picture of another picture to attempt to develop a depth map. The app was even able to take a photo of an old slide film, adding depth effects to a 50 year old photo.

halideoldphoto.jpg
A picture of a picture and the resulting depth map from the iPhone SE​

The iPhone SE's Portrait Mode is somewhat limited because it only works with people, which is due to the neural network that powers the feature. When a Portrait Mode image without a person is captured, it fails in various ways because it can't create an accurate estimated depth map.

The iPhone XR also limited Portrait Mode to people alone, and using Portrait Mode with other objects requires upgrading to one of Apple's more expensive phones.

According to Halide, depth maps on the iPhone SE (or any phone with Portrait Mode) can be viewed by using the Halide app and then shooting in Depth mode. Halide's full breakdown of the iPhone SE's Portrait Mode can be read over on the Halide website.

Article Link: Halide Does Deep Dive Into iPhone SE's Software-Based Portrait Mode
 
  • Like
Reactions: NickName99

farewelwilliams

macrumors 601
Jun 18, 2014
4,472
16,990
Theoretically, you should be able to extract frames from a video, apply neural network from Portrait Mode on to each frame, and stitch it back into a video.

Highly likely iOS 14 will have Portrait Mode for video. Would be only enhanced with Lidar too. This will take cinematography on a smartphone to a whole new level.
 
Comment

Emanuel Rodriguez

macrumors 6502
Oct 17, 2018
361
584
As far as I know, the reason this is supported is because of the CPU's improved ISP. I believe you misunderstood what "because of the improved processor" means. The ISP is hardware, and is the reason Apple was able to bring these features to the phone.
 
Comment

NickName99

macrumors 6502a
Nov 8, 2018
946
2,752
This is wildly impressive.

Agreed - this is far more impressive than most people realize. The Pixel phones and the iPhone XR achieve this with a single lens by leveraging focus pixels to generate a depth map from available 3D information. The SE doesn’t have enough focus pixels for that, so it’s relying purely on machine learning, building a depth map from 2D information.

I’m surprised it was more cost effective to do it this way rather than just make the changes necessary to fit the XR rear camera module. That really speaks to the expense of retooling manufacturing.
 
Comment

DaveP

macrumors 6502
Mar 18, 2005
455
272
She's probably the author's (Halide's) grandmother. Take a break, you'll be old too, sooner than you can imagine.

I had assumed the criticism was because the image is low resolution and blurry, making it a poor test image.
 
  • Like
Reactions: kiensoy
Comment

kiensoy

macrumors regular
Feb 6, 2008
165
289
Theoretically, you should be able to extract frames from a video, apply neural network from Portrait Mode on to each frame, and stitch it back into a video.

Highly likely iOS 14 will have Portrait Mode for video. Would be only enhanced with Lidar too. This will take cinematography on a smartphone to a whole new level.

You mean iPhone11S/12.. no way they would make this available in iOS 14 for iPhone 11.
 
Comment

Unami

macrumors 6502a
Jul 27, 2010
787
563
Austria
it‘s probably even more impressive on a low quality pic.
as far as depth recognition goes, the image is a fail, because the fence on the right should be even more brighter than the person - but it‘s impressive object/person detection.
 
Comment

alpi123

macrumors 68000
Jun 18, 2014
1,540
1,754
Theoretically, you should be able to extract frames from a video, apply neural network from Portrait Mode on to each frame, and stitch it back into a video.

Highly likely iOS 14 will have Portrait Mode for video. Would be only enhanced with Lidar too. This will take cinematography on a smartphone to a whole new level.
Portrait mode on a video sounds like a bad idea... unless you're shooting very simple objects, not humans, pets or anything like that. I'd say Apple would rather strive for a bigger physical sensor to increase the depth of field plus more optical magnification.

Even with LiDAR - remembers, all these lenses have some space between them so if you align them they can't perfectly overlap, causing small glitches sometimes.
 
  • Like
Reactions: SantaFeNM
Comment

JosephAW

macrumors 68040
May 14, 2012
3,732
4,437
I'm surprised with the power of the A13 that there's not an option to replace the background with another image in real-time.

Also you think with a single lens that the software would encourage a little camera shake before taking the photo in order for the chip to create a 3D map of the closest object and help refine the edges for a better portrait photo.
 
Comment

jntdroid

macrumors 6502
Oct 12, 2011
411
489
I've had an 11 Pro and my wife has an 11 and the photos I'm capturing on the SE, short of the lack of night mode, are on par with both of those devices. Portrait mode has already given me some incredible shots of my kids in good light... and not incredible because of the fake bokeh; incredible because of how well it captured their faces.
 
Comment

PortoMavericks

macrumors regular
Jun 23, 2016
248
291
Gotham City
Theoretically, you should be able to extract frames from a video, apply neural network from Portrait Mode on to each frame, and stitch it back into a video.

Highly likely iOS 14 will have Portrait Mode for video. Would be only enhanced with Lidar too. This will take cinematography on a smartphone to a whole new level.
You mean iPhone11S/12.. no way they would make this available in iOS 14 for iPhone 11.

I think it could be achieved with the iPhone 11 combining the feed of one sensor and applying into the final composition.

On previous models, the chip couldn’t handle multiple video feeds and probably the writing speed wasn’t adequate. Also, there isn’t an official API for that, so... it’d be really hard to create from scratch a model for depth of field through ML.

Considering that there was enough power on the CPU left to do apply that, off course.
 
  • Like
Reactions: EmotionalSnow
Comment

farewelwilliams

macrumors 601
Jun 18, 2014
4,472
16,990
You mean iPhone11S/12.. no way they would make this available in iOS 14 for iPhone 11.
Depends if A13 can handle the task in a reasonable amount of time. Doesn't make sense if you record a 5 minute video clip and have to wait 10 minutes for A13 to process it.
[automerge]1588043213[/automerge]
Portrait mode on a video sounds like a bad idea... unless you're shooting very simple objects, not humans, pets or anything like that. I'd say Apple would rather strive for a bigger physical sensor to increase the depth of field plus more optical magnification.

Even with LiDAR - remembers, all these lenses have some space between them so if you align them they can't perfectly overlap, causing small glitches sometimes.

I'm not sure what you're saying here. LiDAR alone creates a detailed depth mask without camera vision. Apple already has LiDAR + camera vision merge with the iPad Pro (via ARKit).
 
Last edited:
  • Like
Reactions: EmotionalSnow
Comment

Cosmosent

macrumors 68000
Apr 20, 2016
1,756
1,900
La Jolla, CA
NOT Rocket Science, a quick review of the Tech Specs makes it clear:

The 2nd-Gen SE's Back Camera has the exact same 12 Mpx image sensor has the XR !
 
  • Disagree
Reactions: KeithBN
Comment

Mike82

macrumors regular
Aug 16, 2013
131
166
NOT Rocket Science, a quick review of the Tech Specs makes it clear:

The 2nd-Gen SE's Back Camera has the exact same 12 Mpx image sensor has the XR !

Except it doesn't. 'The iPhone SE is equipped the same camera sensor as the iPhone 8, based on a recent teardown done by iFixit, but its camera can do more because it's using "Single Image Monocular Depth Estimation," aka generating Portrait Mode effects using a 2D image.'
 
Comment
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.