• Did you order new AirTags? We've opened a dedicated AirTags forum.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
52,552
14,247



Apple is developing 3D depth sensing technology for the rear-facing cameras in its 2019 iPhones, according to a new report by Bloomberg on Tuesday. The 3D sensor system will be different to the one found in the iPhone X's front-facing camera, and is said to be the next big step in turning the smartphone into a leading augmented reality device.

iphonexcameradesign.jpg
Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
The existing TrueDepth camera would continue to be used in the front-facing camera of future iPhones in order to power Face ID, while the new system would bring the more advanced "time-of-flight" 3D sensing capability to the rear camera, according to the sources cited. Discussions with manufacturers are reportedly already underway, and include Infineon, Sony, STMicroelectronics, and Panasonic. Testing is said to be still in the early stages, and could end up not being used in the phones at all.

With the release of iOS 11, Apple introduced the ARKit software framework that allows iPhone developers to build augmented reality experiences into their apps. The addition of a rear-facing 3D sensor could theoretically increase the ability for virtual objects to interact with environments and enhance the illusion of solidity.

Apple was reportedly beset with production problems when making the sensor in the iPhone X's front-facing camera, because the components used in the sensor array have to be assembled with a very high degree of accuracy. According to Bloomberg, while the time-of-flight technology uses a more advanced image sensor than the existing one in the iPhone X, it does not require the same level of precision during assembly. That fact alone could make a rear-facing 3D sensor easier to produce at high volume.

Late last month, oft-reliable KGI Securities analyst Ming-Chi Kuo claimed that Apple is unlikely to expand its front-facing 3D sensing system to the rear-facing camera module on iPhones released in 2018. Kuo said the iPhone X's 3D sensing capabilities are already at least one year ahead of Android smartphones, therefore he believes Apple's focus with next year's iPhone models will be ensuring an on-time launch with adequate supply.

Article Link: Apple Reportedly Working on 3D Sensor System for Rear Camera in 2019 iPhones
 
  • Like
Reactions: Avieshek

meaning-matters

macrumors 6502a
Dec 13, 2013
508
2,087
This is very hard because you need to measure time-of-flight differences. If they do manage to pull this off, it would be truly amazing.

Example: For depth resolution of 1cm you'd need to be able to measure time differences of 33.3 pico seconds (i.e. 0,0000000000333 or 33.3 trillionths seconds). But 1cm for facial recognition is too coarse; they'd need a fraction of this.

It's been done in high-tech systems taking up more space than a whole iPhone.
 
Last edited:
Comment

bmot

macrumors regular
Feb 25, 2016
140
104
Im surprised how long this whole AR topic takes to take form.
I remember i had a map app on my 3gs back then that would project map infos into my surroundings on screen.
 
Comment

iPhysicist

macrumors 65816
Nov 9, 2009
1,335
978
Dresden
If they do this right this will be bigger then iPhone and everything else they did before. But it will undoubtedly be without any projections. Should be something with 3D camera depth sensing.

Pure awesomeness for the geek and everyone else
 
Comment

GrumpyMom

macrumors G3
Sep 11, 2014
9,513
13,650
Are these laser dots like the lasers used in some autofocus camera systems (only multiplied many times over)?

It would be cool if the new tech meant better focusing, too. No more blurry cat photos. They’d be fuzzy only because the cat is fuzzy. :)
 
Comment

JohnApples

macrumors 68000
Mar 7, 2014
1,569
2,499
Yikes, next year’s iPhone is sounding like a beta test. There’s bound to be problems in a gen 1 rear 3D camera system.

Good luck guinea pigs! I’ll be buying the iPhone after next as all the kinks in the rear camera will have been ironed out.

Edit: I thought I was obviously tongue-in-cheek but I guess not. So here’s the tag: /s

These types of comments are so common here that people just assume you’re being serious. Lol.
 
Last edited:
Comment

robinp

macrumors 6502a
Feb 1, 2008
622
1,186
This is very hard because you need to measure time-of-flight differences. If they do manage to pull this off, it would be truly amazing.

Example: For depth resolution of 1cm you'd need to be able to measure time differences of 33.3 pico seconds (i.e. 0,0000000000333 or 33.3 trillionths seconds). But 1cm for facial recognition is too coarse; they'd need a fraction of this.

It's been done in high-tech systems taking up more space than a whole iPhone.

But it doesn't need to work with FaceID if it is the rear camera. It would be about understanding 3D space in a general sense. An upgrade over the dual camera set up for producing portrait mode 'bokeh'
 
  • Like
Reactions: PeLaNo and doelcm82
Comment

KONVICTED

Suspended
Oct 16, 2017
236
222
Cue the comments from people about how they’re glad they didn’t buy the current X and waited
Everyone has different preferences. i own iPhone X and like it. I upgrade annually and will definitely check out Plus iPhone X coming next year.
 
  • Like
Reactions: Radeon85
Comment

futbalguy

macrumors 6502
May 16, 2007
277
58
This is very hard because you need to measure time-of-flight differences. If they do manage to pull this off, it would be truly amazing.

Example: For depth resolution of 1cm you'd need to be able to measure time differences of 33.3 pico seconds (i.e. 0,0000000000333 or 33.3 trillionths seconds). But 1cm for facial recognition is too coarse; they'd need a fraction of this.

It's been done in high-tech systems taking up more space than a whole iPhone.

It sounds like this is for augmented reality instead of face scanning so they wouldn’t need that level of accuracy. Also explains why they would use different technologies on the front and back camera. Different use cases - short-range, high accuracy vs long-range, low accuracy (relatively).
 
Comment

dan9700

Suspended
May 28, 2015
3,347
4,823
In 2030 another new camera leave it out enjoy the X and stop talking about years to come stupid rumours
 
Comment

WatchFromAfar

Suspended
Jan 26, 2017
1,588
1,585
But it doesn't need to work with FaceID if it is the rear camera. It would be about understanding 3D space in a general sense. An upgrade over the dual camera set up for producing portrait mode 'bokeh'
How about if that's the only reason for it? Like being able to recognize your friends and family?
 
Comment

GrumpyMom

macrumors G3
Sep 11, 2014
9,513
13,650
Yikes, next year’s iPhone is sounding like a beta test. There’s bound to be problems in a gen 1 rear 3D camera system.

Good luck guinea pigs! I’ll be buying the iPhone after next as all the kinks in the rear camera will have been ironed out.
But what if you’re eaten by the Kraken the year before? As you slide down its gullet or whatever Krakens have, your last words will be “I should have gotten the 2019 IPhone with the awesome state of the art rear sensor...ack...gurghhhhhh.” So tragic. :(
 
Comment

unibility

macrumors 6502a
Apr 6, 2012
602
575
This is very hard because you need to measure time-of-flight differences. If they do manage to pull this off, it would be truly amazing.

Example: For depth resolution of 1cm you'd need to be able to measure time differences of 33.3 pico seconds (i.e. 0,0000000000333 or 33.3 trillionths seconds). But 1cm for facial recognition is too coarse; they'd need a fraction of this.

It's been done in high-tech systems taking up more space than a whole iPhone.

I’m not sure what I just read but I believe. :)
 
  • Like
Reactions: DeepIn2U
Comment

Enygmatic

macrumors 6502a
Jan 27, 2015
860
855
Various
*chuckles* Yeah, let's hurry and get these "2019 iPhone" rumors out there now, so that Samsung has something to rush to market in their 2018 flagships. Then, by the time Apple presents at the Keynote, everyone's "bored".

Rinse and repeat every year. Ming and Bloomberg, like clockwork.
[doublepost=1510666046][/doublepost]
Yikes, next year’s iPhone is sounding like a beta test. There’s bound to be problems in a gen 1 rear 3D camera system.

Good luck guinea pigs! I’ll be buying the iPhone after next as all the kinks in the rear camera will have been ironed out.
Next year is 2018... the article is about 2019.
 
Comment

WannaGoMac

macrumors 68030
Feb 11, 2007
2,541
3,799
Cue the comments from people about how they’re glad they didn’t buy the current X and waited

Or people who don't care about having the latest thing, don't have time or energy to swap phones every year, have learned to never buy first gen apple stuff and its inevitable bugs (iphone 6+ touch disease, etc.), and prefer something that's stable and polished.

But to each their own...
 
Comment

slicecom

macrumors 68020
Aug 29, 2003
2,061
92
Toronto, Canada
So glad I decided to skip the 2018 model.

/s

Every time a 2018 model rumour is posted, people say they're so glad they skipped the X, so this is the next logical step.
 
Comment
Apple was reportedly beset with production problems when making the sensor in the iPhone X's front-facing camera, because the components used in the sensor array have to be assembled with a very high degree of accuracy. According to Bloomberg, while the time-of-flight technology uses a more advanced image sensor than the existing one in the iPhone X, it does not require the same level of precision during assembly. That fact alone could make a rear-facing 3D sensor easier to produce at high volume.

Somehow, I suspect this or something else will be in "short supply" or have "production challenges" leading into the launch. The show is always the same... and always seem to work (so why not?). If not this, something else will cast a lot of doubt on being able to get the next iPhone at launch and once again create the frenzy of waking up in the wee hours to try to get one of the precious few that will be available before they are all snatched away.

Furthermore, i predict the next, next iPhone will have the exact same issue of short supply rumors flying hot & heavy going into it's launch.

And then the next, next, next iPhone too.

If it ain't broke?
 
Comment

smacrumon

macrumors 68030
Jan 15, 2016
2,683
4,010
How difficult could a 3d sensor be for a multi billion dollar company to work out?
 
Comment
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.