Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
66,677
36,017



The most recent beta of iOS 13 was released yesterday, and it brought an interesting new "FaceTime Attention Correction" feature that changes the way that FaceTime works.

FaceTime Attention Correction, when enabled, adjusts the set of your eyes so that it looks like you're making eye contact with the person you're FaceTiming even when you're looking at the iPhone's screen rather than the camera itself. It's a little difficult to explain, so we've made a hands-on video to demo how it works.


When you're using FaceTime, you naturally want to look at the display to see the other person you're talking to rather than the camera, which has the effect of making you look like you're not maintaining eye contact.

As can be seen in the video, iOS 13 corrects this and makes it so that when you're looking at the iPhone's screen, your gaze appears to be on the camera, allowing eye contact to maintained be maintained while still letting you keep your gaze on the friend or family member you're FaceTiming with.

In iOS 12 and with FaceTime Attention Correction disabled, FaceTime looks like it always does - with no direct eye contact.

FaceTime Attention Correction appears to use an ARKit depth map captured through the front-facing TrueDepth camera to adjust where your eyes are looking for a more personal and natural connection with the person that you're talking to.

Twitter users have discovered the slight eye warping that Apple is using to enable the feature, which can be seen when an object like the arm of a pair of glasses is placed over the eyes.

You can access FaceTime Attention Correction on iPhone XS, iPhone XS Max, iPhone XR, and 2018 iPad Pro models running the third developer beta of iOS 13. It's a setting that's available in the FaceTime section of the Settings app.

Article Link: Testing the New FaceTime Attention Correction Feature in iOS 13
 
where will this end? Removal of blemishes? Removal of wrinkles? No more receding hairline? How about looking 20lbs lighter by getting rid of chubby cheeks and double chins?
Why not just let a freaking animoji do the talking while you're taking a shower?
All it's doing is making it similar to if you're talking in real life. With it off, it always looks like you're staring at the person's stomach.
Edit: Blemish removal would be less like real life. So it probably stops before anything you mentioned, though that's all available in third-party apps with filters.
 
Last edited:
I agree this is quite creepy. My subconscious mind goes to great lengths to avoid direct eye contact in real life. Now, with the computer altering my appearance in ways I don't even realize, it's going to make people think either I like them more than I do, or that I am no where near as afraid of them as I am. What if it starts creeping them out as to how often I'm looking straight at them? I just don't know if I can deal with this kind of intimacy getting forced on me.
 
This is a solution in search of a problem. Now fake eyes will be staring at you all the time. Is that really better than someone just looking below the camera? You'd rather a computer simulates someone looking at you? Sad, sad times.
Calm down. This feature is helpful because I’m tired of looking at the camera in my interviews, and not focusing on the person. It looks stupid when your looking down the entire time
 
Calm down. This feature is helpful because I’m tired of looking at the camera in my interviews, and not focusing on the person. It looks stupid when your looking down the entire time
Pretty certain I was calm. It doesn't "look stupid". It looks natural. We've been using FaceTime and video calls for so long that it's normal. Although it seems like a small issue, it's just very sad that we have technology that wants to alter things to make them "perfect" (like the cameras in the Xs et al). I dread to think where this will lead.
 
What’s the official excuse for making it exclusive to iPhone XS? iPhone XR had the same front camera and array of sensors and the same CPU. Yet another marketing ploy to artificially limit some products to make others look better. I remember they once even restricted background pictures to newer iPhones…
 
where will this end? Removal of blemishes? Removal of wrinkles?

That's already a feature in the corporate videoconferencing solution we use, "Touch Up My Appearance".

Besides, people have been doing this for literally decades. News anchors and guests on TV? They have makeup on. That's how people expect others to look on screen.

And this effect is exactly why they invented the TelePrompTer. They project the words on a mirror in front of the camera lens.

Mr. Rogers looking at you? Nope, he's reading off the screen.
 
What’s the official excuse for making it exclusive to iPhone XS? iPhone XR had the same front camera and array of sensors and the same CPU. Yet another marketing ploy to artificially limit some products to make others look better. I remember they once even restricted background pictures to newer iPhones…

I've updated the article -- while I'd read several reports that it wasn't on the XR, a kind MR reader said that it is. Same with iPad Pro. Sorry for the confusion!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.