Exactly - I get what the feature is supposed to do, but those images look totally random. I think Mr Hardwick needs to go back to his source...Sorry but this is a horrible comparison picture. I can't even tell what it's supposedly doing. My guess based on the description is that it "fixes your eyes" so to speak but the entirely different image between the two shots isn't very useful.
I spend most FaceTime calls looking at myself in the corner lol
Sorry but this is a horrible comparison picture. I can't even tell what it's supposedly doing. My guess based on the description is that it "fixes your eyes" so to speak but the entirely different image between the two shots isn't very useful.
Plato just called… he'd like to have a word with you, but reception is spotty in his cave.This is just the beginning.
In the near future AI combined with Quantum computers will be able to alter our physical reality in similar ways, bringing everything we see into question.
Yes, but to illustrate the feature convincingly, the subject needs two screen captures without moving - one with the feature turned off, one with it turned on.Look at the eyes, not around the eyes.... (Sorry for the UK-centric comedy reference)
The left hand picture shows what the person was actually doing....namely looking at the screen, so there is no 'eye contact'.
In the right hand picture, the user is still looking at the screen, but the phone has altered the eyes so that it appears that they are looking at the camera. this makes the viewer feel like they have eye contact with the person they are speaking to.
Yes, but to illustrate the feature convincingly, the subject needs two screen captures without moving - one with the feature turned off, one with it turned on.
Are you really so disturbed by this? Why? First, it’s not doing it secretly. Second, there’s image and video manipulation happening all the time already. This isn’t a new scary development.Assuming this is not a joke...
This is pretty disturbing.
Hopefully, this is at least off by default in all versions of the OS going forward
Remove of double chinThe big question is, what comes next? Brave new world.
A new feature in the latest iOS 13 beta makes users appear as if they're looking directly at the camera to make eye contact during FaceTime calls, when actually they're looking away from the camera at the image of the other person on their screen.
![]()
The FaceTime Correction Feature as demoed by Will Simon (@Wsig)
The new "FaceTime Attention Correction" feature, first spotted by Mike Rundle on Twitter, can be turned on and off in the FaceTime section of the Settings app, although it only appears to work on iPhone XS and XS Max devices in the third iOS 13 beta sent out to developers on Tuesday.
Why the feature is limited to these devices right now remains unknown. It clearly relies on some form of image manipulation to achieve its results, so maybe the software algorithms require the more advanced processing power of Apple's latest devices.
Rundle predicted in 2017 that FaceTime attention correction would be introduced by Apple in "years to come," but its apparent inclusion in iOS 13, due to be released this fall, has surprised and impressed him.
For more details on the many features coming to iPhones with iOS 13, be sure to check out our comprehensive MacRumors roundup.
Article Link: Attention Correction Feature in iOS 13 Beta Enables Appearance of Eye Contact During FaceTime Calls
It’s scary that you view this as normal.Are you really so disturbed by this? Why? First, it’s not doing it secretly. Second, there’s image and video manipulation happening all the time already. This isn’t a new scary development.
Awesome and creepy at the same time.
This is just the beginning.
In the near future AI combined with Quantum computers will be able to alter our physical reality in similar ways, bringing everything we see into question.
This is my biggest gripe when trying to get a selfie with other people, constantly having to say "LOOK AT THE CAMERA, NOT THE SCREEN!"
It's using AR Kit 2. That's my understanding.There would be a lot of complex AI work involved with doing this I imagine. Basically the phone needs to track where you’re looking relative to the screen, zone out your eyes and then generate a new set of eyes looking elsewhere. All in real-time to become a processed FaceTime video feed.
I reckon only the neural engine in the A12 can do this well enough but I’m surprised the XR isn’t supported since that has all the same Face ID and SOC hardware
The big question is, what comes next?
Plato just called… he'd like to have a word with you, but reception is spotty in his cave.