Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hi Quagmire!

Welcome to this new world where what you see is always not the truth.
 
Sorry but this is a horrible comparison picture. I can't even tell what it's supposedly doing. My guess based on the description is that it "fixes your eyes" so to speak but the entirely different image between the two shots isn't very useful.
Exactly - I get what the feature is supposed to do, but those images look totally random. I think Mr Hardwick needs to go back to his source...
 
  • Like
Reactions: chrono1081
Sorry but this is a horrible comparison picture. I can't even tell what it's supposedly doing. My guess based on the description is that it "fixes your eyes" so to speak but the entirely different image between the two shots isn't very useful.

Look at the eyes, not around the eyes.... (Sorry for the UK-centric comedy reference)

The left hand picture shows what the person was actually doing....namely looking at the screen, so there is no 'eye contact'.

In the right hand picture, the user is still looking at the screen, but the phone has altered the eyes so that it appears that they are looking at the camera. this makes the viewer feel like they have eye contact with the person they are speaking to.
 
Look at the eyes, not around the eyes.... (Sorry for the UK-centric comedy reference)

The left hand picture shows what the person was actually doing....namely looking at the screen, so there is no 'eye contact'.

In the right hand picture, the user is still looking at the screen, but the phone has altered the eyes so that it appears that they are looking at the camera. this makes the viewer feel like they have eye contact with the person they are speaking to.
Yes, but to illustrate the feature convincingly, the subject needs two screen captures without moving - one with the feature turned off, one with it turned on.
 
Men around the world are rejoicing - this will singlehandedly save their relationships with girlfriends/wives.
 
If they skip iPhone XR, something smells... I don't believe it needs so much RAM (which is the difference between XS and XR iPhone.
 
  • Like
Reactions: nexu
Yes, but to illustrate the feature convincingly, the subject needs two screen captures without moving - one with the feature turned off, one with it turned on.

Well considering this isn't Apple promoting the feature I don't see the big deal. It's pretty obvious just looking at the picture provided. Left looks like you are talking to a person and they are looking at your chin, right looks like they are making eye contact.
 
Assuming this is not a joke...

This is pretty disturbing.

Hopefully, this is at least off by default in all versions of the OS going forward
Are you really so disturbed by this? Why? First, it’s not doing it secretly. Second, there’s image and video manipulation happening all the time already. This isn’t a new scary development.
 



A new feature in the latest iOS 13 beta makes users appear as if they're looking directly at the camera to make eye contact during FaceTime calls, when actually they're looking away from the camera at the image of the other person on their screen.

facetime-correction-feature-ios-13-1.jpg

The FaceTime Correction Feature as demoed by Will Simon (@Wsig)

The new "FaceTime Attention Correction" feature, first spotted by Mike Rundle on Twitter, can be turned on and off in the FaceTime section of the Settings app, although it only appears to work on iPhone XS and XS Max devices in the third iOS 13 beta sent out to developers on Tuesday.

Why the feature is limited to these devices right now remains unknown. It clearly relies on some form of image manipulation to achieve its results, so maybe the software algorithms require the more advanced processing power of Apple's latest devices.

Rundle predicted in 2017 that FaceTime attention correction would be introduced by Apple in "years to come," but its apparent inclusion in iOS 13, due to be released this fall, has surprised and impressed him.

For more details on the many features coming to iPhones with iOS 13, be sure to check out our comprehensive MacRumors roundup.

Article Link: Attention Correction Feature in iOS 13 Beta Enables Appearance of Eye Contact During FaceTime Calls

It'd be great, if it'd also corrected receding hairlines.
 
This is just the beginning.
In the near future AI combined with Quantum computers will be able to alter our physical reality in similar ways, bringing everything we see into question.

Don’t worry we’re already in the simulation.
 
  • Like
Reactions: 5105973
Does this happen at the expense of image quality? Right hand side looks compressed in comparison.
 
When I first saw this setting listed online, I figured it was for when you’re using a Memoji head or something. Crazy.

This is my biggest gripe when trying to get a selfie with other people, constantly having to say "LOOK AT THE CAMERA, NOT THE SCREEN!"

They are probably adding this first because FaceTime streams are a lot lower quality than selfie photos and videos. It might be too uncanny for that, or it might require a faster processor in a newer iPhone to make it look real enough.
 
There would be a lot of complex AI work involved with doing this I imagine. Basically the phone needs to track where you’re looking relative to the screen, zone out your eyes and then generate a new set of eyes looking elsewhere. All in real-time to become a processed FaceTime video feed.

I reckon only the neural engine in the A12 can do this well enough but I’m surprised the XR isn’t supported since that has all the same Face ID and SOC hardware
It's using AR Kit 2. That's my understanding.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.