Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Uhhh... not at all.

The photo says he’s looking at the screen, his eyes are pointing down. The other photo says he’s looking at the screen in iOS 13 beta 3, but his eyes are going straight.

Like it’s kinda obvious what they are trying to demonstrate. smh

FaceTime Attention Correction corrects what you’re looking at, so it doesn’t look like you’re looking at the screen.

An off-center camera will always do that, it doesn’t matter if the subject has changed angles or perspective, the user will always be looking away unless they look at the camera.

If you got the memo what FaceTime Attention Correction does, then you should easily make out what they’re trying to convey in the comparison. It’s not that hard.

1000% less obvious or well demonstrated than the side by side photo here though... https://9to5mac.com/2019/07/03/facetime-eye-contact-correction-in-ios-13-uses-arkit/
 
  • Like
Reactions: testcard
So in effect they're altering the video in realtime to make you more polite?
I'd say it's not about being polite. It's about representing reality more <gasp> faithfully.

In reality, you are really are looking directly into the person's eyes (or rather the computerized rendering of their eyes on you screen). What this manipulation does, is give the other party a better impression of what you're really doing.

Some would still call it lying. I'd say it's more of a white lie: A helpful lie.

To use a related example, compressed jpegs are, strictly speaking, lies. They're manipulated images. That said, they're good and accepted lies because they alleviate bandwidth/storage restrictions which, in turn, makes other good things possible.
 
  • Like
Reactions: Colonel Blimp
Uhhh... not at all.

The photo says he’s looking at the screen, his eyes are pointing down. The other photo says he’s looking at the screen in iOS 13 beta 3, but his eyes are going straight.

Like it’s kinda obvious what they are trying to demonstrate. smh

FaceTime Attention Correction corrects what you’re looking at, so it doesn’t look like you’re looking at the screen.

An off-center camera will always do that, it doesn’t matter if the subject has changed angles or perspective, the user will always be looking away unless they look at the camera.

If you got the memo what FaceTime Attention Correction does, then you should easily make out what they’re trying to convey in the comparison. It’s not that hard.

No, it's not hard at all. It's quite easy to understand what they intended to demonstrate. It's also trivial to conclude that they utterly failed to do so. Previous posts have outlined all the technical shortcomings of the demo photos. I don't understand what you are missing(?) It doesn't have to do with what they are _trying_ to convey...the issue is that they didn't convey it at all, due to poor planning, positioning, expression matching, etc.
 
XR Supports this and it has 3GB

I’m sure the other A12 iPads do too

IPP has the option. I tried it and it doesn't seem to be working though, unless you need to use it portrait.

Edit: Nope, same in portrait. (yeah its the 2018 Face ID one).
 
For anyone who hasn’t seen them, these are the comparative “before” and “after” images 9to5Mac used in their story.

F59DA2F5-A26F-4EDA-AA3C-4927238156C5.jpeg
 
  • Like
Reactions: ErikGrim
FOR all the privacy concerns, I’m more concerned about honest communication. I’d much prefer that faces are represented exactly as they appear in real life, so if someone is not looking at the screen, looking in the corner, that’s fine, I want to see that, those visual cues are important in honest empathetic communication.
 
IPP has the option. I tried it and it doesn't seem to be working though, unless you need to use it portrait.

Edit: Nope, same in portrait. (yeah its the 2018 Face ID one).

Yes only iPP 2018 has this feature. Air 3 and Mini 5 don’t because they lack the TrueDepth camera
 
Fer real!! Everybody knows they could make iOS 13 run on the 2007 iPhone if they wanted to.
 
FOR all the privacy concerns, I’m more concerned about honest communication. I’d much prefer that faces are represented exactly as they appear in real life, so if someone is not looking at the screen, looking in the corner, that’s fine, I want to see that, those visual cues are important in honest empathetic communication.

What’s your point? This feature does that. If you are looking directly at the person on the screen, it now appears that way to the viewer on the other end of the FaceTime call.

If you are looking off to the side, or whatever, it still looks that way.

This feature just corrects for the fact that your camera is not in the middle of your screen, so when you look at the picture on the screen you are not looking at the camera.
 
  • Like
Reactions: Arran
Or you could have worked on getting the camera to be in the center of the phone hidden behind the screen...
[doublepost=1562339348][/doublepost]And what does a blink look like...? what happens when you actually look up into the camera...? what about people with lazy eyes...?
 
What’s your point? This feature does that. If you are looking directly at the person on the screen, it now appears that way to the viewer on the other end of the FaceTime call.

If you are looking off to the side, or whatever, it still looks that way.

This feature just corrects for the fact that your camera is not in the middle of your screen, so when you look at the picture on the screen you are not looking at the camera.
No, what this feature does is give a false image of what’s going on. If someone’s eyes aren’t looking at the camera dead on, and looking away that’s fine. This fakery in communications is nonsense.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.