Automatic attitude correction for MacRumors forum posts.The big question is, what comes next? Brave new world.
If you consider a phone obsolete because it can't adjust the location of your pupils in a video call, then yes.Probably "needs" the faster neural processor, forced obsolescence, blah blah blah.
When avatars bridge the uncanny valley, then you can make your own face that is indistinguishable from a real one. They are already half way there with the animojis.The big question is, what comes next? Brave new world.
I’m sure there’ll be an option to toggle on/off.Pretty slick! I remember for years the idea to fix this “problem” on Macs and any device really was going to be putting a camera imbedded behind the display panel where people’s eyes naturally look, but that has all kinds of technical issues (although I do believe it’s been done?)
This is a much simpler fix that I’m all for. It’s a little creepy, I’ll admit, but in western culture it’s very important to look someone in the eyes when talking and I think this goes a long way to making video calls more natural.
I wonder if there’s a way to disable it, plus I wonder if it’ll be on by default in other countries where eye contact is considered to be more an aggressive action?
Imagine a small kid FaceTiming with dad.I spend most FaceTime calls looking at myself in the corner lol
I want to see people's real eyes.
The point is seeing the person. You aren't supposed to put it to your ear when it's on. It puts it on speakerphone so you can hear, or you can use headphones.I've never turned FaceTime on.. what is it actually for?? Honest question! It's not like you can see the person while having the phone to your ear anyway... (Maybe I'm old - I know kids use it to goof around).
Except an AI can still tell the difference.When avatars bridge the uncanny valley, then you can make your own face that is indistinguishable from a real one. They are already half way there with the animojis.
Certainly. There are some situations in which it can be quite helpful to be able to see quickly when the writer is careless or uneducated.Does anyone complain that auto spell correct "distorts reality" by making the sender appear to have a better command of the English language and spelling than they really have?
We already have to do that on most information that we process... on some we know that they are easy to fake so we don't trust them easily, like photos.This is just the beginning.
In the near future AI combined with Quantum computers will be able to alter our physical reality in similar ways, bringing everything we see into question.
While this is based on machine learning it should mostly be a matter of computing power to expand the project into 3D. https://thispersondoesnotexist.com/When avatars bridge the uncanny valley, then you can make your own face that is indistinguishable from a real one. They are already half way there with the animojis.
OK, I have to admit it, you got me on that oneCertainly. There are some situations in which it can be quite helpful to be able to see quickly when the writer is careless or uneducated.
You would not believe the number of résumés littered with spelling errors that headhunters used to send me for quality assurance positions, as if attention to detail weren’t a requirement. Being able to see at a glance that two-thirds of the applicants were clearly unsuited was a great time-saver.
Apple engineers assume that all Mac users position their monitors so they are always looking directly at the bottom left corner of the screen. This is what happens when I work with application windows that filled a 15 inch MacBook Pro screen and later connect a 30 inch, 2560x1600 monitor. Furthermore, many disk image files keep opening in the bottom left areas of the screen by default. Many applications keep opening in the bottom left area of the screen no matter what I do. I first noticed these issues in 10.10 Yosemite and they continue to this day. I have never seen an operating system handle window positioning as stupidly as MacOS.I spend most FaceTime calls looking at myself in the corner lol
What if looking at yourself is important? I mean... I know what they look like. But maybe I have a hair out of place... should really need to know these things so you can look your best. Hmm... maybe the image of us should be bigger so I can tell if I have something stuck in my teeth while I’m talking. This stuff is important.![]()
DeepFake is nearing to fulfillment and it will be marketed as fun similar to Memoji/Animoji.
[doublepost=1562160272][/doublepost]
Minimum requirement I suspect is A12 with 4GB RAM or more.
Not blind at all, his comparison picture is terrible and gives no initial frame of reference. We don't know if he's looking down and left in the second image because it's a completely different angle, his facial expression is completely different, and his distance to the camera is different too. This gives the appearance of him taking these pictures at wildly different times.
A good set of pictures would have him in the exact same spot, one picture of him looking down and left and the next picture turning the feature on (in the same position) to show the difference.
The XR is also compatible with this feature"so maybe the software algorithms require the more advanced processing power of Apple's latest devices"
What a joke. iPhone 10 or XR could easily do this.