Apple's new iOS feature allows you to make proper eye contact during FaceTime calls
1 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
In the latest iOS 13 beta, Apple has released a new cool feature called FaceTime Attention Correction iOS devices. In general, when we make video calls on mobile devices, we look at the phone display all the time to see the other person. Since we are not looking at the camera directly, we can’t have proper eye contact with the other person. FaceTime’s Attention Correction feature solves this issue by using ARKit. It grabs a depth map/position of your face in realtime during the video call, and adjusts the eyes accordingly. Apple claims that your eye contact will be more accurate when you turn on this feature.
How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.
Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN
— Dave Schukin ? (@schukin) July 3, 2019
User forum
0 messages