Apple’s new iOS feature allows you to make proper eye contact during FaceTime calls

In the latest iOS 13 beta, Apple has released a new cool feature called FaceTime Attention Correction iOS devices. In general, when we make video calls on mobile devices, we look at the phone display all the time to see the other person. Since we are not looking at the camera directly, we can’t have proper eye contact with the other person. FaceTime’s Attention Correction feature solves this issue by using ARKit. It grabs a depth map/position of your face in realtime during the video call, and adjusts the eyes accordingly. Apple claims that your eye contact will be more accurate when you turn on this feature.

Some links in the article may not be viewable as you are using an AdBlocker. Please add us to your whitelist to enable the website to function properly.

Related
Comments