Apple launched several of the headlining features of its upcoming iOS 13 during WWDC, but people playing with the closed beta version have revealed some extra tools. One newly found addition is FaceTime Attention Correction, which improves the image during a FaceTime video call to make it look like a person is looking into the camera rather than at their device’s screen.
In practice, that means that while both you and your contact are looking at each other’s faces, you’ll both appear to be making direct eye contact. Mike Rundle and Will Sigmon were the first to tweet about the find, and they describe it as magical, “next-century shit.” Another beta tester, Dave Schukin, said that the feature depends on ARKit to make a map of a person’s face and use that to notify the image adjustments. His features appear to only be rolling out to the iPhone XS and iPhone XS Max with the current beta experiment. It will get wider publicity to the general public when iOS 13 officially goes live, which will likely be sometime this fall. Apple has been introducing more and more features focused on automatically changing images. It has been giving its cameras tools like Smart HDR, which analyzes and composites multiple frames for the “best” shot or automatic reductions in the effect of shaky hands. Usually, these tools are optional, although you may need to dig around in your device’s settings to make sure the tools are off rather than on by default.
It’s a smooth application of Apple’s augmented reality tools, which are admittedly impressive and powerful.