One of iOS13’s new features is one that corrects your eye contact during FaceTime calls. It’s most certainly clever, but is it a bit too freaky as well?
As a journalist Iâ€™m increasingly asked to make interviews via Skype, WhatsApp or FaceTime but two things really get my goat. One is the quality of the service which even over a decent WiFi connection still causes almost unworkable buffering and crashes (Skype being the worst culprit). The second is that Iâ€™m generally typing what my interviewee is saying. Even with my very best touch-typing skills (clue: not that great) Iâ€™m basically heads down over the keyboard which isnâ€™t a good look when you are trying to elicit information from someone.
In short, I prefer a good old voice call.
Apple has come up with an answer, by artificially re-instating the line of sight between callers using FaceTime.
The new feature, FaceTime Attention Correction, makes it look like youâ€™re staring directly at your front-facing camera during calls, rather than at the deviceâ€™s screen. It simply looks like the person calling is looking right at you, instead of your nose or your chin.
Apparently, the effect is being achieved using ARKit to grab a depth map/position of your face, and then adjust the eyes accordingly.
If you didnâ€™t know about it then youâ€™d probably never guess the effect was being applied. Plus, you can turn the function off.
The feature appears to only be rolling out to the iPhone XS and iPhone XS Max but will get a wider release when iOS 13 officially goes live, later this year.
It isnâ€™t something the internet has been clamouring for but now itâ€™s here perhaps this will become the new socially accepted norm.
Except that of course you wonâ€™t be looking into someoneâ€™s eyes. The effect will be faked.
Will that interfere with social discourse? If eyes are the windows to the soul then meaningful conversation will grind to a standstill.
Extrapolating that, future AI/AR enhancements could make it appear as if weâ€™re really truly listening to someone (a loved one?) when in fact weâ€™re picking our nose or yawning.
Why not change the location from where we actually are, at someone elseâ€™s home or on the beach, to appear at home or on the train. It could change our clothing or remove other people from the background.
It need not even be you on the call but someone pretending to be you (I havenâ€™t worked out why that would be needed but Mission Impossible has been trading off such deep fakes for years).
The truth is already up for debate so when we can manipulate and change any part of an image in realtime in pixel perfection where do we draw the line and how do we separate real from simulacra?
Or does the simulacra become the new truth?
Weâ€™re straying waaay too far from what is after all a tiny tweak to make video calls a little less weird.
Now, can someone sort out how to make video calls work consistently without crashing, bugs or delay my life will be complete.