Monday 22 July 2019

This new FaceTime effect is both clever and freaky at the same time!

RedShark News
One of iOS13's new features is one that corrects your eye contact during FaceTime calls. It's most certainly clever, but is it a bit too freaky as well?
As a journalist I’m increasingly asked to make interviews via Skype, WhatsApp or FaceTime but two things really get my goat. One is the quality of the service which even over a decent WiFi connection still causes almost unworkable buffering and crashes (Skype being the worst culprit). The second is that I’m generally typing what my interviewee is saying. Even with my very best touch-typing skills (clue: not that great) I’m basically heads down over the keyboard which isn’t a good look when you are trying to elicit information from someone.
In short, I prefer a good old voice call.
Apple has come up with an answer, by artificially re-instating the line of sight between callers using FaceTime.
The new feature, FaceTime Attention Correction, makes it look like you’re staring directly at your front-facing camera during calls, rather than at the device’s screen. It simply looks like the person calling is looking right at you, instead of your nose or your chin.
Apparently, the effect is being achieved using ARKit to grab a depth map/position of your face, and then adjust the eyes accordingly.
If you didn’t know about it then you’d probably never guess the effect was being applied. Plus, you can turn the function off.
The feature appears to only be rolling out to the iPhone XS and iPhone XS Max but will get a wider release when iOS 13 officially goes live, later this year.
It isn’t something the internet has been clamouring for but now it’s here perhaps this will become the new socially accepted norm.
Except that of course you won’t be looking into someone’s eyes. The effect will be faked.
Will that interfere with social discourse? If eyes are the windows to the soul then meaningful conversation will grind to a standstill.
Extrapolating that, future AI/AR enhancements could make it appear as if we’re really truly listening to someone (a loved one?) when in fact we’re picking our nose or yawning.
Why not change the location from where we actually are, at someone else’s home or on the beach, to appear at home or on the train. It could change our clothing or remove other people from the background.
It need not even be you on the call but someone pretending to be you (I haven’t worked out why that would be needed but Mission Impossible has been trading off such deep fakes for years).
The truth is already up for debate so when we can manipulate and change any part of an image in realtime in pixel perfection where do we draw the line and how do we separate real from simulacra?
Or does the simulacra become the new truth?
We’re straying waaay too far from what is after all a tiny tweak to make video calls a little less weird.
Now, can someone sort out how to make video calls work consistently without crashing, bugs or delay my life will be complete.

No comments:

Post a Comment