What seems to be a paradox in the high tech community these days? All major tech giants have embraced privacy in their public relations, yet even Apple struggles to make it happen without a gaff here and there. Yet, facial recognition is a good way to help us sort through 25,000 photos of friends and family members.
Videos are not photos and the former could use a savior, too. Why? Deep Fakes are a new usage of technology that will only add another layer of high tech scourge to humanity. Wouldn’t it be nice if someone could save us? Guess who is trying?
Facebook. Yeah, that Facebook. Not Apple. Not Google. Not Microsoft. Facebook. Ewdison Then:
The company’s own researchers are developing AI to combat other AI, perhaps including Facebook’s, that try to identify and tag faces in videos.
So, Facebook is into creating facial recognition to further determine who we are and what we look like, and is working on artificial intelligence (AI) to thwart it, too?
What’s going on?
While the Internet seems to be so fascinated by the entertainment value of some “deepfake” apps and services, others are already raising alarms over their privacy and accuracy implications.
Imagine if someone created a deepfake video of you threatening the president or admitting to an affair or confessing to stealing money from a bank.
It might now be too easy to spread misinformation and is almost extremely trivial to identify people just from photos. Especially the ones they willingly submit only to swap with some other face.
Yes, deepfake videos are that good and it takes some specially trained personnel to determine the differences between real and fake.
Facebook’s AI can help. All that we need to do is have it built-in to our cameras, so when we make a video and post it to Facebook, Google’s YouTube, or wherever, it gets scrubbed so artificial intelligence can’t figure out who we are even if humans can.
This AI uses an adversarial autoencoder to apply slight distortions to a person’s moving face. Although the faces are still recognizable to our human brains, face recognition software can be thrown off by those changes. Best of all, since the AI doesn’t have to be retrained for each video, it can, in theory, be applied to any live video feed.
This is something Apple could do to make iPhones even more private and protect customers from people who might steal identities for fun, profit, or prison terms.