As if spotting fake photos and videos on Facebook and Twitter weren’t already hard to spot. It’s only going to get more difficult, if not impossible with advances in technology called “deep fakes”.
Perhaps you saw a video from 2 years ago that went viral of former president Barrack Obama using profanity to describe president Donald Trump. In truth, the voice of Mr. Obama was from comedian, movie star and producer Jordan Peele. The video was intended to demonstrate deep fake technology and encourage people watching to be skeptical of shocking videos of celebrities and other notable people saying or doing things out of the ordinary.
“It’s not just that it sounds like, it’s not just that you see the person’s face, it’s the mannerisms too. Everything is the same,” says Adam Chiara, a communications professor at The University of Hartford. Chiara has been studying deep fakes in recent years and says the possibilities of people using them to incite people or spread misinformation on social media is enormous.
“It is scary,” he said from his home office Wednesday. “And that’s why I think we need to be prepared because it’s coming one way or the other. The only question is are we, as a society, going to be ready for it or not.”
I saw the technology first-hand at CES in January where dozens of companies were demonstrating their deep fake technology. Deep Nen was one of them. An Israeli company that showed me how deep fake videos and gifs can be easily created with a smartphone and its camera. They took my photo and put it over video of President Trump.
“We can take a target face and swap it with a source face,” explained Anne Marie holding the smartphone and app. She also explained that Deep Nen’s technology can be used in Hollywood.
“We can also scale and virtualize, say George Clooney’s face when he can’t be in a commercial in Germany, but we can set up the scene so that we can just implant his face on somebody else’s.”
Deep Nen is also working with Facebook to create technology that will spot and stop fake news from happening.
Chiara says that’s great, but social media platforms may not be able to flag and take down deep fake videos before they are shared hundreds or even millions of times. he offers this to everyone just learning about deep fakes and the dangers it can create:
“When we see something that might be a little suspicious,” he said. “Take a minute and maybe not share it or reshare it or comment on it, to see if we can find it somewhere else. Has a media outlet talked about this and determined that is is false before we decide to share it or talk about it.”
He says it’s critical that people know about this technology because it is only going to get better and easier for anyone to use.