Deepfakes may ruin the world and also you

avanade

Can you envision a world where you see a video of a world leader engaged in a terrible act, but you can’t tell whether it’s real or a completely computer-generated fake? What about a world where pretty much anyone can make one of these fake videos on a regular computer using free tools? This sounds awful, right? Well, it gets worst.

Imagine seeing yourself in a video saying things you’ve never said before, controversial things that could cost you your job or humiliate you in front of your friends and family. How would you respond if you saw your face on a character in an adult movie? Scary right? The worst part is that, within a few months, anyone who has a regular computer will be able to create such fake videos very easily. Already, some mobile apps create fake videos that are impossible to distinguish from real ones. And this is just the beginning, considering how fast machine learning technologies have developed over the last few years.

What is Deepfake?

A deep fake is a video created using a type of machine learning to pick up someone else’s facial movements and place them onto another person’s face. This allows the creators of these videos to manipulate viewers. Let’s see some scary examples to understand how it works.

Watch the video of former U.S. President Barack Obama made by using artificial intelligence technology.

Do you think Kim Kardashian and Mark Zuckerberg would say these things in public?

“I genuinely love the process of manipulating people online for money.“

@bill_posters_uk

“..whoever controls the data controls the future“

These edited videos are called “deepfakes”. They are made using artificial intelligence (AI). The technology uses existing footage of a person as a data point to capture facial structure, expressions, and speech to make a video that looks hyper-realistic. Similar technologies were used before in Hollywood movies and cost a lot of money. However, recently a Chinese app named Zao was released that does the same thing for free and in a really short time. Keep reading. It gets worse.

See this guy who substituted his face for DiCaprio

Deepfake means anyone can create fake news and videos easily

Apps like Zao that allow people to create deepfake videos can cause social chaos. This technology has a lot of potential to affect us badly in many different ways. In addition to its threat to the political system and the decision-making process regarding elections, it has quite a big potential to ruin relationships between people. It opens a really wide door for stalkers and harassers as well as ex-boyfriends/husbands and other perpetrators.

When anyone can create such videos, it’s really hard to keep creators accountable. Fake videos do not always target just politicians; many journalists and activists are already concerned about deepfakes amplifying local harassment and targeting innocent people. Deepfakes can hurt anyone regardless of age, gender, country, color, or social status.

Is there any way to detect deepfake videos?

Although some institutions and companies like DARPA (The Defense Advanced Research Projects Agency), Facebook, and Microsoft are trying to create awareness of this situation, there is still no completed algorithm that can detect such videos once they are uploaded on platforms like Facebook, Instagram, or YouTube. Recently, Facebook and Microsoft launched a new contest for detecting deepfake videos and they stated that they are willing to pay $10 million to winners.

Lately, we at Zemana are focusing entirely on how to detect deepfakes as we see this as the biggest social problem of the near future. We are working on this issue for a year now and from our researches, it’s clear that the deepfakes can have really bad effects on politic life and non-politic life. For now, we can say that innocent internet users are suffering more from deepfakes. We are working on solutions for companies, institutions, and individuals. We can clearly say that we are ahead of cyber criminals and will have a surprise to announce at NeurIPS( Conference on Neural Information Processing Systems) conference this year. We at Zemana value privacy rights and strongly oppose violations of human rights.

Stay tuned!