Deepfakes

When it comes to disinformation and fake news on the Internet, one reads more and more often about deepfakes. Deepfakes are recordings of people that are produced with the help of computer programs. Usually, the term deepfake refers to videos. However, images or sound recordings can also be referred to as deepfakes.

The manipulated videos show people doing or saying things that did not really happen. There are different reasons why people publish deepfakes. Some want to spread misinformation. Others make deepfakes that are meant to be funny. A lot of deepfakes are also pornographic.
In some deepfakes it is quite obvious that the recording is not real. However, there are also deepfakes where it is not easy to see that the recordings have been altered.

It is important to learn the skills necessary to distinguish true from false information. The topic of deepfakes is ideally suited for media education work because they are a new technical form of manipulation that exerts a great deal of fascination. It is important to emphasize not only the seemingly infinite possibilities of manipulation, but also the limits of what is possible. And above all, to show how one can use one's own skills or fact-checking sites to track down deepfakes.

What are deepfakes?

Deepfakes can be thought of as a digital mask. A computer program superimposes a new face onto a video recording of a person. To make this face look realistic, the computer program needs a lot of images of the "new" face. The more images the program has available, the better the result. That is why the most convincing deepfakes so far have always been the faces of well-known people. Because there are many pictures of them on the Internet.

The aim of deepfakes is always to use computer programs to insert the identity of people into images in which they do not actually appear. For example, when the faces of famous actors and actresses are inserted into pornographic videos. Or a real shot of a person is altered to say something completely different than in the original shot.

The following video from Monkeypaw Productions and BuzzFeed uses deepfake technology to warn about its dangers. You can turn the English subtitles on or off using the settings at the bottom right of the screen. You can also have the subtitles automatically translated into German.

How can I recognize deepfakes?

In most cases, it has been possible to detect deepfakes with the naked eye. As impressive as the technology on which deepfakes are based is, it is still not quite perfect. Typical detection features are unnatural facial expressions, a blank look or even incorrectly cast shadows on the face. It helps to watch the videos in question in full-screen mode to spot these small errors.

However, we have to assume that the technology will continue to improve and deepfakes will indeed no longer be recognizable as such. There is disagreement as to whether computer programs can help expose doctored videos in this case. In any case, there are companies working on such methods. Independent of these technical solutions, however, there are also quite conventional methods for verifying the authenticity of deepfakes.

How can I verify the authenticity of videos?

All the ways that help us verify misinformation can also be applied to deepfakes. As always with fake news, you should first look at the context of the video:

  • Can the video be found on reputable news sites or only on rather dubious websites or social media platforms?
  • Have fact-check portals already checked the video? For example, Mimikama, CORRECTIV, dpa-Faktencheck or BAIT: fact check channel for young people on TikTok.
  • Can I use a search engine to find out when and where the video first appeared?
  • Do the statements and behavior of the person shown contradict what he or she usually says and does?

The German Federal Office for Information Security provides a lot of background information on the technical aspects of deepfakes and shows possible detection features.

Are deepfakes dangerous?

Probably the biggest danger of deepfakes is not that people might mistake fake footage for real, but the exact opposite - that they might mistake real footage for fake. Deepfakes often give the impression that we can no longer be sure whether we are seeing real or fake news. This can quickly lead to the feeling that we can no longer trust any source. This is a dangerous development for democracy, because it is important for political decision-making that citizens trust their sources of information.

This dynamic creates further opportunities for political propaganda. If many people think it likely that videos could be unverifiably faked, it is easy to simply declare any disagreeable recording as a deepfake. In the U.S., for example, supporters of conspiracy theories have claimed that the video of Donald Trump acknowledging the election victory of his opponent Joe Biden was merely a deepfake.