Deepfakes
When it comes to the topics of disinformation and fake news on the Internet, the term deepfakes often comes up. These are manipulated videos in which people do or say things that they have never done or said. In addition to political influence, deepfakes are also used in other contexts. For example, parodic videos are popular, and it is usually clear that they are manipulated recordings. So there is not always a danger that deepfake recordings could be mistaken for real. Less funny is the use of deepfake technology in the field of pornography. Here, people's faces are inserted into pornographic recordings - a serious violation of personal rights and, depending on the case, even a criminal offense.
It is important to learn the necessary skills to distinguish true from false information. The topic of deepfakes is ideally suited for media education work because they are a new technical form of manipulation that exerts a great deal of fascination. It is important to emphasize not only the seemingly infinite possibilities of manipulation, but also the limits of what is possible. And, above all, to show how one can track down deepfakes with the help of one's own skills or Web offerings such as fact-checking sites.
What are deepfakes?
Deepfakes can be thought of as a digital mask. An algorithm superimposes a new face onto a video recording of a person. In order for this face to look reasonably realistic, the computer program used requires a large number of images of the "new" face. The more images the program has available, the better the result. That is why the most convincing deepfakes so far have always been the faces of well-known people, of whom a large number of images are freely available.
Besides the deepfakes that imitate the appearance of a person, there are also deepfakes that imitate the voice of a person.
How can I recognize deepfakes?
In most cases so far, deepfakes can be seen with the naked eye. As impressive as the technology on which deepfakes are based is, it is not yet perfect. Typical detection features are unnatural facial expressions, a blank look or incorrectly cast shadows on the face. It helps to watch the videos in question in full-screen mode to spot these small errors.
Of course, it cannot be ruled out that technology will improve in the future so that deepfakes are actually no longer recognizable as such to us. There is disagreement about whether, in this case, computer programs can help debunk manipulated videos. In any case, there are companies working on such methods. Independent of these technical solutions, however, there are also quite conventional methods for verifying the authenticity of deepfakes.
How can I verify the authenticity of videos?
All mechanisms that have protected us from misinformation so far also apply to deepfakes. Here it is also advisable to first look at the context of the video:
- Is the video distributed by a reputable information source or by a rather dubious website?
- When and where did the video first appear?
- Do the statements and behavior of the person shown contradict what he or she usually says and does?
The German Federal Office for Information Security provides a lot of background information on the technical aspects of deepfakes and shows possible detection features.
Are deepfakes dangerous?
Probably the biggest danger of deepfakes is not that people might mistake fake footage for real, but the exact opposite - that they might mistake real footage for fake. Deepfakes often give the impression that we can no longer be sure whether we are seeing real or fake news. This can quickly lead to the feeling that we can no longer trust any source. This is a dangerous development for democracy, because it is important for political decision-making that citizens trust their sources of information.
This dynamic creates further opportunities for political propaganda. If many people think it likely that videos could be unverifiably faked, it is easy to simply declare any disagreeable recording as a deepfake. In the U.S., for example, supporters of conspiracy theories have claimed that the video of Donald Trump acknowledging the election victory of his opponent Joe Biden was merely a deepfake.