Creation of sexualized images

The right to sexual self-determination and the right to one's own image can be violated if intimate media content is created without the consent of the person concerned.  

Upskirting, downblousing & Webcam poses - real shots

If intimate or sex-related images are created, this may involve content that was created in a real situation. This includes, for example, digital voyeurism. Here, nude or sex-related image content is created using small hidden internet-enabled cameras, so-called "spy cams". These are mainly installed in toilets, showers, changing rooms or even hotels. The terms "upskirting" and "downblousing" are used when intimate body regions such as buttocks, genitals, underwear or the female breast are secretly photographed or filmed. These images also often end up on the internet or on porn sites. The following applies here: secretly filming and photographing intimate body parts is a criminal offense and can result in a fine or prison sentence.

The secret creation of sexual images does not only take place offline. According to, an increasing number of video collections have been distributed on the internet in recent years. They show children posing in front of a webcam and performing sexual acts on themselves. The images are created, for example, as part of cybergrooming. The perpetrators persuade the victims to perform sexual acts in front of the webcam in order to secretly record them and then distribute the videos online. 

Deepfakes and deepnudes - fake recordings

Sex-related content is not always real footage that depicts a real situation. The internet is increasingly flooded with so-called "deepfakes" that depict people in sexualized or pornographic situations with the help of generative AI tools. Girls and women are particularly affected by this. The phenomenon of faking intimate images is not new in itself. What has changed with deepfake technologies, however, is the mass and easy accessibility with which a wide range of content can be created for free or for little money in a very short time and without any prior technical knowledge. In addition, the generated recordings look more and more authentic or real. 

Deepfakes are manipulated media content, such as images, videos or audio content (e.g. voices), which are generated with the help of deep learning, a method of machine learning and can appear deceptively real. Depending on the "quality" of the deepfake, the impression can be created that the person depicted is actually naked or engaged in sexual activity. With deepfakes, the image of a person is inserted into content in which they do not actually appear. 

In the case of AI-generated nude images, this is referred to as "deepnudes". "Deep" because they are generated using deep learning, a method of machine learning are created. And 'nude' because the person depicted is naked. There are numerous websites and apps online that have been specially programmed for this purpose. They can be found and used with just a few clicks via search engines. AI-generated "deepfake pornography" is when the face of a real person is seamlessly edited into pornographic video material with the help of AI.

Both phenomena are forms of image-based sexualized violence

Deepnudes and deepfake porn

When creating deep nudes , programs are used thatconvertnon-sexualized photos of people into realistic-looking nude images. The program captures the body of the person in question and creates a new, undressed image of them. In 2019, an app called "DeepNudes" made these image manipulation options available for the first time. The app was widely criticized for objectifying women. The provider took it off the market again after a short time.  

There are now numerous apps that work on exactly the same principle and have met with great interest - including among young people. In the fall of 2023, for example, a case became known in Spain in which young people at several schools created AI-generated nude images of underage girls. A case in Mexico also made headlines at the same time. A student is accused of using AI to create thousands of intimate images of several female students in order to sell them online as pornographic material. One thing is certain: Apps and websites that use AI to create fake nude images of women are experiencing strong growth.   

But it is not just virtual undressing that is becoming a worrying trend. Deepfake porn is also on the rise on the internet. With the help of so-called face swap apps, for example, the faces of people can be cut into real porn scenes with little effort. Those affected are generally unaware of the non-consensual pornographic fakes.

Thedecisive factor is that the use of AI generators hasfundamentally changed the group of people affected . This is because they now also include those who have never consensually shared intimate images of themselves or have been secretly recorded by others.

  AI-basedcases of image-based sexualized violence  alsoviolate the personal rights of those affected. However, the criminal liability here has not yet been clearly and adequately regulated. According to Josephine Ballon, lawyer and managing director of HateAid, the creation of such AI-based content is not yet a criminal offense, but its distribution is. Depending on the age of the persons depicted, there is also criminal liability for the creation, possession or distribution of child or youth pornography.

Thetarget of AI-generated sexualized content is primarily women and girls. All that is needed are images of the person, such as those found on social networks. Often, just a few images are enough to create such fake content. However, the more image material or training data is available, the "higher quality" or more authentic the end result will be. Female people who are in the public eye are therefore particularly frequently affected by image-based sexual abuse, as there are many images of them online.

Because the creation of deep nudes and fake porn is inexpensive and easy to implement, it can be assumed that cases of image-based sexualized violence with the help of AI technologies will increase in the future. It can also be assumed that the quality of deepfakes will continue to increase in the near future. This means that the errors, some of which are still visible, will become increasingly subtle and difficult to detect.