Digital Services ActWhat does the new Digital Services Act bring for children?

The Digital Services Act (DSA) came into force in November 2022. For users, however, the effects will only be felt gradually. Recently, the very large platforms with more than 45 million users have had to comply with the new rules of the DSA. This includes services such as Instagram, Google and Amazon. We explain what impact the new law will have on the use of the Internet by children and young people in Europe.

The DSA wants to ensure that digital services protect the rights of users. Digital services must provide a safe space for everyone and prevent the dissemination of prohibited or inappropriate content. This applies especially to very large online platforms such as Instagram, Snapchat, TikTok and YouTube.

Many of the new rules from the DSA affect all users and not just children and young people. For example, it will be illegal to use dark patterns in the future. Dark patterns are elements on websites that lead users to behave contrary to their own interests. For example, when users are persuaded to buy something through artificial scarcity. The fight against hate and illegal content on the Internet is also part of the DSA. Uniform complaints procedures throughout Europe are intended to remove harmful content more quickly and effectively in the future.

In addition to rules that apply to all users regardless of age, there are some parts of the DSA that specifically affect children and young people.

How are minors protected online by the DSA?

Suitable content

Children and young people should always feel safe and protected online from inappropriate content and contacts. Inappropriate content or contacts include, for example, content that could trigger anger, rage, sadness, worry or fear.

The DSA is designed to ensure that online content is appropriate for the age and interests of children and young people. It is important that online platforms quickly identify what content poses a risk and do something about it. This is, for example, content that violates people's rights or dignity. Or that restricts privacy and freedom of expression.

Through the DSA, online platforms should ...

  • ensure data protection and the safety of young users.
  • pay attention to the impact of their services on society . For example, on fair elections and public safety. Or on the mental and physical well-being of users and all forms of gender-based violence.


We all have the right to protect our personal information. This also applies online. Therefore, companies should not ask anyone to share more personal information with them or other users. They also need to protect users' data. This data should not be falsified or shared, and no one should be spied on.

The DSA specifies:

  • If an online platform does not know whether someone is a child or not, it should not ask for more personal information to find out.
  • Online platforms used by children should have special basic settings for privacy and security.
  • Online platforms should share good ideas for protecting young users.


Targeted advertising uses information about users. We leave this information online, for example, by visiting websites. In this way, advertising is displayed for products that match the user's interests. The platforms use algorithms and artificial intelligence to do this.

The DSA says:

  • In the case of underage users, platforms must not use personal data for advertising.
  • Online platforms must make information about their advertising public. This allows experts to check whether there are risks such as false information or prohibited advertising.
  • Online platforms must explain who their advertising is intended to appeal to and how the advertising is presented. Especially if the advertising is shown to children and young people.

Child friendly instructions

The terms of use of websites and platforms must be written in such a way that children and young people can also easily understand them. Complicated things should be explained additionally.

The DSA specifies:

  • The platforms should try to explain everything as simply as possible. That way, young users understand what terms they are agreeing to.
  • If a website or platform changes its rules, it must communicate this clearly and in a way that everyone can understand.

Dangers online

Users, especially children and young people, should always feel protected online from dangers and risks. For example, from harassment, viruses, misinformation, and people pretending to be someone else.

The Digital Services Act says that platforms must consider what dangers they have for children and young people. And that they must take measures to prevent these dangers.

Examples of measures to prevent hazards:

  • Parental control: Through settings on online platforms, guardians can accompany children. For example, they can set how long children can use Internet services or block inappropriate content and risky functions within the services.
  • Age verification: The age of users is verified by technical systems. This is usually done by means of proof of identity, such as an ID card. This is important because some online content, websites and services are not suitable for young people.
  • Age estimation: Online platforms can also use different ways to estimate the age of users. This can be used to customize the user's online experience.

These are the next steps of the DSA

The Digital Services Act entered into force on November 16, 2022. In April 2023, the European Commission announced which platforms fall into the category of "very large online platforms" or "very large online search engines". These are all platforms with more than 45 million users*. These include, for example, Amazon, Apple, Facebook, Twitter, Wikipedia and Google Maps, among many others.

  • From September 2023, these very large online platforms and search engines will have to comply with the rules of the Digital Services Act.
  • Then, from February 2024, all online services will have to comply with the rules of the Digital Services Act. Even those with less than 45 million users*.
  • The European Union and its member states will monitor that online services comply with these rules. Those who do not comply with the rules can be fined.

The text is partly taken from the factsheet "The Digital Services Act (DSA) explained". The full text of the DSA is available in German here.