Minimum age in social mediaInstagram from 13, WhatsApp from 16 and YouTube from 18?

klicksafe repeatedly receives inquiries about the minimum age. Many parents and educators now know that the popular apps are only permitted from the age of 13, 16 or 18. The different age limits are confusing, as is the fact that many children and young people still use these apps. We explain why providers like TikTok, Snapchat and Spotify set a minimum age. And why the minimum age does not indicate when children should start using these services.

We are all familiar with age ratings from our everyday lives. An FSK sticker is clearly visible on every DVD or BluRay. Alcohol and tobacco are not sold in stores to people under 18. And anyone who wants to access pornography on the Internet must confirm that they are over 18. The purpose of all these regulations is to protectadolescents from influences that could negatively affect their development. However, other reasons are decisive when setting the minimum age on digital platforms. 

Why do Internet platforms specify a minimum age?

The minimum age is usually specified in the general terms and conditions of the services. The companies record it there in order to comply with their obligations in the area of protection of minors. This is because various laws prohibit the processing of children's personal data. These include the European General Data Protection Regulation (GDPR), for example. Or they stipulate that platforms must obtain parental consent when children use them. This is the case, for example, with the Children's Online Privacy Protection Act (COPPA) from the USA, where no personal data of children under 13 years of age may be processed without parental consent.

These laws protect the rights of children to protect their online privacy. However, they have a major disadvantage for companies: implementing reliable age verification involves effort and expense. In addition, most services pursue a business model in which the processing of personal data plays an important role. So the services are in a dilemma: on the one hand, they have to comply with applicable law, otherwise they face legal consequences and fines. On the other hand, they do not want to use age verification systems and forgo revenue. The pragmatic solution for companies is therefore to prescribe a minimum age.

The general terms and conditions specify, among other things, that a service may only be used from the age of 13, for example. From then on, the company pretends that there are no people under 13 on the platform. Anyone under 13 who uses the service anyway is in breach of the general terms and conditions.

Why is the minimum age different in some cases?

As a rule, the popular Internet platforms have a minimum age of 13 years, 16 years or 18 years. The different platforms use different laws as a basis.

From 13 years of age: Facebook, Instagram and TikTok, for example. These services are guided by the Children's Online Privacy Protection Act (COPPA) from the USA. This law stipulates, for example, that platforms must obtain the consent of a parent or guardian before they are allowed to process children's data. According to COPPA, a child is any person under the age of 13.

From 16 years of age:WhatsApp, for example. This age limit is based on the European General Data Protection Regulation (GDPR). In the DSGVO, similar requirements are set as in COPPA. For example, that services must obtain the consent of the parent or guardian. However, the age limit set for this in the GDPR is 16.

From 18 years of age: For example, YouTube (without parental consent), Netflix and Spotify. The services listed here set 18 as the minimum age. However, younger people are allowed to use them with the permission of their parent or guardian. For example, you can create a YouTube account at 16. However, one is not allowed to use it until parents have given their consent. Parental consent is no longer required until the age of 18.
Exactly why these services have opted for a minimum age of 18 is not entirely clear. Possibly it is because content is offered on these platforms that may not be made available to people under 18. For example, a movie with an FSK rating from 18 years on Netflix.

Does the minimum age protect children and young people?

The minimum ages defined by law (13 and 16) were originally intended to protect children and young people. The laws were supposed to lead to platforms taking better care of their underage users, for example by protecting their privacy and not exploiting their trust. Unfortunately, the laws have not had the desired effect. Providers responded to the laws by excluding minors from using their services in the terms and conditions or by making parents responsible by referring to consent.

Does the minimum age provide good guidance for parents?

The minimum age in digital services is not an educational assessment. It results from legal requirements, especially in the area of data protection. Therefore, the minimum age specified in the general terms and conditions is usually not a meaningful orientation for guardians. If a service specifies 13 as the minimum age, this does not mean that the service has been tested and recommended for children over 13. Conversely, services with a minimum age of 16 are not per se unsuitable for younger people in terms of content. Parents should therefore inform themselves about the content and the possible risks of using the service before deciding for or against it.

How can I decide when a service is appropriate for my child?

Unfortunately, there is no simple and universal answer to this question. Rather, it depends on the child's stage of development and sense of responsibility. It plays a significant role whether parents set the services safely together and discuss risks with their children and establish rules of conduct. It is also an important factor whether children use their devices unsupervised or under the supervision of their legal guardians.

Our checklist "Is my child ready for their own smartphone?" helps parents assess whether their child has already developed the necessary skills for safe smartphone use.

If you allow children to use online platforms, keep the following points in mind:

  • Set up the account together with your child. Pay special attention to privacy and security. Show the options for reporting and blocking. Instructions for secure settings for many web offerings and devices can be found at and
  • Establish clear rules for use. These should cover both content aspects e.g. do not reveal private details, do not offend, do not distribute inappropriate content or pictures and videos of others. But rules for the duration of use are also useful, e.g. set a daily maximum duration of use and define media-free times in everyday life. can help you negotiate these rules together.
  • Stay in contact with your child. Inquire regularly about what your child is doing and experiencing on the Internet. This way you will know if there are problems and can help. On the other hand, you can also notice that your child may no longer need some protective settings over time and adjust the arrangements.
  • Familiarize your child with offers of help. No matter how good the parent-child relationship is, it can always happen that a child does not confide in his or her parents out of fear or shame. In this case, children should be aware of anonymous, free and non-binding support services. These include the Nummer gegen Kummer, JUUUPORT and

More information from klicksafe