Minimum age in social mediaInstagram from 13, WhatsApp from 16 and YouTube from 18?

klicksafe repeatedly receives inquiries about the minimum age. Many parents and educators now know that the popular apps are only permitted from the age of 13, 16 or 18. The different age limits are confusing, as is the fact that many children and young people still use these apps. We explain why providers like TikTok, Snapchat and Spotify set a minimum age. And why the minimum age does not indicate when children should start using these services.

Please note: In February 2024, WhatsApp announced the lowering of the minimum age in the EU from 16 to 13. Our article was published on March 29, 2023, and the title of this article refers to the minimum age on WhatsApp at the time of publication.

We are familiar with age ratings from our everyday lives. An FSK sticker is clearly visible on every DVD or BluRay. Alcohol and tobacco are not sold in stores to people under the age of 18. And if you want to access pornography on the Internet, you have to confirm that you are over 18. The purpose of all these regulations is to protectyoung people from influencesthat could have a negative impact on their development. However, other reasons are decisive when setting the minimum age on digital platforms. 

Why do Internet platforms specify a minimum age?

The minimum age is usually specified in the general terms and conditions of the services. The companies record it there in order to comply with their obligations in the area of protection of minors. This is because various laws prohibit the processing of children's personal data. These include the European General Data Protection Regulation (GDPR), for example. Or they stipulate that platforms must obtain parental consent when children use them. This is the case, for example, with the Children's Online Privacy Protection Act (COPPA) from the USA, where no personal data of children under 13 years of age may be processed without parental consent.

These laws protect the rights of children to protect their online privacy. However, they have a major disadvantage for companies: implementing reliable age verification involves effort and expense. In addition, most services pursue a business model in which the processing of personal data plays an important role. So the services are in a dilemma: on the one hand, they have to comply with applicable law, otherwise they face legal consequences and fines. On the other hand, they do not want to use age verification systems and forgo revenue. The pragmatic solution for companies is therefore to prescribe a minimum age.

The general terms and conditions specify, among other things, that a service may only be used from the age of 13, for example. From then on, the company pretends that there are no people under 13 on the platform. Anyone under 13 who uses the service anyway is in breach of the general terms and conditions.

Why is the minimum age different in some cases?

As a rule, popular internet platforms have a minimum age of 13, 16 or 18. The various platforms are based on different laws.

From the age of 13: Facebook, Instagram and TikTok, for example. These services are based on the Children's Online Privacy Protection Act (COPPA) from the USA. This law stipulates, for example, that platforms must obtain the consent of parents or legal guardians before they are allowed to process children's data. COPPA defines a child as any person under the age of 13.

Please note: The following section on WhatsApp corresponds to the status as of March 29, 2023 (publication date of the article). In February 2024, WhatsApp adjusted the minimum age and it is now 13 years old.
From the age of 16:
WhatsApp, for example. This age limit is based on the European General Data Protection Regulation (GDPR). The GDPR has similar requirements to COPPA. For example, that services must obtain the consent of parents or legal guardians. However, the age limit set for this in the GDPR is 16 years.

From the age of 18: For example, YouTube (without parental consent), Netflix and Spotify. The services listed here set 18 as the minimum age. However, younger people may also use them with the permission of a parent or guardian. For example, you can create a YouTube account at the age of 16. However, you may only use it if your parents have given their consent. Parental consent is no longer required until the age of 18.
It is not entirely clear why exactly these services have opted for a minimum age of 18. It may be because content is offered on these platforms that may not be made available to people under the age of 18. For example, a movie with an FSK rating from the age of 18 on Netflix.

Does the minimum age protect children and young people?

The minimum ages defined by law (13 and 16) were originally intended to protect children and young people. The laws were supposed to lead to platforms taking better care of their underage users, for example by protecting their privacy and not exploiting their trust. Unfortunately, the laws have not had the desired effect. Providers responded to the laws by excluding minors from using their services in the terms and conditions or by making parents responsible by referring to consent.

Does the minimum age provide good guidance for parents?

The minimum age in digital services is not an educational assessment. It results from legal requirements, especially in the area of data protection. Therefore, the minimum age specified in the general terms and conditions is usually not a meaningful orientation for guardians. If a service specifies 13 as the minimum age, this does not mean that the service has been tested and recommended for children over 13. Conversely, services with a minimum age of 16 are not per se unsuitable for younger people in terms of content. Parents should therefore inform themselves about the content and the possible risks of using the service before deciding for or against it.

How can I decide when a service is appropriate for my child?

Unfortunately, there is no simple and universal answer to this question. Rather, it depends on the child's stage of development and sense of responsibility. It plays a significant role whether parents set the services safely together and discuss risks with their children and establish rules of conduct. It is also an important factor whether children use their devices unsupervised or under the supervision of their legal guardians.

Our checklist "Is my child fit to have their own smartphone?" helps parents assess whether their child has already been able to develop the necessary skills for safe smartphone use.

If you allow children to use online platforms, keep the following points in mind:

  • Set up the account together with your child. Pay particular attention to privacy and security. Show the options for reporting and blocking. Instructions for secure settings for many web offerings and devices can be found at and
  • Establish clear rules for use. These should cover both content aspects e.g. do not reveal private details, do not offend, do not distribute inappropriate content or pictures and videos of others. But rules for the duration of use are also useful, e.g. set a daily maximum duration of use and define media-free times in everyday life. can help you negotiate these rules together.
  • Stay in contact with your child. Inquire regularly about what your child is doing and experiencing on the Internet. This way you will know if there are problems and can help. On the other hand, you can also notice that your child may no longer need some protective settings over time and adjust the arrangements.
  • Familiarize your child with offers of help. No matter how good the parent-child relationship is, it can always happen that a child does not confide in his or her parents out of fear or shame. In this case, children should be aware of anonymous, free and non-binding support services. These include the Nummer gegen Kummer, JUUUPORT and

More information from klicksafe