Do you operate an online platform, e-shop, or application and are unsure how to properly handle the personal data of children and teenagers? This article will provide you with clear answers. You will learn what the age limit is for consent to the processing of personal data in the Czech Republic, what methods of age verification are accepted by the ÚOOÚ (Office for Personal Data Protection), and what risks and fines of millions of dollars you face if you make a mistake. Find out how to protect your business and gain peace of mind.
Author of the article: ARROWS (Mgr. Jáchym Petřík, office@arws.cz, +420 245 007 740)
In the digital economy of the 21st century, children and teenagers represent not only a significant but also a growing group of users. Their presence on online platforms, social networks, and e-shops is a reality that no modern company can ignore. Protecting the personal data of this vulnerable group is therefore not only a legal obligation but also a cornerstone of corporate ethics and reputation.
Many companies mistakenly believe that if their services are not primarily aimed at children, strict data protection rules do not apply to them. However, this is a dangerous misconception. The key criterion introduced by modern legislation such as the UK's Age Appropriate Design Code and the EU's Digital Services Act (DSA) is whether a service is "likely to be accessible" to minors.
This standard dramatically expands the range of companies that must actively address the issue. It applies not only to games and social networks, but also to news portals with discussion forums, e-shops with general merchandise, and even B2B tools that students can use for school projects. The scope of obligations is therefore much broader than it appears at first glance.
The General Data Protection Regulation (GDPR) explicitly states that children deserve special protection because they may be less aware of the risks, consequences, and their rights in relation to the processing of personal data. Their psychological vulnerability makes them more susceptible to manipulative design elements, known as "dark patterns," which can lead them to provide more data than necessary or choose less private settings. The long-term impacts of excessive data collection and online interactions on their development, including the risk of addiction or reduced empathy, are a matter of serious concern for psychologists and regulators alike.
The fundamental legal principle that arises from this situation is the lawfulness of processing. For many online services, particularly those involving marketing, personalization, or community features, the only valid legal basis for processing personal data is the consent of the data subject. However, this is where a fundamental complication arises for companies: a child's ability to give legally valid consent is limited by law. Illegally obtained consent means that all subsequent processing of the data is unlawful, with all the consequences that entails.
While astronomical fines attract the most media attention, reputational damage can be even more devastating for a company. Being publicly labeled as a company that neglects child protection can lead to a loss of customer trust, an outflow of investors, and problems in recruiting talented employees.
In today's world, where there is an increasing emphasis on corporate social responsibility, protecting the data of the most vulnerable users is becoming a key indicator of corporate culture and integrity. ARROWS lawyers help companies not only navigate complex legislation, but also understand these broader business and reputational risks and set up processes that protect both users and the company itself.
It is absolutely essential for every online service provider to know the exact age limit at which a minor can independently and legally validly consent to the processing of their personal data. It is precisely at this point that general European regulations clash with national legislation, creating one of the most common pitfalls for international and domestic companies alike. A thorough knowledge of local rules is the foundation of any compliance program.
At the heart of European regulations is Article 8 of the GDPR. It states that where the processing of personal data is based on consent in relation to the offer of information society services directly to a child, such processing is lawful if the child is at least 16 years old. This is the default rule applicable throughout the European Union. However, the Regulation also gives Member States the option to lower this age limit, but not below 13 years.
The Czech Republic has made use of this option and set its own specific age limit. The key provision here is Section 7 of Act No. 110/2019 Coll. on the processing of personal data, which states: "A child acquires the capacity to consent to the processing of personal data in connection with the offer of information society services directly to him or her upon reaching the age of fifteen."
The age limit of 15 is therefore binding for all companies operating on the Czech market. The application of the general 16-year age limit from the GDPR is incorrect in the Czech Republic and leads to the unlawful processing of data of 15-year-old users.
It is important to define in practical terms what is meant by "information society services." This term covers a wide range of online activities that are usually provided for a fee or contain commercial communications. These typically include social networks, content sharing platforms, online games, e-shops, streaming services, mobile applications with personalized content, or the sending of marketing newsletters. The age limit rule applies whenever consent is the legal basis for processing.
Conversely, if processing is necessary for the performance of a contract (e.g., a one-time purchase in an e-shop without subsequent marketing), consent is not required and these specific rules do not apply.
The legal consequence is clear: if a child is under 15, their separate consent is invalid. For the processing of their personal data to be lawful, consent must be given or authorized by a person who has parental responsibility (typically a parent or legal guardian).
This age limit of 15 was not set arbitrarily. It reflects the Czech legislature's effort to align "digital maturity" with other significant legal milestones in the life of an adolescent, such as the acquisition of partial legal capacity and criminal responsibility. It is therefore not an arbitrary number, but a well-considered choice reflecting the Czech legal and social context, as evidenced by earlier discussions on this age limit.
However, in addition to the age limit itself, it is also necessary to take into account the broader principle of Czech civil law, which is the "maturity" of the child. Even though a 15-year-old user can already give consent on their own, this consent must still be informed and free. This means that information about data processing must be provided in a language that is understandable to adolescents, and the consent process must not be manipulative.
This principle resonates with the European Data Protection Board's (EDPB) emphasis on transparency and the prohibition of deceptive practices.
ARROWS therefore advises clients not only on how to meet the formal age limit, but also on how to set up the entire consent process so that it is legally defensible in terms of its quality.
One of the biggest practical challenges that the GDPR poses for online platform operators is the obligation to make "reasonable efforts" to verify the age of users and, where applicable, the validity of parental consent. The law deliberately does not specify a single method, as it recognizes that what is "reasonable" varies depending on the nature of the service and the risks associated with data processing. The key to successful and defensible implementation is therefore to choose a risk-based approach.
There is no one-size-fits-all solution. The right age verification method depends on what data you process and what you do with it. This risk-based approach can be broken down into several levels:
For services that do not pose a significant risk to children's rights and freedoms—such as content websites without registration, blogs without interactive elements, or forums with minimal data collection—a simple, neutral age gate may be a sufficient measure. The user is asked to enter their date of birth upon entry.
However, it is important to note that this mechanism is very easy to circumvent, and the European Data Protection Board (EDPB) has expressed considerable skepticism about its use in any scenario other than low-risk ones.
This includes regular e-shops, social networks with limited functions, or online communities. Here, a simple declaration of age is not enough. More robust methods are needed to increase the likelihood that consent is given by an adult.
Such methods include sending a verification email or message to a parent's phone number, making a micro-payment by credit card (which is then refunded), as it is assumed that the child does not have a card, or using third-party services such as mojeID or bank identity.
If your platform processes sensitive data, performs extensive profiling for advertising purposes, allows free interaction between users, or provides content that is inappropriate for children (e.g., gambling, alcohol sales, dating sites), the age verification requirements are the highest. In this case, it is necessary to implement reliable and difficult-to-falsify methods. These may include verification using a valid official document (ID card, passport), where the user uploads a photo of the document, or the use of facial biometric analysis, which estimates the user's age based on a selfie.
Although these methods are the most accurate, they also entail additional legal complications, particularly in relation to the processing of special categories of personal data.
When choosing a method, companies face the so-called data minimization paradox. The most reliable verification methods (document scanning, biometrics) require the collection of the most sensitive personal data, which may be in direct conflict with the GDPR principle of collecting only data that is necessary for the purpose.
Choosing the right mechanism is therefore always about finding a balance between the robustness of verification and the protection of users' privacy.
In this context, it is also important to monitor technological developments. As a modern office, ARROWS also follows emerging trends such as zero-knowledge proofs (ZKPs) and the European digital identity project (eID). In the future, these systems will make it possible to confirm a fact (e.g., "the person is over 15 years old") without revealing the data itself (exact date of birth) or the identity of the person, thus elegantly solving the aforementioned paradox.
In the event of an inspection by the Office for Personal Data Protection (ÚOOÚ), the key question will not be whether you have a verification system in place, but whether you are able to justify why you chose that particular system. The documentation of your decision-making process, in particular the data protection impact assessment (DPIA), thus becomes essential evidence.
A company that has used only a simple age gate for a high-risk service will not be able to justify its choice. The legal work involved in risk analysis and documenting decisions is therefore just as important as technical implementation.
ARROWS not only provides legal opinions, but also actively works with your IT and product teams to design and implement solutions that are compliant with the principle of "compliance-by-design." We will help you conduct a data protection impact assessment (DPIA) and select a verification method that is appropriate for your business model, technically feasible, and, above all, legally defensible.
For companies whose ambitions extend beyond the borders of the Czech Republic, the issue of consent from minors adds another, significantly more complex dimension. The flexibility that the GDPR has given member states in Article 8 has led to a patchwork of legal rules across the European Union. What is legal in one country may be illegal in another. Operating a single pan-European platform without taking these national differences into account is a recipe for serious legal problems.
While the Czech Republic has set the age limit at 15, the situation is different in our neighboring countries and key trading partners. This fragmentation requires companies to have not only legal knowledge but also advanced technical skills.
The following table illustrates how significantly age limits can vary, demonstrating the complexity faced by every company with an international reach:
Country |
Age limit for consent |
Note |
Czech Republic |
15 |
National regulation according to Act 110/2019 Coll.
|
Germany |
16 |
Default limit according to GDPR |
Austria |
14 |
Lower limit |
Poland |
16 |
Default limit according to GDPR |
Slovakia |
16 |
Default limit according to GDPR |
France |
15 |
Same as in the Czech Republic |
Belgium |
13 |
Lowest possible limit |
Sweden |
13 |
Lowest possible age limit |
Ireland |
16 |
Default age limit according to GDPR |
Spain |
14 |
Lower age limit |
The business impact of this legal mosaic is significant. For any company with a pan-European user base, a uniform approach to obtaining consent is simply contrary to the law. Your platform must be technically capable of identifying the user's location and dynamically applying different rules based on that.
In practice, this requires the implementation of reliable geolocation technology and the development of flexible mechanisms for displaying different versions of registration forms and consent forms. This legal fragmentation is directly reflected in increased development and maintenance costs and the overall technical complexity of the system.
This complexity is precisely why we have spent ten years building the ARROWS International network. Our clients do not have to search for and coordinate local experts in each country. Thanks to our network, we provide consistent and effective legal advice across jurisdictions, whether you are launching a new service in Germany, France, or Sweden. We deal with issues with an international element on a daily basis.
The situation is further complicated if you use the services of providers based outside the EU, for example in the United States (which is common for cloud storage, analytics tools, or marketing platforms). In such cases, the issue of different age limits is compounded by the high-risk issue of transferring personal data to third countries.
European supervisory authorities are extremely vigilant and strict when it comes to data relating to minors. Examples include cases in Germany and Denmark, where data protection authorities criticized the use of Google and Microsoft tools in schools precisely because of the transfer of student data to the US, where the same level of protection as in the EU is not guaranteed.
Abstract legal provisions and theoretical risks often only take on real shape when viewed in the context of specific cases. Nothing illustrates the potential consequences of mistakes in the area of children's data protection better than the TikTok case, which has become a cautionary tale for the entire digital industry.
The €345 million (approximately CZK 8.5 billion) fine imposed on TikTok in 2023 by the Irish Data Protection Commission (DPC) was not just a punishment for a single mistake, but for a whole series of systemic failures.
An analysis of this decision provides insight into what supervisory authorities are focusing on and what practices they consider to be the most serious violations of the GDPR. For your company, this is essentially a manual of what to avoid:
These violations led to a huge fine. According to Article 83 of the GDPR, violations of the rules on children's consent, processing principles, and data subjects' rights fall into a category for which a fine of up to EUR 20 million or 4% of the company's global annual turnover can be imposed, whichever is higher.
Although the Czech adaptation law sets lower national limits for certain offenses, the Office for Personal Data Protection (ÚOOÚ) has full authority to impose significant penalties on the Czech market.
To help you better understand the specific threats and our solutions, we have prepared the following overview table of risks:
Risk to be addressed and potential problems and penalties |
How ARROWS helps |
Processing of data of children under 15 without valid parental consent: Fines from the ÚOOÚ of up to EUR 10 million (or 2% of turnover), reputational damage, litigation for damages from the persons concerned, orders to delete data. |
Audit and setup of processes for obtaining and managing consent, preparation of contractual terms and privacy policies that are understandable even to minors.
|
Insufficient verification of user age ("reasonable effort"): Questioning the legality of the entire processing, orders to remedy, high fines for systemic errors, especially for high-risk services. |
Consultation and recommendations on appropriate technical and organizational measures (age-gating), legal assessment of the riskiness of your business model.
|
Inappropriate platform design (public profiles, "dark patterns"): Violation of "privacy by design & default" principles, fines for non-transparent and manipulative practices (see the TikTok case), negative media attention. |
Legal review of user interface (UI/UX), preparation of internal guidelines for developers and product managers, team training. |
Data security breaches and leaks of information about minors: Obligation to report incidents to the ÚOOÚ within 72 hours, high fines, loss of trust of clients and partners, class action lawsuits. |
Representation in proceedings before the ÚOOÚ, preparation of crisis communication, review and implementation of security policies and contracts with processors. |
As the previous chapters have shown, ensuring compliance with the law when processing the personal data of minors is a complex and dynamic discipline. It is not a one-off task that can be ticked off and forgotten. It is a continuous process that requires in-depth knowledge of the law, understanding of technology, and strategic risk management. There is too much at stake to underestimate this area.
Personal data protection is not just one of the many areas we focus on. It is a key specialization of our team and the backbone of our practice. Our extensive experience speaks for itself: we have successfully set up and managed GDPR compliance for more than 150 joint-stock companies, 250 limited liability companies, and 51 municipalities and regions. These numbers represent hundreds of hours of analysis, consultation, and implementation of solutions in real business environments, from small startups to large corporations.
We understand that every company is unique. That's why we don't offer one-size-fits-all templates, but tailor-made solutions that reflect your specific business model, technical capabilities, and risk profile. Our experts can help you at every stage of your journey to full compliance:
Underestimating the rules for protecting the personal data of minors is not a strategy, but a risk that a successful company cannot afford to take. Investing in a robust compliance program is an investment in the trust of your customers, the stability of your business, and peace of mind for your future.
Contact us today to arrange a no-obligation consultation. We will assess your platform and propose specific steps to ensure full compliance and security for your business.