On September 17, 2025, Brazil enacted the Digital Statute of the Child and Adolescent (“Digital ECA”), establishing a pioneering regulatory framework for protecting children (under 12 years of age) and adolescents (between the ages of 12 and 18) online. Brazil’s Congress approved the new law in a matter of just a few days in response to parents’ pressure, after a well-known Brazilian digital influencer published a series of online videos on the “adultization” of children on the internet.
The Digital ECA imposes obligations on technology providers to prioritize child safety and data protection proactively. This comprehensive legislation applies to any information technology product or service aimed at or likely accessed by minors in Brazil, regardless of where the product is developed, manufactured, offered, or marketed. It complements existing Brazilian laws, including the Consumer Defense Code, the Statute of the Child and Adolescent, the General Data Protection Law (“LGPD”), and the Legal Framework for the Electronic Games Industry.
It is also the second measure adopted by the Brazilian federal government to regulate social media. Earlier, Brazil’s Supreme Court struck down the country’s internet governance safe harbor provision and established new content-moderation and third-party-related liability rules for digital platforms.
Scope and Definitions
The Digital ECA applies to providers of any product or service provided remotely, electronically, and upon request, and that is directed to, or “likely to be accessed”, by children or adolescents. This includes apps, software, terminal equipment systems, app stores, online and electronic games, and social media. “Likely access” by children or adolescents is determined by criteria such as the probability of usage, ease of access, and potential risks to the privacy, safety, or biopsychosocial development of the user.
Key Principles and Obligations
The Digital ECA sets foundational principles for using digital products and services by minors, including: (i) full protection and prioritization of their best interests; (ii) recognition of their biopsychosocial development; (iii) protection from exploitation and violence; (iv) respect for their progressive autonomy; (v) prevention of commercial exploitation and promotion of digital education; and (vi) transparency and accountability in the processing of their personal data. The “best interest” of the child or adolescent entails the protection of their privacy, safety, mental and physical health, access to information, freedom to participate in society, meaningful access to digital technologies, and overall well-being.
Providers of digital products or services addressed to, or likely to be accessed by, minors must, among other things:
- implement safeguards at the product’s design stage and throughout its lifecycle, ensuring the highest level of privacy and data protection by default, considering the autonomy and progressive development of minors, in line with their best interests;
- clearly inform all users about the product’s or service’s age classification policy at the time of access;
- provide clear, accessible information to children, adolescents, and their guardians to enable informed choices regarding any less protective settings;
- adopt appropriate technical, including industry-recognized, security measures to enable families and legal guardians to prevent inappropriate access and use by minors;
- refrain from processing personal data of minors in ways that cause, facilitate, or contribute to violations of their privacy or any other legally protected rights, taking into account the principles of the Brazilian LGPD and the “best interest of the child and adolescent”;
- conduct a risk management of their resources, functionalities, and systems, as well as assess the impact on the safety and health of minors;
- evaluate content to ensure compatibility with age classifications;
- offer systems and processes designed to prevent minors from encountering, through the product or service, illegal or pornographic content, as well as other content clearly inappropriate for their age group, according to age classifications and applicable law;
- develop and enable default settings that prevent compulsive use of products or services by minors;
- adopt proportionate, auditable, and secure age verification mechanisms, banning self-declaration as a means of age verification;
- provide parents, legal guardians, and minors with clear information about any risks and the security measures associated with a product or service, with the information provided upfront and regardless of whether users decide to purchase the product or service;
- provide users with reporting mechanisms regarding violations of the rights of children and adolescents;
- promptly remove content violating children’s rights, including sexual abuse, exploitation, kidnapping, and grooming detected in their products or services, directly or indirectly, and notify the content to the competent authority; and
- retain backend data relating to the content violating children’s rights, including the data of the user responsible for the content and metadata related to it, for at least 6 months.
The Digital ECA specifically prohibits:
- offering loot boxes (randomized reward mechanisms) in games addressed to or likely to be accessed by minors;
- monetizing or promoting content that erotizes or displays images sexualizing children or adolescents; and
- the use of profiling techniques for targeting commercial advertising to minors, as well as the use of emotional analysis, augmented reality, extended reality, and virtual reality for this purpose.
The Digital ECA imposes specific obligations on internet application stores and operating systems, such as obtaining parental consent for app downloads by minors. Social networks also have additional duties, such as enabling the linking accounts of users under 16 to their legal guardians, and providing prompt appeal mechanisms in case of account suspensions related to suspected age violations.
In addition, providers of platforms with over one million minor users within Brazil are required to publish semiannual reports on complaints, content moderation actions, data protection compliance, and risk management, as well as facilitate academic and scientific research access subject to strict confidentiality and non-commercial use conditions.
Regulatory Oversight and Guidance
The Brazilian National Data Protection Agency (ANPD) is the designated regulatory body responsible for overseeing compliance with the Digital ECA. The Brazilian President signed a provisional measure converting the ANPD into a full-fledged regulatory agency in Brazil with binding rulemaking and oversight authority. The ANPD has been entrusted with the power to issue recommendations and guidance on best practices for Digital ECA compliance. Beyond that, the ANPD’s regulations have binding legal force, allowing the agency to supervise, audit, and penalize companies for non-compliance with the Digital ECA. In addition, the ANPD is authorized to issue and enforce blocking orders against content or services that violate the Digital ECA, working jointly with ANATEL (the National Telecommunications Agency) for telecom services and CGI.br (the Brazilian Internet Steering Committee) for internet domain names.
The Digital ECA sets penalties for non-compliance, subject to due process, including: (i) fines of up to 10% of the offender’s economic group revenue in Brazil from the prior fiscal year, or, if revenue data is unavailable, a fine ranging from R$10 to R$1,000 per registered user, capped at R$50 million (approximately USD 9.4 million) per violation; (ii) temporary suspension of activities; and (iii) prohibition from operating.
Alignment with European Digital Child Protection Efforts
The Digital ECA aligns with other laws adopted in other parts of the world, such as the European Union’s Digital Services Act (“DSA”) and the UK’s Online Safety Act, all aimed at protecting minors online. The Digital ECA specifically safeguards minors by regulating all digital products and services accessible to them. It introduces detailed, child-focused obligations, including default privacy settings, robust age verification, parental supervision tools, and restrictions on harmful content and profiling.
In contrast, the EU’s DSA regulates providers of “intermediary services,” including online platforms, and imposes a broad obligation on platforms accessible to minors to ensure a high level of privacy, safety, and security (see recent European Commission guidelines on the protection of minors under the EU’s Digital Services Act). Meanwhile, the upcoming EU Digital Fairness Act, expected by late 2026, aims to strengthen protections by addressing addictive design features and limiting exploitative targeting practices, thereby promoting fairness and transparency in digital consumer interactions.
* * *
Covington regularly advises clients on compliance with emerging laws and developments worldwide related to the protection of minors online. We are happy to assist with any questions or challenges you may have in navigating these complex and evolving regulatory landscapes.