Since our mid-year recap on minors’ privacy legislation, several significant developments have emerged in the latter half of 2025. We recap the notable developments below.

Age Signaling Requirements

In October, California enacted the Digital Age Assurance Act (AB 1043), which, effective January 1, 2027, requires operating system providers—defined as entities that develop, license, or control operating system software—to offer an accessible interface at account setup that prompts the account holder to indicate the user’s birth date, age, or both. The developer is required to request an age signal about a particular user when the application is downloaded and launched.

This California law differs from other app store legislation that passed earlier this year, such as in Utah, Texas, and Louisiana, because AB 1043 applies to operating system providers rather than solely mobile application store providers. While the Texas App Store Accountability Act is scheduled to take effect on January 1, 2026, in October, the Computer & Communications Industry Association (“CCIA”) filed a lawsuit challenging the Texas law as unconstitutional.

Mental Health Warning Labels

State legislatures are also targeting minors’ social media usage. Both California and Minnesota enacted laws that would require social media platforms to display warning labels to users.  Minnesota’s law takes effect July 1, 2026 and California’s law is effective January 1, 2027. The New York legislature also passed a bill that would require warning labels on certain social media platforms, which awaits the Governor’s signature. These laws generally prescribe requirements for duration, design, and text of the warning labels. In November 2025, the U.S. District Court for the District of Colorado temporarily halted Colorado’s warning label law, which would have gone into effect on January 1, 2026, and required social media platforms to display warning messages to users under 18 about the negative impacts of social media.

Regulation of AI Chatbots

The second half of 2025 saw increased attention to minors’ use of AI chatbots. The Federal Trade Commission (“FTC”) launched an inquiry into AI chatbots acting as companions under its Section 6(b) authority, including what actions companies are taking to mitigate alleged harm and compliance with the Children’s Online Privacy Protection Act Rule. State legislatures have also been active in enacting laws on the use of AI chatbots:

  • California recently enacted SB 243, which requires operators of companion chat platforms to provide a clear and conspicuous notice indicating that the chatbot is AI-generated. If operators know a user is a minor, they must (1) disclose to the minor user that they are interacting with AI, (2) provide a notification every three hours that reminds the minor user to take a break and that the chatbot is AI, and (3) institute “reasonable measures” to prevent the chatbot from sharing or encouraging the minor user to engage in sexually explicit content.
  • There are several bills pending in Congress that would require AI chatbots to implement age-verification measures. Under the CHAT Act, if an entity determines that the user is a minor, the minor account would need to be affiliated with a parental account and the entity would need to obtain verifiable parental consent for the minor to use the service. Under the GUARD Act, a minor would be prohibited from accessing or using the AI companion. Under the SAFE BOTs Act, chatbots would need to make disclosures of its non-human and non-professional status and provide crisis intervention resources to minor users.

State Rulemaking Activity

Beyond legislation, several states are advancing rulemaking to strengthen protections for minors’ data.

  • California recently amended its regulations under the California Consumer Privacy Act to expand the statutory definition of “sensitive personal information” to include individuals under 16 years old, provided that the business has actual knowledge of the consumer’s age.
  • As an effort to advance rulemaking under SB 976, known as the “Protecting Our Kids from Social Media Addiction Act,” on November 5 the California Attorney General held a hearing to solicit public comment on: (1) methods and standards for age assurance on social media and other online platforms used by minors; (2) ongoing obligations for operators of social media platforms performing age assurance; and (3) parental consent for use of social media and other online platforms by minors. As a next step, the California Attorney General will release draft language for the regulations under SB 976.
  • The New York Attorney General released proposed rules for the Stop Addictive Feeds Exploitation (“SAFE”) for Kids Act to restrict certain social media features. The proposed rules describe which companies would need to comply with the law and the standards for determining users’ age and obtaining parental consent for features. After the public comment period closes in December 2025, the Office of Attorney General has one year to finalize the rules. The Act will go into effect 180 days after the rules are released.
  • Colorado recently finalized amendments to the Colorado Privacy Act rules to address privacy protections for minors’ data by clarifying the knowledge standard for duties regarding minors and the system design features to consider with regards to minors’ use. The amendments also describe the factors for determining when a system design feature significantly increases a minor’s use of a service and is therefore subject to a consent requirement.

Federal Developments

There are several bills currently pending in Congress aimed at addressing minors’ online safety and privacy. On December 2, the House Subcommittee on Commerce, Manufacturing, and Trade held a hearing on a broad package of bills related to minors, such as House versions of the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”) and Kids Online Safety Act (“KOSA”), as well as the App Store Accountability Act.

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager is a recognized leader in representing companies before federal and state regulators, and is renowned for advising on minor protection, AI, and state comprehensive privacy laws.

Lindsey chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their…

Lindsey Tonsager is a recognized leader in representing companies before federal and state regulators, and is renowned for advising on minor protection, AI, and state comprehensive privacy laws.

Lindsey chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and State Attorneys General on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence; data processing for robotics, autonomous vehicles, and other connected devices; biometrics; online advertising; the collection of personal information from children, teens, and students online; e-mail marketing; disclosures of video viewing information; and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

Photo of Jenna Zhang Jenna Zhang

Jenna Zhang advises clients across industries on data privacy, cybersecurity, and emerging technologies. 

Jenna partners with clients to ensure their compliance with the rapidly evolving federal and state privacy and cybersecurity laws. She supports clients in designing new products and services, drafting privacy…

Jenna Zhang advises clients across industries on data privacy, cybersecurity, and emerging technologies. 

Jenna partners with clients to ensure their compliance with the rapidly evolving federal and state privacy and cybersecurity laws. She supports clients in designing new products and services, drafting privacy notices and terms of use, responding to cyber and data security incidents, and evaluating privacy and cybersecurity risks in corporate transactions. In particular, she advises clients on substantive requirements relating to children’s and student privacy, including COPPA, FERPA, age-appropriate design code laws, and social media laws.

As part of her practice, Jenna regularly represents clients in data privacy investigations and enforcement actions brought by the Federal Trade Commission and state attorneys general. She also supports clients in proactive engagement with regulators and policymakers to ensure their perspectives are heard.

Jenna also maintains an active pro bono practice with a focus on supporting families in adoptions, guardianships, and immigration matters.

Photo of Natalie Maas Natalie Maas

Natalie is an associate in the firm’s San Francisco office, where she is a member of the Food, Drug, and Device, and Data Privacy and Cybersecurity Practice Groups. She advises pharmaceutical, biotechnology, medical device, and food companies on a broad range of regulatory…

Natalie is an associate in the firm’s San Francisco office, where she is a member of the Food, Drug, and Device, and Data Privacy and Cybersecurity Practice Groups. She advises pharmaceutical, biotechnology, medical device, and food companies on a broad range of regulatory and compliance issues.

Natalie also maintains an active pro bono practice, with a particular focus on health care and reproductive rights.

Photo of Bryan Ramirez Bryan Ramirez

Bryan Ramirez is an associate in the firm’s San Francisco office and is a member of the Data Privacy and Cybersecurity Practice Group. He advises clients on a range of regulatory and compliance issues, including compliance with state privacy laws. Bryan also maintains…

Bryan Ramirez is an associate in the firm’s San Francisco office and is a member of the Data Privacy and Cybersecurity Practice Group. He advises clients on a range of regulatory and compliance issues, including compliance with state privacy laws. Bryan also maintains an active pro bono practice.

Photo of Irene Kim Irene Kim

Irene Kim is an associate in the firm’s Washington, DC office, where she is a member of the Privacy and Cybersecurity and Advertising and Consumer Protection Investigations practice groups. She advises clients on a broad range of issues, including U.S. state and federal…

Irene Kim is an associate in the firm’s Washington, DC office, where she is a member of the Privacy and Cybersecurity and Advertising and Consumer Protection Investigations practice groups. She advises clients on a broad range of issues, including U.S. state and federal AI legislation, comprehensive state privacy laws, and regulatory compliance matters.