On September 20, 2024, California Governor Newsom signed into law SB 976, the Protecting Our Kids from Social Media Addiction Act (the “Act”). The Act defines and prohibits an “addictive internet-based service or platform” from providing an “addictive feed” to a minor unless the platform has previously obtained verifiable parental consent. The Act will take effect on January 1, 2025, and the California Attorney General will promulgate regulations on age assurance and parental consent by January 1, 2027. This post summarizes the law’s key provisions. The law includes several technical definitions and exceptions, which are explained at the end of this post.Continue Reading California Passes Law to Protect Minors from “Addictive Feeds”
Lindsey Tonsager
Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.
Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.
Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.
State and Federal Developments in Minors’ Privacy in 2024
This year has brought significant movement and trends in minors’ privacy legislation on both the state and federal levels. We recap the notable developments below.
Comprehensive Consumer Privacy Legislation
Individual states have continued to enact their own comprehensive consumer privacy legislation this year. All of the state comprehensive consumer privacy laws passed this year incorporate the Children’s Online Privacy Protection Act (“COPPA”) through parental consent and sensitive data processing requirements. Notably, New Hampshire, New Jersey, and Maryland impose additional restrictions on the processing of minors’ personal data for targeted advertising, sales, and profiling. New Hampshire’s legislation prohibits processing of personal data for sales or targeted data “where the controller has actual knowledge or willfully disregards that the consumer is at least 13 and under 16.” Similarly, New Jersey’s comprehensive privacy legislation prohibits processing of personal data for sales, targeted ads, or profiling “where the controller has actual knowledge or willfully disregards that the consumer is at least 13 and under 17.” Maryland contains an outright prohibition on the sale or processing of personal data for targeted advertising “if the controller knew or should have known that the consumer is under 18.”
AADC and COPPA-Style Laws
States have continued to introduce Age Appropriate Design Codes (“AADC”), adding to the sweeping trend that emerged last year. Maryland’s new AADC law is similar to California’s AADC law, but departs notably by not requiring covered entities to implement age-gating and modifying the scope of covered entities to services that are “reasonably likely to be accessed by children.” The DPIA requirement in Maryland’s law focuses on “data management or processing practices” of the online product and specifies the harm that should be evaluated.Continue Reading State and Federal Developments in Minors’ Privacy in 2024
KOSA, COPPA 2.0 Likely to Pass U.S. Senate
U.S. Senate Majority Leader Chuck Schumer (D-NY) yesterday, July 23, initiated procedural steps that will likely lead to swift Senate passage of the Kids Online Safety Act (“KOSA”) and the Children and Teens’ Online Privacy Protection Act (“COPPA 2.0”). Both bills have been under consideration in the Senate and the House of Representatives for some time, which we have previously covered. Schumer’s action will likely bring the two bills in a single package to the Senate Floor as soon as Thursday, June 25. The future of the legislation in the House, however, is less certain.
KOSA, led by Sens. Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.), would, in its current form (S.1409), require specified “covered platforms” to implement new safeguards, tools, and transparency for minors under 17 online. These covered platforms:
- Would have a duty of care to prevent and mitigate enumerated harms.
- Must have default safeguards for known minors, including tools that: limit the ability of others to communicate with minors; limit features that increase, sustain, or extend use of the platform by the minor; and control personalization systems.
- Must provide “readily-accessible and easy-to-use settings for parents” to help manage a minor’s use of a platform.
- Must provide specified notices and obtain verifiable parental consent for children under 13 to register for the service.
KOSA also requires government agencies to conduct research on minors’ use of online services, directs the Federal Trade Commission (“FTC”) to issue guidance for covered platforms on specific topics, and provides for the establishment of a Kids Online Safety Council. The FTC and state attorneys general would have authority to enforce the law, which would take effect 18 months after it is enacted.
In a press conference yesterday, Blumenthal and Blackburn touted 70 bipartisan Senate cosponsors and called for quick Senate passage of the bill without further amendment.Continue Reading KOSA, COPPA 2.0 Likely to Pass U.S. Senate
FTC Reaches Settlement with NGL Labs Over Children’s Privacy & AI
On July 9, 2024, the FTC and California Attorney General settled a case against NGL Labs (“NGL”) and two of its co-founders. NGL Labs’ app, “NGL: ask me anything,” allows users to receive anonymous messages from their friends and social media followers. The complaint alleged violations of the FTC Act…
Continue Reading FTC Reaches Settlement with NGL Labs Over Children’s Privacy & AILouisiana Bans Targeted Advertising to Minors on Social Media Platforms
On June 18, 2024, Louisiana enacted HB 577, prohibiting “social media platforms” with more than 1 million users globally from displaying targeted advertising to Louisiana users that the platform has actual knowledge are under 18 years of age and from selling the sensitive personal data of such users. The…
Continue Reading Louisiana Bans Targeted Advertising to Minors on Social Media PlatformsCalifornia Legislature Advances Several AI-Related Bills
With three months left until the end of this year’s legislative session, the California Legislature has been considering a flurry of bills regarding artificial intelligence (AI). Notable bills, described further below, impose requirements on developers and deployers of generative AI systems. The bills contain varying definitions of AI and generative AI systems. Each of these bills has been passed by one legislative chamber, but remains under consideration in the other chamber.
Legislation Regulating AI Developers
Two bills would require generative AI systems to make AI-generated content easily identifiable.
- SB 942 would require generative AI systems that average 1 million monthly visitors or users to provide an “AI detection tool” that would verify whether content was generated by the system. It would also require AI-generated content to have a visible and difficult to remove disclosure that the content was generated by AI. A noncompliant system would incur a daily $5,000 fine, although only the Attorney General could file an enforcement action.
- AB 3211 would require, starting February 1, 2025, that every generative AI system, as defined under the law, place watermarks in AI-generated content. Generative AI systems would need to develop associated decoders that would verify whether content was generated by the system. A system available before February 1, 2025 could only remain available if the provider of the system created a decoder with 99% accuracy or published research that the system is incapable of producing inauthentic content. A system used in a conversational setting (e.g., chatbots) would need to clearly disclose that it generates synthetic content. Additionally, vulnerabilities in the system would need to be reported to the Department of Technology. The Department of Technology would have administrative enforcement authority to impose penalties up to the greater of $1 million or 5% of the violator’s annual global revenue.
Two additional bills would limit or require disclosure of information about data sources used to train AI models.
- AB 2013 would require, beginning in January 1, 2026, that the developer of any AI model post on their website information regarding the data used to train the model. This would include: the source or owner of the data; the number of samples in the data; whether the data is protected by copyright, trademark, or patent; and whether there is personal information or aggregate commercial information in the data, as defined in the California Consumer Privacy Act (CCPA). AI models developed solely to ensure security and integrity would be exempt from this requirement.
- AB 2877 would prohibit using personal information, as defined in the CCPA, of individuals under 16 years old to train an AI model without affirmative consent. The individual’s parent would need to give affirmative consent for individuals under 13 years old. Even with consent, the developer would need to deidentify and aggregate the data before using it to train an AI model.
Legislators also are considering preemptively regulating AI that is more advanced than systems currently in existence. SB 1047 would create a new Frontier Model Division to regulate AI models trained on a system that can perform 1026 integer operations per second (IOPS) or floating-point operations per second (FLOPS). The legislature emphasized this would not regulate any technology currently in existence. The bill would also require operators of a cluster of computers that can perform 1020 IOPS or FLOPS to establish certain policies around customer use of the cluster.Continue Reading California Legislature Advances Several AI-Related Bills
Colorado Privacy Act Amended To Include Biometric Data Provisions
On May 31, 2024, Colorado Governor Jared Polis signed HB 1130 into law. This legislation amends the Colorado Privacy Act to add specific requirements for the processing of an individual’s biometric data. This law does not have a private right of action.
Similarly to the Illinois Biometric Information Privacy Act…
Continue Reading Colorado Privacy Act Amended To Include Biometric Data ProvisionsNebraska Enacts Nebraska Data Privacy Act
On April 17, the Nebraska governor signed the Nebraska Data Privacy Act (the “NDPA”) into law. Nebraska is the latest state to enact comprehensive privacy legislation, joining California, Virginia, Colorado, Connecticut, Utah, Iowa, Indiana, Tennessee, Montana, Oregon, Texas, Florida, Delaware…
Continue Reading Nebraska Enacts Nebraska Data Privacy ActFlorida Enacts Social Media Bill Restricting Access for Teens Under the Age of Sixteen
On Monday, March 25, Florida Governor Ron DeSantis signed SB 3 into law. At a high level, the bill requires social media platforms to terminate the accounts of individuals under the age of 14, while seeking parental consent for accounts of those 14 or 15 years of age. The law…
Continue Reading Florida Enacts Social Media Bill Restricting Access for Teens Under the Age of SixteenCalifornia Appeals Court Vacates Enforcement Delay of CPPA Regulations
On February 9, the Third Appellate District of California vacated a trial court’s decision that held that enforcement of the California Privacy Protection Agency’s (“CPPA”) regulations could not commence until one year after the finalized date of the regulations. As we previously explained, the Superior Court’s order prevented the…
Continue Reading California Appeals Court Vacates Enforcement Delay of CPPA Regulations