Photo of Priya Leeds

Priya Leeds

Priya Sundaresan Leeds is an associate in the firm’s San Francisco office. She is a member of the Privacy and Cybersecurity Practice Group. She also maintains an active pro bono practice with a focus on gun control and criminal justice.

On July 9, 2024, the FTC and California Attorney General settled a case against NGL Labs (“NGL”) and two of its co-founders. NGL Labs’ app, “NGL: ask me anything,” allows users to receive anonymous messages from their friends and social media followers. The complaint alleged violations of the FTC Act

Continue Reading FTC Reaches Settlement with NGL Labs Over Children’s Privacy & AI

With three months left until the end of this year’s legislative session, the California Legislature has been considering a flurry of bills regarding artificial intelligence (AI). Notable bills, described further below, impose requirements on developers and deployers of generative AI systems. The bills contain varying definitions of AI and generative AI systems. Each of these bills has been passed by one legislative chamber, but remains under consideration in the other chamber.

Legislation Regulating AI Developers

Two bills would require generative AI systems to make AI-generated content easily identifiable.

  • SB 942 would require generative AI systems that average 1 million monthly visitors or users to provide an “AI detection tool” that would verify whether content was generated by the system. It would also require AI-generated content to have a visible and difficult to remove disclosure that the content was generated by AI. A noncompliant system would incur a daily $5,000 fine, although only the Attorney General could file an enforcement action.
  • AB 3211 would require, starting February 1, 2025, that every generative AI system, as defined under the law, place watermarks in AI-generated content. Generative AI systems would need to develop associated decoders that would verify whether content was generated by the system. A system available before February 1, 2025 could only remain available if the provider of the system created a decoder with 99% accuracy or published research that the system is incapable of producing inauthentic content. A system used in a conversational setting (e.g., chatbots) would need to clearly disclose that it generates synthetic content. Additionally, vulnerabilities in the system would need to be reported to the Department of Technology. The Department of Technology would have administrative enforcement authority to impose penalties up to the greater of $1 million or 5% of the violator’s annual global revenue.

Two additional bills would limit or require disclosure of information about data sources used to train AI models.

  • AB 2013 would require, beginning in January 1, 2026, that the developer of any AI model post on their website information regarding the data used to train the model. This would include: the source or owner of the data; the number of samples in the data; whether the data is protected by copyright, trademark, or patent; and whether there is personal information or aggregate commercial information in the data, as defined in the California Consumer Privacy Act (CCPA). AI models developed solely to ensure security and integrity would be exempt from this requirement.
  • AB 2877 would prohibit using personal information, as defined in the CCPA, of individuals under 16 years old to train an AI model without affirmative consent. The individual’s parent would need to give affirmative consent for individuals under 13 years old. Even with consent, the developer would need to deidentify and aggregate the data before using it to train an AI model.

Legislators also are considering preemptively regulating AI that is more advanced than systems currently in existence. SB 1047 would create a new Frontier Model Division to regulate AI models trained on a system that can perform 1026 integer operations per second (IOPS) or floating-point operations per second (FLOPS). The legislature emphasized this would not regulate any technology currently in existence. The bill would also require operators of a cluster of computers that can perform 1020 IOPS or FLOPS to establish certain policies around customer use of the cluster.Continue Reading California Legislature Advances Several AI-Related Bills

On May 31, 2024, Colorado Governor Jared Polis signed HB 1130 into law. This legislation amends the Colorado Privacy Act to add specific requirements for the processing of an individual’s biometric data. This law does not have a private right of action.

Similarly to the Illinois Biometric Information Privacy Act

Continue Reading Colorado Privacy Act Amended To Include Biometric Data Provisions

On Monday, March 25, Florida Governor Ron DeSantis signed SB 3 into law. At a high level, the bill requires social media platforms to terminate the accounts of individuals under the age of 14, while seeking parental consent for accounts of those 14 or 15 years of age. The law

Continue Reading Florida Enacts Social Media Bill Restricting Access for Teens Under the Age of Sixteen