Photo of Lindsey Tonsager

Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

On July 9, 2024, the FTC and California Attorney General settled a case against NGL Labs (“NGL”) and two of its co-founders. NGL Labs’ app, “NGL: ask me anything,” allows users to receive anonymous messages from their friends and social media followers. The complaint alleged violations of the FTC Act, the Restore Online Shoppers’ Confidence

On June 18, 2024, Louisiana enacted HB 577, prohibiting “social media platforms” with more than 1 million users globally from displaying targeted advertising to Louisiana users that the platform has actual knowledge are under 18 years of age and from selling the sensitive personal data of such users. The law amends the effective date

With three months left until the end of this year’s legislative session, the California Legislature has been considering a flurry of bills regarding artificial intelligence (AI). Notable bills, described further below, impose requirements on developers and deployers of generative AI systems. The bills contain varying definitions of AI and generative AI systems. Each of these bills has been passed by one legislative chamber, but remains under consideration in the other chamber.

Legislation Regulating AI Developers

Two bills would require generative AI systems to make AI-generated content easily identifiable.

  • SB 942 would require generative AI systems that average 1 million monthly visitors or users to provide an “AI detection tool” that would verify whether content was generated by the system. It would also require AI-generated content to have a visible and difficult to remove disclosure that the content was generated by AI. A noncompliant system would incur a daily $5,000 fine, although only the Attorney General could file an enforcement action.
  • AB 3211 would require, starting February 1, 2025, that every generative AI system, as defined under the law, place watermarks in AI-generated content. Generative AI systems would need to develop associated decoders that would verify whether content was generated by the system. A system available before February 1, 2025 could only remain available if the provider of the system created a decoder with 99% accuracy or published research that the system is incapable of producing inauthentic content. A system used in a conversational setting (e.g., chatbots) would need to clearly disclose that it generates synthetic content. Additionally, vulnerabilities in the system would need to be reported to the Department of Technology. The Department of Technology would have administrative enforcement authority to impose penalties up to the greater of $1 million or 5% of the violator’s annual global revenue.

Two additional bills would limit or require disclosure of information about data sources used to train AI models.

  • AB 2013 would require, beginning in January 1, 2026, that the developer of any AI model post on their website information regarding the data used to train the model. This would include: the source or owner of the data; the number of samples in the data; whether the data is protected by copyright, trademark, or patent; and whether there is personal information or aggregate commercial information in the data, as defined in the California Consumer Privacy Act (CCPA). AI models developed solely to ensure security and integrity would be exempt from this requirement.
  • AB 2877 would prohibit using personal information, as defined in the CCPA, of individuals under 16 years old to train an AI model without affirmative consent. The individual’s parent would need to give affirmative consent for individuals under 13 years old. Even with consent, the developer would need to deidentify and aggregate the data before using it to train an AI model.

Legislators also are considering preemptively regulating AI that is more advanced than systems currently in existence. SB 1047 would create a new Frontier Model Division to regulate AI models trained on a system that can perform 1026 integer operations per second (IOPS) or floating-point operations per second (FLOPS). The legislature emphasized this would not regulate any technology currently in existence. The bill would also require operators of a cluster of computers that can perform 1020 IOPS or FLOPS to establish certain policies around customer use of the cluster.Continue Reading California Legislature Advances Several AI-Related Bills

On May 31, 2024, Colorado Governor Jared Polis signed HB 1130 into law. This legislation amends the Colorado Privacy Act to add specific requirements for the processing of an individual’s biometric data. This law does not have a private right of action.

Similarly to the Illinois Biometric Information Privacy Act (BIPA), this law requires controllers

On April 17, the Nebraska governor signed the Nebraska Data Privacy Act (the “NDPA”) into law.  Nebraska is the latest state to enact comprehensive privacy legislation, joining CaliforniaVirginiaColoradoConnecticutUtahIowaIndiana, Tennessee, Montana, OregonTexasFloridaDelawareNew Jersey,  New

On February 9, the Third Appellate District of California vacated a trial court’s decision that held that enforcement of the California Privacy Protection Agency’s (“CPPA”) regulations could not commence until one year after the finalized date of the regulations.  As we previously explained, the Superior Court’s order prevented the CPPA from enforcing the regulations

New Jersey and New Hampshire are the latest states to pass comprehensive privacy legislation, joining CaliforniaVirginiaColoradoConnecticutUtahIowaIndiana, Tennessee, Montana, OregonTexasFlorida, and Delaware.  Below is a summary of key takeaways. 

New Jersey

On January 8, 2024, the New Jersey state senate passed S.B. 332 (“the Act”), which was signed into law on January 16, 2024.  The Act, which takes effect 365 days after enactment, resembles the comprehensive privacy statutes in Connecticut, Colorado, Montana, and Oregon, though there are some notable distinctions. 

  • Scope and Applicability:  The Act will apply to controllers that conduct business or produce products or services in New Jersey, and, during a calendar year, control or process either (1) the personal data of at least 100,000 consumers, excluding personal data processed for the sole purpose of completing a transaction; or (2) the personal data of at least 25,000 consumers where the business derives revenue, or receives a discount on the price of any goods or services, from the sale of personal data. The Act omits several exemptions present in other state comprehensive privacy laws, including exemptions for nonprofit organizations and information covered by the Family Educational Rights and Privacy Act.
  • Consumer Rights:  Consumers will have the rights of access, deletion, portability, and correction under the Act.  Moreover, the Act will provide consumers with the right to opt out of targeted advertising, the sale of personal data, and profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.  The Act will require controllers to develop a universal opt out mechanism by which consumers can exercise these opt out rights within six months of the Act’s effective date.
  • Sensitive Data:  The Act will require consent prior to the collection of sensitive data. “Sensitive data” is defined to include, among other things, racial or ethnic origin, religious beliefs, mental or physical health condition, sex life or sexual orientation, citizenship or immigration status, status as transgender or non-binary, and genetic or biometric data.  Notably, the Act is the first comprehensive privacy statute other than the California Consumer Privacy Act to include financial information in its definition of sensitive data.  The Act defines financial information as an “account number, account log-in, financial account, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a consumer’s financial account.”
  • Opt-In Consent for Certain Processing of Personal Data Concerning Teens:  Unless a controller obtains a consumer’s consent, the Act will prohibit the controller from processing personal data for targeted adverting, sale, or profiling where the controller has actual knowledge, or willfully disregards, that the consumer is between the ages of 13 and 16 years old.
  • Enforcement and Rulemaking:  The Act grants the New Jersey Attorney General enforcement authority.  The Act also provides controllers with a 30-day right to cure for certain violations, which will sunset eighteen months after the Act’s effective date.  Like the comprehensive privacy laws in California and Colorado, the Act authorizes rulemaking under the state Administrative Procedure Act.  Specifically, the Act requires the Director of the Division of Consumer Affairs in the Department of Law and Public Safety to promulgate rules and regulations pursuant to the Administrative Procedure Act that are necessary to effectuate the Act’s provisions.  

Continue Reading New Jersey and New Hampshire Pass Comprehensive Privacy Legislation

Ahead of its December 8 board meeting, the California Privacy Protection Agency (CPPA) has issued draft risk assessment regulations.  The CPPA has yet to initiate the formal rulemaking process and has stated that it expects to begin formal rulemaking next year, at which time it will also consider draft regulations covering “automated decisionmaking technology” (ADMT), cybersecurity audits, and revisions to existing regulations.  Accordingly, the draft risk assessment regulations are subject to change.  Below are the key takeaways:

When a Risk Assessment is Required: The draft regulations would require businesses to conduct a risk assessment before processing consumers’ personal information in a manner that “presents significant risk to consumers’ privacy.”  The draft regulations identify several activities that would present such risk:

  • Selling or sharing personal information;
  • Processing sensitive personal information (except in certain situations involving employees and independent contractors);
  • Using ADMT (1) for a decision that produces legal or similarly significant effects concerning a consumer, (2) to profile a consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student, (3) to profile a consumer while they are in a public place, or (4) for profiling for behavioral advertising; or
  • Processing a consumer’s personal information if the business has actual knowledge the consumer is under 16.

Continue Reading CPPA Releases Draft Risk Assessment Regulations

In the past year, plaintiffs have filed a wave of lawsuits asserting claims under the Video Privacy Protection Act (“VPPA”) in connection with the alleged use of third-party pixels on websites that offer video content.  A recent decision establishes the limits of the VPPA’s reach and provides a well-reasoned ground for future motions to dismiss.