Photo of Libbie Canter

Libbie Canter

Libbie Canter represents a wide variety of multinational companies on privacy, cyber security, and technology transaction issues, including helping clients with their most complex privacy challenges and the development of governance frameworks and processes to comply with global privacy laws. She routinely supports clients on their efforts to launch new products and services involving emerging technologies, and she has assisted dozens of clients with their efforts to prepare for and comply with federal and state privacy laws, including the California Consumer Privacy Act and California Privacy Rights Act.

Libbie represents clients across industries, but she also has deep expertise in advising clients in highly-regulated sectors, including financial services and digital health companies. She counsels these companies — and their technology and advertising partners — on how to address legacy regulatory issues and the cutting edge issues that have emerged with industry innovations and data collaborations.

On April 2, the Enforcement Division of the California Privacy Protection Agency issued its first Enforcement Advisory, titled “Applying Data Minimization to Consumer Requests.”  The Advisory highlights certain provisions of and regulations promulgated under the California Consumer Privacy Act (“CCPA”) that “reflect the concept of data minimization” and provides two examples that illustrate how

A new post on the Covington Inside Privacy blog discusses remarks by California Privacy Protection Agency (CPPA) Executive Director Ashkan Soltani at the International Association of Privacy Professionals’ global privacy conference last week.  The remarks covered the CPPA’s priorities for rulemaking and administrative enforcement of the California Consumer Privacy Act, including with respect to connected

On February 9, the Third Appellate District of California vacated a trial court’s decision that held that enforcement of the California Privacy Protection Agency’s (“CPPA”) regulations could not commence until one year after the finalized date of the regulations.  As we previously explained, the Superior Court’s order prevented the CPPA from enforcing the regulations

New Jersey and New Hampshire are the latest states to pass comprehensive privacy legislation, joining CaliforniaVirginiaColoradoConnecticutUtahIowaIndiana, Tennessee, Montana, OregonTexasFlorida, and Delaware.  Below is a summary of key takeaways. 

New Jersey

On January 8, 2024, the New Jersey state senate passed S.B. 332 (“the Act”), which was signed into law on January 16, 2024.  The Act, which takes effect 365 days after enactment, resembles the comprehensive privacy statutes in Connecticut, Colorado, Montana, and Oregon, though there are some notable distinctions. 

  • Scope and Applicability:  The Act will apply to controllers that conduct business or produce products or services in New Jersey, and, during a calendar year, control or process either (1) the personal data of at least 100,000 consumers, excluding personal data processed for the sole purpose of completing a transaction; or (2) the personal data of at least 25,000 consumers where the business derives revenue, or receives a discount on the price of any goods or services, from the sale of personal data. The Act omits several exemptions present in other state comprehensive privacy laws, including exemptions for nonprofit organizations and information covered by the Family Educational Rights and Privacy Act.
  • Consumer Rights:  Consumers will have the rights of access, deletion, portability, and correction under the Act.  Moreover, the Act will provide consumers with the right to opt out of targeted advertising, the sale of personal data, and profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.  The Act will require controllers to develop a universal opt out mechanism by which consumers can exercise these opt out rights within six months of the Act’s effective date.
  • Sensitive Data:  The Act will require consent prior to the collection of sensitive data. “Sensitive data” is defined to include, among other things, racial or ethnic origin, religious beliefs, mental or physical health condition, sex life or sexual orientation, citizenship or immigration status, status as transgender or non-binary, and genetic or biometric data.  Notably, the Act is the first comprehensive privacy statute other than the California Consumer Privacy Act to include financial information in its definition of sensitive data.  The Act defines financial information as an “account number, account log-in, financial account, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a consumer’s financial account.”
  • Opt-In Consent for Certain Processing of Personal Data Concerning Teens:  Unless a controller obtains a consumer’s consent, the Act will prohibit the controller from processing personal data for targeted adverting, sale, or profiling where the controller has actual knowledge, or willfully disregards, that the consumer is between the ages of 13 and 16 years old.
  • Enforcement and Rulemaking:  The Act grants the New Jersey Attorney General enforcement authority.  The Act also provides controllers with a 30-day right to cure for certain violations, which will sunset eighteen months after the Act’s effective date.  Like the comprehensive privacy laws in California and Colorado, the Act authorizes rulemaking under the state Administrative Procedure Act.  Specifically, the Act requires the Director of the Division of Consumer Affairs in the Department of Law and Public Safety to promulgate rules and regulations pursuant to the Administrative Procedure Act that are necessary to effectuate the Act’s provisions.  

Continue Reading New Jersey and New Hampshire Pass Comprehensive Privacy Legislation

Ahead of its December 8 board meeting, the California Privacy Protection Agency (CPPA) has issued draft risk assessment regulations.  The CPPA has yet to initiate the formal rulemaking process and has stated that it expects to begin formal rulemaking next year, at which time it will also consider draft regulations covering “automated decisionmaking technology” (ADMT), cybersecurity audits, and revisions to existing regulations.  Accordingly, the draft risk assessment regulations are subject to change.  Below are the key takeaways:

When a Risk Assessment is Required: The draft regulations would require businesses to conduct a risk assessment before processing consumers’ personal information in a manner that “presents significant risk to consumers’ privacy.”  The draft regulations identify several activities that would present such risk:

  • Selling or sharing personal information;
  • Processing sensitive personal information (except in certain situations involving employees and independent contractors);
  • Using ADMT (1) for a decision that produces legal or similarly significant effects concerning a consumer, (2) to profile a consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student, (3) to profile a consumer while they are in a public place, or (4) for profiling for behavioral advertising; or
  • Processing a consumer’s personal information if the business has actual knowledge the consumer is under 16.

Continue Reading CPPA Releases Draft Risk Assessment Regulations

On October 10, 2023, California Governor Gavin Newsom signed S.B. 362, the Delete Act (the “Act”), into law.  The new law represents a substantive overhaul of California’s existing data broker statute, which requires data brokers to register with the California Attorney General annually.  The passage of the Act follows a renewed interest in data

On June 22, 2023, the Oregon state legislature passed the Oregon Consumer Privacy Act, S.B. 619 (the “Act”).  This bill resembles the comprehensive privacy statutes in Colorado, Montana, and Connecticut, though there are some notable distinctions.  If passed, Oregon will be the twelfth state to implement a comprehensive privacy statute, joining California, Virginia, Colorado, Connecticut

On April 25, 2023, four federal agencies — the Department of Justice (“DOJ”), Federal Trade Commission (“FTC”), Consumer Financial Protection Bureau (“CFPB”), and Equal Employment Opportunity Commission (“EEOC”) — released a joint statement on the agencies’ efforts to address discrimination and bias in automated systems. 

The statement applies to “automated systems,” which are broadly defined “to mean software and algorithmic processes” beyond AI.  Although the statement notes the significant benefits that can flow from the use of automated systems, it also cautions against unlawful discrimination that may result from that use. 

The statement starts by summarizing the existing legal authorities that apply to automated systems and each agency’s guidance and statements related to AI.  Helpfully, the statement serves to aggregate links to key AI-related guidance documents from each agency, providing a one-stop-shop for important AI-related publications for all four entities.  For example, the statement summarizes the EEOC’s remit in enforcing federal laws that make it unlawful to discriminate against an applicant or employee and the EEOC’s enforcement activities related to AI, and includes a link to a technical assistance document.  Similarly, the report outlines the FTC’s reports and guidance on AI, and includes multiple links to FTC AI-related documents.

After providing an overview of each agency’s position and links to key documents, the statement then summarizes the following sources of potential discrimination and bias, which could indicate the regulatory and enforcement priorities of these agencies.

  • Data and Datasets:  The statement notes that outcomes generated by automated systems can be skewed by unrepresentative or imbalanced data sets.  The statement says that flawed data sets, along with correlation between data and protected classes, can lead to discriminatory outcomes.
  • Model Opacity and Access:  The statement observes that some automated systems are “black boxes,” meaning that the internal workings of automated systems are not always transparent to people, and thus difficult to oversee.
  • Design and Use:  The statement also notes that flawed assumptions about users may play a role in unfair or biased outcomes.

We will continue to monitor these and related developments across our blogs.Continue Reading DOJ, FTC, CFPB, and EEOC Statement on Discrimination and AI

By Libbie CanterAnna D. KrausOlivia VegaElizabeth Brim & Jorge Ortiz on April 14, 2023

On April 11, the U.S. Department of Health and Human Services (“HHS”) Office for Civil Rights (“OCR”) announced that four Notifications of Enforcement Discretion (“Notifications”) that were issued under the Health Insurance Portability and Accountability Act