Photo of Nicholas Shepherd

Nicholas Shepherd

Nicholas Shepherd is an associate in Covington’s Washington, DC office, where he is a member of the Data Privacy and Cybersecurity Practice Group, advising clients on compliance with all aspects of the European General Data Protection Regulation (GDPR), ePrivacy Directive, European direct marketing laws, and other privacy and cybersecurity laws worldwide. Nick counsels on topics that include adtech, anonymization, children's privacy, cross-border transfer restrictions, and much more, providing advice tailored to product- and service-specific contexts to help clients apply a risk-based approach in addressing requirements in relation to transparency, consent, lawful processing, data sharing, and others.

A U.S.-trained and qualified lawyer with 7 years of working experience in Europe, Nick leverages his multi-faceted legal background and international experience to provide clear and pragmatic advice to help organizations address their privacy compliance obligations across jurisdictions.

Nicholas is a member of the Bar of Texas and Brussels Bar (Dutch Section, B-List). District of Columbia bar application pending; supervised by principals of the firm.

Contact:Email

On September 28, 2023, the Cyberspace Administration of China (“CAC”) issued draft Provisions on Standardizing and Promoting Cross-Border Data Flows (Draft for Comment) (规范和促进数据跨境流动规定(征求意见稿)) (draft “Provisions”) (Chinese version available here) for a public consultation, which will conclude on October 15, 2023. 

The draft Provisions propose significant changes to the existing

On April 11, 2023, the Cyberspace Administration of China (“CAC”) released draft Administrative Measures for Generative Artificial Intelligence Services (《生成式人工智能服务管理办法(征求意见稿)》) (“draft Measures”) (official Chinese version available here) for public consultation.  The deadline for submitting comments is May 10, 2023.

The draft Measures would regulate generative Artificial Intelligence (“AI”) services that are “provided to the public in mainland China.”  These requirements cover a wide range of issues that are frequently debated in relation to the governance of generative AI globally, such as data protection, non-discrimination, bias and the quality of training data.  The draft Measures also highlight issues arising from the use of generative AI that are of particular concern to the Chinese government, such as content moderation, the completion of a security assessment for new technologies, and algorithmic transparency.  The draft Measures thus reflect the Chinese government’s objective to craft its own governance model for new technologies such as generative AI.

Further, and notwithstanding the requirements introduced by the draft Measures (as described in greater detail below), the text states that the government encourages the (indigenous) development of (and international cooperation in relation to) generative AI technology, and encourages companies to adopt “secure and trustworthy software, tools, computing and data resources” to that end. 

Notably, the draft Measures do not make a distinction between generative AI services offered to individual consumers or enterprise customers, although certain requirements appear to be more directed to consumer-facing services than enterprise services.Continue Reading China Proposes Draft Measures to Regulate Generative AI

On March 21, 2022, the European Data Protection Board (“EDPB”) published its draft Guidelines 3/2022 on Dark patterns in social media platform interfaces (hereafter “Guidelines”, available here), following the EDPB’s plenary session held on March 14, 2022.  The stated objective of the Guidelines is to provide practical guidance to both designers and users of social media platforms about how to identify and avoid so-called “dark patterns” in social media interfaces that would violate requirements set out in the EU’s General Data Protection Regulation (“GDPR”).  In this sense, the Guidelines serve both to instruct organizations on how to design of their platforms and user interfaces in a GDPR-compliant manner, as well as to educate users on how certain practices they are subject to could run contrary to the GDPR (which could, as a result, lead to an increase in GDPR complaints arising from such practices).  The Guidelines are currently subject to a 6-week period of public consultation, and interested parties are invited to submit feedback directly to the EDPB here (see “provide your feedback” button).

In this blog post, we summarize the Guidelines and identify key takeaways.  Notably, while the Guidelines are targeted to designers and users of social media platforms, they may offer helpful insights to organizations across other sectors seeking to comply with the GDPR, and in particular, its requirements with respect to fairness, transparency, data minimization, purpose limitation, facilitating personal data rights, and so forth.

Setting the Stage

At the outset of the Guidelines, the EDPB defines “dark patterns” as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”.  The EDPB then provides a taxonomy of 6 defined categories of dark patterns, namely:

  1. Overloading– overwhelming users with a large quantity of requests, information, options, or possibilities to prompt them to share more data;
  2. Skipping– designing the interface or user experience in a way that causes users to forget (or fail to consider) all or certain data protection aspects of a decision;
  3. Stirring– appealing to the emotions of users or using visual nudges;
  4. Hindering– obstructing or blocking users from becoming informed about the use of their data or exercising control it by making certain actions hard or impossible to achieve;
  5. Fickle– designing the interface in an inconsistent and unclear manner which makes it difficult to navigate user controls or understand processing purposes; and finally,
  6. Left in the dark– designing the interface in a way to hide information or privacy controls, or to leave users uncertain about how their data is processed and the control they can exercise over it.

The EDPB notes that these six categories can also be thematically framed as “content-based patterns” (i.e., referring to the content of information presented to users, including the context, wording used, and informational components) or “interface-based patterns” (i.e., referring to the manner that content is displayed, navigated through, or interacted with by users, which can have a direct influence on the perception of dark patterns).

Beneath the six over-arching categories of dark patterns outlined above, the EDPB then identifies 15 specific dark pattern behaviors and considers how each them can manifest during the lifecycle of a social media user account, a continuum which the EDPB breaks down into the following 5 stages: (1) opening a social media account; (2) staying informed on social media; (3) staying protected on social media; (4) exercising personal data rights on social media; and (5) leaving a social media account.
Continue Reading EDPB Publishes Draft Guidelines on the Use of “Dark Patterns” in Social Media Interfaces