On April 28, 2022, Covington convened experts across our practice groups for the Covington Robotics Forum, which explored recent developments and forecasts relevant to industries affected by robotics.  Sam Jungyun Choi, Associate in Covington’s Technology Regulatory Group, and Anna Oberschelp, Associate in Covington’s Data Privacy & Cybersecurity Practice Group, discussed global regulatory trends that affect robotics, highlights of which are captured here.  A recording of the forum is available here until May 31, 2022.

Trends on Regulating Artificial Intelligence

            According to the Organization for Economic Cooperation and Development  Artificial Intelligence Policy Observatory (“OECD”), since 2017, at least 60 countries have adopted some form of AI policy, a torrent of government activity that nearly matches the pace of modern AI adoption.  Countries around the world are establishing governmental and intergovernmental strategies and initiatives to guide the development of AI.  These AI initiatives include: (1) AI regulation or policy; (2) AI enablers (e.g., research and public awareness); and (3) financial support (e.g., procurement programs for AI R&D).  The anticipated introduction of AI regulations raises concerns about looming challenges for international cooperation.

Continue Reading Robotics Spotlight: Global Regulatory Trends Affecting Robotics

On May 10, 2022, Prince Charles announced in the Queen’s Speech that the UK Government’s proposed Online Safety Bill (the “OSB”) will proceed through Parliament. The OSB is currently at committee stage in the House of Commons. Since it was first announced in December 2020, the OSB has been the subject of intense debate and scrutiny on the balance it seeks to strike between online safety and protecting children on the one hand, and freedom of expression and privacy on the other.

To what services does the OSB apply?

The OSB applies to “user-to-user” (“U2U”) services—essentially, services through which users can share content online, such as social media and online messaging services—and “search” services. The OSB specifically excludes  email services, SMS, “internal business services,” and services where the communications functionality is limited (e.g., to posting comments relating to content produced by the provider of the service). The OSB also excludes “one-to-one live aural communications”—suggesting that one-to-one over-the-top (“OTT”) calls are excluded, but that one-to-many OTT calls, or video calls, may fall within scope.

Continue Reading Online Safety Bill to Proceed Through Parliament

The Connecticut legislature passed Connecticut SB 6 on April 28, 2022.  If signed by the governor, the bill would take effect on July 1, 2023, though the task force created by the bill will be required to begin work sooner.

The bill closely resembles the Colorado Privacy Act, with a few notable additions.  Like the

The German Conference of Independent Supervisory Authorities (“DSK”) published on March 23, 2022 a statement on scientific research and data protection (see here, in German).  The DSK published the statement in response to the German Government’s initiative on a general law on research data as part of its Open Data Strategy, announced

As many readers will be aware, a key enforcement trend in the privacy sphere is the increasing scrutiny by regulators and activists of cookie banners and the use of cookies. This is a topic that we have been tracking on the Inside Privacy blog for some timeItalian and German data protection authorities have

On March 21, 2022, the European Data Protection Board (“EDPB”) published its draft Guidelines 3/2022 on Dark patterns in social media platform interfaces (hereafter “Guidelines”, available here), following the EDPB’s plenary session held on March 14, 2022.  The stated objective of the Guidelines is to provide practical guidance to both designers and users of social media platforms about how to identify and avoid so-called “dark patterns” in social media interfaces that would violate requirements set out in the EU’s General Data Protection Regulation (“GDPR”).  In this sense, the Guidelines serve both to instruct organizations on how to design of their platforms and user interfaces in a GDPR-compliant manner, as well as to educate users on how certain practices they are subject to could run contrary to the GDPR (which could, as a result, lead to an increase in GDPR complaints arising from such practices).  The Guidelines are currently subject to a 6-week period of public consultation, and interested parties are invited to submit feedback directly to the EDPB here (see “provide your feedback” button).

In this blog post, we summarize the Guidelines and identify key takeaways.  Notably, while the Guidelines are targeted to designers and users of social media platforms, they may offer helpful insights to organizations across other sectors seeking to comply with the GDPR, and in particular, its requirements with respect to fairness, transparency, data minimization, purpose limitation, facilitating personal data rights, and so forth.

Setting the Stage

At the outset of the Guidelines, the EDPB defines “dark patterns” as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”.  The EDPB then provides a taxonomy of 6 defined categories of dark patterns, namely:

  1. Overloading– overwhelming users with a large quantity of requests, information, options, or possibilities to prompt them to share more data;
  2. Skipping– designing the interface or user experience in a way that causes users to forget (or fail to consider) all or certain data protection aspects of a decision;
  3. Stirring– appealing to the emotions of users or using visual nudges;
  4. Hindering– obstructing or blocking users from becoming informed about the use of their data or exercising control it by making certain actions hard or impossible to achieve;
  5. Fickle– designing the interface in an inconsistent and unclear manner which makes it difficult to navigate user controls or understand processing purposes; and finally,
  6. Left in the dark– designing the interface in a way to hide information or privacy controls, or to leave users uncertain about how their data is processed and the control they can exercise over it.

The EDPB notes that these six categories can also be thematically framed as “content-based patterns” (i.e., referring to the content of information presented to users, including the context, wording used, and informational components) or “interface-based patterns” (i.e., referring to the manner that content is displayed, navigated through, or interacted with by users, which can have a direct influence on the perception of dark patterns).

Beneath the six over-arching categories of dark patterns outlined above, the EDPB then identifies 15 specific dark pattern behaviors and considers how each them can manifest during the lifecycle of a social media user account, a continuum which the EDPB breaks down into the following 5 stages: (1) opening a social media account; (2) staying informed on social media; (3) staying protected on social media; (4) exercising personal data rights on social media; and (5) leaving a social media account.
Continue Reading EDPB Publishes Draft Guidelines on the Use of “Dark Patterns” in Social Media Interfaces

In his State of the Union address last week, President Biden declared that he wants to: “strengthen privacy protections, ban targeted advertising to children, and demand tech companies stop collecting personal data on our children.”  This statement comes just a couple of weeks after Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) introduced the Kids