On April 28, 2022, Covington convened experts across our practice groups for the Covington Robotics Forum, which explored recent developments and forecasts relevant to industries affected by robotics.  Sam Jungyun Choi, Associate in Covington’s Technology Regulatory Group, and Anna Oberschelp, Associate in Covington’s Data Privacy & Cybersecurity Practice Group, discussed global regulatory trends that affect robotics, highlights of which are captured here.  A recording of the forum is available here until May 31, 2022.

Trends on Regulating Artificial Intelligence

            According to the Organization for Economic Cooperation and Development  Artificial Intelligence Policy Observatory (“OECD”), since 2017, at least 60 countries have adopted some form of AI policy, a torrent of government activity that nearly matches the pace of modern AI adoption.  Countries around the world are establishing governmental and intergovernmental strategies and initiatives to guide the development of AI.  These AI initiatives include: (1) AI regulation or policy; (2) AI enablers (e.g., research and public awareness); and (3) financial support (e.g., procurement programs for AI R&D).  The anticipated introduction of AI regulations raises concerns about looming challenges for international cooperation.

Continue Reading Robotics Spotlight: Global Regulatory Trends Affecting Robotics

Technology equity markets took a sharp turn in the last two months of Q1 2022, with S&P Technology Index reaching to over 18% in the red in mid-March, before closing the quarter at 7% off.  In the last month, across all sectors, Russia’s attack on Ukraine has rattled markets and dented investor appetite amid increased volatility and uncertainty.  The decline in valuations is being impacted by the combined headwinds of rising inflation and interest rates, as well as geopolitical uncertainty. 

Russia’s invasion of Ukraine triggered an unprecedented phenomenon: global technology firms responded to the invasion by suspending or terminating business operations, effectively self-sanctioning beyond regulatory requirements, often at great expense to bottom lines.  This trend will likely continue – in 2022 decisions about where to invest and who to accept investment from will be driven by ethical concerns, as well as the shifting geopolitical risks.  However, as we will see in this article, many tech businesses struggle to fully abandon their presence in Russia.

This article highlights some of the ways in which the Ukraine crisis is changing tech M&A.

Expanded scope of Due Diligence

As tech companies embark on M&A deals, proactive and effective risk management will be more essential than ever.  Enhanced focus on these issues is likely to translate to expansion of transaction timelines.

Continue Reading Ukraine Crisis:  Changing M&A Transactions for Technology Companies

On May 19, the Federal Trade Commission (“FTC”) adopted, on a unanimous basis, a policy statement reminding educational technology vendors (“ed tech vendors”) of their duty to comply with the substantive privacy protections of the Children’s Online Privacy Protection Act (“COPPA”) and the Commission-issued COPPA Rule.  The policy statement reiterates the requirements of the Rule and previous informal guidance from Commission staff, and makes clear that ed tech vendors may not submit children to commercial surveillance and data monetization practices when using technology in the classroom.

The FTC’s COPPA Rule, which became effective in 2000 and was most recently amended in 2013, is intended to place parents in control over the information collected from their children online.  A major component of the Rule is that commercial online operators must (1) provide parents with notice of data collection and (2) obtain parental consent before the collection of personal information of children under age 13.

Recognizing the unique benefits of ed tech, the new policy statement reminds ed tech vendors that their compliance with the Rule extends beyond the notice and consent requirement.  Specifically, the FTC intends to scrutinize the activities of ed tech vendors in the following areas:

Continue Reading FTC Unanimously Adopts Policy Statement on Education Technology and COPPA

          On April 28, 2022, Covington convened experts across our practice groups for the Covington Robotics Forum, which explored recent developments and forecasts relevant to industries affected by robotics.  One segment of the Robotics Forum covered risks of automation and AI, highlights of which are captured here.  A full recording of the Robotics Forum is available here until May 31, 2022.

            As AI and robotics technologies mature, the use-cases are expected to grow in increasingly complex areas and to pose new risks. Because lawsuits have settled prior to a court deciding liability questions, no settled case law yet exists to identify where the liability rests between robotics engineers, AI designers, and manufacturers.  Scholars and researchers have proposed addressing these issues through products liability and discrimination doctrines, including the creation of new legal remedies specific to AI technology and particular use-cases, such as self-driving cars.  Proposed approaches for liability through existing doctrines have included:

Continue Reading Robotics Spotlight: Risks of Automation and AI

            On April 28, 2022, Covington convened experts across our practice groups for the Covington Robotics Forum, which explored recent developments and forecasts relevant to industries affected by robotics.  Winslow Taub, Partner in Covington’s Technology Transactions Practice Group, and Jennifer Plitsch, Chair of Covington’s Government Contracts Practice Group, discussed the robotics issues presented in private transactions and government contracts, highlights of which are captured here.  A recording of the forum is available here until May 31, 2022.

            A business in the robotics space may acquire, develop, and use a variety of technology assets, including specialized hardware, control software, and AI models that depend on large training data sets.  Understanding these assets, and the forms of IP protection available for them, is critical when engaging in a transaction the robotics space—whether an M&A transaction, a commercial transaction, or a transaction with the government.

            By way of example, in an M&A transaction, the seller may rely on disparate sets of data (including sets of data acquired from its customers) in developing robotics products.  Verifying that the seller has sufficient rights to that data is a critical part of diligence.  And the data is often processed heavily in order to make it useful for a robotics application.  Because data rights are not protectable under patent and copyright laws, it is important to verify that adequate contractual protections are in place to secure exclusive use by the seller.

            As another example, in commercial agreements for the deployment of robotics technology, it is important to take special care in negotiating the rights and obligations during the “support” phase of the project, after the solution is put into operation.  The technology provider will often require access to data from the production environment in order to fix bugs and improve performance—including sets of production data that can be used to further train the relevant AI models.

            The U.S. Government frequently collaborates in the development of specialized technology, or tests or procures finished products from the private sector.  The U.S. Government has specialized rules which must be carefully considered when entering into any transaction with the U.S. Government as these rules are often complicated and can impose significant compliance obligations.  Collaboration or development agreements should be carefully considered as many standard government procurement requirements do not easily translate to new and emerging technologies.

            U.S. Government rights in intellectual property and data related to government agreements should be carefully considered before entering into any U.S. Government agreement or acquiring any technology that has been funded by the U.S. Government.  If a company accepts U.S. Government funding to develop robotics technology, there may be significant intellectual property implications if patentable technology is conceived of or for the first time actually reduced to practice in the performance of the government agreement.  The inventor will hold the patent, but the U.S. Government will receive a license and potentially greater rights if certain reporting and other requirements are not met.  This issue can also arise for companies acquiring already developed technologies that may have been developed with U.S. Government funding as the U.S. Government rights will remain after the transaction. 

            The U.S. Government also generally obtains unlimited rights in data first produced or delivered in the performance of a U.S. Government contract, and companies should think about what data might be produced or delivered in the course of performance of an agreement with the government, and consider what consequences could result from the U.S. Government receiving the right to use, share and even publish such data.  This issue is particularly important in consideration of the potential impact on trade secret protections for data in which the U.S. Government receives unlimited rights.

            We will provide additional updates about the topics covered in the 2022 Covington Robotics Forum and other developments related to robotics on our blog.  To learn more about our work related to this post, please visit the Technology IndustryTechnology Transactions and Government Contracts pages of our web site.  For more information on developments related to AI, IoT and connected and automated vehicles, please visit our AI Toolkit and our Internet of ThingsConnected and Autonomous Vehicles and Data Privacy and Cybersecurity websites.

In the early hours of Friday, 13 May, the European Parliament and the Council of the EU reached provisional political agreement on a new framework EU cybersecurity law, known as “NIS2”. This new law, which will replace the existing NIS Directive (which was agreed around the same time as GDPR, see here) aims to strengthen EU-wide cybersecurity protection across a broader range of sectors, including the pharmaceutical sector, medical device manufacturing, and the food sector.

We set out background on NIS2 in prior blog posts (e.g., in relation to the original proposal in late 2020, see here, and more recently when the Council of the EU adopted an updated version in December 2021). Whilst we are still waiting for the provisionally agreed text to be released, a few points are worth mentioning from this latest agreement:

  • Clearer delineation of scope. NIS2 will only apply to entities that meet certain size thresholds in the prescribed sectors, namely
    • “essential entities” meaning those operating in the following sectors: energy; transport; banking; financial market infrastructures; health (including the manufacture of pharmaceutical products); drinking water; waste water; digital infrastructure (internet exchange points; DNS providers; TLD name registries; cloud computing service providers; data centre service providers; content delivery networks; trust service providers; and public electronic communications networks and electronic communications services); public administration; and space; and
    • “important entities”, meaning those operating in the following sectors: postal and courier services; waste management; chemicals; food; manufacturing of medical devices, computers and electronics, machinery equipment, motor vehicles; and digital providers (online market places, online search engines, and social networking service platforms).
Continue Reading Political Agreement Reached on New EU Horizontal Cybersecurity Directive

On May 10, 2022, Prince Charles announced in the Queen’s Speech that the UK Government’s proposed Online Safety Bill (the “OSB”) will proceed through Parliament. The OSB is currently at committee stage in the House of Commons. Since it was first announced in December 2020, the OSB has been the subject of intense debate and scrutiny on the balance it seeks to strike between online safety and protecting children on the one hand, and freedom of expression and privacy on the other.

To what services does the OSB apply?

The OSB applies to “user-to-user” (“U2U”) services—essentially, services through which users can share content online, such as social media and online messaging services—and “search” services. The OSB specifically excludes  email services, SMS, “internal business services,” and services where the communications functionality is limited (e.g., to posting comments relating to content produced by the provider of the service). The OSB also excludes “one-to-one live aural communications”—suggesting that one-to-one over-the-top (“OTT”) calls are excluded, but that one-to-many OTT calls, or video calls, may fall within scope.

Continue Reading Online Safety Bill to Proceed Through Parliament

Congress launched the Conference Committee on Bipartisan Innovation and Competition Legislation last week with a four-hour meeting featuring remarks by nearly one-hundred committee chairs and members from both chambers of Congress. Chaired by Senator Maria Cantwell (D-WA), the Conference Committee’s objective is to reconcile differences between the United States Innovation and Competition Act (“USICA”), which passed the Senate by a bipartisan vote of 68–32 in June 2021, and the America Creating Opportunities to Meaningfully Promote Excellence in Technology, Education, and Science Act (“America COMPETES Act”), which passed the House by a partisan vote of 222–210 in February 2022.

The kick-off meeting suggested that this objective is attainable, but by no means guaranteed.

On display was broad consensus that the United States is not doing enough to spur innovation and remain competitive around the world, and that legislation is needed in support of those goals.  Chair Cantwell opened the conference by recognizing that this is a “historic day” with a supply chain crisis and that this is a “Sputnik moment.”  A bicameral and bipartisan chorus, including Senate Commerce Committee Ranking Member Roger Wicker (R-MS), House Science Committee Chair Eddie Bernie Johnson (D-TX), and House Science Committee Ranking Member Frank Lucas (R-OK) echoed her optimism and urgency.

Members also generally agreed on several key components in the bills.  Members of both chambers and both sides of the aisle recognized the importance of anchoring supply chains of critical products including semiconductors and pharmaceutical drugs in the United States.  A bipartisan group expressed support for the $52 billion in funding for semiconductor incentives that is included in both the USICA and America COMPETES Act.  Several Democrats and Republicans also noted that they are working together on an additional tax provision, which is currently not in either bill, to encourage semiconductor design and manufacturing in the United States.  Members also agreed on the need to push back against anti-competitive conduct by China such as cyberattacks and intellectual property theft, and to invest in science, technology, education, and mathematics (STEM) education to expand and improve the U.S. workforce.

Continue Reading Congress Kick Offs Conference Committee on Bipartisan Innovation and Competition Legislation

Northern Ireland’s 30 years of ‘Troubles’ were brought to an end by the 1998 Good Friday Agreement (the GFA). The GFA was based on the principle of cross-community support from both nationalists and unionists: a delicate compromise which sought a middle path between the Unionists – who see N Ireland as an integral part of the UK – and the Nationalists – who view the future of N Ireland as lying in reunification with the Republic.

The success of the GFA was underpinned by the fact that both the UK and the Republic of Ireland were in the EU.  Whilst both countries were members of the EU, there was no need for a border between N Ireland the Republic – goods and services could flow unimpeded across the border.  Leaving the EU required a bespoke solution to N Ireland – one that respected the GFA and did not reimpose a physical border between N Ireland and the Republic: a visible manifestation of a divided island.

Squaring the circle of respecting the GFA, whilst taking the UK as a whole out of the EU, was always the most complicated part of Brexit. With the UK outside the EU, a customs border would be required somewhere: it could not be between N Ireland and the Republic, because of the need to respect the GFA and avoid antagonizing the Nationalist community. The only place that border could be therefore, was in the Irish Sea between N Ireland and the rest of GB – which risked irritating the Unionist community.

The Northern Ireland Protocol

The solution to this delicate balancing act was the Northern Ireland Protocol (the NIP), which left N Ireland in the EU Single Market, but brought it out of the Customs Union, enabling N Ireland to have the best of both worlds, with one foot in the UK and the other in the EU.  However, the NIP imposed checks on goods (especially food and medicine) from GB arriving into N Ireland, to ensure they complied with EU standards and avoid the risk of them leaking into the EU Single Market through the back door: these checks have so far been unilaterally postponed by the UK.

Elections add to the complexity…

Continue Reading The UK and the Northern Ireland Protocol (again!)

Most observers expect the Republicans to take control of the House of Representatives, and possibly the Senate, in the upcoming midterm elections.  While both Democrats and Republicans are likely to keep their attention on the actions of so-called “Big Tech,” this political shift should bring a renewed focus on amending Section 230 of the Communications Decency Act.  Section 230, which provides platforms with immunity from liability for third-party content and content-moderation decisions, has been a target for lawmakers seeking to limit the power of large technology companies.  Republicans have generally focused more on modifying Section 230, versus Democrats, who have spent more energy on using antitrust legislation to regulate those platforms.

Looking ahead, now is the time to consider policies and plans in light of a Republican-controlled Congress taking on potentially divisive issues through the lens of Section 230.

Republicans, Conservatives, and Section 230

Two trends will guide Republicans’ approach to Section 230 in the next Congress.  First, as in many areas, Republicans will seek to address what they see as “woke capitalism.”  New York Times columnist Ross Douthat coined the term in 2018 and defined it as a “certain kind of virtue-signaling on progressive social causes, a certain degree of performative wokeness, [that] is offered to liberalism and the activist left pre-emptively, in hopes that having corporate America take their side in the culture wars will blunt efforts to tax or regulate our new monopolies too heavily.”

Republicans are already planning a variety of legislative and oversight maneuvers meant to address corporations taking certain positions on cultural issues.  Technology companies may very well be at the top of Republicans’ list.

Second, conservatives increasingly view liberals as having abandoned their commitment to free speech.  For example, Republicans view the Hunter Biden laptop controversy, campus speech codes, and social media content moderation as part of a broader effort to silence and marginalize conservatives.  Simply put, conservatives believe that they are now the defenders of free speech. Continue Reading SECTION 230 IN A REPUBLICAN CONGRESS