On May 10, 2022, Prince Charles announced in the Queen’s Speech that the UK Government’s proposed Online Safety Bill (the “OSB”) will proceed through Parliament. The OSB is currently at committee stage in the House of Commons. Since it was first announced in December 2020, the OSB has been the subject of intense debate and scrutiny on the balance it seeks to strike between online safety and protecting children on the one hand, and freedom of expression and privacy on the other.

To what services does the OSB apply?

The OSB applies to “user-to-user” (“U2U”) services—essentially, services through which users can share content online, such as social media and online messaging services—and “search” services. The OSB specifically excludes  email services, SMS, “internal business services,” and services where the communications functionality is limited (e.g., to posting comments relating to content produced by the provider of the service). The OSB also excludes “one-to-one live aural communications”—suggesting that one-to-one over-the-top (“OTT”) calls are excluded, but that one-to-many OTT calls, or video calls, may fall within scope.

The OSB will apply to any U2U or search service that has “links” to the UK—meaning that the service: (1) has a significant number of UK users; (2) UK users form one of the target markets of the service, or (3) the service is capable of being used in the UK and there are reasonable grounds to believe that there is a material risk of significant harm to individuals arising from its use.

What types of content does the OSB regulate?

One of the most notable, and debated, aspects of the OSB is that it seeks to regulate not only illegal content, but also content that is legal but “harmful.”

Illegal Content

Content (broadly meaning words, images, speech, or sounds) will be considered illegal where it amounts to a “relevant offence.” A “relevant offence” includes a terrorism offence, an offence relating to child sexual exploitation or abuse (“CSEA”), and any offence where “the victim or intended victim is an individual.” As many offences have individuals as victims, this could sweep in a wide range of content.

The OSB also sets out a list of “priority offences”, which currently includes: assisting suicide; threats to kill; public order offences; harassment; stalking and fear or provocation of violence; drugs and psychoactive substances offences; firearms offences; assisting illegal immigration; sexual exploitation; sexual images offences; proceeds of crime offences; fraud; certain offences relating to financial services; and attempting or conspiring to commit any offence specified in this list. This list can be amended by the Secretary of State through secondary legislation.

Harmful Content

The OSB also regulates content that is legal but “harmful.” Content is “harmful to children” where it is defined as such by the Secretary of State through secondary legislation or where it “presents a material risk of significant harm to an appreciable number of children” in the UK. Content “that is harmful to adults” is defined in the same way, but in relation to adults.

Harm is defined as “physical or psychological harm.” The OSB specifies that harm can arise in the following circumstances:

  • where, as a result of the content, “individuals act in a way that results in harm to themselves or that increases the likelihood of harm to themselves”;
  • where, as a result of the content, “individuals do or say something to another individual that results in harm to that other individual or that increases the likelihood of such harm.”

Given the potential breadth of these definitions, it seems likely that they will be subject to Parliamentary scrutiny as the OSB moves through the legislative process.

What obligations does the OSB impose?

The OSB imposes various obligations—called “duties of care”—on service providers. A provider’s specific duties of care will depend on the nature of the service in question. For example, only the largest U2U services—known as “Category 1” services—will be subject to duties of care in respect of content that is harmful to adults. Similarly, only services that are “likely to be accessed by children” will be subject to duties of care in respect of content that is harmful to children. 

Broadly, however, all covered services will be required to conduct risk assessments and to take steps to mitigate the risks identified in those assessments; to take steps to prevent, mitigate and/or minimize the presence of certain content on their services; and to provide information to users on their content moderation practices. Providers will also be subject to duties to “have regard” to the importance of users’ rights to freedom of expression and privacy.

What are the penalties for non-compliance?

The OSB tasks Ofcom with enforcing the law’s obligations. Ofcom can impose fines on covered providers of up to 10% of their “qualifying worldwide revenue,” and in extreme cases may seek to impose “business disruption measures,” such as service restriction orders and access restriction orders.

Notably, starting three months after the OSB receives Royal Assent, senior managers can be held personally liable where an covered provider commits an “information offence” and the senior manager has failed to take all reasonable steps to prevent that offence being committed.

Please reach out to a member of the Technology Regulatory team of you have any questions on the OSB. We also have been advising clients on the EU Digital Services Act and on how they are likely to intersect. 

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Lisa Peets Lisa Peets

Lisa Peets leads the Technology Regulatory and Policy practice in the London office and is a member of the firm’s Management Committee. Lisa divides her time between London and Brussels, and her practice embraces regulatory counsel and legislative advocacy. In this context, she…

Lisa Peets leads the Technology Regulatory and Policy practice in the London office and is a member of the firm’s Management Committee. Lisa divides her time between London and Brussels, and her practice embraces regulatory counsel and legislative advocacy. In this context, she has worked closely with leading multinationals in a number of sectors, including many of the world’s best-known technology companies.

Lisa counsels clients on a range of EU law issues, including data protection and related regimes, copyright, e-commerce and consumer protection, and the rapidly expanding universe of EU rules applicable to existing and emerging technologies. Lisa also routinely advises clients in and outside of the technology sector on trade related matters, including EU trade controls rules.

According to the latest edition of Chambers UK (2022), “Lisa is able to make an incredibly quick legal assessment whereby she perfectly distils the essential matters from the less relevant elements.” “Lisa has subject matter expertise but is also able to think like a generalist and prioritise. She brings a strategic lens to matters.”

Photo of Marty Hansen Marty Hansen

Martin Hansen has represented some of the world’s leading information technology, telecommunications, and pharmaceutical companies on a broad range of cutting edge international trade, intellectual property, and competition issues. Martin has extensive experience in advising clients on matters arising under the World Trade…

Martin Hansen has represented some of the world’s leading information technology, telecommunications, and pharmaceutical companies on a broad range of cutting edge international trade, intellectual property, and competition issues. Martin has extensive experience in advising clients on matters arising under the World Trade Organization agreements, treaties administered by the World Intellectual Property Organization, bilateral and regional free trade agreements, and other trade agreements.

Drawing on ten years of experience in Covington’s London and DC offices his practice focuses on helping innovative companies solve challenges on intellectual property and trade matters before U.S. courts, the U.S. government, and foreign governments and tribunals. Martin also represents software companies and a leading IT trade association on electronic commerce, Internet security, and online liability issues.

Photo of Shona O'Donovan Shona O'Donovan

Advising clients on a broad range of data protection, e-privacy and online content issues under EU, Irish, and UK law, Shóna O’Donovan works with her clients on technology regulatory and policy issues.
With multi-jurisdictional and in-house experience, Shóna advises global companies on complying…

Advising clients on a broad range of data protection, e-privacy and online content issues under EU, Irish, and UK law, Shóna O’Donovan works with her clients on technology regulatory and policy issues.
With multi-jurisdictional and in-house experience, Shóna advises global companies on complying with data protection laws in the EU. In particular, she represents organizations in regulatory investigations and inquiries, advises on children’s privacy issues and provides strategic advice on incident response. Shóna also advises clients on policy developments in online content and online safety.

In her current role, Shóna has gained experience on secondment to the data protection team of a global technology company. In a previous role, she spent seven months on secondment to the European data protection team of a global social media company.

Shóna’s recent pro bono work includes providing data protection advice to the International Aids Vaccine Initiative and a UK charity helping people with dementia, and working with an organization specializing in providing advice to states involved in conflict on documenting human rights abuses.

Tomos Griffiths

Tomos Griffiths is a Trainee. He attended Durham University.