Skip to content
Photo of Mark Young

Mark Young, an experienced tech regulatory lawyer, advises major global companies on their most challenging data privacy compliance matters and investigations.

Mark also leads on EMEA cybersecurity matters at the firm. He advises on evolving cyber-related regulations, and helps clients respond to incidents, including personal data breaches, IP and trade secret theft, ransomware, insider threats, and state-sponsored attacks.

Mark has been recognized in Chambers UK for several years as "a trusted adviser - practical, results-oriented and an expert in the field;" "fast, thorough and responsive;" "extremely pragmatic in advice on risk;" and having "great insight into the regulators."

Drawing on over 15 years of experience advising global companies on a variety of tech regulatory matters, Mark specializes in:

  • Advising on potential exposure under GDPR and international data privacy laws in relation to innovative products and services that involve cutting-edge technology (e.g., AI, biometric data, Internet-enabled devices, etc.).
  • Providing practical guidance on novel uses of personal data, responding to individuals exercising rights, and data transfers, including advising on Binding Corporate Rules (BCRs) and compliance challenges following Brexit and Schrems II.
    Helping clients respond to investigations by data protection regulators in the UK, EU and globally, and advising on potential follow-on litigation risks.
  • GDPR and international data privacy compliance for life sciences companies in relation to:
    clinical trials and pharmacovigilance;

    • digital health products and services; and
    • marketing programs.
    • International conflict of law issues relating to white collar investigations and data privacy compliance.
  • Cybersecurity issues, including:
    • best practices to protect business-critical information and comply with national and sector-specific regulation;
      preparing for and responding to cyber-based attacks and internal threats to networks and information, including training for board members;
    • supervising technical investigations; advising on PR, engagement with law enforcement and government agencies, notification obligations and other legal risks; and representing clients before regulators around the world; and
    • advising on emerging regulations, including during the legislative process.
  • Advising clients on risks and potential liabilities in relation to corporate transactions, especially involving companies that process significant volumes of personal data (e.g., in the adtech, digital identity/anti-fraud, and social network sectors.)
  • Providing strategic advice and advocacy on a range of EU technology law reform issues including data privacy, cybersecurity, ecommerce, eID and trust services, and software-related proposals.
  • Representing clients in connection with references to the Court of Justice of the EU.

Facial recognition technology (“FRT”) has attracted a fair amount of attention over the years, including in the EU (e.g., see our posts on the European Parliament vote and CNIL guidance), the UK (e.g., ICO opinion and High Court decision) and the U.S. (e.g., Washington state and NTIA guidelines). This post summarizes two recent developments in this space: (i) the UK Information Commissioner’s Office (“ICO”)’s announcement of a £7.5-million fine and enforcement notice against Clearview AI (“Clearview”), and (ii) the EDPB’s release of draft guidelines on the use of FRT in law enforcement.

I. ICO Fines Clearview AI £7.5m

In the past year, Clearview has been subject to investigations into its data processing activities by the French and Italian authorities, and a joint investigation by the ICO and the Australian Information Commissioner. All four regulators held that Clearview’s processing of biometric data scraped from over 20 billion facial images from across the internet, including from social media sites, breached data protection laws.

On 26 May 2022, the ICO released its monetary penalty notice and enforcement notice against Clearview. The ICO concluded that Clearview’s activities infringed a number of the GDPR and UK GDPR’s provisions, including:

  • Failing to process data in a way that is fair and transparent under Article 5(1)(a) GDPR. The ICO concluded that people were not made aware or would not reasonably expect their images to be scraped, added to a worldwide database, and made available to a wide range of customers for the purpose of matching images on the company’s database.
  • Failing to process data in a way that is lawful under the GDPR. The ICO ruled that Clearview’s processing did not meet any of the conditions for lawful processing set out in Article 6, nor, for biometric data, in Article 9(2) GDPR.
  • Failing to have a data retention policy and thus being unable to ensure that personal data are not retained for longer than necessary under Article 5(1)(e) GDPR. There was no indication as to when (or whether) any images are ever removed from Clearview’s database.
  • Failing to provide data subjects with the necessary information under Article 14 GDPR. According to the ICO’s investigation, the only way in which data subjects could obtain that information was by contacting Clearview and directly requesting it.
  • Impeding the exercise of data subject rights under Articles 15, 16, 17, 21 and 22 GDPR. In order to exercise these rights, data subjects needed to provide Clearview with additional personal data, by providing a photograph of themselves that can be matched against the Clearview Database.
  • Failing to conduct a Data Protection Impact Assessment (“DPIA”) under Article 35 GDPR. The ICO found that Clearview failed at any time to conduct a DPIA in respect of its processing of the personal data of UK residents.


Continue Reading Facial Recognition Update: UK ICO Fines Clearview AI £7.5m & EDPB Adopts Draft Guidelines on Use of FRT by Law Enforcement

In the early hours of Friday, 13 May, the European Parliament and the Council of the EU reached provisional political agreement on a new framework EU cybersecurity law, known as “NIS2”. This new law, which will replace the existing NIS Directive (which was agreed around the same time as GDPR, see here) aims to strengthen EU-wide cybersecurity protection across a broader range of sectors, including the pharmaceutical sector, medical device manufacturing, and the food sector.

We set out background on NIS2 in prior blog posts (e.g., in relation to the original proposal in late 2020, see here, and more recently when the Council of the EU adopted an updated version in December 2021). Whilst we are still waiting for the provisionally agreed text to be released, a few points are worth mentioning from this latest agreement:

  • Clearer delineation of scope. NIS2 will only apply to entities that meet certain size thresholds in the prescribed sectors, namely
    • “essential entities” meaning those operating in the following sectors: energy; transport; banking; financial market infrastructures; health (including the manufacture of pharmaceutical products); drinking water; waste water; digital infrastructure (internet exchange points; DNS providers; TLD name registries; cloud computing service providers; data centre service providers; content delivery networks; trust service providers; and public electronic communications networks and electronic communications services); public administration; and space; and
    • “important entities”, meaning those operating in the following sectors: postal and courier services; waste management; chemicals; food; manufacturing of medical devices, computers and electronics, machinery equipment, motor vehicles; and digital providers (online market places, online search engines, and social networking service platforms).


Continue Reading Political Agreement Reached on New EU Horizontal Cybersecurity Directive

As many readers will be aware, a key enforcement trend in the privacy sphere is the increasing scrutiny by regulators and activists of cookie banners and the use of cookies. This is a topic that we have been tracking on the Inside Privacy blog for some timeItalian and German data protection authorities have

On 22 September 2021, the UK Government published its 10-year strategy on artificial intelligence (“AI”; the “UK AI Strategy”).

The UK AI Strategy has three main pillars: (1) investing and planning for the long-term requirements of the UK’s AI ecosystem; (2) supporting the transition to an AI-enabled economy across all sectors and regions

On 25 November 2020, the European Commission published a proposal for a Regulation on European Data Governance (“Data Governance Act”).  The proposed Act aims to facilitate data sharing across the EU and between sectors, and is one of the deliverables included in the European Strategy for Data, adopted in February 2020.  (See our previous blog here for a summary of the Commission’s European Strategy for Data.)  The press release accompanying the proposed Act states that more specific proposals on European data spaces are expected to follow in 2021, and will be complemented by a Data Act to foster business-to-business and business-to-government data sharing.The proposed Data Governance Act sets out rules relating to the following:
  • Conditions for reuse of public sector data that is subject to existing protections, such as commercial confidentiality, intellectual property, or data protection;
  • Obligations on “providers of data sharing services,” defined as entities that provide various types of data intermediary services;
  • Introduction of the concept of “data altruism” and the possibility for organisations to register as a “Data Altruism Organisation recognised in the Union”; and
  • Establishment of a “European Data Innovation Board,” a new formal expert group chaired by the Commission.

Conditions for reuse of public sector data (Chapter II, Articles 3-8)

Chapter II of the Data Governance Act would impose conditions on public-sector bodies when they make certain protected data that they hold available for re-use.  These provisions apply to data held by public-sector bodies that are protected on grounds of commercial or statistical confidentiality, intellectual property rights, or personal data protection.  The Act does not impose new obligations on public-sector bodies to allow re-use of data and does not release them from their existing legal obligations with respect to data.  But if public-sector bodies do make protected data available for re-use, they must comply with the conditions set out in Chapter II.

Specifically, the Act prohibits public-sector bodies from granting exclusive rights in data or restricting the availability of data for re-use by entities other than the parties to such exclusive agreements, with limited derogations.  In addition, if a public-sector body grants or refuses access for the re-use of data, it must ensure that the conditions for such access (or refusal) are non-discriminatory, proportionate, and objectively justified, and must make those conditions publicly available. The Act also provides that public bodies “shall” impose conditions “that preserve the functioning of the technical systems” used to process such data, and authorizes the Commission to adopt implementing acts declaring that third countries to which such data may be transferred provide IP and trade secret protections that are “essentially equivalent” to those in the EU.

In addition, where specific EU acts establish that certain non-personal data categories held by public-sector bodies are  “highly sensitive,” such data may be subject to restrictions on cross-border transfers, as specified by the Commission through delegated acts.

Obligations on “providers of data sharing services” (Chapter III, Articles 9-14)

Chapter III of the Act introduces new rules for the operation of data intermediaries, termed “providers of data sharing services”.  Specifically, it would establish a notification and compliance framework for providers of the following data sharing services:

  • Intermediation services between data holders and data users, which include platforms or databases enabling the exchange or joint exploitation of data, such as industry data spaces;
  • Intermediation services between data subjects that seek to make their personal data available and potential data users; and
  • “Data cooperative” services that support individuals or SMEs to negotiate terms and conditions for data processing.

The Act set out several requirements that providers of these data sharing services would need to comply with, including:

  • Notifying the relevant EU Member State authority of its intent to provide such services;
  • Appointing a legal representative in one of the Member States, if the company is not established within the EU;
  • Not using the data collected for other purposes, and using any metadata only for the development of that service;
  • Placing its data sharing service in a “separate legal entity” from its other services;
  • Having in place adequate security safeguards; and
  • Imposing a fiduciary duty towards data subjects to act in their best interests.


Continue Reading The European Commission publishes a proposal for a Regulation on European Data Governance (the Data Governance Act)

Yesterday, the Court of Justice of the European Union (the “CJEU”) invalidated the European Commission’s Decision on the EU-U.S. Safe Harbor arrangement (Commission Decision 2000/520 – see here). The Court responded to pre-judicial questions put forward by the Irish High Court in the so-called Schrems case. More specifically, the High Court had enquired, in

We and the third parties that provide content, functionality, or business services on our website may use cookies to collect information about your browsing activities in order to provide you with more relevant content and promotional materials, on and off the website, and help us understand your interests and improve the website. Privacy Policy

AcceptReject