On 31 May 2023, at the close of the fourth meeting of the US-EU Trade & Tech Council (“TTC”), Margrethe Vestager – the European Union’s Executive Vice President, responsible for competition and digital strategy – announced that the EU and US are working together to develop a voluntary AI Code of Conduct in advance of
Marianna Drake counsels leading multinational companies on some of their most complex regulatory, policy and compliance-related issues, including data privacy and AI regulation. She focuses her practice on compliance with UK, EU and global privacy frameworks, and new policy proposals and regulations relating to AI and data. She also advises clients on matters relating to children’s privacy, online safety and consumer protection and product safety laws.
Her practice includes defending organizations in cross-border, contentious investigations and regulatory enforcement in the UK and EU Member States. Marianna also routinely partners with clients on the design of new products and services, drafting and negotiating privacy terms, developing privacy notices and consent forms, and helping clients design governance programs for the development and deployment of AI technologies.
Marianna’s pro bono work includes providing data protection advice to UK-based human rights charities, and supporting a non-profit organization in conducting legal research for strategic litigation.
On 11 May 2023, members of the European Parliament’s internal market (IMCO) and civil liberties (LIBE) committees agreed their final text on the EU’s proposed AI Act. After MEPs formalize their position through a plenary vote (expected this summer), the AI Act will enter the last stage of the legislative process: “trilogue” negotiations with the…
Facial recognition technology (“FRT”) has attracted a fair amount of attention over the years, including in the EU (e.g., see our posts on the European Parliament vote and CNIL guidance), the UK (e.g., ICO opinion and High Court decision) and the U.S. (e.g., Washington state and NTIA guidelines). This post summarizes two recent developments in this space: (i) the UK Information Commissioner’s Office (“ICO”)’s announcement of a £7.5-million fine and enforcement notice against Clearview AI (“Clearview”), and (ii) the EDPB’s release of draft guidelines on the use of FRT in law enforcement.
I. ICO Fines Clearview AI £7.5m
In the past year, Clearview has been subject to investigations into its data processing activities by the French and Italian authorities, and a joint investigation by the ICO and the Australian Information Commissioner. All four regulators held that Clearview’s processing of biometric data scraped from over 20 billion facial images from across the internet, including from social media sites, breached data protection laws.
On 26 May 2022, the ICO released its monetary penalty notice and enforcement notice against Clearview. The ICO concluded that Clearview’s activities infringed a number of the GDPR and UK GDPR’s provisions, including:
- Failing to process data in a way that is fair and transparent under Article 5(1)(a) GDPR. The ICO concluded that people were not made aware or would not reasonably expect their images to be scraped, added to a worldwide database, and made available to a wide range of customers for the purpose of matching images on the company’s database.
- Failing to process data in a way that is lawful under the GDPR. The ICO ruled that Clearview’s processing did not meet any of the conditions for lawful processing set out in Article 6, nor, for biometric data, in Article 9(2) GDPR.
- Failing to have a data retention policy and thus being unable to ensure that personal data are not retained for longer than necessary under Article 5(1)(e) GDPR. There was no indication as to when (or whether) any images are ever removed from Clearview’s database.
- Failing to provide data subjects with the necessary information under Article 14 GDPR. According to the ICO’s investigation, the only way in which data subjects could obtain that information was by contacting Clearview and directly requesting it.
- Impeding the exercise of data subject rights under Articles 15, 16, 17, 21 and 22 GDPR. In order to exercise these rights, data subjects needed to provide Clearview with additional personal data, by providing a photograph of themselves that can be matched against the Clearview Database.
- Failing to conduct a Data Protection Impact Assessment (“DPIA”) under Article 35 GDPR. The ICO found that Clearview failed at any time to conduct a DPIA in respect of its processing of the personal data of UK residents.
On 22 September 2021, the UK Government published its 10-year strategy on artificial intelligence (“AI”; the “UK AI Strategy”).
The UK AI Strategy has three main pillars: (1) investing and planning for the long-term requirements of the UK’s AI ecosystem; (2) supporting the transition to an AI-enabled economy across all sectors and regions…
- Conditions for reuse of public sector data that is subject to existing protections, such as commercial confidentiality, intellectual property, or data protection;
- Obligations on “providers of data sharing services,” defined as entities that provide various types of data intermediary services;
- Introduction of the concept of “data altruism” and the possibility for organisations to register as a “Data Altruism Organisation recognised in the Union”; and
- Establishment of a “European Data Innovation Board,” a new formal expert group chaired by the Commission.
Conditions for reuse of public sector data (Chapter II, Articles 3-8)
Chapter II of the Data Governance Act would impose conditions on public-sector bodies when they make certain protected data that they hold available for re-use. These provisions apply to data held by public-sector bodies that are protected on grounds of commercial or statistical confidentiality, intellectual property rights, or personal data protection. The Act does not impose new obligations on public-sector bodies to allow re-use of data and does not release them from their existing legal obligations with respect to data. But if public-sector bodies do make protected data available for re-use, they must comply with the conditions set out in Chapter II.
Specifically, the Act prohibits public-sector bodies from granting exclusive rights in data or restricting the availability of data for re-use by entities other than the parties to such exclusive agreements, with limited derogations. In addition, if a public-sector body grants or refuses access for the re-use of data, it must ensure that the conditions for such access (or refusal) are non-discriminatory, proportionate, and objectively justified, and must make those conditions publicly available. The Act also provides that public bodies “shall” impose conditions “that preserve the functioning of the technical systems” used to process such data, and authorizes the Commission to adopt implementing acts declaring that third countries to which such data may be transferred provide IP and trade secret protections that are “essentially equivalent” to those in the EU.
In addition, where specific EU acts establish that certain non-personal data categories held by public-sector bodies are “highly sensitive,” such data may be subject to restrictions on cross-border transfers, as specified by the Commission through delegated acts.
Obligations on “providers of data sharing services” (Chapter III, Articles 9-14)
Chapter III of the Act introduces new rules for the operation of data intermediaries, termed “providers of data sharing services”. Specifically, it would establish a notification and compliance framework for providers of the following data sharing services:
- Intermediation services between data holders and data users, which include platforms or databases enabling the exchange or joint exploitation of data, such as industry data spaces;
- Intermediation services between data subjects that seek to make their personal data available and potential data users; and
- “Data cooperative” services that support individuals or SMEs to negotiate terms and conditions for data processing.
The Act set out several requirements that providers of these data sharing services would need to comply with, including:
- Notifying the relevant EU Member State authority of its intent to provide such services;
- Appointing a legal representative in one of the Member States, if the company is not established within the EU;
- Not using the data collected for other purposes, and using any metadata only for the development of that service;
- Placing its data sharing service in a “separate legal entity” from its other services;
- Having in place adequate security safeguards; and
- Imposing a fiduciary duty towards data subjects to act in their best interests.