Photo of Kristof Van Quathem

Kristof Van Quathem

Kristof Van Quathem advises clients on information technology matters and policy, with a focus on data protection, cybercrime and various EU data-related initiatives, such as the Data Act, the AI Act and EHDS.

Kristof has been specializing in this area for over twenty years and developed particular experience in the life science and information technology sectors. He counsels clients on government affairs strategies concerning EU lawmaking and their compliance with applicable regulatory frameworks, and has represented clients in non-contentious and contentious matters before data protection authorities, national courts and the Court of the Justice of the EU.

Kristof is admitted to practice in Belgium.

In early March 2024, the EU lawmakers reached agreement on the European Health Data Space (EHDS).  For now, we only have a work-in-progress draft version of the text, but a number of interesting points can already be highlighted.  This article focusses on the obligations of data holders; for an overview of the EHDS generally, see our first post in this series.

We expect the final text of the EHDS to be adopted by the European Parliament in April 2024 and by the EU Member States shortly thereafter.

1: Health data holder

The term “health data holder” includes, among others, any natural or legal person developing products or services intended for health, developing or manufacturing wellness applications, or performing research in relation to healthcare, who:

  • in relation to personal electronic health data: in its capacity of a data controller has the right or obligation to process the health data, including for research and innovation purposes; or
  • in relation to non-personal electronic health data: has the ability to make the data available through control of the technical design of a product and related services.  These terms appear to be taken from the Data Act, but they are not defined under the EHDS.

In practice, this means that, for example, hospitals, as data controllers, are data holders of their electronic health records.  Similarly, pharmaceutical companies are data holders of clinical trial data and biobanks.  Medical device companies may be data holders of non-personal data generated by their devices, if they have access to that data and an ability to produce it.  However, medical device companies would not qualify as a data holder where they merely process personal electronic health data on behalf of a hospital.

Individual researchers and micro enterprises are not data holders, unless EU Member States decide differently for their territory.

2: Data sets covered by EHDS

The EHDS sets out a long list of covered electronic health data that should be made available for secondary use under the EHDS.  It includes, among others:

  • electronic health records;
  • human genetic data;
  • biobanks;
  • data from wellness applications;
  • clinical trial data – though according to the recitals, this only applies when the trial has ended;
  • medical device data;
  • data from registries; and
  • data from research cohorts and surveys, after the first publication of the results – a qualifier that does not seem to apply for clinical trial data.

Continue Reading EHDS Series – 2: The European Health Data Space from the Health Data Holder’s Perspective

In December 2023, the Dutch SA fined a credit card company €150,000 for failure to perform a proper data protection impact assessment (“DPIA”) in accordance with Art. 35 GDPR for its “identification and verification process”.

First, the Dutch SA decided that the company was required to perform a DPIA because the processing met two of

EU advocate general Collins has reiterated that individuals’ right to claim compensation for harm caused by GDPR breaches requires proof of “actual damage suffered” as a result of the breach, and “clear and precise evidence” of such damage – mere hypothetical harms or discomfort are insufficient. The advocate general also found that unauthorised access to

On October 12, 2023 the Italian Data Protection Authority (“Garante”) published guidance on the use of AI in healthcare services (“Guidance”).  The document builds on principles enshrined in the GPDR, national and EU case-law.  Although the Guidance focuses on Italian national healthcare services, it offers considerations relevant to the use of AI in the healthcare

On August 22, 2023, the Spanish Council of Ministers approved the Statute of the Spanish Agency for the Supervision of Artificial Intelligence (“AESIA”) thus creating the first AI regulatory body in the EU. The AESIA will start operating from December 2023, in anticipation of the upcoming EU AI Act  (for a summary of the AI

On April 17, 2023, the Italian Supervisory Authority (“Garante”) published its decision against a company operating digital marketing services finding several GDPR violations, including the use of so-called “dark-patterns” to obtain users’ consent.  The Garante imposed a fine of 300.000 EUR. 

We provide below a brief overview of the Garante’s key findings.

Background

The sanctioned company operated marketing campaigns on behalf of its clients, via text messages, emails and automated calls.  The company’s database of contacts was formed by data collected directly through its online portals (offering news, sweepstakes and trivia), as well as data purchased from data brokers.

Key Findings

Dark patterns.  The Garante found that, during the subscription process, the user was asked for specific consent relating to marketing purposes and sharing of data with third parties for marketing.  If the user did not select either of the checkboxes, a banner would pop-up, indicating the lack of consent, and displaying a prominent consent button.  The site also displayed a “continue without accepting” option, but this was placed at the bottom of the webpage – outside of the pop-up banner – in simple text form and smaller font size, which made it less visible than the “consent” button.  The Garante, referring to the EDPB’s guidelines (see our blogpost here), held that the use of such interfaces and graphic elements constituted “dark patterns” with the aim of pushing individuals towards providing consent.

Double opt-in.  The Garante noted that consent was not adequately documented.  While the company argued that it required a “double opt-in”, the evidence showed that a confirmation request was not consistently sent out to users.  The Garante recalled that double opt-in is not a mandatory requirement in Italy, but constitutes nonetheless an appropriate method to document consent.Continue Reading Italian Garante Fines Digital Marketing Company Over Use of Dark Patterns

On 24 January 2023, the Italian Supervisory Authority (“Garante”) announced it fined three hospitals in the amount of 55,000 EUR each for their unlawful use an artificial intelligence (“AI”) system for risk stratification purposes, i.e., to systematically categorize patients based on their health status. The Garante also ordered the hospitals to erase all the data

On December 28, 2022, the Spanish Data Protection Authority (“AEPD”) published a statement on the interplay between its recently approved Spanish code of conduct for the pharmaceutical industry and the European Federation of Pharmaceutical Industries and Associations’ (“EFPIA”) proposal for an EU code of conduct on clinical trials and pharmacovigilance.  The statement relates specifically to

The German Conference of Independent Supervisory Authorities (“DSK”) published on March 23, 2022 a statement on scientific research and data protection (see here, in German).  The DSK published the statement in response to the German Government’s initiative on a general law on research data as part of its Open Data Strategy, announced