Photo of Laura Somaini

Laura Somaini

Laura Somaini is an associate in the Data Privacy and Cybersecurity Practice Group.

Laura advises clients on EU data protection, e-privacy and technology law, including on Italian requirements. She regularly assists clients in relation to GDPR compliance, international data transfers, direct marketing rules as well as data protection contracts and policies.

On May 30, 2024, the Court of Justice of the EU (“CJEU”) handed down its rulings in several cases (C-665/22Joined Cases C‑664/22 and C‑666/22C‑663/22, and Joined Cases C‑662/22 and C‑667/22) concerning the compatibility with EU law of certain Italian measures imposing obligations on providers of online platforms and search engines.  In doing so, the CJEU upheld the so-called “country-of-origin” principle, established in the EU’s e-Commerce Directive and based on the EU Treaties principle of free movement of services.  The country-of-origin principle gives the Member State where an online service provider is established exclusive authority (“competence”) to regulate access to, and exercise of, the provider’s services and prevents other Member States from imposing additional requirements.

We provide below an overview of Court’s key findings.

Background

The cases originate from proceedings brought by several online intermediation and search engine service providers (collectively, “providers”) against the Italian regulator for communications (“AGCOM”).  The providers, which are not established in Italy, challenged measures adopted by AGCOM designed to ensure the “adequate and effective enforcement” of the EU Platform-to-Business Regulation (“P2B Regulation”).  Among other things, those measures required the providers, depending on the case, to: (1) enter their business into a national register; (2) provide detailed information, including information about the company’s economic situation, ownership structure, and organization; and (3) pay a financial contribution to the regulator for the purposes of supporting its supervision activities. 

The Country-of-Origin Principle

In its rulings, the Court notes that the e-Commerce Directive’s country-of-origin principle relieves online service providers of having to comply with multiple Member State requirements falling within the so-called “coordinated field” (as defined in Article 2(h)-(i) of e-Commerce Directive), that is, requirements concerning access to the service (such as qualifications, authorizations or notifications), and the provision of the service (such as the provider’s behavior, the quality or content of services). 

Member States other than where the service provider is established cannot restrict the freedom to provide such online services for reasons falling within the coordinated field, unless certain conditions are met.  In particular, measures may be taken when it is necessary for reasons of public policy, protection of public health, public security, or the protection of consumers, among other conditions (Article 3(4) of e-Commerce Directive).Continue Reading CJEU Upholds Country-of-Origin Principle for Online Service Providers in the EU

On May 20, 2024, a proposal for a law on artificial intelligence (“AI”) was laid before the Italian Senate.

The proposed law sets out (1) general principles for the development and use of AI systems and models; (2) sectorial provisions, particularly in the healthcare sector and for scientific research for healthcare; (3) rules on the national strategy on AI and governance, including designating the national competent authorities in accordance with the EU AI Act; and (4) amendments to copyright law. 

We provide below an overview of the proposal’s key provisions.

Objectives and General Principles

The proposed law aims to promote a “fair, transparent and responsible” use of AI, following a human-centered approach, and to monitor potential economic and social risks, as well as risks to fundamental rights.  The law will sit alongside and complement the EU AI Act (for more information on the EU AI Act, see our blogpost here).  (Article 1)

The proposed law sets out general principles, based on the principles developed by the Commission’s High-level expert group on artificial intelligence, pursuing three broad objectives:

  1. Fair algorithmic processing. Research, testing, development, implementation and application of AI systems must respect individuals’ fundamental rights and freedoms, and the principles of transparency, proportionality, security, protection of personal data and confidentiality, accuracy, non-discrimination, gender equality and inclusion.
  2. Protection of data. The development of AI systems and models must be based on data and processes that are proportionate to the sectors in which they’re intended to be used, and ensure that data is accurate, reliable, secure, qualitative, appropriate and transparent.  Cybersecurity throughout the systems’ lifecycle must be ensured and specific security measures adopted.
  3. Digital sustainability. The development and implementation of AI systems and models must ensure human autonomy and decision-making, prevention of harm, transparency and explainability.  (Article 3)

Continue Reading Italy Proposes New Artificial Intelligence Law

On October 12, 2023 the Italian Data Protection Authority (“Garante”) published guidance on the use of AI in healthcare services (“Guidance”).  The document builds on principles enshrined in the GPDR, national and EU case-law.  Although the Guidance focuses on Italian national healthcare services, it offers considerations relevant to the use

Continue Reading Italian Garante Issues Guidance on the Use of AI in the Context of National Healthcare Services

On July 10, 2023, the European Commission adopted its adequacy decision on the EU-U.S. Data Privacy Framework (“DPF”). The decision, which took effect on the day of its adoption, concludes that the United States ensures an adequate level of protection for personal data transferred from the EEA to companies certified to the DPF. This blog post summarizes the key findings of the decision, what organizations wishing to certify to the DPF need to do and the process for certifying, as well as the impact on other transfer mechanisms such as the standard contractual clauses (“SCCs”), and on transfers from the UK and Switzerland.

Background

The Commission’s adoption of the adequacy decision follows three key recent developments:

  1. the endorsement of the draft decision by a committee of EU Member State representatives;
  2. the designation by the U.S. Department of Justice of the European Union and Iceland, Liechtenstein, and Norway (which together with the EU form the EEA) as “qualifying states,” for the purposes of President Biden’s Executive Order 14086 on Enhancing Safeguards for U.S. Signals Intelligence Activities (“EO 14086”). This designation enables EU data subjects to submit complaints concerning alleged violations of U.S. law governing signals intelligence activities to the redress mechanism set forth in the Executive Order and implementing regulations (see our previous blog post here); and
  3. updates to the U.S. Intelligence Community’s policies and procedures to implement the safeguards established under EO 14086, announced by the U.S. Office of Director of National Intelligence on July 3, 2023.

The final adequacy decision, which largely corresponds to the Commission’s draft decision (see our prior blog post here), concludes “the United States … ensures a level of protection for personal data transferred from the Union to certified organisations in the United States under the EU-U.S. Data Privacy Framework that is essentially equivalent to the one guaranteed by [the GDPR]” (para. 201).

Key Findings of the Decision

In reaching the final decision, the Commission confirms a few key points:Continue Reading European Commission Adopts Adequacy Decision on the EU-U.S. Data Privacy Framework

On April 17, 2023, the Italian Supervisory Authority (“Garante”) published its decision against a company operating digital marketing services finding several GDPR violations, including the use of so-called “dark-patterns” to obtain users’ consent.  The Garante imposed a fine of 300.000 EUR. 

We provide below a brief overview of the Garante’s key findings.

Background

The sanctioned company operated marketing campaigns on behalf of its clients, via text messages, emails and automated calls.  The company’s database of contacts was formed by data collected directly through its online portals (offering news, sweepstakes and trivia), as well as data purchased from data brokers.

Key Findings

Dark patterns.  The Garante found that, during the subscription process, the user was asked for specific consent relating to marketing purposes and sharing of data with third parties for marketing.  If the user did not select either of the checkboxes, a banner would pop-up, indicating the lack of consent, and displaying a prominent consent button.  The site also displayed a “continue without accepting” option, but this was placed at the bottom of the webpage – outside of the pop-up banner – in simple text form and smaller font size, which made it less visible than the “consent” button.  The Garante, referring to the EDPB’s guidelines (see our blogpost here), held that the use of such interfaces and graphic elements constituted “dark patterns” with the aim of pushing individuals towards providing consent.

Double opt-in.  The Garante noted that consent was not adequately documented.  While the company argued that it required a “double opt-in”, the evidence showed that a confirmation request was not consistently sent out to users.  The Garante recalled that double opt-in is not a mandatory requirement in Italy, but constitutes nonetheless an appropriate method to document consent.Continue Reading Italian Garante Fines Digital Marketing Company Over Use of Dark Patterns

On June 23, 2022 the Italian data protection authority (“Garante”) released a general statement (here) flagging the unlawfulness of data transfers to the U.S. resulting from the use of Google Analytics.  The Garante invites all Italian website operators, both public and private, to verify that the use of cookies and other tracking tools on their websites is compliant with data protection law, in particular with regards to the use of Google Analytics and similar services. 

The Garante’s statement follows an order (here) issued against an Italian website operator to stop data transfers to Google LLC in the U.S., and joins other European data protection authorities in their actions relating to the use of Google Analytics (see our previous blogs here and here).

Below we summarize the Garante’s key considerations.

  • Google Analytics’ “IP Anonymization” feature

The Garante analyzes Google Analytics’ so-called “IP-Anonymization” feature, which allows the transfer of user IP addresses to Google Analytics after masking the IP address’ last octet.  The Garante finds that such feature constitutes a pseudonymization of the IP address, and not anonymization.  According to the Garante, the feature does not prevent Google LLC from re-identifying the user, given Google’s capabilities to enrich such data through additional information it holds, especially in circumstances where those users maintain and use a Google account.Continue Reading Italian Garante bans use of Google Analytics