Data Protection & Privacy 2026

Last Updated March 10, 2026

Italy

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that provides strategic legal and regulatory support across privacy, data protection, intellectual property, and technology, media and telecommunications (TMT) law, with a strong focus on the normative and operational aspects of cybersecurity. The firm assists organisations in designing and implementing governance, compliance and security frameworks that meet the highest international standards. With over 80 professionals and a network active in more than 65 jurisdictions, ICTLC combines global co-ordination with local insight. Through its sister company ICT Cyber Consulting, the firm offers integrated cybersecurity services, including legal-technical risk assessments, resilience planning, and alignment with frameworks such as NIS2, DORA and the Cyber Resilience Act. ICTLC’s multidisciplinary expertise enables clients to navigate complex digital regulations and strengthen trust, compliance and resilience across their global operations.

The Italian regulatory framework on the protection of personal data and privacy is dictated by Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, repealing Directive 95/46/EC (GDPR). To the extent that such protection is not mentioned by the GDPR, it is regulated by Legislative Decree No 196/2003 (the “Privacy Code”).

Further detailed rules are contained in Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector, as transposed into Italian law by the Privacy Code.

With reference to the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection and prosecution of criminal offences or the execution of criminal penalties, the regulatory framework is instead governed by EU Directive 2016/680, transposed into the Italian legal system through Legislative Decree No 51/2018. Finally, other specific indications and/or interpretations are contained in the decisions, recommendations and guidelines issued by the national supervisory authorities and the European Data Protection Board (eg, in Italy, the requirements for system administrators).

Concerning the extraterritoriality application, Article 2 of the GPDR addresses the application to the processing of personal data of data subjects who are in the EU by a controller or processor not established in the EU, where the processing activities are related to:

  • the offering of goods or services, irrespective of whether a payment by the data subject is required, to such data subjects in the EU; or
  • the monitoring of their behaviour as far as their behaviour takes place within the EU.

Finally, as regards the interaction with artificial intelligence (AI) rules (Regulation (EU) 2024/1689 and Italian Law No 132/2025) and with cybersecurity rules (in particular, Italian Legislative Decree No 138/2024 transposing Directive (EU) 2022/2555, together with the determinations of the National Cybersecurity Agency in the NIS2 framework, Italian Law No 90/2024, and all implementing measures of Regulation (EU) 2022/2554), these instruments expressly defer to EU personal data protection law, leaving its content and scope unchanged and fully applicable.

General Principles Relating to Processing of Personal Data

According to Article 5(1) of the GDPR, personal data shall be:

  • processed lawfully, fairly and in a transparent manner in relation to the data subject (“lawfulness, fairness and transparency”);
  • collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes (“purpose limitation”);
  • adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (“data minimisation”);
  • accurate and, where necessary, kept up to date – every reasonable step must be taken to ensure that personal data that is inaccurate, having regard to the purposes for which it is processed, is erased or rectified without delay (“accuracy”);
  • kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data is processed (“storage limitation”); and
  • processed in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical or organisational measures (“integrity and confidentiality”).

In addition, the data controller shall be responsible for, and be able to demonstrate compliance with, such principles (“accountability”).

Lawfulness of Processing

Any processing of personal data must be based on at least one of the following legal bases provided for in Article 6(1) of the GDPR:

  • the data subject has freely given specific, informed and unambiguous consent to the processing of their personal data;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
  • processing is necessary for compliance with a legal obligation to which the controller is subject;
  • processing is necessary in order to protect the vital interests of the data subject or of another natural person;
  • processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; or
  • processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require the protection of personal data (particularly where the data subject is a child) – in this case, the data controller is required to carry out an assessment of the legitimate interest pursued in relation to the rights and freedoms of the data subject by conducting a balancing activity that may possibly be challenged by the supervisory authority or the court.

Data Subjects’ Rights

Data subjects have certain rights under the GDPR in order to allow them to have continuous and effective control over their personal data. In particular, data subjects have the right to:

  • request access to their data (by receiving a copy of it) or to all information relating to the processing of their personal data (the purpose of processing, the recipients to whom the data is disclosed, any transfers outside the European Economic Area (EEA), etc);
  • obtain the rectification of inaccurate or incomplete personal data;
  • obtain the deletion of their personal data in the cases provided for in Article 17 of the GDPR;
  • obtain the restriction of processing in the cases provided for in Article 18 of the GDPR;
  • obtain their personal data in a structured and commonly used format or to request the transmission of such personal data to another data controller, where the legal basis of the processing is the consent of the data subject or the performance of a contract;
  • object to processing based on legitimate interest or the performance of a task carried out in the public interest or in the exercise of official authority;
  • not be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them;
  • withdraw the consent given; and
  • lodge a complaint with a supervisory authority.

Special Categories of Personal Data

According to Article 9 of the GDPR, processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, as well as the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health, or data concerning a natural person's sex life or sexual orientation, is prohibited except in the following cases:

  • where the data subject has given explicit consent to the processing of that personal data for one or more specified purposes;
  • where processing is necessary for the purposes of carrying out the obligations and exercising specific rights of the controller or of the data subject in the field of employment, social security and social protection law in so far as it is authorised by the law providing for appropriate safeguards for the fundamental rights and the interests of the data subject;
  • where processing is necessary to protect the vital interests of the data subject or of another natural person where the data subject is physically or legally incapable of giving consent;
  • where processing is carried out in the course of its legitimate activities with appropriate safeguards by a foundation, association or any other not-for-profit body with a political, philosophical, religious or trade union aim, and on condition that the processing relates solely to the members or to former members of the body or to persons who have regular contact with it in connection with its purposes, and that the personal data is not disclosed outside that body without the consent of the data subjects;
  • where processing relates to personal data that is manifestly made public by the data subject;
  • where processing is necessary for the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity;
  • where processing is necessary for reasons of substantial public interest, on the basis of the law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject;
  • where processing is necessary for the purposes of preventative or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment, or the management of health or social care systems and services on the basis of the law or pursuant to contract with a health professional;
  • where processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health or ensuring high standards of quality and safety of healthcare and of medicinal products or medical devices, on the basis of the law which provides for suitable and specific measures to safeguard the rights and freedoms of the data subject – in particular, professional secrecy (in this specific regard, reference is made to Article 2-sexies of the Privacy Code); and
  • where processing is necessary for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes.

According to Article 2-septies of the Privacy Code, the Italian supervisory authority (Garante per la Protezione dei Dati Personali; GPDP) may also specify particular safeguards for processing genetic data, biometric data and data concerning health.

Personal Data Relating to Criminal Convictions and Offences

According to Articles 10 of the GDPR and 2-octies of the Privacy Code – as amended by Article 71(1)(g) of Law No 182/2025 – processing of personal data relating to criminal convictions and offences or related security measures is carried out only under the control of official authority or when the processing is authorised by the law providing for appropriate safeguards for the rights and freedoms of data subjects.

Data Relating to Minors

In accordance with Article 8 of the GDPR and Article 2-quinques of the Privacy Code, restrictions on the processing of minors' data concern their consent. In particular, minors who have reached the age of 14 may consent to the processing of their personal data in relation to the direct offer of information society services (below this age, consent must be given by those exercising parental responsibility). However, the data controller must draft the information and communications relating to the processing in particularly clear and simple, concise and comprehensive language that is easily accessible and understandable to the minor.

Life sciences or medtech companies can anonymise patient data for product development or scientific research only if the outcome is true anonymisation in the GDPR sense. That means that the data must be rendered irreversibly non-identifiable, taking into account all means reasonably likely to be used for re-identification. In healthcare, this is a high bar, as clinical datasets are rich and easily linkable; the GPDP’s approach is consistently strict, so mere removal of direct identifiers or standard pseudonymisation is not enough. If the company achieves genuine anonymisation, the dataset falls outside the GDPR and can be used for research and development without a GDPR legal basis.

Where full anonymisation is not realistically achievable, data remains personal (even if pseudonymised). Then, the company must rely on a GDPR Article 6 lawful basis and an Article 9(2) condition for health data, most often the scientific research ground combined with Article 89 safeguards. The Italian Privacy Code rules on research (including the updated Article 110) support this pathway by allowing certain research uses with enhanced safeguards when informing individuals is impossible or disproportionate, but they do not remove the need for a lawful basis and accountability.

The Code of Conduct for management software developers approved by the GPDP in October 2024 reinforces this compliance logic for software/tech vendors working with hospitals: it operationalises privacy-by-design/default, clarifies that vendors typically act as processors for installation/maintenance/support, and expects robust security and life cycle controls. In practice, it pushes medtech vendors to ensure that any re-use/anonymisation for R&D is clearly covered contractually with the healthcare provider and supported by appropriate technical measures.

Looking forwards, the European Health Data Space Regulation (EU) 2025/327 (EHDS) is a major change for these companies. It entered into force on 26 March 2025 and will apply in phases from 26 March 2027, with the secondary use regime (research/innovation/AI training, etc) becoming operational from 26 March 2029 and some data categories being phased in later. The EHDS will create a harmonised, permit-based route to access large volumes of electronic health data across the EU through national Health Data Access Bodies, typically in secure environments and often in anonymised/pseudonymised form. For life sciences and medtech firms, this should increase lawful access and legal certainty for research and development, but only under tighter governance, interoperability, auditability and anti-reidentification constraints.

With regard to the use of AI systems, it is essential to note that both EU (AI Act) and national legislation refer to the GDPR and the Privacy Code for the regulation of personal data protection profiles relating to the use of AI tools. In this context, the first thing to note is the transparency obligations, as Articles 13 and 14 of the GDPR require data subjects to be provided with information on the existence of automated decision-making, including profiling, and meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

In addition, Article 22 of the GDPR states that the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them, except where the decision is:

  • necessary for entering into, or the performance of, a contract between the data subject and a data controller;
  • authorised by the law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
  • based on the data subject’s explicit consent.

In these cases, the data controller shall implement suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests – at least the right to obtain human intervention on the part of the controller – to express their point of view and to contest the decision.

Lastly, it is likely that an AI tool is also subject to the Data Protection Impact Assessment requirement under Article 35 of the GDPR.

In the case of a personal data breach, the data controller shall, without undue delay and where feasible no later than 72 hours after having become aware of it, notify the personal data breach to the GPDP, unless the personal data breach is unlikely to result in a risk to the rights and freedoms of natural persons. Such notification should at least:

  • describe the nature of the personal data breach including, where possible, the categories and approximate number of data subjects concerned and the categories and approximate number of personal data records concerned;
  • communicate the name and contact details of the Data Protection Officer or other contact point where more information can be obtained;
  • describe the likely consequences of the personal data breach; and
  • describe the measures taken or proposed to be taken by the controller to address the personal data breach, including, where appropriate, measures to mitigate its possible adverse effects.

Where the notification to the authority is not made within 72 hours, it shall be accompanied by reasons for the delay. In addition, where and in so far as it is not possible to provide all the information at the same time, the information may be provided in phases without undue further delay.

When the personal data breach is likely to result in a high risk to the rights and freedoms of natural persons, in addition to the notification described above, the data controller shall also communicate the personal data breach to the data subject without undue delay, describing in clear and plain language the nature of the personal data breach and the same above-mentioned information. Such communication to the data subject shall not be required if:

  • the controller implemented appropriate technical and organisational protection measures, and those measures were applied to the personal data affected by the personal data breach – in particular, those that render the personal data unintelligible to any person who is not authorised to access it, such as encryption;
  • the controller has taken subsequent measures which ensure that the high risk to the rights and freedoms of data subjects is no longer likely to materialise; and
  • it would involve disproportionate effort – in such a case, there shall instead be a public communication or similar measure whereby the data subjects are informed in an equally effective manner.

Lastly, the controller shall document any personal data breaches, comprising the facts relating to the personal data breach, its effects and the remedial action taken.

Supervisory authorities have limited regulatory power, mainly through the adoption of guidelines and opinions interpreting legal provisions. However, supervisory authorities (in Italy, the GPDP) also have supervisory powers to monitor compliance with data protection legislation, and benefit from investigative powers that include, ex multis, the possibility of requesting information from data controllers and data processors or conducting on-site checks and inspections.

In this context, the supervisory authority may request access to the documentation adopted (privacy policy, consents, internal policies and procedures, records of processing activities, etc) and to systems and databases. The GPDP’s inspections may be triggered by the authority itself (based on an inspection plan adopted and published every six months, or following notification of a personal data breach) or by data subjects or other third parties (in the case of complaints or reports). Any decisions that are eventually adopted are published.

Data protection legislation may also be applied by the courts in the case of appeals lodged by individuals (particularly in the case of claims for damages or appeals against decisions of the supervisory authority).

As mentioned in 1.7 Regulators, GPDP inspections may be triggered by the authority itself (on the basis of an inspection plan adopted and published every six months, or following the notification of a personal data breach) or by data subjects or other third parties (in the case of complaints or reports).

Preliminary Investigation

In the event of a complaint by a data subject, the GPDP shall verify the correctness and completeness of the complaint and, if necessary, grant the complainant a period of time to amend it, normally not exceeding 15 days. In the event of a correct and complete complaint (or in the event of an investigation on its own accord, such as following the notification of a personal data breach), the GPDP shall start a preliminary investigation, during which the documentation received is examined and/or further information is requested from the data controller or data processor.

In that scenario, inspections may also be carried out, during which the entity subject to inspection may be assisted by its trusted advisers and reserve the right to produce the documentation that is not immediately available within a reasonable period (as a rule, not exceeding 30 days). A record of the activity carried out shall also be drawn up, with particular reference to the statements made and the documents acquired, and a copy shall be given to the subject under inspection.

Closing of the Preliminary Investigation and Archiving

At the end of the preliminary investigation, the competent department within the GPDP may conclude its examination of the complaint by archiving it, when:

  • the issue examined does not appear to be related to the protection of personal data or the tasks entrusted to the GPDP;
  • there is no evidence of a breach of the relevant data protection regulations;
  • the claim set out in the complaint is excessive, due in particular to its specious or repetitive character; or
  • the issue raised by the complaint has already been examined by the GPDP.

In the case of a complaint, feedback is provided to the applicant, briefly stating the reasons why no action is being taken.

Initiation of Proceedings

If the matter is not dismissed following the preliminary investigation, the competent department shall initiate proceedings for the adoption of measures by the board of the GPDP, by means of its own communication to the data controller and/or data processor. The communication shall contain:

  • a concise description of the facts and alleged breaches of the relevant data protection rules, as well as the relevant sanctioning provisions;
  • an indication of the competent organisational unit where a copy of the investigative documents may be inspected and extracted; and
  • the indication that, within 30 days of receipt of the notice, it is possible to send the GPDP defence papers or documents, and to ask to be heard by the GPDP.

Right of Defence

The addressee of the notice may exercise the right of defence by submitting written statements and documents within 30 days from the date of notification of the communication, as well as a personal testimony regarding the facts of the notice, where requested.

The addressee of the notice may request a short extension by specifically and duly motivating the request. The extension shall normally not exceed 15 days, and may be granted according to proportionality criteria and criteria relating to the operational/dimensional characteristics of the addressees themselves and to the complexity of the matter under examination. The addressee of the notice may also request a hearing before the GPDP.

Failure to submit written counterarguments or a request for a hearing shall not prejudice the continuation of the proceedings.

Decision

Where necessary, the board of the GPDP, by its own resolution, shall adopt the corrective and sanctioning measures referred to in Article 58(2) of the GDPR (in the case of an administrative pecuniary sanction, the quantum is calculated on the basis of the criteria indicated by Article 83 of the GDPR). The decision is notified to the parties by the department, service or other organisational unit that has supervised the preliminary investigation.

Appeal Against Measures of the GPDP

Under penalty of inadmissibility, an appeal against the measures adopted by the GPDP must be lodged within 30 days from the date of communication of the decision or within 60 days if the appellant resides abroad, with the ordinary court of the place where the data controller resides, or with the court of the place of residence of the data subject. At the time of the appeal, it is also possible to request the court to suspend the enforceability of the contested decision.

The so-called “work ritual” applies to the judicial procedure, and the sentence that defines the judgment is not appealable before the judge and may prescribe the necessary measures and compensation for damages.

Over the past 24 months, Italian enforcement has concentrated on a few high-risk clusters:

  • telemarketing and unlawful lead chains, with record-level fines (notably the Enel Energia case) for weak control over data sources and processors;
  • generative AI and web-scraping, where the GPDP has moved from warnings to major sanctions (eg, OpenAI/ChatGPT, Replika) focusing on lawful basis for training, transparency and child protection;
  • biometrics/facial recognition, treated as exceptional and stopped when safeguards or voluntariness are insufficient;
  • workplace monitoring via metadata and logs, now enforced as strictly as traditional surveillance; and
  • cookies/dark-pattern consent, seen as basic compliance hygiene.

A parallel trend is the growing privacy–competition convergence, with Italian Competition Authority (AGCM) scrutiny on consent and data-combination practices.

In recent years, data protection litigation in Italy has experienced significant growth, driven by an increasing awareness of rights among data subjects.

The most frequent disputes involve unlawful data processing, data breaches, and the improper use of personal data by companies and public administrations. Claims for compensation for privacy violations, particularly for non-material damages, are also on the rise.

In this context, the GPDP is playing a crucial role, imposing substantial fines that influence corporate strategies, seeking to stay ahead of other European authorities, and positioning itself as a leader on issues related to AI (for instance, the proceedings initiated against OpenAI and ChatGPT, which concluded with a sanction in December 2024) and employee monitoring, especially concerning the retention of metadata generated through employees’ use of email tools (ie, the decision against Regione Lombardia on metadata and logs relating to internet browsing).

The most significant recent decision for EU/Italian privacy litigation is the CJEU’s Deloitte judgment C-413/23 P (EDPS v SRB), 4 September 2025. The Court held that pseudonymised data is not automatically “personal data” for every actor: for a third-party recipient, the qualification depends on whether that recipient has reasonably likely and practically available means to re-identify individuals. If, in the recipient’s hands, identification is not realistically possible, the data may be treated as anonymous for that recipient, even though it remains personal data for the original controller who retains the re-identification key. The Court set aside the General Court’s approach because it had relied on an abstract presumption (pseudonymisation = personal data) without assessing Deloitte’s concrete re-identification capability.

The national legislation on personal data protection does not currently provide explicit regulation for collective redress mechanisms. However, data subjects may rely on the tools generally available under civil procedure law or those designed to protect their rights as consumers.

In Italy, the protection and governance of non-personal data is largely shaped by directly applicable EU regulations, rather than by autonomous national legislation. There is no standalone Italian “horizontal” law equivalent to the EU Data Act; instead, Italy relies on the EU data strategy instruments, complemented only by sectoral or administrative measures.

The starting point is the Free Flow of Non-Personal Data Regulation (Regulation (EU) 2018/1807), which establishes the principle of the free movement of non-personal data within the EU. It prohibits unjustified data-localisation requirements imposed by member states, and promotes portability and switching for professional users of data processing and cloud services. Importantly, for mixed datasets (where personal and non-personal data are intertwined), the Regulation operates alongside the GDPR: the GDPR applies to the personal component, while the free-flow rules apply to the non-personal component.

This baseline is expanded by the Data Governance Act (Regulation (EU) 2022/868), which provides a cross-sector EU framework to increase data availability and sharing, covering both personal and non-personal data. It regulates the re-use of certain protected data held by public sector bodies, introduces trusted roles such as data intermediation service providers, and supports voluntary mechanisms such as data altruism and the development of Common European Data Spaces. Italy has adopted implementing measures mainly to designate competent authorities and procedural arrangements, but the substantive regime remains EU-based.

Finally, the EU Data Act (Regulation (EU) 2023/2854) – applicable across the EU, including Italy, from 12 September 2025 – constitutes the central horizontal regime for access to and use of data generated by connected products and related services (ie, IoT ecosystems). It sets out rights and obligations for key actors such as data holders, users (B2C/B2B) and data recipients, and applies to both personal and non-personal data depending on the nature of the dataset, subject to GDPR compliance for any personal elements. A core pillar is the mandatory B2B/B2C data access and sharing framework, often under FRAND (fair, reasonable and non-discriminatory) terms, and another pillar addresses the cloud and edge layer by imposing rules on switching, interoperability and the removal of contractual/technical lock-ins in data processing services.

Beyond these EU Regulations, Italy has not enacted a separate Data Act-like statute. National activity is mainly confined to sector-specific governance (especially for public-administration data and cloud security classifications) and to procedural implementation where EU Acts require member state designation of authorities or enforcement structures.

These frameworks are designed to complement, not override, privacy and IP rules, so the interaction works in layers. When the data to be accessed or shared includes personal data (or is part of a mixed dataset), the GDPR remains fully controlling. The Data Act and the Data Governance Act do not create a new lawful basis on their own: any disclosure must still rely on an Article 6 GDPR ground already applicable to the data holder, and must respect all GDPR principles, including confidentiality, minimisation, purpose limitation, transparency and security. In practice, the user’s Data Act access right can be exercised only within what is lawful under the GDPR, and any recipient of personal data must comply with the GDPR as an autonomous controller or processor.

For non-personal data, access and sharing obligations under the Data Act/Data Governance Act are balanced against IP rights and trade secret protection. The general rule is that sharing goes ahead, but the data holder may and should impose proportionate safeguards – for example, confidentiality clauses, technical access controls, or limiting the dataset to what is strictly necessary – so that disclosure does not destroy trade secret value or infringe IP. A refusal to share is allowed only in exceptional cases where no reasonable protective measures could prevent a serious risk of economic harm.

Thus, conceptually, the system is a “dual compliance gate”: the GDPR determines whether and how personal data can move; trade secret/IP law determines under what protections sensitive non-personal data can be shared, with access remaining the default outcome.

As a baseline, the Free Flow Regulation guarantees that non-personal data can be stored and processed anywhere in the EU, banning unjustified localisation rules. It also promotes portability and switching for professional cloud users, mainly through industry codes of conduct, and confirms that mixed datasets are governed jointly by the GDPR (personal layer) and free-flow rules (non-personal layer).

The Data Governance Act does not impose broad private-sector access duties but sets trusted conditions for sharing. It regulates the re-use of certain protected public-sector data, creates a compliance regime for neutral data intermediation services, and enables voluntary “data altruism” channels.

The Data Act is the core horizontal regime. It gives users of connected products and related services a right to access the data they generate, and to require the data holder to share those data with a third party (data recipient). Data holders must design products/services so that data is accessible by default, and must share B2B data under FRAND terms, with additional safeguards for SMEs and controls on unfair contractual clauses. In parallel, the Act imposes strong switching, portability, interoperability and termination duties on providers of data processing services (cloud/edge), including removal of technical/contractual lock-ins and limits on switching/egress fees. Any flow involving personal data remains fully subject to GDPR requirements; the Data Act does not override them.

For organisations, the main action items are practical and contractual:

  • map and classify product-generated data (non-personal/personal/mixed);
  • build technical interfaces and APIs for user access and third-party sharing;
  • update notices and B2B templates to reflect FRAND pricing and Data Act unfair terms rules;
  • implement procedures to protect trade secrets proportionately when access is requested; and
  • for cloud providers, revise pricing, contracts and architectures to enable effective switching and interoperable portability.

In Italy, enforcement of the EU non-personal data regime is shared among multiple authorities. AgID is the lead authority for the Data Governance Act, overseeing data intermediation services, data altruism registrations, and public sector data re-use; it also functions as the co-ordination hub. When data sharing involves personal or mixed datasets, AgID works with the GPDP to ensure GDPR compliance. Where access or FRAND terms create competition or unfair-contract concerns, the AGCM intervenes, and the ACN provides support in cybersecurity/cloud-security aspects.

For the Data Act, Italy is moving towards a multi-authority model: digital/data-governance functions close to AgID, technical/market-surveillance authorities for connected products, the GPDP for the personal data layer, and the AGCM for FRAND, unfair terms and anti-lock-in issues. The main trend is co-ordinated enforcement: Data Act cases are increasingly treated as “hybrid” matters requiring simultaneous privacy, competition and technical oversight, with regulators prioritising practical outcomes (IoT user access and cloud switching) over purely formal classifications.

In Italy, the general rule is set out by Article 122 of the Privacy Code (which transposes Directive 2002/58/EC), under which all cookies – and other similar tracking tools – other than those strictly necessary for the functioning of the website may be installed on the users’ devices only with their consent.

In this regard, as clarified by the Guidelines adopted by the Italian Supervisory Authority in 2021, it is essential that consent for the use of cookies is collected in compliance with the principles established by the GDPR. Accordingly, it must be preceded by a multi-layered notice (cookie banner) that provides information about the cookies and the related personal data processing, and that allows the user to freely accept or refuse the use of cookies as well as to change their decision at any time.

A specific consideration applies to analytical cookies, which may be treated as necessary cookies (and therefore not require consent) only when:

  • IP anonymisation features are enabled;
  • the use of analytical cookies is strictly limited to the production of aggregated statistics; and
  • they are used solely in connection with a single website or mobile application, such that they do not allow the tracking of users’ navigation across different websites or applications.

In Italy, the general rule is set out by Article 130(1;2) of the Privacy Code (which transposes Directive 2002/58/EC), under which commercial and promotional communications by email, fax, telephone and similar means of communication require the prior consent of the user (natural or legal person). However, Article 130(4) provides for an exception to the requirement of consent, allowing for the processing of the email address provided by the data subject in the context of the sale of a product or a service for the purpose of sending commercial communications aimed at the direct sale of products or services similar to those already purchased, provided that the data subject has been adequately informed and does not refuse such use, either initially or on the occasion of subsequent communications.

With specific regard to telephone marketing activities, Article 130(3-bis) provides that data controllers may lawfully contact all users who have not objected to receiving commercial communications by telephone by registering in the Register of Opposition. In this sense, pursuant to Law No 5/2018, users may enlist in the register in order to prevent subsequent communications and, at the same time, withdraw any consent previously given to the processing of their personal data for telephone marketing purposes. In fact, the data controller who intends to carry out telemarketing activities is required to consult the register at least every 15 days or, in any event, before the start of a new campaign.

On the other hand, online marketing may consist primarily of an activity carried out through the use of profiling and advertising cookies (see 4.1 Use of Cookies), or of behavioural advertising and targeting activities carried out through the use of external databases (especially those of social networks). In this second case, the jurisprudence of the CJEU and the interpretation provided by the European Data Protection Board (EDPB) in Guidelines 8/2020 clarify the need to carry out the activity on the basis of the prior consent of the data subject and, as a general rule, to reconstruct the privacy roles between the company and the social network as joint controllers of the processing to be regulated under Article 26 of the GDPR.

Processing carried out in the employment context is one of the sectors to which the GDPR defers to its regulation under national law, without prejudice to certain common guidelines and orientations first shared by WP29 and then by the EDPB, specifically regarding the vulnerable position of the data subject employee vis-à-vis the data controller employer (a situation that results in the presumption of the invalidity of any consents requested from the employee due to a lack of freedom).

Managing the Selection Process and the Employment Relationship

In these phases, the employer’s activities must respect – more than ever – the principle of minimisation, ensuring that only personal data that is essential for the performance of work duties and that, to a large extent, is governed by labour law provisions (eg, Article 8 of Law No 300/1970 or Legislative Decree No 81/2008) is requested from the candidate or employee.

Remote Monitoring of Workers

Without prejudice to a general prohibition on the use of instruments (also based on AI) to monitor employee activities, this case is governed by Article 4 of Law No 300/1970, which legitimises the use of such tools solely for organisational purposes and the protection of company assets (eg, cybersecurity purposes). In this case, without prejudice to instruments that are essential and prearranged for the performance of work duties, the use of instruments for remote monitoring is permitted only if doing so is:

  • agreed with the trade union representatives present in the company; or
  • authorised by the competent Labour Inspectorate in the absence of trade union representatives in the company or in the event of there being no agreement.

In these cases, the employee data subject will have to be provided with additional and detailed information on what is normally provided for under Articles 13 and 14 of the GDPR; this can be done by adopting an internal regulation on the use of IT tools, for example, which also informs employees of the possible controls and their purposes.

However, although the agreement with trade union representatives or administrative authorisation is sufficient to legitimise the activity from the point of view of labour law, this does not exempt the employer from complying with the principles on the protection of personal data (eg, the principle of minimisation). In this sense, unencrypted or clear monitoring of the URLs surfed by employees is unlawful because, in terms of security purposes, the same results can be achieved by implementing filters that inhibit the surfing of potentially risky websites. On this point, see also the Guidelines adopted by the GPDP on 1 March 2007 or the Guidelines on the metadata collected through the use of email accounts.

Whistle-Blowing and the Transparency Decree

The national legislation on whistle-blowing was updated to transpose Directive (EU) 2019/1937 through Legislative Decree No 2023/24, which made discipline uniform between the private and public sectors. With regard to the protection of personal data, the general principles dictated by the GDPR remain valid, concerning the obligations to set up reporting and management processes in compliance with the principles of privacy by default and by design, and with the need to ensure the confidentiality of the reporter (resulting in the inadequacy, for instance, of the email channel), carry out a Data Protection Impact Assessment (DPIA) on the processing, train and instruct the people who access the data and manage the reports, etc.

Further obligations (mainly informative) are also imposed by Legislative Decree No 104/2022 (the so-called “Transparency Decree”), which prescribes the need to carry out a DPIA and to provide additional information to data subjects in the event of “the use of automated decision-making or monitoring systems designed to provide indications relevant to the recruitment or assignment, management or termination of the employment relationship, the assignment of tasks or duties, as well as indications affecting the monitoring, assessment, performance and fulfilment of contractual obligations of workers”.

The value of personal data and consent databases as a corporate asset is often underestimated in corporate transactions. In this context, with regards to the sector in question, the main activity may consist of verifying the lawfulness and correctness of the processing of personal data that makes up a company's databases; this can be done by verifying the correctness and completeness of the information that the data controller had to provide to the data subjects pursuant to Articles 13 and 14 of the GDPR, and by examining the evidence of compliance with this information notice obligation.

Furthermore, where the processing of personal data is based on consent (eg, in the case of processing for promotional purposes or in the context of scientific research), it is essential to verify the correctness and ability to prove the consents collected from the data subjects and the effective capacity of the systems to receive any requests for withdrawal and/or opposition.

Following the M&A transaction, it is also essential that data subjects receive a communication informing them of the change in the identity of the data controller, as well as any other changes relating to the processing of personal data (new contact channels for the data controller and the Data Protection Officer, new methods for exercising rights, any transfer of data to third countries, etc). This communication should be provided upon first contact with the data subject or within 30 days.

European data protection legislation requires that any transfer of personal data that is undergoing processing or is intended for processing after transfer to a third country or to an international organisation (including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation) shall take place only if the level of protection of natural persons guaranteed by the GDPR is not undermined.

This therefore requires an examination of the legal provisions applicable to the third country or international organisation in order to understand the actual level of protection of personal data, taking into account the elements specified in Article 45(2) of the GDPR. This analysis is carried out by the European Commission when it adopts the adequacy decisions referred to in Article 45 of the GDPR (decisions legitimising the transfer of personal data to the country or organisation benefiting from it).

In the absence of an adequacy decision, as clarified by the CJEU in its judgment of 16 July 2020 (the Schrems II judgment), this assessment is instead the responsibility of the data controller or data processor who is intending to export the personal data. In such a case, where the law in force in the third country or applicable to the international organisation does not guarantee an adequate level of protection of personal data, the transfer may only be carried out subject to the adoption of additional security measures suitable to mitigate the risks to the rights and freedoms of the data subjects (eg, encryption of the data prior to the transfer in order to exclusively share encrypted data).

With reference to non-personal data, see 5.2 Government Notifications and Approvals and 5.3 Data Localisation Requirements.

With reference to personal data, notification to the supervisory authority is only required in the case of transfers pursuant to Article 49(1)(2) of the GDPR. This is the case when no other means can be used to legitimise the transfer, and requires that the transfer:

  • is not repetitive;
  • concerns a limited number of data subjects;
  • is necessary for the purposes of compelling legitimate interests pursued by the controller which are not overridden by the interests or rights and freedoms of the data subject;
  • is carried out subject to appropriate data protection safeguards; and
  • is notified to the supervisory authority by the data controller.

With reference to non-personal data, Article 32 of the Data Act works as a safeguard against third-country authorities trying to obtain non-personal data stored in the EU from cloud/data-processing providers. A foreign court order or administrative decision is enforceable in the EU only if it is based on an international agreement with the EU or a member state (for example, a mutual legal assistance treaty). If no such agreement exists and compliance would conflict with EU or national law, the provider may grant access only where the third-country legal system offers solid rule-of-law guarantees:

  • the order must be reasoned, specific and proportionate;
  • the provider must be able to object and obtain judicial review; and
  • the reviewing court must be empowered to take account of the provider’s EU-protected interests.

To assess this, the provider can ask for an opinion from the national authority competent for international legal co-operation, especially where the request concerns trade secrets, commercially sensitive data, IP-protected material, or risks of re-identification; that authority may consult the Commission. A similar check applies if national security or defence interests may be involved. If the authority does not respond within one month, or concludes that the safeguards are not met, the provider is entitled to refuse the foreign request.

With reference to personal data, European legislation on the protection of personal data does not provide for any obligation to store data within a specific member state or the EEA, aiming, on the contrary, to regulate and facilitate the free movement of such data. In the case of transfers of data to third countries, however, the provisions of Chapter V of the GDPR apply in order to guarantee an adequate level of protection of personal data (see 5.1 Restrictions on International Data Transfers).

With reference to non-personal data, Article 32 of the Data Act states that the providers of data processing services shall take all adequate technical, organisational and legal measures, including contracts, in order to prevent international and third-country governmental access and transfer of non-personal data held in the EU where such transfer or access would create a conflict with Union law or with the national law of the relevant member state.

There are no “blocking” statutes in European legislation in addition to those described in the previous sections concerning the transfer of data outside the EEA.

The most recent update was in 2024, when the European Commission concluded the review process of 11 adequacy decisions regarding the transfer of personal data, confirming their validity. In this regard, the decisions remain effective, and it is permissible to continue the free transfer of personal data to Andorra, Argentina, Canada, the Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland and Uruguay.

ICT Legal Consulting

ICT Legal Consulting
Via Borgonuovo 12
20121 Milan
Italy

+39 0284 2471 94

+39 0270 0512 101

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Trends and Developments


Authors



ICT Legal Consulting (ICTLC) is an international law firm that provides strategic legal and regulatory support across privacy, data protection, intellectual property, and technology, media and telecommunications (TMT) law, with a strong focus on the normative and operational aspects of cybersecurity. The firm assists organisations in designing and implementing governance, compliance and security frameworks that meet the highest international standards. With over 80 professionals and a network active in more than 65 jurisdictions, ICTLC combines global co-ordination with local insight. Through its sister company ICT Cyber Consulting, the firm offers integrated cybersecurity services, including legal-technical risk assessments, resilience planning, and alignment with frameworks such as NIS2, DORA and the Cyber Resilience Act. ICTLC’s multidisciplinary expertise enables clients to navigate complex digital regulations and strengthen trust, compliance and resilience across their global operations.

Data Protection Enforcement Trends in Italy

Overview

Article 51 of the General Data Protection Regulation (GDPR) provides that “each member state shall provide for one or more independent public authorities to be responsible for monitoring the application of this Regulation, in order to protect the fundamental rights and freedoms of natural persons in relation to processing and to facilitate the free flow of personal data within the Union”. Under Article 58 GDPR, the supervisory authority is granted a wide range of powers, including investigative, corrective, authorising and advisory powers, among which is – notably – the possibility to levy pecuniary fines and to impose a temporary or definitive ban on the processing of personal data.

In Italy, the competent supervisory authority is the Garante per la Protezione dei Dati Personali (so-called “Garante”, or GPDP), whose decisions can be appealed by applying to the ordinary tribunal on second degree, and to the Supreme Court of Cassation on third degree.

The GPDP is widely considered one of most active and influential supervisory authorities, having issued – as of 31 December 2025 – more than 575 publicly available enforcement actions, amounting to over EUR315,696,105 in sanctions. This leaves Italy behind only Spain in terms of the number of sanctions issued, given that the Spanish supervisory authority has issued at least 1,027 sanctions, though levying only a total amount of approximately EUR137,721,810 in sanctions. The GPDP has issued fines and leveraged its corrective powers – including the imposition of a ban on processing – towards most GDPR areas and industry sectors. There are, however, aspects of the GDPR on which the authority seems to focus its attention more often than others.

Trends in Enforcement

Traditionally, the GPDP has been especially concerned with combating unlawful telemarketing practices, in particular as regards transparency and consent requirements as well as the engagement of third parties (such as call centres) as data processors without the necessary data protection safeguards, including performance of audits by the data controller. In this field, the GPDP has issued some of its highest sanctions ever published, such as those against Eni Gas e Luce (issued on 11 December 2019, for a total amount of EUR11.5 million), Tim (issued on 15 January 2020, for a total amount of EUR27.8 million) and Sky Italia (issued on 16 September 2021, for a total amount of EUR3.2 million). More recently, in July 2024, the GPDP ordered Hera (an energy provider) to pay a fine of EUR 5 million for violations in the processing of personal data of over 2,300 customers due to the insufficient implementation of safeguards by Hera’s data processors, which in some cases had led to the activation of energy supply contracts without knowledge or consent from the users.

During the last few years, the GPDP has also focused its attention on the protection of the data privacy rights of children, as apparent from the enforcement actions undertaken against the popular social network TikTok concerning data verification requirements. On 22 January 2021, following the highly publicised death of a ten-year old girl from Sicily participating in a “blackout” challenge, the GPDP imposed an immediate limitation on the data processing concerning users “whose age could not be established with full certainty so as to ensure compliance with the age-related requirements”. On 3 February 2021, the GPDP noted that, following the enforcement action, TikTok committed to fulfilling GDPR age-verification requirements by taking a number of actions, including:

  • removing accounts belonging to users under 13 years of age;
  • employment of an AI-based system to verify age;
  • launching an information campaign to raise awareness;
  • including an in-app button for reporting those under 13 years of age;
  • improving the language of the privacy notice for users under 18 years of age; and
  • doubling the number of Italian platform content moderators.

Another field where the GPDP has recently stepped up its enforcement actions is that of video-surveillance. In October 2023, the GPDP reprimanded an individual person for installing a home surveillance system with cameras capturing not only their own apartment but also a public area (including a public park). According to the GPDP, the placement of the cameras violated the principle of lawfulness, as the data controller failed to show a legitimate interest capable of justifying the recording of public areas and conversations through the audio system, as well as the principle of data minimisation. The decision is also interesting because it clarifies that, while domestic surveillance systems are generally exempt from having to comply with the GDPR, such exemption does not apply when involving public or third parties’ properties. The GPDP issued a mere reprimand against the data controller, considering that the individual promptly rectified the situation by replacing the camera and redirecting it solely towards the entrance of their home.

In June 2023, the GPDP issued a sanction of EUR20,000 against an Italian employer for, inter alia, having installed a video-surveillance system in its premises without having obtained the prior approval of the workers’ council or of the public labour authority, as required by Article 4 of Law No 300/1970 (the so-called “Workers’ Statute”). Moreover, no privacy notice had been drafted and made available to the workers. More recently, in July 2024, the GPDP ordered the Turin municipality to share further information relating to the use of AI-powered “smart CCTVs”, which would allegedly help local police forces to understand whether their intervention is needed in emergency situations, in order for the GPDP to be able to investigate the system’s compliance with the GDPR.

Lastly, a field where the GPDP has focused much of its attention since early 2023 is that of artificial intelligence (AI), as shall be seen specifically in the following section.

The GPDP at the Forefront of AI Enforcement

During the last couple of years, the GPDP has taken noteworthy initiatives in the context of AI, including by means of enforcement actions against major providers of generative AI systems and models. As a result, the GPDP has accredited itself as one of the most active EU supervisory authorities on the regulation of AI vis-à-vis the GDPR and the Italian Data Protection Law. The following provide a brief overview of the most important initiatives undertaken by the GPDP in the AI field.

Enforcement action against OpenAI

In late March 2023, only a few months after its launch, the GPDP identified several violations of the GDPR and the Italian Data Protection Law regarding the famous and widespread generative AI system ChatGPT. According to the GPDP, OpenAI failed to demonstrate the presence of a valid legal basis for collecting and processing personal data for the purposes of training ChatGPT, and the information provided to users and individuals whose data was used for training the generative AI system was incomplete. Moreover, individuals whose data was used for training the AI system had no easy way to exercise their data protection rights, including the rights of access, rectification and objection. Interestingly, the GPDP also noted that ChatGPT's responses to users’ prompts often deviated from reality (so-called “hallucinations”), thereby violating the accuracy principle established by the GDPR when such responses concerned another individual. The ChatGPT case further underscores, once again, the GPDP’s scrutiny of data processing concerning children: in this respect, the authority questioned whether the platform’s outputs might result in inappropriate responses for children, even if the service is purportedly aimed at users above the age of 13, as stated in OpenAI’s terms of service. As a result, the GPDP required OpenAI to implement a suitable age verification system.

On 28 April 2023, the GPDP lifted the ban, as it considered the measures adopted by OpenAI to adequately address the data protection issues raised by the authority and which underpinned the ban. Such measures included updating ChatGPT’s privacy policy, implementing adequate age-verification systems, and implementing measures to enable individuals to exercise their data protection rights. However, the GPDP reserved its powers to fully investigate the underlying shortcomings that led to it issuing the ban in the first place and, if necessary, issuing any relevant sanction in a separate decision after having fully investigated the facts.

The final decision was issued on 2 November 2024. In its ruling, the GPDP imposed a EUR15 million fine on OpenAI and mandated the company to launch a six-month public awareness campaign. The campaign should be aimed at educating the public about the collection of personal data for training its generative AI model ChatGPT, and informing individuals about their data protection rights (such as the right to opt out).

The amount of the fine was due to the following data protection infringements by OpenAI.

  • Processing personal data without having previously identified a suitable legal basis.
  • Lack of age-verification mechanisms. When the service launched, there were no effective measures to verify users’ ages.
  • Failure to implement the campaign to raise awareness in the way prescribed by the GPDP within the 2023 provisional decision.
  • Provision of insufficient information to users within ChatGPT’s privacy notice, especially as regards the failure to inform data subjects about how their personal data was processed, including its use for training the AI model. 
  • Violation of the accuracy principle, as ChatGPT’s responses often result in incorrect information about individuals.

The fine considered OpenAI’s co-operative approach, acknowledging the implementation of several measures requested by the GPDP. OpenAI was given 30 days to pay the fine and 60 days to submit a detailed plan for the required awareness campaign to the GPDP.

The GPDP–OpenAI saga has garnered widespread attention, not only within the Italian data protection community but also in the general public of both Italy and Europe, given that it is one of the first enforcement initiatives to have targeted a generative AI system. The GPDP activity goes to show the GDPR potential of regulating specific aspects relating to generative AI, as well as the GPDP’s willingness to lead the way in this respect.

Finally, given that during the investigation OpenAI established its EU headquarters in Ireland, the Irish Data Protection Commissioner (DPC) was brought into focus as the lead supervisory authority for OpenAI. In line with the GDPR’s one-stop-shop mechanism, the GPDP has forwarded the case documents to the DPC, which will now be responsible for investigating any further data protection infringements involving OpenAI.

Enforcement action against DeepSeek

DeepSeek entered the global AI market in January 2025, attracting widespread attention as the first large-scale Chinese generative AI model capable of competing with leading US providers. Its launch was widely perceived as a turning point in the AI market, both for its technological ambitions and for the broader geopolitical implications of a non-US model achieving comparable performance and, in some cases, adoption at international level.

The GPDP once again positioned itself at the forefront of regulatory enforcement in the AI sector. The authority promptly initiated supervisory action against the companies operating the service, confirming its determination to play a leading role in asserting the application of EU data protection rules to global AI providers, irrespective of their place of establishment.

In January 2025, a few days after DeepSeek’s launch, the GPDP adopted urgent measures against Hangzhou DeepSeek Artificial Intelligence Co, Ltd and Beijing DeepSeek Artificial Intelligence Co, Ltd, the Chinese companies operating the generative AI service known as DeepSeek. The service is offered through a website, applications and related software, and allows users to access AI-driven functionalities after registration.

The case originated from a formal request for information sent by the GPDP to the companies on 28 January 2025. In their reply, they stated that DeepSeek had not entered, nor intended to enter, the Italian market and that the application had been removed from Italian app stores. The companies also argued that the GDPR did not apply to their processing activities. Following a rapid investigation, the GPDP established that, although DeepSeek’s mobile app was no longer available on the Apple and Google App Stores in Italy, the underlying service remained accessible to Italian users through its website. In addition, users who had previously registered were still able to log in and use the service. On this basis, the authority concluded that DeepSeek was unquestionably offering its services to individuals located in the EU, including Italy. As a consequence, the GPDP found that the GDPR applied extraterritorially on the basis of its Article 3(2), as the service targeted individuals residing in the Union. The authority also noted that DeepSeek had failed to co-operate adequately with the investigation, having omitted to clarify key aspects of its data processing activities, therefore breaching its co-operation duties laid down in Article 31 GDPR.

Several substantive shortcomings were also identified. DeepSeek’s privacy policy was not available in Italian, but only in English, and did not meet the transparency and information requirements set out in Articles 12, 13 and 14 of the Regulation, as it missed a few mandatory requirements. In particular, it failed to clearly specify the legal bases relied upon for each processing activity, contrary to Article 6 GDPR, and did not provide sufficient information to enable data subjects to exercise their subjective data protection rights (eg, access, rectification, objection, portability, deletion, etc).

Further concerns arose in relation to international data transfers and security. According to the privacy policy, personal data collected through the DeepSeek service were stored in the People’s Republic of China, without adequate safeguards being demonstrated, raising issues under the data transfers regime foreseen by the GDPR, as strictly interpreted by the Court of Justice (most famously in the Schrems I and Schrems II judgments).

Finally, despite being subject to the GDPR while being established in a third country, the companies had failed to appoint a representative in the EU, in violation of Article 27.

Given the seriousness of the findings and the ongoing nature of the investigation, the GPDP ordered, on an urgent basis, a definitive limitation on the processing of personal data of individuals located in Italy. The measure took immediate effect and was adopted without prejudice to any further enforcement action that may follow upon completion of the investigation.

Once again, the decision illustrates the GPDP’s readiness to intervene swiftly in the AI domain and its ambition to remain a key EU data protection enforcer vis-à-vis global AI providers.

Enforcement in the Field of Data Brokering Services

Inquiry regarding Lusha

Aside from the AI sector, the GPDP has also recently concentrated its efforts on enterprises providing data brokering services to the Italian market.

One salient example is the enforcement action against Lusha Systems Inc, a US-based company that operates an online platform offering so-called “enriched” contact data of potential leads. Through subscriptions purchased by its clients – typically sales, marketing and recruitment professionals – the platform allows users to identify or verify business contact details such as email addresses, landline numbers and mobile phone numbers of potential leads, including professionals and other individuals working for companies that may be interested in their services, with a view towards contacting them by leveraging the contact data provided by Lusha.

Lusha's services are built around personal data-gathering and aggregation from multiple sources, including work-related social networks, websites of companies, and many other third-party sources. The service therefore does not only include registered users of the platform, as the database may also include personal data relating to individuals who have never interacted with Lusha directly.

Importantly from a jurisdictional perspective, the GPDP found that Lusha’s services are accessible from the EU and are actively offered to users located in EU member states, including Italy. Lusha’s database reportedly contains personal data relating to EU residents, including Italian residents and, in some cases, individuals holding prominent institutional or senior professional roles. This circumstance was decisive in allowing the GPDP to assert its jurisdiction vis-à-vis Lusha.

On 8 April 2025, the GPDP formally announced the opening of an investigation into Lusha. The decision followed multiple complaints from individuals who reported receiving unsolicited promotional or commercial telephone calls allegedly made using contact details obtained through Lusha’s services. According to the GPDP, preliminary information suggested that Lusha was selling contact details, including telephone numbers, some of which appeared to be of uncertain or problematic origin in light of the GDPR. The authority also highlighted that Lusha’s database included data relating to Italian residents, and that the platform was being used by companies to carry out marketing activities targeting individuals in Italy.

In light of the above, the GPDP issued a formal request for information to Lusha, requiring the company to respond within 20 days from receipt thereof. The request was broad in scope and aimed at clarifying the fundamental aspects of Lusha's data processing activities in so far as they affect individuals in Italy.

In particular, the authority asked Lusha to provide:

  • details on the quantity of personal data relating to individuals residing in Italy that is collected or processed;
  • a clear description of the means used to collect such data;
  • comprehensive information on each source contributing to the database;
  • clarification as to whether personal data of individuals who are not platform users is processed; and
  • specific information, with regard to email addresses and telephone numbers, on the relevant sources, consent mechanisms and purposes of data-sharing with platform users.

Moreover, the GPDP recalled that similar business models had already been under scrutiny in the past. In earlier cases, the authority had sanctioned real estate agencies for carrying out promotional activities using telephone numbers purchased from third-party providers that were offering services comparable to those provided by Lusha.

On 5 December 2025, the GPDP issued a further communication, clarifying that the proceedings which had been promptly opened in April 2025 were still fully ongoing. The GPDP confirmed that its investigation had been triggered in light of the receipt of formal complaints from data subjects who lamented having received unwanted promotional or commercial calls based on data included in Lusha’s database.

From the authority’s preliminary findings, it emerged that Lusha collects and processes personal data of EU residents, building its services around the collection and further sharing of this data with potential clients, while maintaining that compliance with data protection rules rests primarily with the latter, as data controllers.

The GPDP also highlighted that Lusha includes in its services the contact details of public figures, including individuals holding public and institutional positions. This element raises additional concerns in light of the principles of lawfulness, purpose limitation and data minimisation, as well as in light of the data subjects’ increased expectations of privacy and transparency applicable when processing personal data at scale.

It is worth noting that, although Lusha operates across all EU member states, the GPDP is, at present, the only supervisory authority to have formally launched an in-depth investigation into the platform. This confirms the proactive stance traditionally adopted by the Italian authority in sectors perceived as high-risk, including telemarketing, the commercial trading of personal data and – as seen above – AI.

At the same time, the GPDP has made it clear that its enforcement focus is not limited to Lusha alone. The authority is monitoring other major platforms active in the personal contact data market, and is also assessing the position of companies that have relied on such databases to carry out marketing activities without an adequate legal basis.

The outcome of the Lusha investigation may therefore have significant implications, not only for the company and its clients themselves but also for the wider ecosystem of data brokers, and for businesses that source contact data from third-party providers and/or publicly available sources (such as social networks and public databases).

This case serves as a reminder that the commercial value of leveraging third-party-managed large-scale contact databases and the possibility of relying on data-brokering services always require a careful assessment of compliance with EU data protection rules, and that offering or receiving similar services may entail some level of compliance risk.

ICT Legal Consulting

ICT Legal Consulting
Via Borgonuovo 12
20121 Milan
Italy

+39 0284 2471 94

+39 0270 05121 01

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that provides strategic legal and regulatory support across privacy, data protection, intellectual property, and technology, media and telecommunications (TMT) law, with a strong focus on the normative and operational aspects of cybersecurity. The firm assists organisations in designing and implementing governance, compliance and security frameworks that meet the highest international standards. With over 80 professionals and a network active in more than 65 jurisdictions, ICTLC combines global co-ordination with local insight. Through its sister company ICT Cyber Consulting, the firm offers integrated cybersecurity services, including legal-technical risk assessments, resilience planning, and alignment with frameworks such as NIS2, DORA and the Cyber Resilience Act. ICTLC’s multidisciplinary expertise enables clients to navigate complex digital regulations and strengthen trust, compliance and resilience across their global operations.

Trends and Developments

Authors



ICT Legal Consulting (ICTLC) is an international law firm that provides strategic legal and regulatory support across privacy, data protection, intellectual property, and technology, media and telecommunications (TMT) law, with a strong focus on the normative and operational aspects of cybersecurity. The firm assists organisations in designing and implementing governance, compliance and security frameworks that meet the highest international standards. With over 80 professionals and a network active in more than 65 jurisdictions, ICTLC combines global co-ordination with local insight. Through its sister company ICT Cyber Consulting, the firm offers integrated cybersecurity services, including legal-technical risk assessments, resilience planning, and alignment with frameworks such as NIS2, DORA and the Cyber Resilience Act. ICTLC’s multidisciplinary expertise enables clients to navigate complex digital regulations and strengthen trust, compliance and resilience across their global operations.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.