Data Protection & Privacy 2025

Last Updated March 11, 2025

Italy

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Sweden (Gothenburg), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 56 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

The Italian regulatory framework on the protection of personal data and privacy is dictated by Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, repealing Directive 95/46/EC (GDPR). To the extent that such protection is not mentioned by the GDPR, it is regulated by Legislative Decree No 196/2003 (the “Privacy Code”).

Further detailed rules are contained in Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector, as transposed into Italian law by the Privacy Code.

With reference to the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection and prosecution of criminal offences or the execution of criminal penalties, the regulatory framework is instead governed by EU Directive 2016/680, transposed into the Italian legal system through Legislative Decree No 51/2018.

Finally, other specific indications and/or interpretations are contained in the decisions, recommendations and guidelines issued by the national supervisory authorities and the European Data Protection Board (eg, in Italy, the requirements for system administrators).

As mentioned in 1.1 Overview of Data and Privacy-Related Laws, supervisory authorities have limited regulatory power, mainly through the adoption of guidelines and opinions interpreting legal provisions. However, supervisory authorities (in Italy, the Garante per la Protezione dei Dati Personali, or GPDP) also have supervisory powers to monitor compliance with data protection legislation, and benefit from investigative powers that include, ex multis, the possibility of requesting information from data controllers and data processors or conducting on-site checks and inspections.

In this context, the supervisory authority may request access to the documentation adopted (privacy policy, consents, internal policies and procedures, records of processing activities, etc) and to systems and databases. The inspections of the GPDP may be triggered by the authority itself (based on an inspection plan adopted and published every six months, or following notification of a personal data breach), or by data subjects or other third parties (in the case of complaints or reports). Any decisions that are eventually adopted are published.

Data protection legislation may also be applied by the courts in the case of appeals lodged by individuals (particularly in the case of claims for damages or appeals against decisions of the supervisory authority).

As mentioned in 1.2 Regulators, GPDP inspections may be triggered by the authority itself (on the basis of an inspection plan adopted and published every six months, or following the notification of a personal data breach) or by data subjects or other third parties (in the case of complaints or reports).

Preliminary Investigation

In the event of a complaint by a data subject, the GPDP shall verify the correctness and completeness of the complaint and, if necessary, grant the complainant a period of time to amend it, normally not exceeding 15 days. In the event of a correct and complete complaint (or in the event of an investigation on its own accord, such as following the notification of a personal data breach), the GPDP shall start a preliminary investigation during which the documentation received is examined and/or further information is requested from the data controller or data processor.

In that scenario, inspections may also be carried out, during which the entity subject to inspection may be assisted by its trusted advisers and reserve the right to produce the documentation that is not immediately available within a reasonable period (as a rule, not exceeding 30 days). A record of the activity carried out shall also be drawn up, with particular reference to the statements made and the documents acquired, and a copy shall be given to the subject under inspection.

Closing of the Preliminary Investigation and Archiving

At the end of the preliminary investigation, the competent department within the GPDP may conclude its examination of the complaint by archiving it, when:

  • the issue examined does not appear to be related to the protection of personal data or the tasks entrusted to the GPDP;
  • there is no evidence of a breach of the relevant data protection regulations;
  • the claim set out in the complaint is excessive, due in particular to its specious or repetitive character; or
  • the issue raised by the complaint has already been examined by the GPDP.

In the case of a complaint, feedback is provided to the applicant, briefly stating the reasons why no action is taken.

Initiation of Proceedings

If the matter is not dismissed following the preliminary investigation, the competent department shall initiate proceedings for the adoption of measures by the board of the GPDP, by means of its own communication to the data controller and/or data processor. The communication shall contain:

  • a concise description of the facts and alleged breaches of the relevant data protection rules, as well as the relevant sanctioning provisions;
  • an indication of the competent organisational unit where a copy of the investigative documents may be inspected and extracted; and
  • the indication that, within 30 days of receipt of the notice, it is possible to send the GPDP defence papers or documents, and to ask to be heard by the same GPDP.

Right of Defence

The addressee of the notice may exercise the right of defence by submitting written statements and documents within 30 days from the date of notification of the communication, as well as a personal testimony regarding the facts of the notice, where requested.

The addressee of the notice may request a short extension by specifically and duly motivating the request. The extension shall normally not exceed 15 days and may be granted according to proportionality criteria and criteria relating to the operational/dimensional characteristics of the addressees themselves and to the complexity of the matter under examination. The addressee of the notice may also request a hearing before the GPDP.

Failure to submit written counterarguments or a request for a hearing shall not prejudice the continuation of the proceedings.

Decision

Where necessary, the board of the GPDP, by its own resolution, shall adopt the corrective and sanctioning measures referred to in Article 58(2) of the GDPR (in the case of an administrative pecuniary sanction, the quantum is calculated on the basis of the criteria indicated by Article 83 of the GDPR). The decision is notified to the parties by the department, service or other organisational unit that has supervised the preliminary investigation.

Appeal Against Measures of the GPDP

Under penalty of inadmissibility, an appeal against the measures adopted by the GPDP must be lodged within 30 days from the date of communication of the decision or within 60 days if the appellant resides abroad, with the ordinary court of the place where the data controller resides, or with the court of the place of residence of the data subject. At the time of the appeal, it is also possible to request the court to suspend the enforceability of the contested decision.

The so-called “work ritual” applies to the judicial procedure, and the sentence that defines the judgment is not appealable before the judge and may prescribe the necessary measures and compensation for damages.

One of the Most Significant Administrative Proceedings of 2024 Involved OpenAI

In December 2024, the Italian Supervisory Authority concluded an investigation into OpenAI, identifying several GDPR violations related to the ChatGPT service. These violations included the processing of personal data without a proper legal basis, a lack of transparency towards users, and the absence of effective mechanisms for age verification, which exposed minors to inappropriate content.

As a result, the Italian Supervisory Authority imposed a fine of EUR15 million on OpenAI and mandated a six-month public information campaign across various media to raise awareness about how ChatGPT operates and the rights of data subjects. Furthermore, given that the company established its European headquarters in Ireland during the investigation, the Authority, in compliance with the “one-stop shop” rule, referred the case to the Irish Data Protection Commission (DPC), which became the lead supervisory authority under the GDPR, to continue the investigation regarding any ongoing violations that persisted prior to the establishment of the European headquarters.

Additional Proceedings by the Italian Supervisory Authority

Another series of proceedings conducted by the Italian Supervisory Authority focused on telemarketing and teleselling activities, culminating in the imposition of substantial fines on companies in the telecommunications and energy supply sectors. Within this context, a notable sanction was imposed on Enel Energia but was subsequently annulled by the court. Following legal proceedings, the Rome Tribunal highlighted procedural shortcomings in the Italian Supervisory Authority’s actions, particularly regarding compliance with administrative deadlines.

The Regulation (EU) 2024/1689, also known as the AI Act, was adopted on 13 June 2024, and represents the European Union’s first comprehensive legal framework for artificial intelligence. It establishes harmonised rules for the development, deployment, and use of AI systems in the EU, aiming to promote safety, transparency, and compliance with fundamental rights while fostering innovation and market development. The regulation applies to providers, deployers, importers, and distributors of AI systems, classifying them by risk level – minimal, limited, or high-risk – while prohibiting certain practices, such as subliminal manipulation or biometric categorisation in public spaces. Specific obligations are set for high-risk systems, including strict data governance and transparency requirements. The AI Act became effective on 10 August 2024, with staggered deadlines for compliance: prohibitions take effect after six months, governance and general-purpose AI model rules after one year, and integrated systems’ obligations after three years.

At the same time, the Italian legislature is working on drafting an additional national legislative act which, as of today, has not yet been adopted.

With regard to data protection, European AI regulations expressly emphasise the need to comply with data protection laws, which are therefore applicable in this context as well. The GDPR already establishes a series of provisions – particularly the obligations of transparency, the right not to be subject to fully automated decisions, and the obligation to conduct a Data Protection Impact Assessment (DPIA) – that are suitable for regulating and ensuring an adequate level of protection for data subjects, including in the context of the use of AI tools.

Please see 1.5 AI Regulation.

In recent years, data protection litigation in Italy has experienced significant growth, driven by an increasing awareness of rights among data subjects.

The most frequent disputes involve unlawful data processing, data breaches, and the improper use of personal data by companies and public administrations. Claims for compensation for privacy violations, particularly for non-material damages, are also on the rise.

In this context, the Italian Supervisory Authority is playing a crucial role, imposing substantial fines that influence corporate strategies, seeking to stay ahead of other European authorities, and positioning itself as a leader on issues related to AI (for instance, the proceedings initiated against OpenAI and ChatGPT, which concluded with a sanction in December 2024) and employee monitoring, especially concerning the retention of metadata generated through employees’ use of email tools.

Please see 1.4 Data Protection Fines in Practice.

The national legislation on personal data protection does not currently provide explicit regulation for collective redress mechanisms. However, data subjects may rely on the tools generally available under civil procedure law or those designed to protect their rights as consumers.

The use of IOT services is governed, from a data protection perspective, by the legislation already outlined in 1.1 Overview of Data and Privacy-Related Laws and whose obligations and rights are outlined in detail in 3.3 Rights and Obligations Under Applicable Data Regulation.

In addition, with regard to IOT services, the regulations adopted as part of the EU Data Strategy (in particular, the Data Act and the Data Governance Act), which are outlined in 3.2 Interaction of Data Regulation and Data Protection, also apply andentail certain obligations to share and circulate information consisting of both personal and non-personal data.

The interaction between data protection legislation and that adopted as part of the EU Data Strategy (in particular, the Data Act and the Data Governance Act) forms a regulatory framework aimed at balancing the protection of personal data with the promotion of a data economy based on sharing and innovation. In particular:

  • The GDPR, as outlined in 1.1 Overview of Data and Privacy-Related Laws,governs data protection and provides fundamental principles and rights to protect data subjects.
  • The Data Governance Act (effective from September 2023) promotes a secure and trusted ecosystem for data sharing, creating data spaces and mechanisms for regulated access to data, including public data.
  • The Data Act (phase-in from 2024) regulates the mandatory sharing of data generated by IoT devices and imposes interoperability and access obligations on public and private entities.

In this context, the sharing of personal data – distinct from non-personal data – takes on particular significance. Under the Data Act, such sharing may sometimes constitute a legal obligation and, in other cases, serve a public interest, thereby providing a legitimate basis for processing under the GDPR. However, compliance with key GDPR principles, such as data minimisation, security of processing, and transparency towards data subjects, must always be ensured.

In summary, while the Data Act and the Data Governance Act complement the GDPR by introducing rules to encourage data sharing, they also necessitate a thorough analysis of the legal basis and privacy implications. This includes the implementation of technical measures to separate personal from non-personal data and the use of accountability tools to document assessments – particularly in cases where data sharing is mandatory.

Data Protection Officer (DPO)

Pursuant to Article 37 of the GDPR, as interpreted by the supervisory authorities’ guidelines, the appointment of a DPO is mandatory for public administrations or where the main activities carried out by the data controller or data processor consist of processing operations which, by virtue of their nature, scope and/or purposes, require regular and systematic monitoring of data subjects on a large scale or the processing on a large scale of special categories of data and personal data relating to criminal convictions and offences. In addition, the European guidelines make it clear that data controllers and data processors must document their assessments as to whether or not to designate a DPO and periodically review this assessment, unless it is evident that an organisation is not required to designate a DPO.

The tasks of the DPO are set out in Article 39 of the GDPR and consist of:

  • informing and advising the controller or the processor and the employees who carry out processing of their obligations pursuant to European data protection legislation;
  • monitoring compliance with European data protection legislation and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, raising awareness and training staff involved in processing operations, and the related audits;
  • providing advice where requested as regards the Data Protection Impact Assessment (DPIA) and monitoring its performance;
  • co-operating with the supervisory authority; and
  • acting as the contact point for the supervisory authority on issues relating to processing, and to consult, where appropriate, with regard to any other matter.

In the performance of their tasks, the DPO shall have due regard to the risk associated with processing operations, taking into account the nature, scope, context and purposes of processing.

Lawfulness of Processing

Any processing of personal data must be based on at least one of the following legal bases provided for in Article 6(1) of the GDPR:

  • the data subject has freely given specific, informed and unambiguous consent to the processing of their personal data;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
  • processing is necessary for compliance with a legal obligation to which the controller is subject;
  • processing is necessary in order to protect the vital interests of the data subject or of another natural person;
  • processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; or
  • processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require the protection of personal data, particularly where the data subject is a child. In this case, the data controller is required to carry out an assessment of the legitimate interest pursued in relation to the rights and freedoms of the data subject by conducting a balancing activity that may possibly be challenged by the supervisory authority or the court.

Data Protection by Design and by Default

Both at the time of the determination of the means for new processing and at the time of the processing itself, the data controller shall implement appropriate technical and organisational measures that are designed to implement data protection principles and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. At the same time, the data controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data that is necessary for each specific purpose of the processing is processed.

Data Protection Impact Assessment

Pursuant to Article 35 of the GDPR, where a type of processing is likely to result in a high risk to the rights and freedoms of natural persons, particularly processing using new technologies, and taking into account the nature, scope, context and purposes of the processing, the controller shall carry out an assessment of the impact of the envisaged processing operations on the protection of personal data, prior to the processing. This activity is especially required in the following:

  • a systematic and extensive evaluation of personal aspects relating to natural persons based on automated processing, including profiling, and upon which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;
  • processing on a large scale of special categories of data or of personal data relating to criminal convictions and offences; or
  • a systematic monitoring of a publicly accessible area on a large scale.

The supervisory authorities have also identified a further criterion for assessing the need for a DPIA; in fact, they have identified nine risk factors and provided for the obligation of such an activity when a processing operation presents two or more of them. This approach was also used to draw up the blacklist of processing operations that the supervisory authorities locally require to be subject to a DPIA.

This assessment shall contain at least the following:

  • a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  • an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  • an assessment of the risks to the rights and freedoms of data subjects; and
  • the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with data protection principles.

Where a DPIA indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate such risk, the controller shall consult the supervisory authority prior to processing.

Internal and External Privacy Policies

A corollary of the transparency principle is the obligation for data controllers to inform data subjects about the processing of personal data, providing them with the information required by Articles 13 and 14 of the GDPR. For data collected directly from the data subject, this must be done at the time the data is obtained and at the time of the first contact with the data subject, or within 30 days in the case of data that is not provided directly by the data subject. In the second case, the information does not need to be provided to the data subject when:

  • the data subject already has the information;
  • the provision of such information proves impossible or would involve a disproportionate effort;
  • obtaining or disclosure is expressly laid down by the law to which the controller is subject and which provides appropriate measures to protect the data subject’s legitimate interests; or
  • the personal data must remain confidential subject to an obligation of professional secrecy.

Data Subjects’ Rights

Data subjects have certain rights under the GDPR in order to allow them to have continuous and effective control over their personal data. In particular, data subjects have the right to:

  • request access to their data (by receiving a copy of it) or to all information relating to the processing of their personal data (the purpose of processing, the recipients to whom the data is disclosed, any transfers outside the EEA, etc);
  • obtain the rectification of inaccurate or incomplete personal data;
  • obtain the deletion of their personal data in the cases provided for in Article 17 of the GDPR;
  • obtain the restriction of processing in the cases provided for in Article 18 of the GDPR;
  • obtain their personal data in a structured and commonly used format or to request the transmission of such personal data to another data controller, where the legal basis of the processing is the consent of the data subject or the performance of a contract;
  • object to processing based on legitimate interest or the performance of a task carried out in the public interest or in the exercise of official authority;
  • not be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them;
  • withdraw the consent given; and
  • lodge a complaint with a supervisory authority.

Anonymisation, De-identification and Pseudonymisation

Anonymisation and pseudonymisation are two processing operations aimed respectively at excluding or reducing the ability of information to be attributed to a specific data subject. The former makes such subsequent re-identification impossible and therefore aims to exclude the applicability of data protection provisions on the resulting output (the so-called anonymised data).

The second is instead a security measure expressly referred to in Article 32 of the GDPR and defined as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.”

Automated Individual Decision-Making

As anticipated, the data subject has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them. It does not apply if the decision is:

  • necessary for entering into, or the performance of, a contract between the data subject and a data controller;
  • authorised by the law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
  • based on the data subject’s explicit consent.

In the first and last cases, the data controller shall implement suitable measures to safeguard the data subject’s rights, freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express their point of view and to contest the decision. In addition, the data controller shall provide the data subject with information about the existence of automated decision-making, including profiling, and with meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

“Injury” or “Harm” in Data Protection Law

From a data protection perspective, it is necessary to pay attention to the risk to the rights and freedoms of natural persons in terms of physical, material or non-material damage, particularly where the processing may give rise to discrimination, identity theft or fraud, financial loss, reputational damage, the loss of confidentiality of personal data protected by professional secrecy, the unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage.

In addition to the obligations dictated by the GDPR, it is also necessary to consider additional regulations adopted as part of the EU Data Strategy.

Data Governance Act (EU Regulation No 2022/868) – DGA

The Data Governance Act (DGA), effective from September 2023, introduces a European regulatory framework aimed at fostering data management based on trust, transparency, and interoperability. Businesses must implement specific measures to comply with these new rules, particularly concerning data sharing and use across various economic sectors.

Firstly, companies must ensure that data management and sharing are transparent and secure. It is essential to clearly inform data subjects about how their data will be used, by whom, and for what purposes, while simultaneously implementing technical and organisational measures to protect the data from breaches.

Key roles of data intermediaries

A significant role is played by data intermediaries, entities that facilitate the sharing of information between businesses or between public and private entities. Those intending to operate as intermediaries must register with an official registry, demonstrate independence and neutrality, and use standard contracts to govern data-sharing operations.

Creation of European data spaces

The DGA also promotes the establishment of European data spaces dedicated to specific sectors such as healthcare, energy, or mobility. Companies participating in these spaces must adopt standardised formats to ensure interoperability and collaborate to facilitate access to data in compliance with sector-specific regulations.

Access to public sector data

Another important innovation concerns access to data held by the public sector. Businesses may request such data for specific purposes but must adhere to clear procedures and use the information solely in accordance with agreed terms.

Voluntary data sharing

In the context of voluntary data sharing, the DGA requires companies to observe principles of fairness and non-discrimination, avoiding unfair practices, particularly toward SMEs. Regarding personal data, the regulation complements the GDPR, making it necessary to adopt measures such as data minimisation, anonymisation, or pseudonymisation.

Documentation and compliance

Finally, companies must maintain accurate documentation of data-sharing procedures and the contracts entered into, notify their activities to the competent authorities, and undergo inspections to ensure compliance.

Data Act

The Data Act, progressively applicable from 2024, introduces a series of significant obligations for businesses, aimed at facilitating data sharing and access while fostering a more competitive and equitable ecosystem. For enterprises, this regulation represents a major shift in data management, presenting new responsibilities as well as opportunities for innovation.

Management of IoT-generated data

A central element of the Data Act is the management of data generated by IoT devices. Manufacturers of connected devices, such as smart appliances or connected vehicles, must ensure that users have access to the data produced by their devices. This means that users – whether individuals or other companies – will have the right to obtain these data in a readable format and share them with third parties. Manufacturers will be prohibited from imposing restrictions or creating technical barriers that limit the use of these data by other entities.

Mandatory data sharing with public authorities

In exceptional circumstances, such as public health emergencies, natural disasters, or energy crises, companies may be required to provide data to public authorities to address the situation. This data sharing must be transparent, limited to specified purposes, and prevent unauthorised use of the data.

Technical requirements for compatibility and interoperability

Businesses will be required to ensure the compatibility and interoperability of data. This entails adopting standardised formats that allow different systems to communicate and share information seamlessly. Companies must invest in technological infrastructure that enables efficient data management across various platforms.

Privacy and security safeguards

Respect for privacy and data security remains a top priority. Regarding personal data, the regulation complements the GDPR, obliging businesses to process data in compliance with legal bases and to apply measures such as pseudonymisation or anonymisation to protect users’ information. Ensuring data security during management and sharing is equally critical, with obligations to implement systems that prevent unauthorised access or breaches.

Contractual transparency and fairness

Companies must ensure that contracts governing data access are clear and non-discriminatory, avoiding abusive or excessively burdensome clauses. Special attention is given to promoting data access for SMEs, which often face challenges in negotiating fair terms with larger corporations.

Equitable sharing of benefits

The Data Act also emphasises the fair distribution of benefits derived from data usage. This principle seeks to prevent monopolistic scenarios and promote equitable opportunities linked to data utilisation. Businesses are required to maintain clear and detailed documentation of their data management and sharing procedures, ensuring they can demonstrate compliance during inspections or audits by competent authorities.

Please see 1.2 Regulators.

In Italy, the general rule is set out by Article 122 of the Privacy Code (which transposes Directive 2002/58/EC), under which all cookies – and other similar tracking tools – other than those strictly necessary for the functioning of the website may be installed on the users’ devices only with their consent.

In this regard, as clarified by the Guidelines adopted by the Italian Supervisory Authority in 2021, it is essential that consent for the use of cookies is collected in compliance with the principles established by the GDPR. Accordingly, it must be preceded by a multi-layered notice (cookie banner) that provides information about the cookies and the related personal data processing and allows the user to freely accept or refuse the use of cookies, as well as to change their decision at any time.

A specific consideration applies to analytical cookies, which may be treated as necessary cookies (and therefore not require consent) only when (i) IP anonymisation features are enabled; (ii) the use of analytical cookies is strictly limited to the production of aggregated statistics; and (iii) they are used solely in connection with a single website or mobile application, such that they do not allow the tracking of users’ navigation across different websites or applications.

In Italy, the general rule is set out by Article 130(1;2) of the Privacy Code (which transposes Directive 2002/58/EC), under which commercial and promotional communications by email, fax, telephone and similar means of communication require the prior consent of the user (natural or legal person). However, Article 130(4) provides for an exception to the requirement of consent, allowing for the processing of the email address provided by the data subject in the context of the sale of a product or a service for the purpose of sending commercial communications aimed at the direct sale of products or services similar to those already purchased, provided that the data subject has been adequately informed and does not refuse such use, either initially or on the occasion of subsequent communications.

With specific regard to telephone marketing activities, Article 130(3-bis) provides that data controllers may lawfully contact all users who have not objected to receiving commercial communications by telephone by registering in the Register of Opposition. In this sense, pursuant to Law No 5/2018, users may enlist in the register in order to prevent subsequent communications and, at the same time, withdraw any consent previously given to the processing of their personal data for telephone marketing purposes. In fact, the data controller who intends to carry out telemarketing activities is required to consult the register at least every 15 days or, in any event, before the start of a new campaign.

On the other hand, online marketing may consist primarily of an activity carried out through the use of profiling and advertising cookies (see 4.1 Use of Cookies), or of behavioural advertising and targeting activities carried out through the use of external databases (especially those of social networks). In this second case, the jurisprudence of the Court of Justice of the European Union and the interpretation provided by the EDPB in Guidelines 8/2020 clarify the need to carry out the activity on the basis of the prior consent of the data subject and, as a general rule, to reconstruct the privacy roles between the company and the social network as joint controllers of the processing to be regulated under Article 26 of the GDPR.

Processing carried out in the employment context is one of the sectors to which the GDPR defers to its regulation under national law, without prejudice to certain common guidelines and orientations first shared by WP29 and then by the EDPB, specifically regarding the vulnerable position of the data subject employee vis-à-vis the data controller employer (a situation that results in the presumption of the invalidity of any consents requested from the employee due to a lack of freedom).

Managing the Selection Process and the Employment Relationship

In these phases, the employer’s activities must respect – more than ever – the principle of minimisation, ensuring that only personal data that is essential for the performance of work duties and that, to a large extent, is governed by labour law provisions (eg, Article 8 of Law No 300/1970 or Legislative Decree No 81/2008) is requested from the candidate or employee.

Remote Monitoring of Workers

Without prejudice to a general prohibition on the use of instruments (also based on AI) to monitor employee activities, this case is governed by Article 4 of Law No 300/1970, which legitimises the use of such tools solely for organisational purposes and the protection of company assets (eg, cybersecurity purposes). In this case, without prejudice to instruments that are essential and prearranged for the performance of work duties, the use of instruments for remote monitoring is permitted only if doing so is:

  • agreed with the trade union representatives present in the company; or
  • authorised by the competent Labour Inspectorate in the absence of trade union representatives in the company or in the event of there being no agreement.

In these cases, the employee data subject will have to be provided with additional and detailed information on what is normally provided for under Articles 13 and 14 of the GDPR; this can be done by adopting an internal regulation on the use of IT tools, for example, which also informs employees of the possible controls and their purposes.

However, although the agreement with trade union representatives or administrative authorisation is sufficient to legitimise the activity from the point of view of labour law, this does not exempt the employer from complying with the principles on the protection of personal data (eg, the principle of minimisation). In this sense, unencrypted or clear monitoring of the URLs surfed by employees is unlawful because, in terms of security purposes, the same results can be achieved by implementing filters that inhibit the surfing of potentially risky websites. On this point, see also the Guidelines adopted by the GPDP on 1 March 2007.

Whistle-Blowing and the Transparency Decree

The national legislation on whistle-blowing was updated to transpose Directive (EU) 2019/1937 through the Legislative Decree No. 2023/24 which made discipline uniform between the private and public sectors. With regard to the protection of personal data, the general principles dictated by the GDPR remain valid, concerning the obligations to set up reporting and management processes in compliance with the principles of privacy by default and by design and with the need to ensure the confidentiality of the reporter (resulting in the inadequacy, for instance, of the email channel), carry out a DPIA on the processing, train and instruct the people who access the data and manage the reports, etc.

Further obligations (mainly informative) are also imposed by Legislative Decree No 104/2022 (the so-called “Transparency Decree”), which prescribes the need to carry out a DPIA and to provide additional information to data subjects in the event of “the use of automated decision-making or monitoring systems designed to provide indications relevant to the recruitment or assignment, management or termination of the employment relationship, the assignment of tasks or duties, as well as indications affecting the monitoring, assessment, performance and fulfilment of contractual obligations of workers.”

The value of personal data and consent databases as a corporate asset is often underestimated in corporate transactions. In this context, with regards to the sector in question, the main activity may consist of verifying the lawfulness and correctness of the processing of personal data that makes up a company's databases; this can be done by verifying the correctness and completeness of the information that the data controller had to provide to the data subjects pursuant to Articles 13 and 14 of the GDPR, and by examining the evidence of compliance with this information notice obligation.

Furthermore, where the processing of personal data is based on consent (eg, in the case of processing for promotional purposes or in the context of scientific research), it is essential to verify the correctness and ability to prove the consents collected from the data subjects and the effective capacity of the systems to receive any requests for withdrawal and/or opposition.

European data protection legislation requires that any transfer of personal data that is undergoing processing or is intended for processing after transfer to a third country or to an international organisation (including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation) shall take place only if the level of protection of natural persons guaranteed by the GDPR is not undermined.

This therefore requires an examination of the legal provisions applicable to the third country or international organisation in order to understand the actual level of protection of personal data, taking into account the elements specified in Article 45(2) of the GDPR. This analysis is carried out by the European Commission when it adopts the adequacy decisions referred to in Article 45 of the GDPR (decisions legitimising the transfer of personal data to the country or organisation benefiting from it).

In the absence of an adequacy decision, as clarified by the Court of Justice of the European Union in its judgment of 16 July 2020 (the “Schrems II” judgment), this assessment is instead the responsibility of the data controller or data processor who is intending to export the personal data. In such a case, where the law in force in the third country or applicable to the international organisation does not guarantee an adequate level of protection of personal data, the transfer may only be carried out subject to the adoption of additional security measures suitable to mitigate the risks to the rights and freedoms of the data subjects (eg, encryption of the data prior to the transfer in order to exclusively share encrypted data).

Notification to the supervisory authority is only required in the case of transfers pursuant to Article 49(1)(2) of the GDPR. This is the case when no other means can be used to legitimise the transfer and requires that the transfer:

  • is not repetitive;
  • concerns a limited number of data subjects;
  • is necessary for the purposes of compelling legitimate interests pursued by the controller which are not overridden by the interests or rights and freedoms of the data subject;
  • is carried out subject to appropriate data protection safeguards; and
  • is notified to the supervisory authority by the data controller.

European legislation on the protection of personal data does not provide for any obligation to store data within a specific member state or the EEA, aiming, on the contrary, to regulate and facilitate the free movement of such data. In the case of transfers of data to third countries, however, the provisions of Chapter V of the GDPR apply in order to guarantee an adequate level of protection of personal data (see 5.1 Restrictions on International Data Transfers).

There are no “blocking” statutes in the European data protection legislation in addition to those described in the previous sections concerning the transfer of data outside the EEA.

In 2024, the European Commission concluded the review process of 11 adequacy decisions regarding the transfer of personal data, confirming their validity. In this regard, the decisions remain effective, and it is permissible to continue the free transfer of personal data to Andorra, Argentina, Canada, the Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland, and Uruguay.

ICTLC – ICT Legal Consulting

Via Borgonuovo 12
20121 Milan
Italy

+39 028 424 7194

+39 027 005 121 01

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Trends and Developments


Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Sweden (Gothenburg), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 56 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

Data Protection Enforcement Trends in Italy

In general

Article 51 GDPR provides that “[e]ach Member State shall provide for one or more independent public authorities to be responsible for monitoring the application of this Regulation, in order to protect the fundamental rights and freedoms of natural persons in relation to processing and to facilitate the free flow of personal data within the Union”. Under Article 58 GDPR, the supervisory authority is granted a wide range of powers, including investigative, corrective, authorising and advisory powers, amongst which – notably – the possibility to levy pecuniary fines and to impose a temporary or definitive ban on the processing of personal data.

In Italy, the competent supervisory authority is the Garante per la Protezione dei Dati Personali (so-called Garante or GPDP), whose decisions can be appealed by applying to the ordinary tribunal at the second level, and to the Supreme Court of Cassation at the third.

The GPDP is widely considered one of the most active and influential supervisory authorities, having issued – as of 31 December 2024 – more than 484 publicly available enforcement actions, amounting to over EUR301,670,797 in sanctions. This places Italy second only to Spain in terms of the number of sanctions issued. While the Spanish Supervisory Authority has issued at least 826 sanctions, their total value amounts to approximately EUR97,931,280. The GPDP has applied fines and exercised its corrective powers – such as the imposition of processing bans – across a broad range of GDPR-related matters and industry sectors. Nevertheless, there are certain aspects of the GDPR on which the Authority appears to focus its attention more frequently than others.

Trends in enforcement

Traditionally, the GPDP has been especially concerned with combating unlawful telemarketing practices, in particular as regards transparency and consent requirements, as well as the engagement of third parties (such as call centres) as data processors without the necessary data protection safeguards, including the performance of audits by the data controller. In this field, the Garante has issued some of its highest sanctions ever published, such as those against Eni Gas e Luce (issued on 11 December 2019 for a total amount of EUR11,500,000), Tim (issued on 15 January 2020 for a total amount of EUR27,800,000) and Sky Italia (issued on 16 September 2021 for a total amount of EUR3,200,000). More recently, in July 2024, the GPDP ordered Hera (an energy provider) to pay a fine of EUR5,000,000 for violations in the processing of personal data of over 2,300 customers due to the insufficient implementation of safeguards by Hera’s data processors, which in some cases had led to the activation of energy supply contracts without the knowledge of or consent from the users.

During the last few years, the Garante has also focused its attention on the protection of the data privacy rights of children, as apparent from the enforcement actions undertaken against the popular social network TikTok concerning data verification requirements. On 22 January 2021, following the highly publicised death of a ten-year-old girl from Sicily participating in a “blackout” challenge, the GPDP imposed an immediate limitation on the data processing concerning users “whose age could not be established with full certainty so as to ensure compliance with the age-related requirements”. On 3 February 2021 the Italian DPA noted that, following the enforcement action, TikTok committed to fulfilling GDPR age-verification requirements by taking a number of actions, including:

  • removing accounts belonging to users under thirteen years of age;
  • employment of an AI-based system to verify age;
  • launching an information campaign to raise awareness;
  • including an in-app button for reporting those under thirteen years;
  • improving the language of the privacy notice for users under eighteen years; and
  • doubling the number of Italian platform content moderators.

Another field where the GPDP has recently stepped up its enforcement actions is that of video surveillance. In October 2023, the GPDP reprimanded an individual for installing a home surveillance system with cameras capturing not only their own apartment, but also a public area (including a public park). According to the GPDP, the placement of the cameras violated the principle of lawfulness, as the data controller failed to show a legitimate interest capable of justifying the recording of public areas and conversations through the audio system, as well as the principle of data minimisation. The decision is also interesting because it clarifies that, while domestic surveillance systems are generally exempt from having to comply with the GDPR, such exemption does not apply when involving public or third parties’ properties. The Garante issued a mere reprimand against the data controller, considering that the individual promptly rectified the situation by replacing the camera and redirecting it solely towards the entrance of their home.

In June 2023, the GPDP issued a sanction of EUR20,000 against an Italian employer for having, inter alia, installed a video surveillance system in its premises without having obtained the prior approval of the workers’ council or of the public labour authority, as required by article 4 of Law No 300/1970 (so-called Workers’ Statute). Moreover, no privacy notice had been drafted and made available to the workers. More recently, in July 2024, the GPDP ordered the Turin municipality to share further information relating to the use of AI-powered “smart CCTVs”, which would allegedly help local police forces to understand whether their intervention is needed in emergency situations, in order for the GPDP to be able to investigate the system’s compliance with the GDPR.

Lastly, a field where the Garante has recently focused its attention is that of Artificial Intelligence, as shall be seen specifically in the following section.

The Italian Data Protection Authority at the Forefront of Artificial Intelligence Enforcement

During the last couple of years, the GPDP has taken noteworthy initiatives in the context of Artificial Intelligence (AI), including by means of enforcement actions against providers of AI systems. As a result, the Garante is accrediting itself as one of the most active European Union supervisory authorities on the regulation of AI vis-à-vis the GDPR and Italian Data Protection Law. Below, we provide a brief overview of the most important initiatives undertaken by the GPDP in the AI field.

Enforcement action against ClearviewAI

On 9 March 2022, the GPDP fined the US-based company Clearview AI EUR20 million after finding it had carried out processing activities concerning biometric data of persons residing within Italian territory.

The GPDP inquiry into Clearview AI revealed that the company processed personal data, including biometric and geolocation information, unlawfully without a proper legal basis. In particular, the legitimate interest leveraged by the US-based company as the relevant legal basis for the processing was not suitable for the processing of biometric data, which qualifies as a special category of personal data under Article 9 GDPR so that its processing is generally prohibited, save where specific exceptions provided by paragraph 2 of Article 9 GDPR apply. Moreover, Clearview AI violated several fundamental principles of the GDPR, such as lacking transparency in adequately informing users, exceeding the intended purposes for processing users’ data made available online, and neglecting to establish a data storage period. Consequently, Clearview AI is infringing on the freedoms of data subjects, including their right to privacy, personal data protection and non-discrimination.

Through web scraping, Clearview AI has amassed a database containing billions of facial images sourced globally from public web outlets like media platforms, social media, and online videos. By processing such personal data by means of advanced algorithms, Clearview AI has been able to provide a refined search service allowing the creation of profiles based on biometric data extracted from these images. These profiles can then be augmented with additional information, such as image tags, geolocation, and so on.

As a result of these violations, the GPDP imposed a EUR20 million fine on Clearview AI and mandated the deletion of data pertaining to individuals residing in Italy. The authority also prohibited any further collection and processing of data through Clearview AI’s facial recognition system. Additionally, Clearview AI was instructed by the Italian SA to appoint a representative in the EU pursuant to Article 27 GDPR, facilitating the exercise of data subject rights, alongside (or in lieu of) the US-based controller.

Enforcement action against OpenAI

In late March 2023, only a few months after its launch, the Garante identified several violations of the GDPR and Italian Data Protection Law regarding the famous and widespread generative AI system “ChatGPT”. According to the GPDP, OpenAI failed to demonstrate the presence of a valid legal basis for collecting and processing personal data for the purposes of training ChatGPT, and the information provided to users and individuals whose data was used for training the generative AI system was incomplete. Moreover, individuals whose data was used for training the AI system had no easy way to exercise their data protection rights, including the rights of access, rectification and objection. Interestingly, the GPDP also noted that ChatGPT’s responses to users’ prompts often deviated from reality (so-called hallucinations), thereby violating the accuracy principle established by the GDPR when such responses concerned another individual. The ChatGPT case underscores, once again, the GPDP’s scrutiny of data processing concerning children: in this respect, the authority questioned whether the platform’s outputs might result in inappropriate responses for children, even if the service is purportedly aimed at users above the age of thirteen, as stated in OpenAI’s terms of service. As a result, the GPDP required OpenAI to implement a suitable age verification system.

On 28 April 2023, the Garante lifted the ban, finding that the measures adopted by OpenAI adequately addressed the data protection issues raised by the authority and which underpinned the ban. Such measures included updating ChatGPT’s privacy policy, implementing adequate age-verification systems, and implementing measures to enable individuals to exercise their data protection rights. The Garante, however, reserved its powers to fully investigate the underlying shortcomings that have led it to issue the ban in the first place and, if necessary, issue any relevant sanction in a separate decision after having fully investigated the facts.

The final decision was finally issued on 2 November 2024. In its ruling, the GPDP imposed a EUR15 million fine on OpenAI and mandated the company to launch a six-month public awareness campaign. The campaign shall be aimed at educating the public about the collection of personal data for training its generative artificial intelligence model GPT, and informing individuals about their data protection rights (such as the right to opt out).

The amount of the fine was due to the following data protection infringements by OpenAI, including:

  • processing personal data without having previously identified a suitable legal basis;
  • lack of age-verification mechanism (when the service launched, there were no effective measures to verify users’ age);
  • failure to implement the campaign to raise awareness in the way prescribed by the GPDP in the 2023 provisional decision;
  • provision of insufficient information to users in ChatGPT’s privacy notice, especially as regards the failure to inform data subjects about how their personal data was processed, including its use for training the AI model; and
  • violation of the accuracy principle, as ChatGPT’s responses often result in incorrect information about individuals.

The fine considered OpenAI’s co-operative approach, acknowledging the implementation of several measures requested by the GPDP. OpenAI has been given 30 days to pay the fine and 60 days to submit a detailed plan for the required awareness campaign to the GPDP.

The Garante-OpenAI saga has raised widespread attention not only within the Italian data protection community, but also among the general public of both Italy and Europe, given that it is one of the first enforcement initiatives to have targeted a generative AI system. The GPDP’s intervention highlights the GDPR’s potential to regulate specific aspects of generative AI and underscores the Garante’s willingness to lead the way in this respect.

Finally, given that during the investigation OpenAI established its EU headquarters in Ireland, the Irish Data Protection Commissioner (DPC) is brought into focus as the lead supervisory authority for OpenAI. In line with the GDPR’s one-stop shop mechanism, the GPDP has forwarded the case documents to the DPC, which will now be responsible for investigating any further data protection infringements involving OpenAI.

Inquiry and guidance on data mining for the training of AI systems

On 22 November 2023, the GPDP launched a survey aimed at both public and private websites to assess the implementation of effective security measures against the widespread collection of personal data (so-called web scraping) for training third-party AI algorithms.

The investigation covers all entities, acting as data controllers, based in Italy or providing services in Italy, that allow personal data to be freely accessible online, including by means of the “spiders” used by AI algorithm providers. The inquiry is prompted by the widespread practice of various AI platforms using web scraping to gather large amounts of information, including personal data, from websites managed by public and private entities for specific purposes, such as news and administrative transparency. The Garante invited trade associations, consumer groups, experts, and academic representatives to share their comments and contributions on security measures against the extensive collection of personal data for algorithm training. 

Following the investigation, on 7 June 2024 the GPDP issued a guidance document on how to protect personal data published online from web scraping carried out by third parties for the purpose of training generative artificial intelligence models. The document provides data controllers who publish personal data online (eg, website publishers) with some recommended security measures to prevent or, at least, hinder web scraping. These measures are not mandatory: data controllers have a duty to autonomously assess, based on the principle of accountability, whether to implement them, taking into account elements such as the latest technology developments and the costs of implementation.

In the guidance document, the Garante suggests a number of concrete measures to be adopted, including:

  • the creation of reserved areas, accessible only upon registration, so as to remove data from public availability;
  • the inclusion of anti-scraping clauses in the terms of service of websites;
  • the monitoring of traffic to web pages, so as to identify any abnormal flows of incoming and outgoing data; and
  • the implementation of specific measures against bots using, among others, the technological solutions made available by the same companies responsible for web scraping (eg, intervening on the robots.txt file).

It is interesting to note that most of these measures are in line with provisions stemming from other laws that regulate web scraping from other angles, such as the EU AI Act and the EU Copyright Directive. For example, intervening in the robots.txt file in order to prevent or curtail web scraping from taking place is foreseen by the current draft of the Code of Practice for General-purpose AI (Article 56 AI Act), while the inclusion of anti-scraping clauses is aligned with the “text and data mining exception” pursuant to Article 4 of the Copyright Directive. This goes to show that the regulation of web scraping – as a fundamental pre-requisite for AI model training – brings an increasing degree of convergence between privacy and data protection, intellectual property, and AI regulation.

In line with the above, in November 2024, the GPDP issued a warning against the Gedi Group, one of Italy’s biggest newspaper publishers, over its recent deal in relation to the sharing of Gedi’s newspaper archives with OpenAI in order to allow for the training of generative AI systems. In particular, the GPDP has reminded Gedi Group that, where the newspaper archives to be disclosed to OpenAI contain personal data, such disclosure has to take place in compliance with GDPR, including as regards the right of each data subject to object to the disclosure.

These initiatives further showcase the GPDP’s willingness and ability to accredit itself as a lead European regulator of AI systems, from a data protection perspective.

Standardised Set of Icons Approved by the Garante for Clearer Privacy Notices

Under Article 12 GDPR, the data controller is required to provide data subjects with the necessary information on the processing of their personal data “in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child”. In order to achieve this result, Article 12 further recommends that such information “may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable”.

Although not mandatory, the use of icons under the GDPR is therefore a good practice, which can help data controllers to achieve the high transparency requirements set by the GDPR, proactively demonstrating compliance in light of the principle of accountability. This is especially true in the case of complex data processing operations and/or where the information is specifically addressed to a child, as the use of standardised icons can help enhance the clarity and overall transparency of privacy notices.

Against this background, in March 2021 the GPDP launched a contest titled “Easy privacy information via icons? Yes, you can!” aimed at stimulating the development of a standardised set of icons by software developers, tech professionals, experts, lawyers, designers, university students, and anyone interested in the topic. On 15 December 2021, the Garante published on its website the three sets of icons deemed to be most effective, based on the following criteria: concept (which includes the aspects of effectiveness and conciseness); visual (graphics, readability, clarity); originality; and inclusiveness (gender equality, non-discrimination). The three winning projects are currently available on the GPDP’s website and can be freely used by any data controller who wishes to render its privacy notices more transparent.

ICTLC – ICT Legal Consulting

Via Borgonuovo 12
20121 Milan
Italy

+39 028 424 7194

+39 027 005 121 01

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Sweden (Gothenburg), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 56 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

Trends and Developments

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Sweden (Gothenburg), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 56 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.