Data Protection & Privacy 2025 Comparisons

Last Updated March 11, 2025

Contributed By ICT Legal Consulting

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Sweden (Gothenburg), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 56 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

The Italian regulatory framework on the protection of personal data and privacy is dictated by Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, repealing Directive 95/46/EC (GDPR). To the extent that such protection is not mentioned by the GDPR, it is regulated by Legislative Decree No 196/2003 (the “Privacy Code”).

Further detailed rules are contained in Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector, as transposed into Italian law by the Privacy Code.

With reference to the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection and prosecution of criminal offences or the execution of criminal penalties, the regulatory framework is instead governed by EU Directive 2016/680, transposed into the Italian legal system through Legislative Decree No 51/2018.

Finally, other specific indications and/or interpretations are contained in the decisions, recommendations and guidelines issued by the national supervisory authorities and the European Data Protection Board (eg, in Italy, the requirements for system administrators).

As mentioned in 1.1 Overview of Data and Privacy-Related Laws, supervisory authorities have limited regulatory power, mainly through the adoption of guidelines and opinions interpreting legal provisions. However, supervisory authorities (in Italy, the Garante per la Protezione dei Dati Personali, or GPDP) also have supervisory powers to monitor compliance with data protection legislation, and benefit from investigative powers that include, ex multis, the possibility of requesting information from data controllers and data processors or conducting on-site checks and inspections.

In this context, the supervisory authority may request access to the documentation adopted (privacy policy, consents, internal policies and procedures, records of processing activities, etc) and to systems and databases. The inspections of the GPDP may be triggered by the authority itself (based on an inspection plan adopted and published every six months, or following notification of a personal data breach), or by data subjects or other third parties (in the case of complaints or reports). Any decisions that are eventually adopted are published.

Data protection legislation may also be applied by the courts in the case of appeals lodged by individuals (particularly in the case of claims for damages or appeals against decisions of the supervisory authority).

As mentioned in 1.2 Regulators, GPDP inspections may be triggered by the authority itself (on the basis of an inspection plan adopted and published every six months, or following the notification of a personal data breach) or by data subjects or other third parties (in the case of complaints or reports).

Preliminary Investigation

In the event of a complaint by a data subject, the GPDP shall verify the correctness and completeness of the complaint and, if necessary, grant the complainant a period of time to amend it, normally not exceeding 15 days. In the event of a correct and complete complaint (or in the event of an investigation on its own accord, such as following the notification of a personal data breach), the GPDP shall start a preliminary investigation during which the documentation received is examined and/or further information is requested from the data controller or data processor.

In that scenario, inspections may also be carried out, during which the entity subject to inspection may be assisted by its trusted advisers and reserve the right to produce the documentation that is not immediately available within a reasonable period (as a rule, not exceeding 30 days). A record of the activity carried out shall also be drawn up, with particular reference to the statements made and the documents acquired, and a copy shall be given to the subject under inspection.

Closing of the Preliminary Investigation and Archiving

At the end of the preliminary investigation, the competent department within the GPDP may conclude its examination of the complaint by archiving it, when:

  • the issue examined does not appear to be related to the protection of personal data or the tasks entrusted to the GPDP;
  • there is no evidence of a breach of the relevant data protection regulations;
  • the claim set out in the complaint is excessive, due in particular to its specious or repetitive character; or
  • the issue raised by the complaint has already been examined by the GPDP.

In the case of a complaint, feedback is provided to the applicant, briefly stating the reasons why no action is taken.

Initiation of Proceedings

If the matter is not dismissed following the preliminary investigation, the competent department shall initiate proceedings for the adoption of measures by the board of the GPDP, by means of its own communication to the data controller and/or data processor. The communication shall contain:

  • a concise description of the facts and alleged breaches of the relevant data protection rules, as well as the relevant sanctioning provisions;
  • an indication of the competent organisational unit where a copy of the investigative documents may be inspected and extracted; and
  • the indication that, within 30 days of receipt of the notice, it is possible to send the GPDP defence papers or documents, and to ask to be heard by the same GPDP.

Right of Defence

The addressee of the notice may exercise the right of defence by submitting written statements and documents within 30 days from the date of notification of the communication, as well as a personal testimony regarding the facts of the notice, where requested.

The addressee of the notice may request a short extension by specifically and duly motivating the request. The extension shall normally not exceed 15 days and may be granted according to proportionality criteria and criteria relating to the operational/dimensional characteristics of the addressees themselves and to the complexity of the matter under examination. The addressee of the notice may also request a hearing before the GPDP.

Failure to submit written counterarguments or a request for a hearing shall not prejudice the continuation of the proceedings.

Decision

Where necessary, the board of the GPDP, by its own resolution, shall adopt the corrective and sanctioning measures referred to in Article 58(2) of the GDPR (in the case of an administrative pecuniary sanction, the quantum is calculated on the basis of the criteria indicated by Article 83 of the GDPR). The decision is notified to the parties by the department, service or other organisational unit that has supervised the preliminary investigation.

Appeal Against Measures of the GPDP

Under penalty of inadmissibility, an appeal against the measures adopted by the GPDP must be lodged within 30 days from the date of communication of the decision or within 60 days if the appellant resides abroad, with the ordinary court of the place where the data controller resides, or with the court of the place of residence of the data subject. At the time of the appeal, it is also possible to request the court to suspend the enforceability of the contested decision.

The so-called “work ritual” applies to the judicial procedure, and the sentence that defines the judgment is not appealable before the judge and may prescribe the necessary measures and compensation for damages.

One of the Most Significant Administrative Proceedings of 2024 Involved OpenAI

In December 2024, the Italian Supervisory Authority concluded an investigation into OpenAI, identifying several GDPR violations related to the ChatGPT service. These violations included the processing of personal data without a proper legal basis, a lack of transparency towards users, and the absence of effective mechanisms for age verification, which exposed minors to inappropriate content.

As a result, the Italian Supervisory Authority imposed a fine of EUR15 million on OpenAI and mandated a six-month public information campaign across various media to raise awareness about how ChatGPT operates and the rights of data subjects. Furthermore, given that the company established its European headquarters in Ireland during the investigation, the Authority, in compliance with the “one-stop shop” rule, referred the case to the Irish Data Protection Commission (DPC), which became the lead supervisory authority under the GDPR, to continue the investigation regarding any ongoing violations that persisted prior to the establishment of the European headquarters.

Additional Proceedings by the Italian Supervisory Authority

Another series of proceedings conducted by the Italian Supervisory Authority focused on telemarketing and teleselling activities, culminating in the imposition of substantial fines on companies in the telecommunications and energy supply sectors. Within this context, a notable sanction was imposed on Enel Energia but was subsequently annulled by the court. Following legal proceedings, the Rome Tribunal highlighted procedural shortcomings in the Italian Supervisory Authority’s actions, particularly regarding compliance with administrative deadlines.

The Regulation (EU) 2024/1689, also known as the AI Act, was adopted on 13 June 2024, and represents the European Union’s first comprehensive legal framework for artificial intelligence. It establishes harmonised rules for the development, deployment, and use of AI systems in the EU, aiming to promote safety, transparency, and compliance with fundamental rights while fostering innovation and market development. The regulation applies to providers, deployers, importers, and distributors of AI systems, classifying them by risk level – minimal, limited, or high-risk – while prohibiting certain practices, such as subliminal manipulation or biometric categorisation in public spaces. Specific obligations are set for high-risk systems, including strict data governance and transparency requirements. The AI Act became effective on 10 August 2024, with staggered deadlines for compliance: prohibitions take effect after six months, governance and general-purpose AI model rules after one year, and integrated systems’ obligations after three years.

At the same time, the Italian legislature is working on drafting an additional national legislative act which, as of today, has not yet been adopted.

With regard to data protection, European AI regulations expressly emphasise the need to comply with data protection laws, which are therefore applicable in this context as well. The GDPR already establishes a series of provisions – particularly the obligations of transparency, the right not to be subject to fully automated decisions, and the obligation to conduct a Data Protection Impact Assessment (DPIA) – that are suitable for regulating and ensuring an adequate level of protection for data subjects, including in the context of the use of AI tools.

Please see 1.5 AI Regulation.

In recent years, data protection litigation in Italy has experienced significant growth, driven by an increasing awareness of rights among data subjects.

The most frequent disputes involve unlawful data processing, data breaches, and the improper use of personal data by companies and public administrations. Claims for compensation for privacy violations, particularly for non-material damages, are also on the rise.

In this context, the Italian Supervisory Authority is playing a crucial role, imposing substantial fines that influence corporate strategies, seeking to stay ahead of other European authorities, and positioning itself as a leader on issues related to AI (for instance, the proceedings initiated against OpenAI and ChatGPT, which concluded with a sanction in December 2024) and employee monitoring, especially concerning the retention of metadata generated through employees’ use of email tools.

Please see 1.4 Data Protection Fines in Practice.

The national legislation on personal data protection does not currently provide explicit regulation for collective redress mechanisms. However, data subjects may rely on the tools generally available under civil procedure law or those designed to protect their rights as consumers.

The use of IOT services is governed, from a data protection perspective, by the legislation already outlined in 1.1 Overview of Data and Privacy-Related Laws and whose obligations and rights are outlined in detail in 3.3 Rights and Obligations Under Applicable Data Regulation.

In addition, with regard to IOT services, the regulations adopted as part of the EU Data Strategy (in particular, the Data Act and the Data Governance Act), which are outlined in 3.2 Interaction of Data Regulation and Data Protection, also apply andentail certain obligations to share and circulate information consisting of both personal and non-personal data.

The interaction between data protection legislation and that adopted as part of the EU Data Strategy (in particular, the Data Act and the Data Governance Act) forms a regulatory framework aimed at balancing the protection of personal data with the promotion of a data economy based on sharing and innovation. In particular:

  • The GDPR, as outlined in 1.1 Overview of Data and Privacy-Related Laws,governs data protection and provides fundamental principles and rights to protect data subjects.
  • The Data Governance Act (effective from September 2023) promotes a secure and trusted ecosystem for data sharing, creating data spaces and mechanisms for regulated access to data, including public data.
  • The Data Act (phase-in from 2024) regulates the mandatory sharing of data generated by IoT devices and imposes interoperability and access obligations on public and private entities.

In this context, the sharing of personal data – distinct from non-personal data – takes on particular significance. Under the Data Act, such sharing may sometimes constitute a legal obligation and, in other cases, serve a public interest, thereby providing a legitimate basis for processing under the GDPR. However, compliance with key GDPR principles, such as data minimisation, security of processing, and transparency towards data subjects, must always be ensured.

In summary, while the Data Act and the Data Governance Act complement the GDPR by introducing rules to encourage data sharing, they also necessitate a thorough analysis of the legal basis and privacy implications. This includes the implementation of technical measures to separate personal from non-personal data and the use of accountability tools to document assessments – particularly in cases where data sharing is mandatory.

Data Protection Officer (DPO)

Pursuant to Article 37 of the GDPR, as interpreted by the supervisory authorities’ guidelines, the appointment of a DPO is mandatory for public administrations or where the main activities carried out by the data controller or data processor consist of processing operations which, by virtue of their nature, scope and/or purposes, require regular and systematic monitoring of data subjects on a large scale or the processing on a large scale of special categories of data and personal data relating to criminal convictions and offences. In addition, the European guidelines make it clear that data controllers and data processors must document their assessments as to whether or not to designate a DPO and periodically review this assessment, unless it is evident that an organisation is not required to designate a DPO.

The tasks of the DPO are set out in Article 39 of the GDPR and consist of:

  • informing and advising the controller or the processor and the employees who carry out processing of their obligations pursuant to European data protection legislation;
  • monitoring compliance with European data protection legislation and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, raising awareness and training staff involved in processing operations, and the related audits;
  • providing advice where requested as regards the Data Protection Impact Assessment (DPIA) and monitoring its performance;
  • co-operating with the supervisory authority; and
  • acting as the contact point for the supervisory authority on issues relating to processing, and to consult, where appropriate, with regard to any other matter.

In the performance of their tasks, the DPO shall have due regard to the risk associated with processing operations, taking into account the nature, scope, context and purposes of processing.

Lawfulness of Processing

Any processing of personal data must be based on at least one of the following legal bases provided for in Article 6(1) of the GDPR:

  • the data subject has freely given specific, informed and unambiguous consent to the processing of their personal data;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
  • processing is necessary for compliance with a legal obligation to which the controller is subject;
  • processing is necessary in order to protect the vital interests of the data subject or of another natural person;
  • processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; or
  • processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require the protection of personal data, particularly where the data subject is a child. In this case, the data controller is required to carry out an assessment of the legitimate interest pursued in relation to the rights and freedoms of the data subject by conducting a balancing activity that may possibly be challenged by the supervisory authority or the court.

Data Protection by Design and by Default

Both at the time of the determination of the means for new processing and at the time of the processing itself, the data controller shall implement appropriate technical and organisational measures that are designed to implement data protection principles and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. At the same time, the data controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data that is necessary for each specific purpose of the processing is processed.

Data Protection Impact Assessment

Pursuant to Article 35 of the GDPR, where a type of processing is likely to result in a high risk to the rights and freedoms of natural persons, particularly processing using new technologies, and taking into account the nature, scope, context and purposes of the processing, the controller shall carry out an assessment of the impact of the envisaged processing operations on the protection of personal data, prior to the processing. This activity is especially required in the following:

  • a systematic and extensive evaluation of personal aspects relating to natural persons based on automated processing, including profiling, and upon which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;
  • processing on a large scale of special categories of data or of personal data relating to criminal convictions and offences; or
  • a systematic monitoring of a publicly accessible area on a large scale.

The supervisory authorities have also identified a further criterion for assessing the need for a DPIA; in fact, they have identified nine risk factors and provided for the obligation of such an activity when a processing operation presents two or more of them. This approach was also used to draw up the blacklist of processing operations that the supervisory authorities locally require to be subject to a DPIA.

This assessment shall contain at least the following:

  • a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  • an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  • an assessment of the risks to the rights and freedoms of data subjects; and
  • the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with data protection principles.

Where a DPIA indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate such risk, the controller shall consult the supervisory authority prior to processing.

Internal and External Privacy Policies

A corollary of the transparency principle is the obligation for data controllers to inform data subjects about the processing of personal data, providing them with the information required by Articles 13 and 14 of the GDPR. For data collected directly from the data subject, this must be done at the time the data is obtained and at the time of the first contact with the data subject, or within 30 days in the case of data that is not provided directly by the data subject. In the second case, the information does not need to be provided to the data subject when:

  • the data subject already has the information;
  • the provision of such information proves impossible or would involve a disproportionate effort;
  • obtaining or disclosure is expressly laid down by the law to which the controller is subject and which provides appropriate measures to protect the data subject’s legitimate interests; or
  • the personal data must remain confidential subject to an obligation of professional secrecy.

Data Subjects’ Rights

Data subjects have certain rights under the GDPR in order to allow them to have continuous and effective control over their personal data. In particular, data subjects have the right to:

  • request access to their data (by receiving a copy of it) or to all information relating to the processing of their personal data (the purpose of processing, the recipients to whom the data is disclosed, any transfers outside the EEA, etc);
  • obtain the rectification of inaccurate or incomplete personal data;
  • obtain the deletion of their personal data in the cases provided for in Article 17 of the GDPR;
  • obtain the restriction of processing in the cases provided for in Article 18 of the GDPR;
  • obtain their personal data in a structured and commonly used format or to request the transmission of such personal data to another data controller, where the legal basis of the processing is the consent of the data subject or the performance of a contract;
  • object to processing based on legitimate interest or the performance of a task carried out in the public interest or in the exercise of official authority;
  • not be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them;
  • withdraw the consent given; and
  • lodge a complaint with a supervisory authority.

Anonymisation, De-identification and Pseudonymisation

Anonymisation and pseudonymisation are two processing operations aimed respectively at excluding or reducing the ability of information to be attributed to a specific data subject. The former makes such subsequent re-identification impossible and therefore aims to exclude the applicability of data protection provisions on the resulting output (the so-called anonymised data).

The second is instead a security measure expressly referred to in Article 32 of the GDPR and defined as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.”

Automated Individual Decision-Making

As anticipated, the data subject has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them. It does not apply if the decision is:

  • necessary for entering into, or the performance of, a contract between the data subject and a data controller;
  • authorised by the law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
  • based on the data subject’s explicit consent.

In the first and last cases, the data controller shall implement suitable measures to safeguard the data subject’s rights, freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express their point of view and to contest the decision. In addition, the data controller shall provide the data subject with information about the existence of automated decision-making, including profiling, and with meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

“Injury” or “Harm” in Data Protection Law

From a data protection perspective, it is necessary to pay attention to the risk to the rights and freedoms of natural persons in terms of physical, material or non-material damage, particularly where the processing may give rise to discrimination, identity theft or fraud, financial loss, reputational damage, the loss of confidentiality of personal data protected by professional secrecy, the unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage.

In addition to the obligations dictated by the GDPR, it is also necessary to consider additional regulations adopted as part of the EU Data Strategy.

Data Governance Act (EU Regulation No 2022/868) – DGA

The Data Governance Act (DGA), effective from September 2023, introduces a European regulatory framework aimed at fostering data management based on trust, transparency, and interoperability. Businesses must implement specific measures to comply with these new rules, particularly concerning data sharing and use across various economic sectors.

Firstly, companies must ensure that data management and sharing are transparent and secure. It is essential to clearly inform data subjects about how their data will be used, by whom, and for what purposes, while simultaneously implementing technical and organisational measures to protect the data from breaches.

Key roles of data intermediaries

A significant role is played by data intermediaries, entities that facilitate the sharing of information between businesses or between public and private entities. Those intending to operate as intermediaries must register with an official registry, demonstrate independence and neutrality, and use standard contracts to govern data-sharing operations.

Creation of European data spaces

The DGA also promotes the establishment of European data spaces dedicated to specific sectors such as healthcare, energy, or mobility. Companies participating in these spaces must adopt standardised formats to ensure interoperability and collaborate to facilitate access to data in compliance with sector-specific regulations.

Access to public sector data

Another important innovation concerns access to data held by the public sector. Businesses may request such data for specific purposes but must adhere to clear procedures and use the information solely in accordance with agreed terms.

Voluntary data sharing

In the context of voluntary data sharing, the DGA requires companies to observe principles of fairness and non-discrimination, avoiding unfair practices, particularly toward SMEs. Regarding personal data, the regulation complements the GDPR, making it necessary to adopt measures such as data minimisation, anonymisation, or pseudonymisation.

Documentation and compliance

Finally, companies must maintain accurate documentation of data-sharing procedures and the contracts entered into, notify their activities to the competent authorities, and undergo inspections to ensure compliance.

Data Act

The Data Act, progressively applicable from 2024, introduces a series of significant obligations for businesses, aimed at facilitating data sharing and access while fostering a more competitive and equitable ecosystem. For enterprises, this regulation represents a major shift in data management, presenting new responsibilities as well as opportunities for innovation.

Management of IoT-generated data

A central element of the Data Act is the management of data generated by IoT devices. Manufacturers of connected devices, such as smart appliances or connected vehicles, must ensure that users have access to the data produced by their devices. This means that users – whether individuals or other companies – will have the right to obtain these data in a readable format and share them with third parties. Manufacturers will be prohibited from imposing restrictions or creating technical barriers that limit the use of these data by other entities.

Mandatory data sharing with public authorities

In exceptional circumstances, such as public health emergencies, natural disasters, or energy crises, companies may be required to provide data to public authorities to address the situation. This data sharing must be transparent, limited to specified purposes, and prevent unauthorised use of the data.

Technical requirements for compatibility and interoperability

Businesses will be required to ensure the compatibility and interoperability of data. This entails adopting standardised formats that allow different systems to communicate and share information seamlessly. Companies must invest in technological infrastructure that enables efficient data management across various platforms.

Privacy and security safeguards

Respect for privacy and data security remains a top priority. Regarding personal data, the regulation complements the GDPR, obliging businesses to process data in compliance with legal bases and to apply measures such as pseudonymisation or anonymisation to protect users’ information. Ensuring data security during management and sharing is equally critical, with obligations to implement systems that prevent unauthorised access or breaches.

Contractual transparency and fairness

Companies must ensure that contracts governing data access are clear and non-discriminatory, avoiding abusive or excessively burdensome clauses. Special attention is given to promoting data access for SMEs, which often face challenges in negotiating fair terms with larger corporations.

Equitable sharing of benefits

The Data Act also emphasises the fair distribution of benefits derived from data usage. This principle seeks to prevent monopolistic scenarios and promote equitable opportunities linked to data utilisation. Businesses are required to maintain clear and detailed documentation of their data management and sharing procedures, ensuring they can demonstrate compliance during inspections or audits by competent authorities.

Please see 1.2 Regulators.

In Italy, the general rule is set out by Article 122 of the Privacy Code (which transposes Directive 2002/58/EC), under which all cookies – and other similar tracking tools – other than those strictly necessary for the functioning of the website may be installed on the users’ devices only with their consent.

In this regard, as clarified by the Guidelines adopted by the Italian Supervisory Authority in 2021, it is essential that consent for the use of cookies is collected in compliance with the principles established by the GDPR. Accordingly, it must be preceded by a multi-layered notice (cookie banner) that provides information about the cookies and the related personal data processing and allows the user to freely accept or refuse the use of cookies, as well as to change their decision at any time.

A specific consideration applies to analytical cookies, which may be treated as necessary cookies (and therefore not require consent) only when (i) IP anonymisation features are enabled; (ii) the use of analytical cookies is strictly limited to the production of aggregated statistics; and (iii) they are used solely in connection with a single website or mobile application, such that they do not allow the tracking of users’ navigation across different websites or applications.

In Italy, the general rule is set out by Article 130(1;2) of the Privacy Code (which transposes Directive 2002/58/EC), under which commercial and promotional communications by email, fax, telephone and similar means of communication require the prior consent of the user (natural or legal person). However, Article 130(4) provides for an exception to the requirement of consent, allowing for the processing of the email address provided by the data subject in the context of the sale of a product or a service for the purpose of sending commercial communications aimed at the direct sale of products or services similar to those already purchased, provided that the data subject has been adequately informed and does not refuse such use, either initially or on the occasion of subsequent communications.

With specific regard to telephone marketing activities, Article 130(3-bis) provides that data controllers may lawfully contact all users who have not objected to receiving commercial communications by telephone by registering in the Register of Opposition. In this sense, pursuant to Law No 5/2018, users may enlist in the register in order to prevent subsequent communications and, at the same time, withdraw any consent previously given to the processing of their personal data for telephone marketing purposes. In fact, the data controller who intends to carry out telemarketing activities is required to consult the register at least every 15 days or, in any event, before the start of a new campaign.

On the other hand, online marketing may consist primarily of an activity carried out through the use of profiling and advertising cookies (see 4.1 Use of Cookies), or of behavioural advertising and targeting activities carried out through the use of external databases (especially those of social networks). In this second case, the jurisprudence of the Court of Justice of the European Union and the interpretation provided by the EDPB in Guidelines 8/2020 clarify the need to carry out the activity on the basis of the prior consent of the data subject and, as a general rule, to reconstruct the privacy roles between the company and the social network as joint controllers of the processing to be regulated under Article 26 of the GDPR.

Processing carried out in the employment context is one of the sectors to which the GDPR defers to its regulation under national law, without prejudice to certain common guidelines and orientations first shared by WP29 and then by the EDPB, specifically regarding the vulnerable position of the data subject employee vis-à-vis the data controller employer (a situation that results in the presumption of the invalidity of any consents requested from the employee due to a lack of freedom).

Managing the Selection Process and the Employment Relationship

In these phases, the employer’s activities must respect – more than ever – the principle of minimisation, ensuring that only personal data that is essential for the performance of work duties and that, to a large extent, is governed by labour law provisions (eg, Article 8 of Law No 300/1970 or Legislative Decree No 81/2008) is requested from the candidate or employee.

Remote Monitoring of Workers

Without prejudice to a general prohibition on the use of instruments (also based on AI) to monitor employee activities, this case is governed by Article 4 of Law No 300/1970, which legitimises the use of such tools solely for organisational purposes and the protection of company assets (eg, cybersecurity purposes). In this case, without prejudice to instruments that are essential and prearranged for the performance of work duties, the use of instruments for remote monitoring is permitted only if doing so is:

  • agreed with the trade union representatives present in the company; or
  • authorised by the competent Labour Inspectorate in the absence of trade union representatives in the company or in the event of there being no agreement.

In these cases, the employee data subject will have to be provided with additional and detailed information on what is normally provided for under Articles 13 and 14 of the GDPR; this can be done by adopting an internal regulation on the use of IT tools, for example, which also informs employees of the possible controls and their purposes.

However, although the agreement with trade union representatives or administrative authorisation is sufficient to legitimise the activity from the point of view of labour law, this does not exempt the employer from complying with the principles on the protection of personal data (eg, the principle of minimisation). In this sense, unencrypted or clear monitoring of the URLs surfed by employees is unlawful because, in terms of security purposes, the same results can be achieved by implementing filters that inhibit the surfing of potentially risky websites. On this point, see also the Guidelines adopted by the GPDP on 1 March 2007.

Whistle-Blowing and the Transparency Decree

The national legislation on whistle-blowing was updated to transpose Directive (EU) 2019/1937 through the Legislative Decree No. 2023/24 which made discipline uniform between the private and public sectors. With regard to the protection of personal data, the general principles dictated by the GDPR remain valid, concerning the obligations to set up reporting and management processes in compliance with the principles of privacy by default and by design and with the need to ensure the confidentiality of the reporter (resulting in the inadequacy, for instance, of the email channel), carry out a DPIA on the processing, train and instruct the people who access the data and manage the reports, etc.

Further obligations (mainly informative) are also imposed by Legislative Decree No 104/2022 (the so-called “Transparency Decree”), which prescribes the need to carry out a DPIA and to provide additional information to data subjects in the event of “the use of automated decision-making or monitoring systems designed to provide indications relevant to the recruitment or assignment, management or termination of the employment relationship, the assignment of tasks or duties, as well as indications affecting the monitoring, assessment, performance and fulfilment of contractual obligations of workers.”

The value of personal data and consent databases as a corporate asset is often underestimated in corporate transactions. In this context, with regards to the sector in question, the main activity may consist of verifying the lawfulness and correctness of the processing of personal data that makes up a company's databases; this can be done by verifying the correctness and completeness of the information that the data controller had to provide to the data subjects pursuant to Articles 13 and 14 of the GDPR, and by examining the evidence of compliance with this information notice obligation.

Furthermore, where the processing of personal data is based on consent (eg, in the case of processing for promotional purposes or in the context of scientific research), it is essential to verify the correctness and ability to prove the consents collected from the data subjects and the effective capacity of the systems to receive any requests for withdrawal and/or opposition.

European data protection legislation requires that any transfer of personal data that is undergoing processing or is intended for processing after transfer to a third country or to an international organisation (including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation) shall take place only if the level of protection of natural persons guaranteed by the GDPR is not undermined.

This therefore requires an examination of the legal provisions applicable to the third country or international organisation in order to understand the actual level of protection of personal data, taking into account the elements specified in Article 45(2) of the GDPR. This analysis is carried out by the European Commission when it adopts the adequacy decisions referred to in Article 45 of the GDPR (decisions legitimising the transfer of personal data to the country or organisation benefiting from it).

In the absence of an adequacy decision, as clarified by the Court of Justice of the European Union in its judgment of 16 July 2020 (the “Schrems II” judgment), this assessment is instead the responsibility of the data controller or data processor who is intending to export the personal data. In such a case, where the law in force in the third country or applicable to the international organisation does not guarantee an adequate level of protection of personal data, the transfer may only be carried out subject to the adoption of additional security measures suitable to mitigate the risks to the rights and freedoms of the data subjects (eg, encryption of the data prior to the transfer in order to exclusively share encrypted data).

Notification to the supervisory authority is only required in the case of transfers pursuant to Article 49(1)(2) of the GDPR. This is the case when no other means can be used to legitimise the transfer and requires that the transfer:

  • is not repetitive;
  • concerns a limited number of data subjects;
  • is necessary for the purposes of compelling legitimate interests pursued by the controller which are not overridden by the interests or rights and freedoms of the data subject;
  • is carried out subject to appropriate data protection safeguards; and
  • is notified to the supervisory authority by the data controller.

European legislation on the protection of personal data does not provide for any obligation to store data within a specific member state or the EEA, aiming, on the contrary, to regulate and facilitate the free movement of such data. In the case of transfers of data to third countries, however, the provisions of Chapter V of the GDPR apply in order to guarantee an adequate level of protection of personal data (see 5.1 Restrictions on International Data Transfers).

There are no “blocking” statutes in the European data protection legislation in addition to those described in the previous sections concerning the transfer of data outside the EEA.

In 2024, the European Commission concluded the review process of 11 adequacy decisions regarding the transfer of personal data, confirming their validity. In this regard, the decisions remain effective, and it is permissible to continue the free transfer of personal data to Andorra, Argentina, Canada, the Faroe Islands, Guernsey, the Isle of Man, Israel, Jersey, New Zealand, Switzerland, and Uruguay.

ICTLC – ICT Legal Consulting

Via Borgonuovo 12
20121 Milan
Italy

+39 028 424 7194

+39 027 005 121 01

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Law and Practice in Italy

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Sweden (Gothenburg), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 56 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.