Data Protection & Privacy 2024

Last Updated February 13, 2024

Italy

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 54 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

The Italian regulatory framework on the protection of personal data and privacy is dictated by Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, repealing Directive 95/46/EC (GDPR). To the extent that such protection is not mentioned by the GDPR, it is regulated by Legislative Decree No 196/2003 (the “Privacy Code”).

Further detailed rules are contained in Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector, as transposed into Italian law by the Privacy Code.

With particular reference to the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection and prosecution of criminal offences or the execution of criminal penalties, the regulatory framework is instead governed by EU Directive 2016/680, transposed into the Italian legal system through Legislative Decree No 51/2018.

Finally, other specific indications and/or interpretations are contained in the decisions, recommendations and guidelines issued by the national supervisory authorities and the European Data Protection Board (eg, in Italy, the requirements for system administrators).

As mentioned in 1.1 Laws, supervisory authorities have limited regulatory power, mainly through the adoption of guidelines and opinions interpreting legal provisions. However, supervisory authorities (in Italy, the Autorità Garante per la Protezione dei Dati Personali – GPDP) also have supervisory powers to monitor compliance with data protection legislation, and benefit from investigative powers that include, ex multis, the possibility of requesting information from data controllers and data processors, or conducting on-site checks and inspections.

In this context, the supervisory authority may request access to the documentation adopted (privacy policy, consents, internal policies and procedures, records of processing activities, etc) and to systems and databases. The inspections of the GPDP may be triggered by the authority itself (on the basis of an inspection plan adopted and published every six months, or following notification of a personal data breach), or by data subjects or other third parties (in the case of complaints or reports). Any decisions that are eventually adopted are published.

Data protection legislation may also be applied by the courts in the case of appeals lodged by individuals (particularly in the case of claims for damages or appeals against decisions of the supervisory authority).

The above also applies with regard to AI matters, although in the AI Act to be adopted by the European Union, Member States may be required to confer competencies in AI matters to specific local authorities.

As mentioned in 1.2 Regulators, GPDP inspections may be triggered by the authority itself (on the basis of an inspection plan adopted and published every six months, or following the notification of a personal data breach) or by data subjects or other third parties (in the case of complaints or reports).

Preliminary Investigation

In the event of a complaint by a data subject, the GPDP shall verify the correctness and completeness of the complaint and, if necessary, grant the complainant a period of time to amend it, normally not exceeding 15 days. In the event of a correct and complete complaint (or in the event of an investigation on its own accord, such as following the notification of a personal data breach), the GPDP shall start a preliminary investigation during which the documentation received is examined and/or further information is requested from the data controller or data processor.

In that scenario, inspections may also be carried out, during which the entity subject to inspection may be assisted by its trusted advisers and reserve the right to produce the documentation that is not immediately available within a reasonable period (as a rule, not exceeding 30 days). A record of the activity carried out shall also be drawn up, with particular reference to the statements made and the documents acquired, and a copy shall be given to the subject under inspection.

Closing of the Preliminary Investigation and Archiving

At the end of the preliminary investigation, the competent department within the GPDP may conclude its examination of the complaint by archiving it, when:

  • the issue examined does not appear to be related to the protection of personal data or the tasks entrusted to the GPDP;
  • there is no evidence of a breach of the relevant data protection regulations;
  • the claim set out in the complaint is excessive, due in particular to its specious or repetitive character; or
  • the issue raised by the complaint has already been examined by the GPDP.

In the case of a complaint, feedback is provided to the applicant, briefly stating the reasons why no action is taken.

Initiation of Proceedings

If the matter is not dismissed following the preliminary investigation, the competent department shall initiate proceedings for the adoption of measures by the board of the GPDP, by means of its own communication to the data controller and/or data processor. The communication shall contain:

  • a concise description of the facts and alleged breaches of the relevant data protection rules, as well as the relevant sanctioning provisions;
  • an indication of the competent organisational unit where a copy of the investigative documents may be inspected and extracted; and
  • the indication that, within 30 days of receipt of the notice, it is possible to send the GPDP defence papers or documents, and to ask to be heard by the same GPDP.

Right of Defence

The addressee of the notice may exercise the right of defence by submitting written statements and documents within 30 days from the date of notification of the communication, as well as a personal testimony regarding the facts of the notice, where requested.

The addressee of the notice may request a short extension by specifically and duly motivating the request. The extension shall normally not exceed 15 days, and may be granted according to proportionality criteria and criteria relating to the operational/dimensional characteristics of the addressees themselves and to the complexity of the matter under examination. The addressee of the notice may also request a hearing before the GPDP.

Failure to submit written counter-arguments or a request for a hearing shall not prejudice the continuation of the proceedings.

Decision

Where necessary, the board of the GPDP, by its own resolution, shall adopt the corrective and sanctioning measures referred to in Article 58(2) of the GDPR (in the case of an administrative pecuniary sanction, the quantum is calculated on the basis of the criteria indicated by Article 83 of the GDPR). The decision is notified to the parties by the department, service or other organisational unit that has supervised the preliminary investigation.

Appeal Against Measures of the GPDP

Under penalty of inadmissibility, an appeal against the measures adopted by the GPDP must be lodged within 30 days from the date of communication of the decision or within 60 days if the appellant resides abroad, with the ordinary court of the place where the data controller resides, or with the court of the place of residence of the data subject. At the time of the appeal, it is also possible to request the court to suspend the enforceability of the contested decision.

The so-called “work ritual” applies to the judicial procedure, and the sentence that defines the judgment is not appealable before the judge and may prescribe the necessary measures and compensation for damages.

With regard to multilateral agreements with countries outside the European Economic Area, the adequacy decisions adopted by the European Commission pursuant to Article 45 of the GDPR must be taken into account (see 4.1 Restrictions on International Data Issues and 4.2 Mechanisms or Derogations That Apply to International Data Transfers for more detail) to legitimise a transfer of personal data.

On the other hand, any local provisions in force in the individual member states are legitimate only where the GDPR provides for local legislation (eg, in the case of limitations on the processing of particular categories of personal data or the processing of personal data in the context of the employment relationship).

These entities do not have an expressly defined role in data protection legislation but may operate as bodies that protect the rights of data subjects and whose activity is aimed at promoting compliance with data protection principles through the promotion of complaints and judicial remedies (eg, NOYB – European Center for Digital Rights), or as professional associations supporting them (eg, IAPP – International Association of Privacy Professionals). There are also bodies aimed at promoting study, research and innovation in the field of personal data protection, such as the Italian Privacy Institute or the European Centre for Certification and Privacy (scheme owner of the EuroPrivacy certification pursuant to Article 42 of the GDPR).

The main objective of the European legislature – achieved with the GDPR and in progress as far as the electronic communications sector is concerned – was to harmonise personal data protection regulations among the various member states, with a view to simplifying and facilitating the circulation of data and the internal market. Moreover, the GDPR has given rise to a shift away from a merely formal compliance model (based, for instance, on checklists and lists of universally valid security measures) to a substantial and flexible model. In other words, by requiring them to assess, identify and adopt the security measures appropriate to their own entities, and also in view of the available technologies and the relevant costs, data controllers can well adapt to any sector and be considered up-to-date, even in light of technological developments.

It is therefore not surprising that the GDPR is taken as a model for all new data protection legislation, and that a debate has arisen in the US as to whether a federal regulation of the subject is appropriate.

One of the main issues addressed in the last 12 months concerns the adoption by the  European Commission of its adequacy decision for the EU-US Data Privacy Framework, which restores an important mechanism for the cross-border transfer of personal data and introduces limitations on US surveillance agencies’ access to EU data beyond what is “necessary and proportionate” as well as an independent dispute resolution mechanism. In addition, as was the case with previous Privacy Shield, and unlike the other adequacy decisions that operate “automatically” in favour of all organisations established in the country to which the decision is addressed, this EU-US DPF only applies to US companies that adhere to the certification mechanism, the list of which will be made available at the official website.

Of particular interest are the claims that have already been filed against this adequacy decision and that could lead the Court of Justice of the European Union to examine the validity of the EU-US DPF, and – in the worst case scenario – to declare the third agreement between the EU and the US invalid.

A major topic that is fuelling the debate is the monetisation of personal data and the possibility for data subjects to use personal data as a counter-performance as payment for goods and services. This issue will also be the subject of particular attention in light of:

  • the adoption of EU Directive 2019/770 and its transposition (in Italy, the introduction of Articles 135-octies et seq. of the Consumer Code) aimed at regulating those cases in which “the trader supplies or undertakes to supply digital content or a digital service to the consumer, and the consumer provides or undertakes to provide personal data to the trader”;
  • the initiative of some data controllers operating in the publishing sector to apply cookie walls on their websites, aimed at collecting consent to the installation of the profiling cookies that are essential for the enjoyment of web content and as an alternative to the payment of the subscription cost; and
  • Meta’s decision to require users to pay a fee – as an alternative to consenting to the processing of data for profiling purposes – for the use of its social networking services.

A second theme is likely to be the need for a regulatory framework governing the development and use of AI tools. In this regard, the European Union came to an agreement in principle on the content of the AI Act, which will be developed in the coming months and which, if adopted, could provide a set of rules applicable within a few years.

Data Protection Officer (DPO)

Pursuant to Article 37 of the GDPR, as interpreted by the supervisory authorities’ guidelines, the appointment of a DPO is mandatory for public administrations or where the main activities carried out by the data controller or data processor consist of processing operations which, by virtue of their nature, scope and/or purposes, require regular and systematic monitoring of data subjects on a large scale or the processing on a large scale of special categories of data and personal data relating to criminal convictions and offences. In addition, the European guidelines make it clear that data controllers and data processors must document their assessments as to whether or not to designate a DPO and periodically review this assessment, unless it is evident that an organisation is not required to designate a DPO.

The tasks of the DPO are set out in Article 39 of the GDPR and consist of:

  • informing and advising the controller or the processor and the employees who carry out processing of their obligations pursuant to European data protection legislation;
  • monitoring compliance with European data protection legislation and with the policies of the controller or processor in relation to the protection of personal data, including the assignment of responsibilities, raising awareness and training staff involved in processing operations, and the related audits;
  • providing advice where requested as regards the Data Protection Impact Assessment (DPIA) and monitoring its performance;
  • co-operating with the supervisory authority; and
  • acting as the contact point for the supervisory authority on issues relating to processing, and to consult, where appropriate, with regard to any other matter.

In the performance of their tasks, the DPO shall have due regard to the risk associated with processing operations, taking into account the nature, scope, context and purposes of processing.

Lawfulness of Processing

Any processing of personal data must be based on at least one of the following legal bases provided for in Article 6(1) of the GDPR:

  • the data subject has freely given specific, informed and unambiguous consent to the processing of their personal data;
  • processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
  • processing is necessary for compliance with a legal obligation to which the controller is subject;
  • processing is necessary in order to protect the vital interests of the data subject or of another natural person;
  • processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller; or
  • processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require the protection of personal data, particularly where the data subject is a child; in this case, the data controller is required to carry out an assessment of the legitimate interest pursued in relation to the rights and freedoms of the data subject by conducting a balancing activity that may possibly be challenged by the supervisory authority or the court.

Data Protection by Design and by Default

Both at the time of the determination of the means for new processing and at the time of the processing itself, the data controller shall implement appropriate technical and organisational measures that are designed to implement data protection principles and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. At the same time, the data controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data that is necessary for each specific purpose of the processing is processed.

Data Protection Impact Assessment

Pursuant to Article 35 of the GDPR, where a type of processing is likely to result in a high risk to the rights and freedoms of natural persons, particularly processing using new technologies, and taking into account the nature, scope, context and purposes of the processing, the controller shall carry out an assessment of the impact of the envisaged processing operations on the protection of personal data, prior to the processing. This activity is especially required in the following:

  • a systematic and extensive evaluation of personal aspects relating to natural persons based on automated processing, including profiling, and upon which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person;
  • processing on a large scale of special categories of data or of personal data relating to criminal convictions and offences; or
  • a systematic monitoring of a publicly accessible area on a large scale.

The supervisory authorities have also identified a further criterion for assessing the need for a DPIA; in fact, they have identified nine risk factors and provided for the obligation of such an activity when a processing operation presents two or more of them. This approach was also used to draw up the blacklist of processing operations that the supervisory authorities locally require to be subject to a DPIA.

This assessment shall contain at least the following:

  • a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller;
  • an assessment of the necessity and proportionality of the processing operations in relation to the purposes;
  • an assessment of the risks to the rights and freedoms of data subjects; and
  • the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with data protection principles.

Where a DPIA indicates that the processing would result in a high risk in the absence of measures taken by the controller to mitigate such risk, the controller shall consult the supervisory authority prior to processing.

Internal and External Privacy Policies

A corollary of the transparency principle is the obligation for data controllers to inform data subjects about the processing of personal data, providing them with the information required by Articles 13 and 14 of the GDPR. For data collected directly from the data subject, this must be done at the time the data is obtained and at the time of the first contact with the data subject, or within 30 days in the case of data that is not provided directly by the data subject. In the second case, the information does not need to be provided to the data subject when:

  • the data subject already has the information;
  • the provision of such information proves impossible or would involve a disproportionate effort;
  • obtaining or disclosure is expressly laid down by the law to which the controller is subject and which provides appropriate measures to protect the data subject's legitimate interests; or
  • the personal data must remain confidential subject to an obligation of professional secrecy.

Data Subjects’ Rights

Data subjects have certain rights under the GDPR in order to allow them to have continuous and effective control over their personal data. In particular, data subjects have the right to:

  • request access to their data (by receiving a copy of it) or to all information relating to the processing of their personal data (the purpose of processing, the recipients to whom the data is disclosed, any transfers outside the EEA, etc);
  • obtain the rectification of inaccurate or incomplete personal data;
  • obtain the deletion of their personal data in the cases provided for in Article 17 of the GDPR;
  • obtain the restriction of processing in the cases provided for in Article 18 of the GDPR;
  • obtain their personal data in a structured and commonly used format or to request the transmission of such personal data to another data controller, where the legal basis of the processing is the consent of the data subject or the performance of a contract;
  • object to processing based on legitimate interest or the performance of a task carried out in the public interest or in the exercise of official authority;
  • not be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them;
  • withdraw the consent given; and
  • lodge a complaint with a supervisory authority.

Anonymisation, De-identification and Pseudonymisation

Anonymisation and pseudonymisation are two processing operations aimed respectively at excluding or reducing the ability of information to be attributed to a specific data subject. The former makes such subsequent re-identification impossible and therefore aims to exclude the applicability of data protection provisions on the resulting output (the so-called anonymised data).

The second is instead a security measure expressly referred to in Article 32 of the GDPR and defined as “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.”

Automated Individual Decision-Making

As anticipated, the data subject has the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning the subject or similarly significantly affects them. It does not apply if the decision is:

  • necessary for entering into, or the performance of, a contract between the data subject and a data controller;
  • authorised by the law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests; or
  • based on the data subject’s explicit consent.

In the first and last cases, the data controller shall implement suitable measures to safeguard the data subject’s rights, freedoms and legitimate interests, at least the right to obtain human intervention on the part of the controller, to express their point of view and to contest the decision. In addition, the data controller shall provide the data subject with information about the existence of automated decision-making, including profiling, and with meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

“Injury” or “Harm” in Data Protection Law

From a data protection perspective, it is necessary to pay attention to the risk to the rights and freedoms of natural persons in terms of physical, material or non-material damage, particularly where the processing may give rise to discrimination, identity theft or fraud, financial loss, reputational damage, the loss of confidentiality of personal data protected by professional secrecy, the unauthorised reversal of pseudonymisation, or any other significant economic or social disadvantage.

In addition to the legal bases listed under Article 6(1) of the GDPR indicated in 2.1 Omnibus Laws and General Requirements, a distinction must be made between the following:

  • special categories of personal data referred to in Article 9 of the GDPR (personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation), the processing of which is generally prohibited; indeed, the processing of such data must rely on one of the narrow exceptions set out in Article 9(2) of the GDPR in conjunction with Article 6(1);
  • personal data relating to criminal convictions and offences indicated in Article 10 of the GDPR, the processing of which shall be carried out only under the control of an official authority or when the processing is authorised by law; and
  • traffic data, location data and browsing data, the regulation of which is dictated by Articles 121 et seq of the Privacy Code (implementing Directive 2002/58/EC); with particular reference to the use of cookies, processing is permitted only with the user’s consent, except in the case of cookies that are essential for providing the service.

In Italy, the general rule is set out by Article 130(1;2) of the Privacy Code (which transposes Directive 2002/58/EC), under which commercial and promotional communications by email, fax, telephone and similar means of communication require the prior consent of the user (natural or legal person). However, Article 130(4) provides for an exception to the requirement of consent, allowing for the processing of the email address provided by the data subject in the context of the sale of a product or a service for the purpose of sending commercial communications aimed at the direct sale of products or services similar to those already purchased, provided that the data subject has been adequately informed and does not refuse such use, either initially or on the occasion of subsequent communications.

With specific regard to telephone marketing activities, Article 130(3-bis) provides that data controllers may lawfully contact all users who have not objected to receiving commercial communications by telephone by registering in the Register of Opposition. In this sense, pursuant to Law No 5/2018, users may enlist in the register in order to prevent subsequent communications and, at the same time, withdraw any consent previously given to the processing of their personal data for telephone marketing purposes. In fact, the data controller who intends to carry out telemarketing activities is required to consult the register at least every 15 days or, in any event, before the start of a new campaign.

On the other hand, online marketing may consist primarily of an activity carried out through the use of profiling and advertising cookies (see 2.2 Sectoral and Special Issues), or of behavioural advertising and targeting activities carried out through the use of external databases (especially those of social networks). In this second case, the jurisprudence of the Court of Justice of the European Union and the interpretation provided by the EDPB in Guidelines 8/2020 clarify the need to carry out the activity on the basis of the prior consent of the data subject and, as a general rule, to reconstruct the privacy roles between the company and the social network as joint controllers of the processing to be regulated under Article 26 of the GDPR.

Processing carried out in the employment context is one of the sectors to which the GDPR defers to its regulation under national law, without prejudice to certain common guidelines and orientations first shared by WP29 and then by the EDPB, specifically regarding the vulnerable position of the data subject employee vis-à-vis the data controller employer (a situation that results in the presumption of the invalidity of any consents requested from the employee due to a lack of freedom).

Managing the Selection Process and the Employment Relationship

In these phases, the employer’s activities must respect – more than ever – the principle of minimisation, ensuring that only personal data that is essential for the performance of work duties and that, to a large extent, is governed by labour law provisions (eg, Article 8 of Law No 300/1970 or Legislative Decree No 81/2008) is requested from the candidate or employee.

Remote Monitoring of Workers

Without prejudice to a general prohibition on the use of instruments (also based on AI) to monitor employee activities, this case is governed by Article 4 of Law No 300/1970, which legitimises the use of such tools solely for organisational purposes and the protection of company assets (eg, cybersecurity purposes). In this case, without prejudice to instruments that are essential and prearranged for the performance of work duties, the use of instruments for remote monitoring is permitted only if doing so is:

  • agreed with the trade union representatives present in the company; or
  • authorised by the competent Labour Inspectorate in the absence of trade union representatives in the company or in the event of there being no agreement.

In these cases, the employee data subject will have to be provided with additional and detailed information on what is normally provided for under Articles 13 and 14 of the GDPR; this can be done by adopting an internal regulation on the use of IT tools, for example, which also informs employees of the possible controls and their purposes.

However, although the agreement with trade union representatives or administrative authorisation is sufficient to legitimise the activity from the point of view of labour law, this does not exempt the employer from complying with the principles on the protection of personal data (eg, the principle of minimisation). In this sense, unencrypted or clear monitoring of the URLs surfed by employees is unlawful because, in terms of security purposes, the same results can be achieved by implementing filters that inhibit the surfing of potentially risky websites. On this point, see also the Guidelines adopted by the GPDP on 1 March 2007.

Whistle-Blowing and the Transparency Decree

The national legislation on whistle-blowing was updated to transpose Directive (EU) 2019/1937 through the Legislative Decree No 2023/24 which made discipline uniform between the private and public sectors. With regard to the protection of personal data, the general principles dictated by the GDPR remain valid, namely the obligations to set up reporting and management processes in compliance with the principles of privacy by default and by design, and the need to ensure the confidentiality of the reporter (highlighting the unsuitability of channels like email for this purpose), carry out a DPIA on the processing, and train and instruct the people who access the data and manage the reports, etc.

Further obligations (mainly informative) are also imposed by Legislative Decree No 104/2022 (the so-called “Transparency Decree”), which prescribes the need to carry out a DPIA and to provide additional information to data subjects in the event of “the use of automated decision-making or monitoring systems designed to provide indications relevant to the recruitment or assignment, management or termination of the employment relationship, the assignment of tasks or duties, as well as indications affecting the monitoring, assessment, performance and fulfilment of contractual obligations of workers.”

Please see 1.3 Administration and Enforcement Process regarding the internal procedure of the GPDP aimed at adopting sanctioning measures, which states that the administrative pecuniary fines provided for by Article 83 of the GDPR only have a maximum amount equal, depending on the type of infringement, to EUR10 million or, in the case of an undertaking, up to 2% of the total worldwide annual turnover of the preceding financial year, whichever is higher, or equal to EUR20 million or up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher.

In addition, when considering the quantum of the administrative fine, the supervisory authority shall also take into account all the circumstances set out in Article 83(2) of the GDPR.

According to Articles 78 and 79 of the GDPR, there are no standards that provide for an authorisation for judicial protection, but this can be promoted:

  • against the supervisory authority by challenging its decisions that affect the natural or legal person bringing the action or in the event of inaction on the part of the supervisory authority with regard to complaints brought by the data subjects; or
  • against a data controller or data processor if a data subject considers that their rights under the GDPR have been violated as a result of processing.

In this regard, it is worth taking note (with particular regard to the first point) of the court’s annulment of the decision adopted by the GPDP against Enel Energia SpA on 16 December 2021. On the other hand, there are still some doubts regarding the interpretation of the possibility of resorting to the institutions currently included in the Code of Civil Procedure with regard to the protection of personal data protection rights by means of so-called class actions.

Legislation about the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, is provided by EU Directive 2016/680 and Italian legislative Decree No 51/2018. The different ways in which data and information are collected by the authorities are regulated by the Civil and Criminal Procedure Codes and, as a rule, require the approval of the judge.

Please see 3.1 Laws and Standards for Access to Data for Serious Crimes.

Please see 4.6 Limitations and Considerations regarding foreign government access requests. Italy does not participate in a Cloud Act agreement with the USA.

Italian data protection legislation does not provide for the possibility of indiscriminate access to personal data by government authorities; in fact, their activities must be based on compliance with the principles dictated by Article 5 of the GDPR, as well as compliance with those obligations expressly provided for by law pursuant to Articles 6(1)(c) of the GDPR and 2-ter of the Privacy Code.

However, there has been no lack of concern or debate in the public domain about the possibility of access to personal data in the case of the creation of large databases (eg, in the case of systems used for electronic invoicing or in case of the Immuni App for preventing the spread of COVID-19), but in all these cases the legal provisions prohibited the scenario.

European data protection legislation requires that any transfer of personal data that is undergoing processing or is intended for processing after transfer to a third country or to an international organisation (including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation) shall take place only if the level of protection of natural persons guaranteed by the GDPR is not undermined.

This therefore requires an examination of the legal provisions applicable to the third country or international organisation in order to understand the actual level of protection of personal data, taking into account the elements specified in Article 45(2) of the GDPR. This analysis is carried out by the European Commission when it adopts the adequacy decisions referred to in Article 45 of the GDPR (decisions legitimising the transfer of personal data to the country or organisation benefiting from it).

In the absence of an adequacy decision, as clarified by the Court of Justice of the European Union in its judgment of 16 July 2020 (the “Schrems II” judgment), this assessment is instead the responsibility of the data controller or data processor who is intending to export the personal data. In such a case, where the law in force in the third country or applicable to the international organisation does not guarantee an adequate level of protection of personal data, the transfer may only be carried out subject to the adoption of additional security measures suitable to mitigate the risks to the rights and freedoms of the data subjects (eg, encryption of the data prior to the transfer in order to exclusively share encrypted data).

According to Chapter V of the GDPR, the transfer of personal data to third countries and international organisations may take place on the basis of an adequacy decision as provided for in Article 45, an appropriate safeguard as provided for in Article 46, or one of the derogations set out in Article 49.

Adequacy Decision (Article 45 of the GDPR)

Taking into account the elements indicated in 4.1 Restrictions on International Data Issues, the European Commission may adopt adequacy decisions recognising that the level of protection of personal data guaranteed within the third country or international organisation is not inferior to that provided for in the GDPR. Adequacy decisions are also subject to periodic review (at least every four years), taking into account any internal regulatory developments or agreements of the third country or international organisation.

To date, 15 adequacy decisions are in force: Andorra, Argentina, Australia, Canada, Faroe Islands, Japan, Guernsey, Israel, Isle of Man, Jersey, New Zealand, the USA, the United Kingdom, Switzerland and Uruguay.

With particular reference to the USA, on 10 July 2023, the European Commission adopted its adequacy decision for the EU-US Data Privacy Framework which restores an important mechanism for the cross-border transfer of personal data and introduces limitations on US surveillance agencies’ access to EU data beyond what is “necessary and proportionate” as well as an independent dispute resolution mechanism. In addition, as was the case with previous Privacy Shield, and unlike the other adequacy decisions that operate “automatically” in favour of all organisations established in the country to which the decision is addressed, this EU-US DPF only applies to US companies that adhere to the certification mechanism, the list of which will be made available at the official website www.dataprivacyframework.gov/s/. On a practical level, this entails the possibility of freely transferring personal data to US entities that adhere to the certification.

Appropriate Safeguards (Article 46 of the GDPR)

In the absence of a decision pursuant to Article 45, a controller or processor may transfer personal data to a third country or an international organisation only if the controller or processor has provided appropriate safeguards, and on the condition that enforceable data subject rights and effective legal remedies for data subjects are available. These appropriate safeguards are:

  • a legally binding and enforceable instrument between public authorities or bodies;
  • binding corporate rules in accordance with Article 47 (an instrument approved by the supervisory authority at the request of a group of companies to govern the most frequent cases of transfers of personal data within the group’s companies);
  • standard data protection clauses adopted by the European Commission;
  • standard data protection clauses adopted by a supervisory authority and approved by the European Commission (to date, the Italian supervisory authority has not adopted these clauses);
  • an approved code of conduct pursuant to Article 40, together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights (this safeguard has not yet been adopted); and
  • an approved certification mechanism pursuant to Article 42, together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights (this safeguard has not yet been adopted).

Derogations for Specific Situations (Article 49 of the GDPR)

In the absence of a decision pursuant to Article 45 or appropriate safeguards pursuant to Article 46, a transfer or a set of transfers of personal data to a third country or an international organisation shall take place only under one of the following conditions:

  • the data subject has explicitly consented to the proposed transfer, after having been informed of the possible risks of such transfers for the data subject due to the absence of an adequacy decision and appropriate safeguards;
  • the transfer is necessary for the performance of a contract between the data subject and the controller or the implementation of pre-contractual measures taken at the data subject’s request;
  • the transfer is necessary for the conclusion or performance of a contract concluded in the interest of the data subject between the controller and another natural or legal person;
  • the transfer is necessary for important reasons of public interest;
  • the transfer is necessary for the establishment, exercise or defence of legal claims;
  • the transfer is necessary in order to protect the vital interests of the data subject or of other persons, where the data subject is physically or legally incapable of giving consent; or
  • the transfer is made from a register that according to EU or member state law is intended to provide information to the public and is open to consultation either by the public in general or by any person who can demonstrate a legitimate interest, but only to the extent that the conditions laid down by EU or member state law for consultation are fulfilled in the particular case.

As a residual measure with respect to all the above-mentioned hypotheses, a transfer to a third country or an international organisation may take place only if:

  • the transfer is not repetitive;
  • the transfer concerns only a limited number of data subjects;
  • the transfer is necessary for the purposes of compelling legitimate interests pursued by the controller that are not overridden by the interests or rights and freedoms of the data subject; and
  • the controller has assessed all the circumstances surrounding the data transfer and has on the basis of that assessment provided suitable safeguards with regard to the protection of personal data.

In this case, the controller shall inform the supervisory authority of the transfer and shall inform the data subject of the transfer and the compelling legitimate interests pursued.

Notification to the supervisory authority is only required in the case of transfers pursuant to Article 49(1)(2) of the GDPR. This is the case when no other means can be used to legitimise the transfer and requires that the transfer:

  • is not repetitive;
  • concerns a limited number of data subjects;
  • is necessary for the purposes of compelling legitimate interests pursued by the controller which are not overridden by the interests or rights and freedoms of the data subject;
  • is carried out subject to appropriate data protection safeguards; and
  • is notified to the supervisory authority by the data controller.

European legislation on the protection of personal data does not provide for any obligation to store data within a specific member state or the EEA, aiming, on the contrary, to regulate and facilitate the free movement of such data. In the case of transfers of data to third countries, however, the provisions of Chapter V of the GDPR apply in order to guarantee an adequate level of protection of personal data (see 4.1 Restrictions on International Data Issues and 4.2 Mechanisms or Derogations That Apply to International Data Transfers.

There is no obligation stipulated under European data protection law to share data, software code, algorithms or other technologies with government authorities.

Without prejudice to other grounds for transfer pursuant to Chapter V of the GDPR, Article 48 of the GDPR provides that “any judgment of a court or tribunal and any decision of an administrative authority of a third country requiring a controller or processor to transfer or disclose personal data may only be recognised or enforceable in any manner if based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the Union or a Member State.”

Moreover, Article 50(b) of the GDPR states that “in relation to third countries and international organisations, the Commission and supervisory authorities shall take appropriate steps to provide international mutual assistance in the enforcement of legislation for the protection of personal data, including through notification, complaint referral, investigative assistance and information exchange, subject to appropriate safeguards for the protection of personal data and other fundamental rights and freedoms.”

There are no “blocking” statutes in the European data protection legislation in addition to those described in the previous sections concerning the transfer of data outside the EEA.

Most of the topics indicated are not yet subject to specific legal provisions but are regulated on the basis of the general principles dictated by the GDPR and European legislation, and on the basis of national or European guidelines, opinions, white books, etc. Without prejudice to what is stated in 2.2 Sectoral and Special Issues with regard to the processing of biometric data and localisation, it seems useful here to refer to Law No 205/2021 as converted into Decree Law No 8/2021, which prohibits the installation and use by public authorities or private entities of video surveillance with facial recognition systems operating through the use of biometric data in public places or places open to the public, until 31 December 2023.

On the other hand, with regard to the regulation of big data, reference is made to the fact-finding investigation carried out by the Italian Data Protection Authority in co-operation with the antitrust and telecommunications authorities, as reported in a document jointly published on 2 July 2019. In addition to setting out recommendations and guidelines for the legislature, the document also envisages the creation of a permanent co-ordination mechanism between the three competent authorities.

Italian data protection legislation does not expressly require the creation of internal committees or protocols to manage digital governance. However, in practice, it is often the case that such committees will act in support of or in collaboration with the DPO.

Please see 1.3 Administration and Enforcement Process and 2.5 Enforcement and Litigation.

The value of personal data and consent databases as a corporate asset is often underestimated in corporate transactions. In this context, with regards to the sector in question, the main activity may consist of verifying the lawfulness and correctness of the processing of personal data that makes up a company’s databases; this can be done by verifying the correctness and completeness of the information that the data controller had to provide to the data subjects pursuant to Articles 13 and 14 of the GDPR, and by examining the evidence of compliance with this information notice obligation.

Furthermore, where the processing of personal data is based on consent (eg, in the case of processing for promotional purposes or in the context of scientific research), it is essential to verify the correctness and ability to prove the consents collected from the data subjects and the effective capacity of the systems to receive any requests for withdrawal and/or opposition.

From a data protection point of view, data controllers and data processors who adhere to a code of conduct pursuant to Article 40 or who benefit from certification pursuant to Article 42 may make public such adherence or certification; however, there is no obligation to disclose risk profiles.

In terms of cybersecurity legislation, the national provisions on Network and Information Security provided by Legislative Decree No 65/2018 (adopted in transposition of EU Directive 2016/1148) required subjects falling within the scope of application to co-operate with government authorities (ministries and national cybersecurity agency) to share information on and approaches to the risks and measures taken.

As provided in 1.8 Significant Pending Changes, Hot Topics and Issues, the coming period will be characterised by a debate regarding:

  • the monetisation of personal data, a phenomenon that will necessarily entail a dialogue with competition and consumer protection law (already initiated by EU Directive 2019/770); this will be even more apparent once the new European regulations that are to strengthen the Digital Single Market come into force, introducing a regulatory framework that cannot be evaluated separately, but rather requires a comprehensive overview; and
  • the development and use of AI tools, a phenomenon that will necessarily be governed under a data protection, copyright, competition and consumer protection law; in this regard, the European Union came to an agreement in principle on the content of the AI Act, which will be developed in the coming months and which, if adopted, could provide a set of rules applicable within a few years.

There are no other significant issues.

ICT Legal Consulting

Via Borgonuovo 12
20121 Milan
Italy

+39 028 424 7194

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Trends and Developments


Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 54 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

Article 51 GDPR provides that “[e]ach Member State shall provide for one or more independent public authorities to be responsible for monitoring the application of this Regulation, in order to protect the fundamental rights and freedoms of natural persons in relation to processing and to facilitate the free flow of personal data within the Union”. Under Article 58 of the GDPR, the supervisory authority is granted a wide range of powers, including investigative, corrective, authorising and advisory powers. Among the most significant of these powers is the ability to levy pecuniary fines and to impose a temporary or definitive ban on the processing of personal data.

In Italy, the competent supervisory authority is the Garante per la Protezione dei Dati Personali (“Garante” or GPDP), whose decisions can be appealed by applying to the ordinary tribunal (second instance), and to the Supreme Court of Cassation (third instance).

The GPDP is widely considered one of the most active and influential supervisory authorities, having issued – as of 15 January 2023 – more than 340 enforcement actions, amounting to over EUR145,282,300 in sanctions. Italy is second only to Spain in terms of the number of sanctions issued; the Spanish Supervisory Authority has issued at least 790 sanctions, although the total amount in fines is only approximately EUR63,026,500. The GPDP has issued fines and utilised its corrective powers, including imposing bans on processing where necessary, in most areas and industry sectors subject to the GDPR. However, there are certain aspects of the GDPR where the GPDP seems to have focused its enforcement efforts more intensively.

Trends in Enforcement

Traditionally the GPDP has been especially concerned with combating unlawful telemarketing practices, in particular as regards transparency and consent requirements, as well as the engagement of call centers as data processors without the necessary data protection safeguards, including performance of audits by the data controller. In this field, the Garante has issued some of its highest sanctions, such as those against Eni Gas e Luce (issued on 11 December 2019, totalling EUR11,500,000), Tim (issued on 15 January 2020, totalling EUR27,800,000) and Sky Italia (issued on 16 September 2021, totalling EUR3,200,000).

More recently, the Garante has focused its attention on the protection of the data privacy rights of children, as apparent from the enforcement actions undertaken against the popular social network TikTok concerning data verification requirements. On 22 January 2021, following the highly publicised death of a ten-year-old girl from Sicily participating in a “blackout” challenge, the GPDP imposed an immediate limitation on the data processing concerning users “whose age could not be established with full certainty so as to ensure compliance with the age-related requirements”. On 3 February 2021, the Italian DPA noted that, following the enforcement action, TikTok committed to fulfilling GDPR age-verification requirements by taking a number of actions, including:

  • removing accounts belonging to users under thirteen;
  • employment of an AI-based age verification system;
  • launching an information campaign to raise awareness;
  • including an in-app button for reporting those under thirteen;
  • improving the language of the privacy notice for users under eighteen; and
  • doubling the number of Italian platform content moderators.

Another field where the GPDP has recently stepped up its enforcement actions is that of video surveillance. In October 2023, the GPDP reprimanded an individual for installing a home surveillance system with cameras capturing not only their own flat, but also a public area (including a public park). According to the GPDP, the placement of the cameras violated the principle of lawfulness, as the data controller failed to show a legitimate interest capable of justifying the recording of public areas and conversations through the audio system, as well as the principle of data minimisation. The decision is interesting also because it clarifies that, while domestic surveillance systems are generally exempt from having to comply with the GDPR, such exemption does not apply when involving public or third parties’ properties. The Garante issued a mere reprimand against the data controller, considering that the individual promptly rectified the situation by replacing the camera and redirecting it solely towards the entrance of their home.

In June 2023, the GPDP issued a sanction of EUR20,000 against an Italian employer for having, inter alia, installed a video surveillance system in its premises without having obtained the prior approval of the workers’ council or of the public labour authority, as required by Article 4 of Law No 300/1970 (the so-called Workers’ Statute). Moreover, no privacy notice had been drafted and made available to the workers.

Lastly, a field where the Garante has recently focused its attention is that of Artificial Intelligence (AI), as shall be examined in the following section.

The Italian Data Protection Authority at the Forefront of Artificial Intelligence Enforcement

During the last couple of years, the GPDP has taken noteworthy initiatives in the context of AI, including by means of enforcement actions against providers of AI systems. As a result, the Garante is positioning itself as one of the most active European Union supervisory authorities on the regulation of AI vis-à-vis the GDPR and Italian Data Protection Law. Below, we provide a brief overview of the most important initiatives undertaken by the GPDP in the AI field.

Enforcement action against ClearviewAI

On 9 March 2022, the GPDP fined the US-based company Clearview AI EUR20 million after finding it carried out processing activities concerning biometric data of persons residing within Italian territory.

The GPDP inquiry into Clearview AI revealed that the company had processed personal data, including biometric and geolocation information, unlawfully without a proper legal basis. In particular, the legitimate interest leveraged by the US-based company as the relevant legal basis for the processing was not suitable for the processing of biometric data, which qualify as special categories of personal data under Article 9 of the GDPR so that their processing is generally prohibited, save where specific exceptions provider by paragraph 2 of Article 9 of the GDPR apply. Moreover, Clearview AI had violated several fundamental principles of the GDPR, such as lacking transparency in adequately informing users, exceeding the intended purposes for processing users’ data made available online, and neglecting to establish a data storage period. Consequently, Clearview AI has been infringing on the freedoms of data subjects, including their right to privacy, personal data protection and to non-discrimination.

Through web scraping, Clearview AI has amassed a database containing billions of facial images sourced globally from public web outlets like media platforms, social media, and online videos. By processing these personal data by means of advanced algorithms, Clearview AI has been able to provide a refined search service allowing the creation of profiles based on biometric data extracted from these images. These profiles can then be augmented with additional information, such as image tags, geolocation, and so on.

As a result of these violations, the GPDP levied a EUR20 million fine on Clearview AI and ordered the deletion of data pertaining to individuals residing in Italy. The authority also prohibited any further collection and processing of data through Clearview AI’s facial recognition system. Additionally, Clearview AI was instructed by the Italian SA to appoint a representative in the EU pursuant to Article 27 of the GDPR, facilitating the exercise of data subject rights, alongside (or in lieu of) the US-based controller.

Enforcement action Against OpenAI

In late March 2023, the Garante identified several violations of the GDPR and Italian Data Protection Law by the well-known and widely used generative AI system ChatGPT, just a few months after its launch. As a result, it was banned by the Garante at the start of April 2023. According to the GPDP, OpenAI (the company behind ChatGPT) failed to demonstrate that there was a valid legal basis for collecting and processing personal data for the purposes of training ChatGPT and the information provided to users and individuals whose data was used for training the generative AI system was incomplete. Moreover, individuals whose data were used for training the AI system had no easy way to exercise their data protection rights, including the rights of access, rectification and objection. Interestingly, the GPDP also noted that ChatGPT’s responses to users’ prompts often deviated from reality (so-called hallucinations), thereby violating the accuracy principle established by the GDPR when such responses concerned another individual.

In addition to accuracy concerns, the GPDP has also focused on the implications of ChatGPT for children: the authority questioned whether the platform’s outputs might result in inappropriate responses for children, even if the service is purportedly aimed at users above the age of thirteen, as stated in OpenAI’s terms of service. As a result, the GPDP required OpenAI to implement a suitable age verification system.

On 28 April 2023, the Garante lifted the ban, as it considered the measures adopted by OpenAI to adequately address the data protection issues raised by the authority and which underpinned the ban. In particular, according to the GPDP, OpenAI:

  • drafted and published, on its website, an information notice addressed to users and non-users, in Europe and elsewhere, describing which personal data are processed under which arrangements for training algorithms, and recalling that everyone has the right to opt out of such processing;
  • expanded its privacy policy for users and also made it accessible from the sign-up page prior to registration with the service;
  • granted all individuals in the EU, including non-users, the right to opt out of the processing of their data for training of algorithms also by way of an online, easily accessible ad-hoc form;
  • introduced a welcome back page in case of reinstatement of the service in Italy containing links to the new privacy policy and the information notice on the processing of personal data for training algorithms;
  • introduced mechanisms to enable data subjects to obtain erasure of information that is considered inaccurate, whilst stating that it is technically impossible, as of now, to rectify inaccuracies;
  • clarified in the information notice for users that it would keep on processing certain personal data to enable performance of its services on a contractual basis; however, it would process users’ personal data for training algorithms on the legal basis of its legitimate interest, without prejudice to users’ right to opt out of such processing;
  • implemented a form to enable all European users to opt out of the processing of their personal data and thus to filter out their chats and chat history from the data used for training algorithms;
  • added, in the welcome back page reserved for Italian registered users, a button for them to confirm that they are aged above 18 prior to gaining access to the service, or else that they are aged above 13 and have obtained consent from their parents or guardians for that purpose; and
  • included the request to specify one’s date of birth in the service sign-up page to block access by users aged below 13 and to request confirmation of the consent given by parents or guardians for users aged between 13 and 18.

The Garante-OpenAI saga has garnered significant attention not only within the Italian data protection community but also among the general public across Italy and Europe, as it represents one of the first enforcement initiatives to have targeted a generative AI system. The GPDP’s actions in this case highlight its potential to regulate specific aspects of generative AI. This is particularly significant considering that the AI Act will not become directly applicable in the EU for several more years.

Inquiry on data mining for the training of AI systems

On 22 November 2023, the GPDP launched a survey aimed at both public and private websites to assess the implementation of effective security measures against the widespread collection of personal data (so-called web scraping) for training third-party AI algorithms.

The investigation covers all entities, acting as data controllers, based in Italy or providing services in Italy, that allow personal data to be freely accessible online, including by means of the “spiders” used by AI algorithm providers. The inquiry is prompted by the widespread practice of various AI platforms using web scraping to gather large amounts of information, including personal data, from websites managed by public and private entities for specific purposes, such as news and administrative transparency. The Garante has invited trade associations, consumer groups, experts, and academic representatives to share their comments and contributions on security measures against the extensive collection of personal data for algorithm training. Following the investigation, the GPDP will assess the need to take any corrective measures, also on an urgent basis if appropriate.

This initiative further showcases the GPDP’s willingness and ability to position itself as a leading European regulator of AI systems from a data protection perspective.

Standardised Set of Icons Approved by the Garante for Clearer Privacy Notices

Under Article 12 of the GDPR, the data controller is required to provide data subjects with the necessary information on the processing of their personal data “in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child”. In order to achieve this result, Article 12 further recommends that such information “may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable”.

Although not mandatory, the use of icons under the GDPR is therefore a good practice, which can help data controllers achieve the high transparency requirements set by the GDPR, proactively demonstrating compliance in light of the principle of accountability. This is especially true in case of complex data processing operations and/or where the information is specifically addressed to a child, as the use of standardised icons can help boost understandability and overall transparency of privacy notices.

Against this background, in March 2021 the GPDP launched a contest called “Easy privacy information via icons? Yes, you can!” aimed at stimulating the development of a standardised set of icons by software developers, tech professionals, experts, lawyers, designers, university students, and anyone interested in the topic. On 15 December 2021, the Garante published on its website the three sets of icons deemed to be most effective, based on the following criteria: concept (which includes the aspects of effectiveness and conciseness); visual (graphics, readability, clarity); originality; and inclusiveness (gender equality, non-discrimination). The three winning projects are currently available on the GPDP’s website and can be freely used by any data controller who wishes to render its privacy notices more transparent.

ICT Legal Consulting

Via Borgonuovo 12
20121 Milan
Italy

+39 028 424 7194

info.legal@ictlc.com www.ictlegalconsulting.com
Author Business Card

Law and Practice

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 54 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

Trends and Developments

Authors



ICT Legal Consulting (ICTLC) is an international law firm that offers strategic support in legal compliance (privacy, IP and TMT) and assists in drafting and developing governance, organisation, management, security and control models for data-driven organisations. The firm has successfully assembled a close-knit team of more than 80 qualified professionals specialising in the fields of ICT, privacy, data protection, cybersecurity, and IP law. ICTLC has offices in Italy (Milan, Bologna, and Rome), the Netherlands (Amsterdam), Greece (Athens), France (Paris), Spain (Madrid), Finland (Helsinki), Nigeria (Lagos), Kenya (Nairobi), Saudi Arabia (Riyadh) and Australia (Melbourne). It has also established partnerships with law firms and professionals in 54 other countries, giving clients access to the most qualified professionals who are most suited to their specific needs.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.