Data Protection & Privacy 2024 Comparisons

Last Updated March 12, 2024

Law and Practice


Greenberg Traurig, LLP is an international law firm with approximately 2,300 attorneys serving clients from 40 offices in the USA, Latin America, Europe, Asia and the Middle East. The firm’s dedicated TMT team consists of more than 100 lawyers, of which seven are in Amsterdam. The Amsterdam team is well-versed in representing clients around the world in domestic, national and international policy and legislative initiatives, as well as guiding them through the business growth cycle for a variety of technologies. As a result, it provides forward-thinking and innovative legal services to companies producing or using leading-edge technologies to transform and grow their businesses.

The fundamental data protection legislation applicable in the Netherlands are:

  • Regulation (EU) 2016/679 (the General Data Protection Regulation (GDPR)); and
  • the Dutch GDPR Implementation Act (the Implementation Act) (Uitvoeringswet Algemene verordening gegevensbescherming).

The Dutch regulator in terms of data protection is the Dutch Data Protection Authority (DPA or the “Dutch DPA”) (Autoriteit Persoonsgegevens). Investigations by the DPA are generally initiated following complaints by data subjects or at the DPA’s own initiative.

The Dutch DPA can impose sanctions and fines based on the Implementation Act, the Administrative Law Act, and the GDPR.

The Implementation Act and Administrative Law Act grant the Dutch DPA rights to enforce obligations under the GDPR and the Implementation Act through an order under penalty.

The GDPR also vests the power in the Dutch DPA to impose administrative fines up to a maximum of EUR20 million, or, if it involves an undertaking, up to 4% of the total worldwide turnover in the preceding financial year, whatever is higher.

In order to create more uniformity in the fines issued by EU member states, the European Data Protection Board (EDPB) issued guidelines on the calculations of administrative fines. These guidelines provide for a five-step methodology to assess what the amount of the fine should be:

  • step 1: identify processing operations;
  • step 2: identify the starting point for further calculation of the amount of the fine by classifying the seriousness of the infringement;
  • step 3: identify aggravating and mitigating circumstances related to past or present behaviour of the controller/processor;
  • step 4: identify the relevant legal maximums; and
  • step 5: analyse the requirements of effectiveness, dissuasiveness, and proportionality.

The GDPR, as an EU regulation, applies directly in and to EU member states. EU member states may implement deviations where the GDPR allows for such deviations, but in principle it is not possible to deviate. This also applies to issues such as international data transfers.

In the Netherlands, the role of self-regulating entities in the realm of data protection is negligible.

NGOs play a more important role. For example, Privacy First advocates for the protection of privacy rights in the Netherlands, while Bits of Freedom researches the impact of new privacy-related legislation on people, rights and freedoms.

The Netherlands falls under the EU regime, and as such it follows the EU omnibus model.

The broad scope of the catch-all approach often instills a sense of unease in the public’s perception, yet the reality is that supervisory authorities, hampered by time and budget constraints, find it impractical to investigate every minor violation.

Therefore, the enforcement rate is perceived as non-aggressive. However, an upward trend is noticeable, now that more aspects of the GDPR are crystallised.

All relevant key developments are discussed in the Netherlands Trends and Developments section of the Chambers Data Protection Guide.

Arguably the most impactful development in data protection is the adequacy decision granted for the EU-US Data Privacy Framework, which for those companies self-certified to this framework simplifies the process of transferring EU personal data to the United States.

All relevant key developments are discussed in the Netherlands Trends and Developments section of the Chambers Data Protection Guide.

Data Protection Officer

Organisations are required to appoint a data protection officer if:

  • they are a public authority or body, except for courts acting in their judicial capacity;
  • their core activities are to engage in systematic monitoring of data subjects on a large scale; or
  • their core activities include the processing of sensitive categories of personal data on a large scale.

Lawful Basis

The GDPR requires that there is a lawful basis for processing (including collection of) personal data. These bases are limited to the following:

  • consent;
  • contract performance;
  • necessary for compliance with a legal obligation;
  • vital interest of the data subject;
  • necessary for performance of a task carried out in the public interest; or
  • legitimate interest.

When the processing is based on legitimate interest, the organisation must perform a legitimate interest analysis in order to document why it came to the conclusion that its interest was legitimate.

Privacy by Design or by Default

Organisations subject to the GDPR must take into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing. Additionally, they must evaluate the potential risks to the rights and freedoms of individuals, considering the likelihood and severity of these risks brought about by the data processing activities. They must also, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data protection principles.

Data Protection Impact Assessment

The data protection impact assessment (DPIA) is a process designed to describe the processing, assess its necessity and proportionality, and help manage the risks to rights and freedoms of natural persons resulting from the processing of personal data by assessing them and determining the measures to address them. DPIAs are an important accountability tool to demonstrate compliance with the requirements of the GDPR. Under the GDPR, a DPIA is mandatory when the envisaged processing “likely constitutes a high risk” to individuals’ rights and freedoms. Certain EU member states (eg, the Netherlands) have also published a binding list of processing activities for which a DPIA is mandatory. For example, according to the list published by the Dutch Data Protection Authority, a DPIA is mandatory if, inter alia, CCTV is installed in the workplace.

Privacy Policies

Organisations subject to the GDPR must have privacy policies in place in order to fulfil their transparency obligations under Article 12 of the GDPR. Most often this will include an internal privacy policy for employees and an external privacy policy that is customer-facing. However, the two can also be combined into one.

Data Subjects’ Rights

Under the GDPR, data subjects have the right to request confirmation if their personal data is processed by a company, as well as access thereto, rectification, erasure, restriction of processing, objection to the processing, data portability, and the right not to be subjected to automated decision-making.

Pseudonymisation, Anonymisation, De-identification

There are no specific restrictions under the GDPR to pseudonymise, anonymise or de-identify persona data. However, they can be seen as technical measures to protect personal data.

Restrictions on Automated Decision-making (Including Profiling)

In principle, it is not allowed to apply automated decision-making, including profiling. However, exceptions apply when:

  • automated decision-making is necessary for entering into or performing a contract with the data subject;
  • authorised by EU member state law; or
  • based on the data subject’s explicit consent.

The Concept of Harm or Injury

Affected individuals may claim material (eg, monetary) or non-material (eg, reputational) damages as a result of an infringement of the GDPR. In order to do so, they will need to prove the damages at hand.

The definition of sensitive data is provided in the GDPR and includes personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation. It is generally prohibited to process these special categories of personal data, unless one of the specific exceptions described in Article 9(2) of the GDPR applies. The most commonly used exception is that of explicit consent.

The other requirements, including data subject rights, are described in 2.1 Omnibus Laws and General Requirements.

Unsolicited commercial or marketing communications are generally prohibited, unless they fall under one of the lawful bases as described in 2.1 Omnibus Laws and General Requirements. Unsolicited implies that there is no consent. In addition, it is unlikely that a company could claim any of the other lawful bases under Article 6 of the GDPR.

Behavioural and targeted advertising is generally only allowed when the “target” has provided consent. The consent must be opt-in, not opt-out. For example, this means that when there is a checkbox to provide the consent, this checkbox cannot be pre-filled.

The two most significant considerations for privacy in the workplace are:

  • Monitoring: Monitoring of employees, whether through email surveillance or CCTV, generally leads to the obligation to perform a data protection impact assessment.
  • Internal privacy policy: The employer is required to be transparent about which employee personal data is processed, typically in an internal privacy policy or an employee handbook.

The legal standards and potential enforcement penalties are discussed in 1.3 Administration and Enforcement Process.

In terms of private litigation, two important cases are discussed in in the Netherlands Trends and Developments section of the Chambers Data Protection Guide.

Finally, class actions are allowed under the GDPR and Dutch law. Most famous is the TikTok case in 2021, which resulted in the Dutch Data Protection Authority issuing an administrative fine of EUR750,000. In this case, three non-profit organisations brought overlapping claims for declaratory relief regarding, inter alia, the legality of TikTok’s general conditions and the processing of personal data, as well as substantial claims for damages.

In order to target organised crime, the Dutch Code of Criminal Procedure was recently amended. These changes have broadened the scope of investigative powers available to law enforcement authorities.

With respect to organised crimes, law enforcement authorities may now engage in the following activities:

  • systematic observation;
  • infiltration;
  • pseudo-buying or pseudo-services;
  • undercover systematic collection of information;
  • recording confidential communications;
  • examination of communications by automated means; and
  • demand for data.

Generally, the investigating officer will be required to obtain approval or an order from the Ministry of Justice and Security and specifically the public prosecutor.

Such laws are described in 1.3 Administration and Enforcement Process.

The Netherlands is a signatory to the OECD Declaration (December 2022) regarding government access. The practical implications of this are not significant, as the GDPR generally provides for more stringent limitations.

In principle, invoking a foreign government access request as a legitimate basis to collect and transfer personal data is permitted. However, the organisation in question must make a careful analysis of why its legitimate interest should override the rights and freedoms of the data subjects involved. The Netherlands is currently not participating in a Cloud Act agreement with the USA.

The most prominent debate on government access relates to the US CLOUD Act – an instrument through which the US government can request electronic information held by service providers. Some people were of the opinion that the US CLOUD Act meant that they could no longer use US-based service providers, as this legislation interfered with the rights and freedoms granted to EU data subjects under the GDPR. The Dutch government published an extensive analysis of why the CLOUD Act does not preclude organisations from using US IT service providers. The analysis follows a risk-based analysis model, and perceives the actual risk of the US CLOUD Act as low. You can find the link here.

Restrictions apply to data transfers to third countries. Third countries are non-EEA countries. In principle, no personal data may be transferred to third countries, unless there is a transfer mechanism in place. In order to determine whether personal data can be transferred to a third country in a safe manner, organisations must conduct a data transfer impact assessment, which assesses the applicable laws of the recipient’s country and the technical and organisational measures in place for the transfer.

The most commonly used transfer mechanisms are an adequacy decision, binding corporate rules (BCR), and the standard contractual clauses (SCC).

The adequacy decision means that the European Commission has designated the country as offering adequate protections and safeguards for the rights and freedoms of data subjects essentially equivalent to those under the GDPR.

BCR are often used by international organisations with offices all around the world, as they set a convenient framework for internal data flows. BCR must be approved by the supervisory authority in the country where they are requested (ie, the Dutch DPA for the Netherlands). A downside to the BCR is that the process can take up to several years before approval is granted.

Finally, SCC are used in abundance. They contain a set of standardised provisions to ensure that the rights and obligations between the data exporter and data importer are sufficiently addressed. However, organisations often tend to think that the usage of SCC is sufficient to transfer personal data to a third country. It is important to keep in mind that a data transfer impact assessment must still be conducted even after the SCC are signed by the relevant parties.

There is no government notification or approval required in order to transfer personal data under the abovementioned transfer mechanisms.

There are restrictions for onward transfers. Onward transfers are subsequent transfers from the data importer, who first received the personal data in the third country, to another organisation in the same or another third country. In such cases, there must again be a transfer mechanism in place, per the explanation in 4.1 Restrictions on International Data Issues and 4.2 Mechanisms or Derogations that Apply to International Data Transfers.

There are no requirements to share software code, algorithms, encryption, or similar technical detail with the government.

An organisation collecting or transferring personal data in connection with foreign government data requests must in particular take into account the following considerations and limitations.

Lawful Basis

The organisation must have a lawful basis to collect or transfer data in connection with a foreign government data request, foreign litigation, proceedings or internal investigations. Organisations will often opt for legitimate interest or a need to comply with a legal obligation. However, for the legal obligation basis it is important to note that the processing must have a basis under EU or member state law (ie, third-country government requests do not meet that requirement).

Transfer Mechanism

If foreign government data requests, foreign litigation proceedings (eg, civil discovery) or internal investigations require a transfer of personal data to a third country, the transfer must be based on a valid transfer mechanism.

Data Subjects Rights

When collecting or transferring personal data for the purposes of a foreign government data request, foreign litigation proceedings (eg, civil discovery) or internal investigations, the organisation must respect the data subjects’ rights as laid down in the GDPR. Importantly, the data subject must be informed of such processing, and have appropriate redress rights to request restriction of these processing activities.

There is no relevant “blocking” statute in place for the Netherlands.

The following areas are currently addressed in law.

Big Data Analytics

Besides the GDPR, big data analytics is also addressed in the EU Data Act, which entered into force on 11 January 2024. The EU Data Act enhances the possibilities for data sharing, for instance, by mitigating abuse of contractual imbalances that impede equitable data sharing.

Automated Decision-making (including Profiling)

Automated decision-making (including profiling) is addressed by the GDPR, as explained in 2.1 Omnibus Laws and General Requirements.

Artificial Intelligence (Including Machine Learning)

Artificial intelligence will be addressed in the upcoming EU Artificial Intelligence Act (EU AI Act). The EU AI Act applies a risk-based model. Unacceptable AI risk will be prohibited. High-risk AI will be subject to more obligations. Limited-risk AI will be subject to minimal transparency obligations. The EU Parliament reached a provisional agreement with the Council on the AI Act. More information on the EU AI Act is provided in the Netherlands Trends and Developments section of the TMT Guide.

Internet of Things (IoT) or Ubiquitous Sensors

IoT services are affected by various laws, including the GDPR, but there is currently no law specifically addressing them. However, the European Commission has announced the launch of a new Cyber Resilience Act in order to improve the mimnimum security requirements for connected devices, both during product development and throughout the product life cycle. On 1 December 2023, the European Parliament and the Council reached an agreement on the Cyber Resilience Act, which means that the next step is formal approval by both the European Parliament and the Council.

Facial Recognition/Biometric Data

Facial recognition and biometric data are subject to various restrictions in the GDPR. Firstly, there must be an exception under Article 9 of the GDPR to process such information, since biometric information qualifies as a special category of personal data, of which processing is otherwise prohibited. Furthermore, the use of facial recognition and biometric data will likely require the performance of a data protection impact assessment, and necessary safeguards following from the conclusions of such assessment.


Geolocation is generally considered personal data under the GDPR. As such, it is subject to the rights and limitations contained therein. However, it does not qualify as a special category of personal data subject to Article 9 of the GDPR.


The use of drones is subject to the Easy Access Rules for Unmanned Aircraft Systems (Regulations (EU) 2019/947 and 2019/945) and additional Dutch regulations, including the Regulation on Unmanned Aircrafts, Regulation on Model Flights, and Regulation on Remote Controlled Aircrafts. These regulations set out rules on the type of drone that can be flown, whether or not a camera is allowed, the altitude at which they can fly, and whether a specific certificate is required in order to operate the drone.

Disinformation, Deepfakes, or Other Online Harms

Spreading disinformation, deepfakes, or other online arms are most prominently addressed under the EU Digital Services Act, which places obligations on online intermediaries and platforms in relation to the moderation of online content.

“Dark Patterns” or Online Manipulation

Dark patterns and online manipulation are currently addressed in the Unfair Commercial Practices Directive, which is implemented in the Dutch Civil Code, as well as the EU Digital Services Act, and EU Data Act. They will also be addressed in the upcoming EU AI Act. 

Fiduciary Duty for Privacy or Data Protection

There is a specific fiduciary duty that applies to publicly listed companies under the corporate Governance Code and financial institutions under the EU Digital Operational Resilience Act. This fiduciary duty relates to IT risk management, which is broader than only privacy or data protection. 

The Dutch DPA has issued guidance on the use of AI and provides updates on its research on the latest trends every six months. In addition, the Dutch DPA has been designated as co-ordinating supervisory authority with respect to algorithms and AI and has created a separate body within the organisation dedicated to these technologies. Furthermore, all eyes are currently on the upcoming EU AI Act.

Please refer to 2.5 Enforcement and Litigation.

In corporate transactions, the record of processing activities functions as a starting point for the relevant personal data processed by the organisation subject to due diligence. If organisations do not have such a document available, this is already an indication of the level of maturity of their data protection programme. In addition, it is important to identify all the relevant data flows, especially where the target comprises a group of different entities.

The process continues by assessing all relevant policies and documentation in place at the target company. It is also important to examine whether the target has the required data processing agreement in place with its vendors and customers.

Furthermore, a typical issue that arises with companies relates to the cookie practices on the website. A host of companies do not have a properly functioning cookie collection system in place. For example, the opt-in functionality does not function correctly, which results in advertising cookies being deployed before the website user has had the opportunity to consent.

Finally, it is important to be aware of any sensitive processing; eg, involving children or biometric information, as further restrictions will apply to such processing. For instance, this may lead to the need to conduct a data protection impact assessment.

There is no strict requirement for companies to disclose their cybersecurity programme. However, there is a discernible trend in which companies disclose certain cybersecurity-related information in their annual reports.   

The Netherlands, as an EU member state, is subject to the Digital Markets Act, Digital Services Act, and the Data Act. Developments and trends relating to the convergence of privacy, competition and consumer protection law or policy, including AI, are discussed at length in the Netherlands Trends and Developments section of the TMT Guide.

Also noteworthy is the Dutch government’s positive outlook on tech, and its important role in negotiating GDPR-compliant agreements with major cloud providers, such as Google and AWS, who are not typically eager to accept changes to their standard contractual documentation.

A prime example of this positive outlook follows from the Dutch Cloud Policy 2022 under which most government data may be stored in the cloud.

Greenberg Traurig, LLP

Beethovenstraat 545
1083 HK Amsterdam
The Netherlands

+31 651 289 224

+31 20 301 7350
Author Business Card

Law and Practice in Netherlands


Greenberg Traurig, LLP is an international law firm with approximately 2,300 attorneys serving clients from 40 offices in the USA, Latin America, Europe, Asia and the Middle East. The firm’s dedicated TMT team consists of more than 100 lawyers, of which seven are in Amsterdam. The Amsterdam team is well-versed in representing clients around the world in domestic, national and international policy and legislative initiatives, as well as guiding them through the business growth cycle for a variety of technologies. As a result, it provides forward-thinking and innovative legal services to companies producing or using leading-edge technologies to transform and grow their businesses.