Data Protection & Privacy 2024

Last Updated February 13, 2024

Netherlands

Law and Practice

Authors



Greenberg Traurig, LLP is an international law firm with approximately 2,300 attorneys serving clients from 40 offices in the USA, Latin America, Europe, Asia and the Middle East. The firm’s dedicated TMT team consists of more than 100 lawyers, of which seven are in Amsterdam. The Amsterdam team is well-versed in representing clients around the world in domestic, national and international policy and legislative initiatives, as well as guiding them through the business growth cycle for a variety of technologies. As a result, it provides forward-thinking and innovative legal services to companies producing or using leading-edge technologies to transform and grow their businesses.

The fundamental data protection legislation applicable in the Netherlands are:

  • Regulation (EU) 2016/679 (the General Data Protection Regulation (GDPR)); and
  • the Dutch GDPR Implementation Act (the Implementation Act) (Uitvoeringswet Algemene verordening gegevensbescherming).

The Dutch regulator in terms of data protection is the Dutch Data Protection Authority (DPA or the “Dutch DPA”) (Autoriteit Persoonsgegevens). Investigations by the DPA are generally initiated following complaints by data subjects or at the DPA’s own initiative.

The Dutch DPA can impose sanctions and fines based on the Implementation Act, the Administrative Law Act, and the GDPR.

The Implementation Act and Administrative Law Act grant the Dutch DPA rights to enforce obligations under the GDPR and the Implementation Act through an order under penalty.

The GDPR also vests the power in the Dutch DPA to impose administrative fines up to a maximum of EUR20 million, or, if it involves an undertaking, up to 4% of the total worldwide turnover in the preceding financial year, whatever is higher.

In order to create more uniformity in the fines issued by EU member states, the European Data Protection Board (EDPB) issued guidelines on the calculations of administrative fines. These guidelines provide for a five-step methodology to assess what the amount of the fine should be:

  • step 1: identify processing operations;
  • step 2: identify the starting point for further calculation of the amount of the fine by classifying the seriousness of the infringement;
  • step 3: identify aggravating and mitigating circumstances related to past or present behaviour of the controller/processor;
  • step 4: identify the relevant legal maximums; and
  • step 5: analyse the requirements of effectiveness, dissuasiveness, and proportionality.

The GDPR, as an EU regulation, applies directly in and to EU member states. EU member states may implement deviations where the GDPR allows for such deviations, but in principle it is not possible to deviate. This also applies to issues such as international data transfers.

In the Netherlands, the role of self-regulating entities in the realm of data protection is negligible.

NGOs play a more important role. For example, Privacy First advocates for the protection of privacy rights in the Netherlands, while Bits of Freedom researches the impact of new privacy-related legislation on people, rights and freedoms.

The Netherlands falls under the EU regime, and as such it follows the EU omnibus model.

The broad scope of the catch-all approach often instills a sense of unease in the public’s perception, yet the reality is that supervisory authorities, hampered by time and budget constraints, find it impractical to investigate every minor violation.

Therefore, the enforcement rate is perceived as non-aggressive. However, an upward trend is noticeable, now that more aspects of the GDPR are crystallised.

All relevant key developments are discussed in the Netherlands Trends and Developments section of the Chambers Data Protection Guide.

Arguably the most impactful development in data protection is the adequacy decision granted for the EU-US Data Privacy Framework, which for those companies self-certified to this framework simplifies the process of transferring EU personal data to the United States.

All relevant key developments are discussed in the Netherlands Trends and Developments section of the Chambers Data Protection Guide.

Data Protection Officer

Organisations are required to appoint a data protection officer if:

  • they are a public authority or body, except for courts acting in their judicial capacity;
  • their core activities are to engage in systematic monitoring of data subjects on a large scale; or
  • their core activities include the processing of sensitive categories of personal data on a large scale.

Lawful Basis

The GDPR requires that there is a lawful basis for processing (including collection of) personal data. These bases are limited to the following:

  • consent;
  • contract performance;
  • necessary for compliance with a legal obligation;
  • vital interest of the data subject;
  • necessary for performance of a task carried out in the public interest; or
  • legitimate interest.

When the processing is based on legitimate interest, the organisation must perform a legitimate interest analysis in order to document why it came to the conclusion that its interest was legitimate.

Privacy by Design or by Default

Organisations subject to the GDPR must take into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing. Additionally, they must evaluate the potential risks to the rights and freedoms of individuals, considering the likelihood and severity of these risks brought about by the data processing activities. They must also, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data protection principles.

Data Protection Impact Assessment

The data protection impact assessment (DPIA) is a process designed to describe the processing, assess its necessity and proportionality, and help manage the risks to rights and freedoms of natural persons resulting from the processing of personal data by assessing them and determining the measures to address them. DPIAs are an important accountability tool to demonstrate compliance with the requirements of the GDPR. Under the GDPR, a DPIA is mandatory when the envisaged processing “likely constitutes a high risk” to individuals’ rights and freedoms. Certain EU member states (eg, the Netherlands) have also published a binding list of processing activities for which a DPIA is mandatory. For example, according to the list published by the Dutch Data Protection Authority, a DPIA is mandatory if, inter alia, CCTV is installed in the workplace.

Privacy Policies

Organisations subject to the GDPR must have privacy policies in place in order to fulfil their transparency obligations under Article 12 of the GDPR. Most often this will include an internal privacy policy for employees and an external privacy policy that is customer-facing. However, the two can also be combined into one.

Data Subjects’ Rights

Under the GDPR, data subjects have the right to request confirmation if their personal data is processed by a company, as well as access thereto, rectification, erasure, restriction of processing, objection to the processing, data portability, and the right not to be subjected to automated decision-making.

Pseudonymisation, Anonymisation, De-identification

There are no specific restrictions under the GDPR to pseudonymise, anonymise or de-identify persona data. However, they can be seen as technical measures to protect personal data.

Restrictions on Automated Decision-making (Including Profiling)

In principle, it is not allowed to apply automated decision-making, including profiling. However, exceptions apply when:

  • automated decision-making is necessary for entering into or performing a contract with the data subject;
  • authorised by EU member state law; or
  • based on the data subject’s explicit consent.

The Concept of Harm or Injury

Affected individuals may claim material (eg, monetary) or non-material (eg, reputational) damages as a result of an infringement of the GDPR. In order to do so, they will need to prove the damages at hand.

The definition of sensitive data is provided in the GDPR and includes personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation. It is generally prohibited to process these special categories of personal data, unless one of the specific exceptions described in Article 9(2) of the GDPR applies. The most commonly used exception is that of explicit consent.

The other requirements, including data subject rights, are described in 2.1 Omnibus Laws and General Requirements.

Unsolicited commercial or marketing communications are generally prohibited, unless they fall under one of the lawful bases as described in 2.1 Omnibus Laws and General Requirements. Unsolicited implies that there is no consent. In addition, it is unlikely that a company could claim any of the other lawful bases under Article 6 of the GDPR.

Behavioural and targeted advertising is generally only allowed when the “target” has provided consent. The consent must be opt-in, not opt-out. For example, this means that when there is a checkbox to provide the consent, this checkbox cannot be pre-filled.

The two most significant considerations for privacy in the workplace are:

  • Monitoring: Monitoring of employees, whether through email surveillance or CCTV, generally leads to the obligation to perform a data protection impact assessment.
  • Internal privacy policy: The employer is required to be transparent about which employee personal data is processed, typically in an internal privacy policy or an employee handbook.

The legal standards and potential enforcement penalties are discussed in 1.3 Administration and Enforcement Process.

In terms of private litigation, two important cases are discussed in in the Netherlands Trends and Developments section of the Chambers Data Protection Guide.

Finally, class actions are allowed under the GDPR and Dutch law. Most famous is the TikTok case in 2021, which resulted in the Dutch Data Protection Authority issuing an administrative fine of EUR750,000. In this case, three non-profit organisations brought overlapping claims for declaratory relief regarding, inter alia, the legality of TikTok’s general conditions and the processing of personal data, as well as substantial claims for damages.

In order to target organised crime, the Dutch Code of Criminal Procedure was recently amended. These changes have broadened the scope of investigative powers available to law enforcement authorities.

With respect to organised crimes, law enforcement authorities may now engage in the following activities:

  • systematic observation;
  • infiltration;
  • pseudo-buying or pseudo-services;
  • undercover systematic collection of information;
  • recording confidential communications;
  • examination of communications by automated means; and
  • demand for data.

Generally, the investigating officer will be required to obtain approval or an order from the Ministry of Justice and Security and specifically the public prosecutor.

Such laws are described in 1.3 Administration and Enforcement Process.

The Netherlands is a signatory to the OECD Declaration (December 2022) regarding government access. The practical implications of this are not significant, as the GDPR generally provides for more stringent limitations.

In principle, invoking a foreign government access request as a legitimate basis to collect and transfer personal data is permitted. However, the organisation in question must make a careful analysis of why its legitimate interest should override the rights and freedoms of the data subjects involved. The Netherlands is currently not participating in a Cloud Act agreement with the USA.

The most prominent debate on government access relates to the US CLOUD Act – an instrument through which the US government can request electronic information held by service providers. Some people were of the opinion that the US CLOUD Act meant that they could no longer use US-based service providers, as this legislation interfered with the rights and freedoms granted to EU data subjects under the GDPR. The Dutch government published an extensive analysis of why the CLOUD Act does not preclude organisations from using US IT service providers. The analysis follows a risk-based analysis model, and perceives the actual risk of the US CLOUD Act as low. You can find the link here.

Restrictions apply to data transfers to third countries. Third countries are non-EEA countries. In principle, no personal data may be transferred to third countries, unless there is a transfer mechanism in place. In order to determine whether personal data can be transferred to a third country in a safe manner, organisations must conduct a data transfer impact assessment, which assesses the applicable laws of the recipient’s country and the technical and organisational measures in place for the transfer.

The most commonly used transfer mechanisms are an adequacy decision, binding corporate rules (BCR), and the standard contractual clauses (SCC).

The adequacy decision means that the European Commission has designated the country as offering adequate protections and safeguards for the rights and freedoms of data subjects essentially equivalent to those under the GDPR.

BCR are often used by international organisations with offices all around the world, as they set a convenient framework for internal data flows. BCR must be approved by the supervisory authority in the country where they are requested (ie, the Dutch DPA for the Netherlands). A downside to the BCR is that the process can take up to several years before approval is granted.

Finally, SCC are used in abundance. They contain a set of standardised provisions to ensure that the rights and obligations between the data exporter and data importer are sufficiently addressed. However, organisations often tend to think that the usage of SCC is sufficient to transfer personal data to a third country. It is important to keep in mind that a data transfer impact assessment must still be conducted even after the SCC are signed by the relevant parties.

There is no government notification or approval required in order to transfer personal data under the abovementioned transfer mechanisms.

There are restrictions for onward transfers. Onward transfers are subsequent transfers from the data importer, who first received the personal data in the third country, to another organisation in the same or another third country. In such cases, there must again be a transfer mechanism in place, per the explanation in 4.1 Restrictions on International Data Issues and 4.2 Mechanisms or Derogations that Apply to International Data Transfers.

There are no requirements to share software code, algorithms, encryption, or similar technical detail with the government.

An organisation collecting or transferring personal data in connection with foreign government data requests must in particular take into account the following considerations and limitations.

Lawful Basis

The organisation must have a lawful basis to collect or transfer data in connection with a foreign government data request, foreign litigation, proceedings or internal investigations. Organisations will often opt for legitimate interest or a need to comply with a legal obligation. However, for the legal obligation basis it is important to note that the processing must have a basis under EU or member state law (ie, third-country government requests do not meet that requirement).

Transfer Mechanism

If foreign government data requests, foreign litigation proceedings (eg, civil discovery) or internal investigations require a transfer of personal data to a third country, the transfer must be based on a valid transfer mechanism.

Data Subjects Rights

When collecting or transferring personal data for the purposes of a foreign government data request, foreign litigation proceedings (eg, civil discovery) or internal investigations, the organisation must respect the data subjects’ rights as laid down in the GDPR. Importantly, the data subject must be informed of such processing, and have appropriate redress rights to request restriction of these processing activities.

There is no relevant “blocking” statute in place for the Netherlands.

The following areas are currently addressed in law.

Big Data Analytics

Besides the GDPR, big data analytics is also addressed in the EU Data Act, which entered into force on 11 January 2024. The EU Data Act enhances the possibilities for data sharing, for instance, by mitigating abuse of contractual imbalances that impede equitable data sharing.

Automated Decision-making (including Profiling)

Automated decision-making (including profiling) is addressed by the GDPR, as explained in 2.1 Omnibus Laws and General Requirements.

Artificial Intelligence (Including Machine Learning)

Artificial intelligence will be addressed in the upcoming EU Artificial Intelligence Act (EU AI Act). The EU AI Act applies a risk-based model. Unacceptable AI risk will be prohibited. High-risk AI will be subject to more obligations. Limited-risk AI will be subject to minimal transparency obligations. The EU Parliament reached a provisional agreement with the Council on the AI Act. More information on the EU AI Act is provided in the Netherlands Trends and Developments section of the TMT Guide.

Internet of Things (IoT) or Ubiquitous Sensors

IoT services are affected by various laws, including the GDPR, but there is currently no law specifically addressing them. However, the European Commission has announced the launch of a new Cyber Resilience Act in order to improve the mimnimum security requirements for connected devices, both during product development and throughout the product life cycle. On 1 December 2023, the European Parliament and the Council reached an agreement on the Cyber Resilience Act, which means that the next step is formal approval by both the European Parliament and the Council.

Facial Recognition/Biometric Data

Facial recognition and biometric data are subject to various restrictions in the GDPR. Firstly, there must be an exception under Article 9 of the GDPR to process such information, since biometric information qualifies as a special category of personal data, of which processing is otherwise prohibited. Furthermore, the use of facial recognition and biometric data will likely require the performance of a data protection impact assessment, and necessary safeguards following from the conclusions of such assessment.

Geolocation

Geolocation is generally considered personal data under the GDPR. As such, it is subject to the rights and limitations contained therein. However, it does not qualify as a special category of personal data subject to Article 9 of the GDPR.

Drones

The use of drones is subject to the Easy Access Rules for Unmanned Aircraft Systems (Regulations (EU) 2019/947 and 2019/945) and additional Dutch regulations, including the Regulation on Unmanned Aircrafts, Regulation on Model Flights, and Regulation on Remote Controlled Aircrafts. These regulations set out rules on the type of drone that can be flown, whether or not a camera is allowed, the altitude at which they can fly, and whether a specific certificate is required in order to operate the drone.

Disinformation, Deepfakes, or Other Online Harms

Spreading disinformation, deepfakes, or other online arms are most prominently addressed under the EU Digital Services Act, which places obligations on online intermediaries and platforms in relation to the moderation of online content.

“Dark Patterns” or Online Manipulation

Dark patterns and online manipulation are currently addressed in the Unfair Commercial Practices Directive, which is implemented in the Dutch Civil Code, as well as the EU Digital Services Act, and EU Data Act. They will also be addressed in the upcoming EU AI Act. 

Fiduciary Duty for Privacy or Data Protection

There is a specific fiduciary duty that applies to publicly listed companies under the corporate Governance Code and financial institutions under the EU Digital Operational Resilience Act. This fiduciary duty relates to IT risk management, which is broader than only privacy or data protection. 

The Dutch DPA has issued guidance on the use of AI and provides updates on its research on the latest trends every six months. In addition, the Dutch DPA has been designated as co-ordinating supervisory authority with respect to algorithms and AI and has created a separate body within the organisation dedicated to these technologies. Furthermore, all eyes are currently on the upcoming EU AI Act.

Please refer to 2.5 Enforcement and Litigation.

In corporate transactions, the record of processing activities functions as a starting point for the relevant personal data processed by the organisation subject to due diligence. If organisations do not have such a document available, this is already an indication of the level of maturity of their data protection programme. In addition, it is important to identify all the relevant data flows, especially where the target comprises a group of different entities.

The process continues by assessing all relevant policies and documentation in place at the target company. It is also important to examine whether the target has the required data processing agreement in place with its vendors and customers.

Furthermore, a typical issue that arises with companies relates to the cookie practices on the website. A host of companies do not have a properly functioning cookie collection system in place. For example, the opt-in functionality does not function correctly, which results in advertising cookies being deployed before the website user has had the opportunity to consent.

Finally, it is important to be aware of any sensitive processing; eg, involving children or biometric information, as further restrictions will apply to such processing. For instance, this may lead to the need to conduct a data protection impact assessment.

There is no strict requirement for companies to disclose their cybersecurity programme. However, there is a discernible trend in which companies disclose certain cybersecurity-related information in their annual reports.   

The Netherlands, as an EU member state, is subject to the Digital Markets Act, Digital Services Act, and the Data Act. Developments and trends relating to the convergence of privacy, competition and consumer protection law or policy, including AI, are discussed at length in the Netherlands Trends and Developments section of the TMT Guide.

Also noteworthy is the Dutch government’s positive outlook on tech, and its important role in negotiating GDPR-compliant agreements with major cloud providers, such as Google and AWS, who are not typically eager to accept changes to their standard contractual documentation.

A prime example of this positive outlook follows from the Dutch Cloud Policy 2022 under which most government data may be stored in the cloud.

Greenberg Traurig, LLP

Beethovenstraat 545
1083 HK Amsterdam
The Netherlands

+31 651 289 224

+31 20 301 7350

Herald.Jongen@gtlaw.com www.gtlaw.com
Author Business Card

Trends and Developments


Authors



Greenberg Traurig, LLP is an international law firm with approximately 2,300 attorneys serving clients from 40 offices in the USA, Latin America, Europe, Asia and the Middle East. The firm’s dedicated TMT team consists of more than 100 lawyers, of which seven are in Amsterdam. The Amsterdam team is well-versed in representing clients around the world in domestic, national and international policy and legislative initiatives, as well as guiding them through the business growth cycle for a variety of technologies. As a result, it provides forward-thinking and innovative legal services to companies producing or using leading-edge technologies to transform and grow their businesses.

Introduction

The Netherlands is witnessing a transformative era in data protection and algorithmic oversight, with the Dutch Data Protection Authority (DPA) taking pioneering steps to navigate the challenges posed by evolving technologies. This contribution sheds light on the significant progress made by the Dutch DPA in supervising algorithms and artificial intelligence (AI) while addressing broader privacy concerns.

Court rulings against tech giants Meta and Criteo reflect the increasing role of the judicial system in safeguarding privacy rights. The article also covers the DPA’s collaboration with international entities, cybersecurity initiatives, and updates on privacy risks in major tech companies.

The Dutch government’s continued influence in overseeing Big Tech, evident in negotiations with companies like Google and AWS, emphasises the importance of navigating the intersection of technology and privacy. As privacy considerations gain prominence in corporate landscapes, the Dutch DPA advises large companies to enhance transparency, aligning privacy practices with socially responsible business practices.

In the realm of data transfers, the adoption of the EU-US Data Privacy Framework stands out as a significant trend, posing challenges and opportunities for businesses. The contribution concludes by highlighting the Dutch DPA’s role in advocating for transparency and privacy considerations, providing guidelines for companies and emphasising the importance of aligning with evolving regulatory frameworks.

The contribution also highlights the Dutch DPA’s vigilance in scrutinising Tesla’s security cameras, ensuring privacy-friendly adjustments.

Major Strides in the Dutch DPA’s Approach Towards Supervising the Use of Algorithms/AI

The growing integration of generative AI within organisations and its implications for personal data have raised concerns. In response, the Dutch DPA has initiated various measures and initiatives to examine this matter.

Dutch DPA Asks for Clarification on ChatGPT

On 7 June 2023, the Dutch DPA announced that it had approached ChatGPT to gain clarity on the training methods of its generative chatbot. The inquiry included questions about how the platform utilises information from user interactions, the policies and procedures regarding the treatment of personal data sourced from the internet for training purposes, and the approach to handling generated responses that may be inaccurate, outdated, defamatory, or offensive.

2020-2023 Enforcement Agenda

As part of its enforcement agenda for 2020-2023, the Dutch DPA established a new unit in January 2023 dedicated to overseeing AI and algorithms, known as the algorithms co-ordination directorate.

This directorate plays a co-ordinating role in cross-sectoral investigations related to AI and algorithms. In addition to this co-ordination unit, the Dutch DPA bolstered its supervisory capacity for tasks specifically related to AI and algorithms, with the government committing to providing a yearly budget for these purposes.

First Algorithmic Risks Report Netherlands

On 17 July 2023, the Dutch DPA released its inaugural report on algorithm-related risks in the Netherlands. In the report, the DPA acknowledges the positive impact and the value creation potential of algorithms and AI, but also highlights the need for significant steps by the Dutch government and industry to better manage and understand these technologies, including risks posed by them.

The DPA underscores two current challenges: the rapid adoption of AI innovations in society, presenting both opportunities and risks, and the imperative for the Netherlands to gain insight into the use of high-risk algorithms that profoundly impact people’s lives.

The Dutch DPA also warns against the deployment of the latest AI innovations, cautioning organisations to thoroughly assess risks before experimenting with applications such as candidate evaluations, fraud detection, customer assessments, or patient evaluations.

Furthermore, this was the first report to provide a comprehensive overview of developments, risks, and challenges in the Netherlands, presented from an overarching risk perspective. The Dutch DPA plans to publish this report every six months to provide insights into recent developments, current risks, and associated challenges. Since the beginning of 2023, the Dutch DPA has assumed the role of the co-ordinating authority for risk signalling, advice, and collaboration in algorithm oversight, positioning the Netherlands at the forefront of international efforts.

The report concludes by emphasising the importance of periodic reports on algorithm risks as a crucial instrument for this oversight approach, with global efforts underway for standardisation and co-ordination of regulations and oversight on algorithms and AI.

International Collaboration on AI Supervision between the European Commission and the Netherlands

UNESCO, the European Commission, and the Dutch Authority for Digital Infrastructure have initiated a collaborative project called “Supervising AI by Competent Authorities”. The project aims to design institutional frameworks for the ethical governance of AI in the Netherlands. Financed through the European Commission’s Technical Support Instrument, the initiative addresses key questions surrounding AI governance, adaptability to rapid technological advancements, and the balance between regulation and innovation.

The collaboration focuses on analysing the optimal institutional design for AI supervision in compliance with the EU AI Act and UNESCO’s Recommendation on the Ethics of AI. Recognising the challenges and potential risks of AI, the project seeks to capitalise on its opportunities while safeguarding human rights, security, and ethical considerations.

Inspector General Angeline van Dijk of the Dutch Authority for Digital Infrastructure emphasises the importance of balancing AI’s opportunities and risks. UNESCO will support Dutch authorities by producing a comprehensive report on AI supervision, developing case studies, outlining best practices, and organising training sessions to enhance institutional capacity.

UNESCO’s core function involves promoting ethical development and application of emerging technologies. The collaborative project aligns with UNESCO’s global standard, the Recommendation on the Ethics of AI, adopted in November 2021. The Recommendation provides policy guidance across eleven areas for effective AI regulation and oversight based on universal values and principles.

Gabriela Ramos, UNESCO Assistant Director-General for Social and Human Sciences, emphasises that AI governance is a societal discussion and underscores the need for effective frameworks rooted in ethical and moral values. The project aims to help Competent Authorities across the European Union enhance their capacity to ensure AI system compliance with the EU AI Act and ethical standards. The experience gained will contribute to UNESCO’s work with member states, assisting in the development of benchmarks and frameworks for stronger, safer, and more ethical AI deployment globally.

Second Algorithmic Risks Report Netherlands

In its second report, the Dutch DPA emphasised the increasing risks associated with AI and algorithms, particularly with the rise of generative AI. Issues such as disinformation, privacy violations, and discrimination are becoming more prevalent, outpacing society’s ability to regulate and supervise these technologies effectively. The Dutch DPA, acting as the national coordinating authority for AI and algorithm supervision since early 2023, highlights the urgent need for improved risk management and incident monitoring.

In its second AI and Algorithmic Risks Report Netherlands, the Dutch DPA recommends the implementation of a comprehensive national master plan by 2030. This plan aims to ensure effective management and control of AI-associated risks, promoting responsible and safe AI use across various sectors. The proposed strategy emphasises human control and oversight, secure applications and systems, and strict rules for organisational control.

To achieve this, the Dutch DPA suggests clear yearly goals and agreements, incorporating the implementation of regulations like the Artificial Intelligence Act (AI Act). The plan emphasises the importance of human awareness and understanding of algorithms and AI in daily life, fostering trust and responsible use.

The report acknowledges the necessity of investing in education to ensure people of all ages understand the impact of algorithms and AI on their lives. This knowledge is crucial for individuals in various roles, from teachers and doctors, to workers and organisational leaders. The report underscores the need for structural investments in knowledge to enable society to navigate the use of algorithms and AI effectively.

The rise of generative AI poses high risks, including disinformation, manipulation, and discrimination. The AI Act, scheduled to take effect in 2025, will provide oversight for foundation models and the organisations developing them. The Dutch DPA calls for proactive efforts from companies and organisations in risk management, internal supervision, and control mechanisms to ensure reliable and safe AI use, emphasising that effective regulation goes beyond setting up supervision alone.

Tesla Cameras under Scrutiny by the Dutch DPA

In February 2023, the Dutch DPA announced that Tesla had made privacy-friendly adjustments to the settings of its built-in security cameras in response to an investigation by the Dutch DPA. The Dutch DPA examined Tesla’s “Sentry Mode”, designed to protect cars from theft or vandalism by capturing footage using four external cameras. Previously, when Sentry Mode was activated, the cameras would continuously record the surroundings of a parked Tesla, storing one hour of footage. Following software updates, the cameras are now defaulting to the off position. If users choose to activate them, the cameras will only store a maximum of 10 minutes of footage.

The Dutch DPA found that parked Teslas in Sentry Mode often recorded individuals near the car without their knowledge, storing the footage for an extended period. Tesla responded to the investigation by implementing changes. Sentry Mode now activates only when the car is physically touched, not in response to “suspicious” movements around the vehicle. In such instances, the car does not automatically start recording; instead, the owner receives a text message on their phone.

While the car can still capture camera footage, this feature must be manually enabled by the user. When recording, the car’s display screen indicates it, and the headlights emit a special light signal to inform people that they are being filmed. By default, the car retains one minute of footage, with the option for the owner to increase this to ten minutes. Notably, the captured footage remains within the car, and there is no capability to share the recordings with Tesla.

The Dutch DPA’s investigation did not result in any fines or sanctions against Tesla, as it determined that the legal responsibility for the captured footage lies with the car owner, not Tesla. The Dutch DPA welcomed Tesla’s modifications, emphasising their role in protecting passersby and reducing the risk of Tesla drivers violating the law by illicitly filming people.

The rules governing in-car cameras align with those applicable to cameras installed by individuals around their homes. Filming public roads is generally prohibited, with exceptions permitted only in cases of serious security concerns, such as frequent car break-ins in the vicinity. Camera owners bear the responsibility of correctly configuring their devices and respecting the privacy of others.

Cybersecurity NCSC – One-Stop-Shop for Reporting Cyber Threats

The Dutch government is intensifying collaboration in cyber threat warnings, with three cybersecurity entities establishing a unified reporting centre for threat and vulnerability notifications. This move is part of the preparation for merging these three organisations into a single national cyber entity by the end of 2025. The involved entities include the National Cyber Security Centre (NCSC) under the Ministry of Justice and Security, the Computer Security Incident Response Team for Digital Service Providers (CSIRT-DSP), and the Digital Trust Center (DTC) under the Ministry of Economic Affairs and Climate.

The shared objective is to enhance digital security throughout the Netherlands. These entities, namely NCSC, CSIRT-DSP, and DTC, are now working closely to ensure timely warnings for all organisations if they are victims or targets of cyber threats, regardless of their size or sector. The collaboration aligns with a key focus of the Dutch Cybersecurity Strategy, aiming for better threat visibility.

The government receives daily information about vulnerable or compromised systems from security researchers, ethical hackers, and domestic and international partners. This information is crucial to reach the affected companies swiftly, allowing them to take preventive action. Government-issued warning messages are sent when a cyber threat is identified for an organisation in the Netherlands.

To streamline the process, the three entities have established a single reporting centre where cybersecurity experts and partners can share information about cyber threats and incidents. This central reporting centre, managed by the NCSC as the national Computer Emergency Response Team (CERT), can be reached at cert@ncsc.nl. The NCSC assesses the quality of the information and initiates the notification process involving all three organisations, using a transparent and consistent evaluation framework based on the principle of sharing information as much as possible.

A unified assessment framework has been recently developed by the three organisations to determine when a warning message is necessary. Approximately 30 types of cyber threats have been designated as always warranting a warning, such as an open server port due to a configuration error or the use of an outdated cloud application with security flaws.

The collaboration extends beyond threat warnings, as the entities already share IT systems. In the coming months, they plan to further develop an integrated, scalable, and robust warning service to support businesses in enhancing the digital security of their IT systems across the Netherlands.

Continuing Influence by the Dutch State

The Dutch government continues to have a positive influence on new tech, and has an important role in negotiating GDPR-compliant agreements with major cloud providers, such as Google and Amazon Web Services (AWS), who are not typically eager to accept changes to their standard contractual documentation. On 1 June 2023, the Dutch government announced that it had entered into a framework agreement with AWS for government-wide use of (cloud) services.

On 20 April 2023, the Dutch Ministry of Education, Culture, and Science provided an update to the Dutch government regarding the progress of agreed-upon measures aimed at addressing privacy risks identified in Google Workspace. Subsequent negotiations with Google resulted in agreements specifying (i) timelines for addressing outstanding risks; (ii) ongoing monitoring of GDPR compliance post-remediation; and (iii) the implementation of a data transfer impact assessment.

On 1 June 2023, an agreement was reached with Google to mitigate the high risks identified in the DPIA. This regulatory approach by the Dutch Ministry of Justice toward overseeing Big Tech has garnered, and continues to garner, global admiration.

On 5 July 2023, the Ministry of Education, Culture, and Science informed Parliament that Google’s remedial actions were deemed sufficient. In 2020, the Dutch government initiated a Data Protection Impact Assessment (DPIA) for Amazon Web Services (AWS), leading the Ministry of Justice and Security to engage in negotiations with AWS for GDPR-compliant usage of its products and services.

However, when the Dutch government and big US tech providers fail to reach an agreement, the Dutch Government is not afraid to show its teeth. On 31 October 2023, the Financieele Dagblad (Dutch financial newspaper) reported that the former Dutch State Secretary of Digital Affairs asked the Dutch DPA to investigate Meta’s personalised advertising practices in relation to government communications through Facebook. According to the Dutch government, Facebook is not sufficiently transparent as to the process behind this, so that it is impossible for Facebook users to give meaningful consent.

Data Protection in the Dutch Courts

Tracking cookies

On 18 October 2023, the District Court of Amsterdam issued a ruling in summary proceedings against Criteo, an AdTech company engaged in the placement of third-party tracking cookies across users’ computers via various websites, with the purpose of monitoring user behaviour for targeted advertising.

The legal proceedings were initiated by a Dutch individual (“the Plaintiff”), who alleged that Criteo had infringed upon his privacy and data protection rights. The Plaintiff argued that Criteo violated these rights by deploying third-party tracking cookies on his computer without obtaining consent, processing his personal data without authorisation, and inadequately responding to his requests to cease processing, provide access to, and delete his personal data collected not only by Criteo but also by other entities within the real-time bidding (RTB) chain that received his data.

The court determined that Criteo, functioning as a joint controller, bears the responsibility to ensure proper consent is obtained. It emphasised that Criteo cannot evade accountability by relying on its partners, even if the task of obtaining consent is outsourced. Additionally, the court held that the onus is not on the data subject to disable cookies through browser settings or opt out of cookie usage.

The court issued a series of directives compelling Criteo to take specific actions within designated timeframes. These directives include refraining from placing cookies on the Plaintiff’s computer until valid consent is obtained, informing him about the group companies and third parties that have accessed his personal data (including the cookie ID), granting him access to his personal data, deleting all unlawfully processed personal data, and notifying relevant entities and third parties about the deletion request so that they may also erase unlawfully acquired personal data, as applicable.

Collective action against Meta

On 15 March 2023, the Amsterdam District Court delivered a groundbreaking verdict against Meta, establishing a precedent that upholds the privacy rights of Dutch residents. The District Court concluded that Meta had been unlawfully processing consumers’ data for advertising purposes, violating their privacy over an extended period. Consequently, the court determined that Meta’s practices were unjust towards consumers, resulting in the company’s unjust enrichment.

This declaratory judgment creates the potential for damage claims, although the feasibility and mechanics of such claims remain uncertain. The anticipation is that the trend of GDPR-related class actions navigating through the Dutch legal system will continue to gain momentum in the coming years.

Transparency of Large Companies Regarding Privacy Practices

On 5 December 2023, the Dutch DPA announced that it is advising large companies to enhance transparency regarding their privacy practices, suggesting the inclusion of privacy policy details in annual reports. The Dutch DPA has developed two guidelines to assist companies in achieving this. Emphasising the alignment of privacy with socially responsible business practices, the Dutch DPA’s Vice-Chair Monique Verdier notes the increasing focus on sustainability and work conditions, making privacy considerations a natural extension.

With the growing use of data in the digital age, the Dutch DPA encourages companies to actively showcase their approach to privacy risks to build trust with stakeholders, such as customers, employees, and shareholders. While legal compliance with the GDPR is mandatory, the Dutch DPA recommends additional openness about privacy practices.

The authority urges larger companies to proactively communicate their privacy efforts, with annual reports serving as a potential starting point. Recognising existing corporate inquiries on improving privacy accountability, the Dutch DPA provides guidelines for integrating privacy information into annual reports.

Additionally, the Dutch DPA suggests that boards of directors or supervisory boards play a catalytic role in ensuring privacy compliance within companies, offering guidelines titled “The Board of Directors or Supervisory Board and Privacy: Your Role as Supervisor” to facilitate discussions with executive leadership on privacy management.

Data Transfers

The adoption of the adequacy decision for the EU-US Data Privacy Framework (EU-US DPF), succeeding the Privacy Shield, stands out as a highly impactful trend in data protection. This decision restores the option for data transfers between EU organisations and those in the United States that have certified their compliance with the EU-US DPF principles. In alignment with the EU, both the UK and Switzerland have introduced their respective transfer mechanisms, namely the UK Extension to the EU-US DPF and the Swiss-US DPF.

Businesses that previously depended on the EU-US Privacy Shield and Safe Harbor frameworks now confront a new challenge. They must decide whether to attempt, for the third time, to validate their compliance with a framework that might be temporary or invalidated. Given the expressed intention to initiate legal action against the EU-US DPF by Maximilian Schrems, the individual responsible for invalidating its predecessors, a thorough review by the Court of Justice EU can be expected in the coming years.

On the other hand, pursuing certification could ease the contractual and compliance challenges accumulated over the years. However, there is a possibility that the DPF may heighten the burdens on US companies, exposing them to intricate litigation and regulatory risks.

Certain companies may welcome the DPF as a means to align with regulatory requirements. In various European countries such as France, Italy, Norway, the Netherlands, and Sweden, challenges have arisen regarding the use of online tracking and analytic tools. Data protection authorities have questioned the lawfulness of transferring data to US providers.

From the data protection authorities’ perspective, major tech providers have struggled to implement adequate “supplementary measures” to safeguard European analytics data against government surveillance in a manner deemed “essentially equivalent” to EU law. US big tech companies are and will likely continue to automatically transition to adopting the DPF to facilitate legal cross-border data transfers.

Conclusion

The DPA stands at the forefront of global efforts in data protection and algorithmic oversight, showcasing remarkable achievements. From influencing negotiations with Big Tech to scrutinising AI and algorithm risks, the DPA has been proactive in safeguarding privacy.

Landmark court rulings against Meta and Criteo, coupled with the DPA’s guidance to enhance transparency in privacy practices, reflect a growing emphasis on responsible business conduct. The international collaboration with UNESCO and the European Commission underscores the Netherlands’ leadership in ethical AI governance.

The adoption of the EU-US DPF for data transfers presents both challenges and opportunities, highlighting the complex landscape of data protection. The DPA’s call for a national master plan by 2030 reinforces its commitment to responsible AI use and privacy considerations.

In essence, the Dutch DPA’s multifaceted approach positions the Netherlands as a pioneer in addressing the evolving dynamics of technology and privacy. As digital landscapes continue to evolve, the DPA’s proactive measures set a noteworthy precedent for global data protection efforts.

Greenberg Traurig, LLP

Beethovenstraat 545
1083 HK Amsterdam
The Netherlands

+31 651 289 224

+31 20 301 7350

Herald.Jongen@gtlaw.com www.gtlaw.com
Author Business Card

Law and Practice

Authors



Greenberg Traurig, LLP is an international law firm with approximately 2,300 attorneys serving clients from 40 offices in the USA, Latin America, Europe, Asia and the Middle East. The firm’s dedicated TMT team consists of more than 100 lawyers, of which seven are in Amsterdam. The Amsterdam team is well-versed in representing clients around the world in domestic, national and international policy and legislative initiatives, as well as guiding them through the business growth cycle for a variety of technologies. As a result, it provides forward-thinking and innovative legal services to companies producing or using leading-edge technologies to transform and grow their businesses.

Trends and Developments

Authors



Greenberg Traurig, LLP is an international law firm with approximately 2,300 attorneys serving clients from 40 offices in the USA, Latin America, Europe, Asia and the Middle East. The firm’s dedicated TMT team consists of more than 100 lawyers, of which seven are in Amsterdam. The Amsterdam team is well-versed in representing clients around the world in domestic, national and international policy and legislative initiatives, as well as guiding them through the business growth cycle for a variety of technologies. As a result, it provides forward-thinking and innovative legal services to companies producing or using leading-edge technologies to transform and grow their businesses.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.