Data Protection & Privacy 2020

Last Updated March 09, 2020

Netherlands

Law and Practice

Authors



Vondst Advocaten N.V. is an Amsterdam-based boutique law firm that focuses on data protection law, IT, IP and life sciences. The firm combines a deep knowledge of and experience with the application of data protection concepts and is therefore pre-eminently able to provide workable solutions. A strict application of law when required, a flexible approach whenever possible. The data protection team has repeatedly and successfully assisted clients in enforcement matters against the Data Protection Authority and the Authority for Consumers & Markets. Vondst advises various clients on a strategic level about compliance with the GDPR. Recent matters have included advising a service provider in relation to a data breach, advising a financial institution on data retention issues and assisting a healthcare company in data access requests.

The right to privacy is embedded in Article 10 of the Dutch Constitution. This Article provides for a general right of protection of private life as well as an obligation to lay down rules on data protection. This Article must be interpreted in the light of Article 8 of the European Convention on Human Rights and Articles 7 and 8 of the European Charter of Fundamental Rights of the European Union.

In the Netherlands, data protection is regulated by the General Data Protection Regulation (GDPR). The GDPR came into force on 25 May 2018 and regulates the processing of the personal data of individuals by imposing obligations on data controllers and data processors.

As a directly applicable regulation, the legal obligations contained in the GDPR have direct effect in the Netherlands without any national implementing measures. However, the GDPR contains a number of derogations that provide EU member states with discretion to introduce specific derogations on how certain provisions of the GDPR will apply in member state law.

The Netherlands has introduced such specific derogations in Dutch law through the Dutch General Data Protection Regulation Implementation Act (the Implementation Act). The Implementation Act repealed the implementation act of the EU Data Protection Directive – the Dutch Data Protection Act. Aside from the enforcement regime set out in the GDPR, the Implementation Act provides for the possibility of imposing an administrative enforcement order by the Dutch Data Protection Authority (AP) to enforce obligations laid down by the Implementation Act.

The e-Privacy Directive 2002 (as amended in 2009) regulates direct marketing. The e-Privacy Directive has been implemented in the Dutch Telecommunications Act, which also includes provisions on the use of cookies and similar techniques. The e-Privacy Directive will be replaced by the e-Privacy Regulation within the next few years.

The collection and processing of personal data is also regulated by various specific laws and regulations and certain sector-specific laws.

The national data supervisory authority is the Dutch Data Protection Authority (AP). The AP is charged with the supervision of the processing of personal data in accordance with the GDPR and the Implementation Act. The AP is competent to perform the tasks and exercise the powers set forth in Articles 57 and 58 of the GDPR. In addition, the Implementation Act provides for the possibility of imposing an administrative enforcement order by the AP to enforce obligations laid down by the GDPR and the Implementation Act.

In general, the AP focuses on material personal data breaches. Priority is given to violations that have a big impact on privacy or on minor violations affecting many data subjects. If the AP finds minor violations, it will often first give a warning, provided the violator can demonstrate good faith and is prepared to improve (for example, by implementing new privacy procedures). In 2018, however, the AP imposed a penalty of EUR600,000 on Uber for violating the Dutch Data Breach Regulation (based on the former Dutch Data Protection Act). The AP has performed various targeted enforcement actions since the GDPR came into force. On 16 July 2019, the AP imposed a penalty of EUR460,000 on Haga Hospital for violating the GDPR. In addition, the AP imposed a charge subject to periodic penalty payments on both the National Police and the Employee Insurance Agency (UWV). They both had to take appropriate security measures within six and three months respectively, with an imminent penalty of EUR12,500 per week (to a maximum of EUR 200,000) and EUR150,000 per month (to a maximum of EUR900,000). The AP also imposed a charge subject to periodic penalty payment on insurance company Menzis due to unauthorised access by employees to medical data. From time to time, the AP announces specific areas of focus. Recent focus has been on the security of personal data, big data and profiling, medical data, personal data with the (digital) government and personal data in labour relationships. The AP has announced that personal data breaches that are not notified in accordance with the GDPR are a focus point for 2020.

The Dutch Authority for Consumers and Markets (Autoriteit Consument & Markt – ACM) is charged with the supervision of the Telecommunications Act (direct marketing and cookies). For violations of the Telecommunications Act, the ACM may impose an administrative penalty of up to EUR900,000 per breach or 10% of the annual turnover of the company in breach (whichever is higher).

In general, the enforcement process starts with a suspicion or a complaint. The regulator can then decide to launch an investigation. The findings of this investigation are recorded in a report (called a statement of objections). The offender is given the opportunity to express their opinion in writing or orally. If the regulator decides to impose a penalty, it will lay down this penalty in a penalty decision. This decision will, in principle, be published on the website of the regulator.

Within six weeks after the penalty decision the offender can file an objection with the regulator. During the objection process, the interested parties are given the opportunity to be heard at an oral hearing. The regulator then renders a written decision. The offender can appeal this decision with the District Court and ultimately appeal the judgment of the District Court either with either the Administrative High Court for Trade and Industry or the Administrative Jurisdiction Division of the Council of State.

The Netherlands belongs to the continental law tradition, in which statutory law is the primary legal source. Dutch privacy and data protection law is based on the same sources as Dutch law in general – law and other statutes, court practice, parliamentary history and established legal doctrine. Being a member of the EU, the legal framework for privacy and data protection law in the Netherlands is, to a significant and continuously growing extent, based on European and EU law.

There are several organisations that are committed to civil rights, privacy and consumer interests in the Netherlands. Very recently, a coalition of NGOs initiated proceedings against the Dutch government, which led to a court banning the use of the algorithm-based system SyRI (System Risk Indication) by the Dutch government.

The AP regularly investigates potential violations and often looks for an amicable solution. Generally, the AP tends to impose an administrative order subject to a penalty in the event of non-compliance, allowing companies to end their violation in order to avoid a substantial penalty. After a period of more than 15 years, the AP imposed its first penalty in 2018 on Uber (for a violation of the Data Breach Regulations). Subsequently, the AP imposed a penalty on the Haga Hospital in 2019 for violating the GDPR.

The ACM, however, is much more aggressive when it comes to enforcement actions.

In the past 12 months, focus on the GDPR has increased. The AP has performed several targeted enforcement actions, such as investigations into data processing agreements, data protection officers (DPOs), privacy policies at healthcare institutions and political parties and records of processing activities. There have been a limited number of court cases, most of which pertained to the right to be left alone, the right to be forgotten, freedom of speech or the right of access.

The AP approved the Datapro processor framework issued by NL Digital, the branch organisation of the Dutch IT industry. This framework includes a template of a processor agreement.

The next 12 months will be dominated by further enforcement by the AP, case law on the GDPR and debate on the upcoming e-Privacy Regulation.

The general requirements that apply in the Netherlands derive from the GDPR and, to a certain degree, from the Implementation Act.

According to Article 37 of the GDPR, the appointment of a DPO in the private sector is required where an organisation’s core activities involve:

  • the regular and systematic monitoring of individuals on a large scale; or
  • the large-scale processing of special categories of personal data (eg, health data) or personal data relating to criminal convictions and offences.

The European Data Protection Board (EDPB), which consists of representatives of European DPAs and the European Data Protection Supervisor, and succeeded the Article 29 Working Party (Art29WP), has adopted a guideline on DPOs (WP 243). In this guideline, the EDPB elaborates on the criteria of mandatory designation, the position and the tasks of the DPO.

The GDPR requires an organisation to publish the contact details of the DPO and to communicate these to the AP. To notify the AP, organisations should send an email to fg@autoriteitpersoonsgegevens.nl. The AP answers specific (administrative) questions of registered DPOs and sends a quarterly newsletter to DPOs.

The criteria for the lawfulness of processing are included in Article 6 of the GDPR. Apart from obtaining consent, personal (non-sensitive) data can be processed based on a number of grounds, such as the performance of a contract or for upholding legitimate interests.

The principles of "privacy by design" and "privacy by default" (a requirement to put appropriate technical and organisational measures, such as pseudonymisation, in place to implement the data protection principles and safeguard individual rights) have been included in Article 25 of the GDPR. Under the former Data Protection Act, the AP released guidelines on security of personal data, including privacy by design, that are still relevant in daily practice. In its guidelines, the AP encourages controllers to pay close attention to security and to implement a "plan-do-check-act" cycle.

Under Article 35 of the GDPR, controllers are obliged to carry out a data protection impact assessment (DPIA) where their processing is likely to result in a high risk to individuals. The AP has created a DPIA checklist.

First of all, controllers have to check whether their intended type of processing is on the list of processing operations that require a DPIA. On this list are processing activities such as covert research, blacklists, credit scoring, monitoring employees, communication and location data and profiling.

If the intended processing is not on the list, the second step is to assess the risk. The AP refers to the nine criteria set out by the European data protection authorities. As a rule of thumb, a controller has to perform a DPIA if the processing meets two or more of the nine criteria such as evaluation or scoring, systematic monitoring, matching or combining datasets or innovative use of data.

If the intended data processing strongly resembles a type of data processing for which a DPIA has already been performed, there is no need for a DPIA with regard to the intended data processing.

The AP has also created a checklist to answer the question of whether a data controller would still need to perform a DPIA for a form of data processing that already existed prior to the GDPR.

The GDPR requires, in Articles 13 and 14, that the data controller provides information to the data subject where personal data is collected from the data subject and where data has not been obtained from the data subject.

The implementation of privacy policies also assists organisations in meeting the principle of accountability (Article 5(2) of the GDPR). In addition, it is the controller’s responsibility to implement appropriate data protection policies, proportionate in relation to processing activities (Article 24(2) of the GDPR).

The GDPR grants data subjects a number of rights under Articles 13-22.

Based on the Implementation Act, the controller may refrain from applying the rights and obligation referred to above (excluding a number of rights in relation to automated decision-making, but including the obligation to communicate a personal data breach to the data subject in accordance with Article 34 of the GDPR), insofar as this is necessary and proportionate to safeguard amongst other things national security, public security and the enforcement of civil law claims.

The GDPR does not apply to anonymous data, as this data does not relate to an identified or identifiable individual. Pseudonymised data can be used to identify an individual and therefore the GDPR applies to the processing of pseudonymised data. Pseudonymisation, however, is an appropriate measure to ensure an appropriate level of security (Article 32(1)(a)).

Article 40 of the Implementation Act stipulates that Article 22(1) of the GDPR – regarding automated individual decision-making, including profiling – does not apply if the automated individual decision-making, other than based on profiling, is necessary for compliance with a legal obligation to which the controller is subject or for the performance of a task carried out for reasons of public interest. If this exception applies, the controller must take appropriate measures to safeguard the data subject’s rights, freedoms and legitimate interest. If the controller is not an administrative body, the appropriate measures should in any case have been taken if the right to obtain human intervention, the data subject’s right to express his or her point of view and the right to contest the decision, have been safeguarded.

Any person who has suffered material or non-material damage as a result of an infringement of the GDPR has the right to receive compensation from the controller or processor for the damage suffered. Under Dutch law, financial loss and other disadvantages can be compensated. Other disadvantages may include immaterial or emotional damage. Normally, damage will be calculated in monetary form.

The concept of injury and harm may also play a role in the determination of the amount of a penalty by the AP.

The GDPR indicates a special category of personal data that, by its nature, merits higher protection as the context of its processing could create significant risks to fundamental rights and freedoms.

This special category of personal data includes racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation. Personal data relating to criminal convictions and offences, or related security measures, is not considered a special category of personal data, but there are specific rules for processing this type of data. Data relating to criminal convictions and offences is treated in the same way as sensitive data by the AP.

In addition to the exceptions for processing this type of special category of data mentioned in the GDPR, the Implementation Act provides for a number of exceptions.

In addition to the various types of sensitive data mentioned in the GDPR, the AP also treats other data as sensitive, such as financial data, location data, behavioural data and communications data.

Financial Data

Although financial information, as such, is not qualified as sensitive data in the GDPR, information about someone’s financial details will nonetheless probably be treated as sensitive data by the AP. The code of conduct for financial institutions, which is binding for almost all Dutch financial institutions, gives important guidance on the use of personal data, even though the formal approval of this code from the AP has lapsed.

Health Data

The GDPR defines data concerning health as personal data related to the physical or mental health of a natural person, including the provision of healthcare services, which reveal information about his or her health status. Health data may be processed, inter alia, if necessary to protect the vital interest of the data subject, for the purpose of medical diagnosis, for reasons of public interest in the area of public health and scientific purposes. The Implementation Act provides for additional exceptions for administrative bodies, pension funds and employers, for schools, institutions of rehabilitation, healthcare providers and insurers.

In 2013 the AP investigated the Nike running app and concluded that it measured how many calories the user burns, how much, how often and how intensively the user runs. The AP concluded that this type of data can be considered as health data and therefore constitutes sensitive data.

Communications Data

In an investigation by the AP in 2013 into smart TVs, the AP considered that personal data with regard to online viewing behaviour should be considered as sensitive data. This type of data provides a lot of information about the viewer (viewed broadcasts, rented movies, visits to and use by data subjects of apps and websites, times of switching on and off the device etc). The AP refers to WP29 Opinion 13/2011 on geolocation services on smart mobile devices (WP 185).

Based on Recital (47) of the GDPR, the processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest. Direct marketing generally refers to any form of advertising by which a natural or legal person sends direct marketing communications directly to one or more identified or identifiable end users using electronic communications services. In addition to the offering of products and services for commercial purposes, this also includes messages sent by political parties and other non-profit organisations to support the purpose of the organisation.

Where personal data is processed for the purposes of direct marketing, the data subject should have the right to object to such processing, including profiling to the extent that it is related to such direct marketing, whether with regard to initial or further processing, at any time and free of charge. That right should be explicitly brought to the attention of the data subject and presented clearly and separately from any other information.

General rules for direct marketing may be found in the GDPR. The data subject has a right to object to the processing of his or her personal data for direct marketing purposes, without any justification being necessary. Furthermore, the data subject must be informed of his or her right to object to any direct marketing communication.

With regard to direct marketing by means of telecommunications and the use of cookies and similar techniques, the Dutch Telecommunications Act provides for detailed regulation via the implementation of the e-Privacy Directive. The Directive will be replaced by the e-Privacy Regulation within the next few years. Moreover, in January 2015, the AP published a manual explaining how to set up Google analytics in a privacy-friendly way. This set-up enables data controllers to use the tool without having to obtain consent on their websites to place the Google analytics cookie.

The Dutch Telecommunications Act provides for an opt-in regime (one that basically requires consent) for marketing via email, SMS and similar techniques. Sending unsolicited communications to customers by a data controller is allowed when the contact details have been obtained in the context of the sale of a product or a service, the message relates to its own similar products or services and the customer has been given an opportunity to object, free of charge and in an easy manner. If the customer does not object to the initial collection of its electronic contact details, the customer should be given the possibility to object in each message sent.

Specific rules apply to promotional telephone calls. These rules provide for an opt-out regime, but require a mandatory check of the do-not-call register.

The Dutch Telecommunications Act also provides for rules regarding the use of cookies and similar techniques. In general, the use of cookies that are strictly necessary to provide the requested services, to carry out the transmission of electronic communication over an electronic communications network, or to gather information on the quality or effectiveness of the services provided – with no, or only minor, consequences to the end user’s privacy – are allowed. However, the use of other cookies such as tracking cookies, cookies for behavioural targeting and device fingerprinting require consent and end users need to be informed properly and in advance in order to give consent.

The AP has published various opinions, inter alia, on direct marketing (most recently in October 2018), on the concept of legitimate interest (November 2019) and the use of cookies (most recently in December 2019) on its website. In 2015, the AP investigated Wifi tracking technology in shops and on public roads provided by Bluetrace. In short, the AP decided that by way of Wifi tracking, unique MAC addresses of mobile devices were being collected that, combined with information concerning location, date and time of registration, could be considered personal data. It even involved processing of personal data of a sensitive nature – ie, the location data of individuals. Hashing of the MAC addresses does not lead to the conclusion that they are no longer personal data.

In its position on direct marketing, the AP stresses that – despite Recital 47 of the GDPR – in most cases data controllers cannot rely on a legitimate interest when processing personal data for direct marketing purposes, but processing should be based on consent.

In its position on the use of tracking cookies, the AP states that there is no valid consent in a situation where the user of a website must deselect a pre-ticked checkbox to refuse his or her consent. In addition, the AP concludes that no valid consent is given if a website only informs the website visitor that "by continuing to use this website you agree to the use of tracking cookies". This differs from the latest guidance document of the ACM of November 2016, that allowed "implicit consent".

Workplace privacy is protected by several laws and regulations. The GDPR applies to the workplace. The right to privacy of employees in the Netherlands is furthermore recognised under the European Convention for the Protection of Human Rights. The general principles of fair employment practices of the Dutch Civil Code also protect the privacy rights of employees to some extent. The Works Council Act contains the legal framework for works council involvement in certain privacy issues.

According to the guidelines of the AP, monitoring of internet and email usage of employees is allowed, provided some safeguards are taken, including:

  • an employee privacy protocol is approved by the works council (if any);
  • there is a necessity; and
  • the employee’s privacy is respected where reasonably possible.

This is in accordance with best practice and courts take this framework as their starting point to assess whether monitoring is lawful.

Furthermore, in its decision on DPIAs, dated 19 November 2019, the AP has stated that a DPIA should be carried out in connection to large scale or systematically monitoring of employees – eg, in relation to email and internet usage.

The EDPB’s predecessor, the ArtWP29, has published various guidance documents on the use of monitoring measures in the employment context, including Opinion 2/2017 on data processing at work (WP 249), the 2002 working document on the surveillance of electronic communications in the workplace (WP 55), and Opinion 8/2001 on the processing of personal data in the employment context (WP 48). Although these documents are not explicitly endorsed by the EDPB in Endorsement 1/2018, they may still provide some useful insights into the reasoning of DPAs on this topic.

Works councils have an active role with respect to workplace privacy, regulated in the Works Council Act. The Act requires companies that have a works council to:

  • consult the works council when introducing new technical applications; and
  • obtain works council approval when introducing new employment-related privacy policies or a system that can monitor employees (such as a camera system or building access system).

Persons from other jurisdictions sometimes underestimate the impact of Dutch works council practices. There are various examples of cases where works councils have gone to court to enforce their rights, also with respect to workspace privacy (such as the works councils of Omron and Apple Retail). The basic rule is that (legal) costs of a works council should be borne by the employer.

The Act on the House for Whistle-Blowers requires every company with more than 50 employees to have a whistle-blowing policy. Employees may report suspected wrongdoings to the House of Whistle-blowers, a governmental agency, that can investigate reported wrong-doings. An investigation may lead to a report. According to the Civil Code, an employer may not retaliate against an employee who has reported a wrongdoing in accordance with the Act on the House for Whistle-Blowers. The Whistle-Blowers Authority has published useful information, including annual reports, brochures (eg, on the reporting procedure) and research reports, on its website.

Current Dutch law is not compliant with Directive 2019/1937/EU, that has to be implemented by 17 December 2021. Whistle-blower protection, for example, is currently only offered to employees, the possibility of seeking publicity is currently not addressed and the shifted burden of proof requires implementation. As yet, no proposal to amend current legislation has been proposed.

The ArtWP29, has given guidance on the processing of personal data in the context of whistle-blower hotlines in its Opinion 1/2006 on the application of EU data protection rules to internal whistle-blowing schemes in the fields of accounting, internal accounting controls, auditing matters, and the fight against bribery, banking and financial crime (WP 117). In this opinion, the ArtWP29 acknowledges that, in general, companies can have a legitimate interest in having a whistle-blower hotline. Although this opinion is not explicitly endorsed by the EDPB in Endorsement 1/2018, it may still serve as useful guidance on this matter.

As well as the foregoing, there are a couple of other issues relating to workplace privacy.

On its website, the AP offers FAQ relating to workplace privacy. Amongst others, the AP gives guidance on internet research regarding job applicants (generally accepted) and retention periods of CVs (four weeks).

E-discovery is not a known concept under Dutch law.

Regulators must act in accordance with the principles of proper public administration, which, inter alia, means they must act fairly, proportionally, may not discriminate and should treat civilians equally. If regulators fail to comply, the Dutch courts will hold that against them.

For violations of the GDPR, the AP may impose a penalty up to EUR20 million, or in the case of an undertaking, up to 4% of the organisation’s total worldwide annual turnover of the preceding financial year, whichever is higher.

In 2019 the AP has issued guidelines in a Penalty Policy. The guidelines provide insight into how the AP will use its fining powers, in addition to the guidance given in the EDPB guidelines on the application and setting of administrative fines for the purposes of the GDPR (WP253). This policy distinguishes between four categories of GDPR violation, depending on the nature and impact of the violation. For each category the AP has established a bandwidth, setting out the minimum and maximum penalty for a violation of each category. Besides, the AP has determined a default basic penalty amount for each category. When calculating a penalty, the AP starts with the amount of the basic penalty and then increases (or decreases) the penalty depending on specific factors, based on the circumstances of the relevant case. This default amount will be used a starting point and can be decreased or increased, depending on the circumstances of the case. The basic penalties range from EUR100,000 for low impact violations to EUR725,000 for violations in the highest category. It is noteworthy that the absolute maximum amount set for the highest category of infringements is limited to EUR1 million, which amount is considerably lower than the maximum amounts provided in the GDPR. However, the guidelines stress that the AP can easily increase the amount if the maximum amount would be "inappropriate" in a particular case.

For violations of the Telecommunications Act, the ACM may impose an administrative penalty of up to EUR900,000 per breach or 10% of the annual turnover of the company in breach (whichever is higher).

Both the AP and the ACM may impose an administrative enforcement order to enforce obligations laid down by the GDPR, Implementation Act or Telecommunications Act.

The leading enforcement case brought in the last 12 months is the Haga case, discussed above. In another significant case the AP imposed a ban on the Dutch Tax Authority processing Social Security Numbers in the VAT number of freelancers and other self-employed persons.

Data protection issues are raised in private litigation on a regular basis. Most of these cases pertain to the right to be left alone, the right to be forgotten, freedom of speech or the right of access.

A distinction can be made between different uses of data protection law in litigation, such as:

Enforcement of data protection law rights (such as the right to access to personal information or the right to be forgotten) – as an example of this, the Supreme Court in its judgment of 21 December 2018 sanctioned a lower court decision that the right of access to personal data under the regime of Directive 95/46/EC does not require the controller to provide access to the full documents containing the personal data; another example is the judgment of the Court of Appeal of The Hague, dated 24 September 2019, in which a claim to be forgotten was denied.

Balancing of the human right of privacy against other rights, such as the freedom of speech – there is much case law regarding media coverage where persons (often celebrities) are featured against their will; an example is the case in which the name of a university professor was disclosed in a newspaper article about sexual harassment (Court of Appeal Arnhem-Leeuwarden, 17 December 2019)

Damage claims for privacy infringements – an example is a case involving a Dutch celebrity who sued an online forum for publishing a sex tape and was awarded compensation of EUR30,000; in cases involving the illegal disclosure of less sensitive data, case law shows that lower amounts of damages are awarded (in a case involving the municipality of Deventer an amount of EUR500 was awarded in relation to the sharing of name and address data between various administrative bodies (Court of Overijssel, 28 May 2019)) while the unlawful processing of special categories of personal data may result in relatively small awards of damages (in a case where the Employee Insurance Agency disclosed the health data of a data subject to her new employer, an amount of EUR250 was awarded (Court of Amsterdam, 2 September 2019)).

Litigation against AP – an example is the judgment of the Court of The Hague, dated 8 March 2018, in a case between the mayor of The Hague and the AP.

The normal standards of Dutch procedural law apply to private litigation for alleged privacy or data protection violations. For instance, courts should respect the right to a fair trial and the principle of an adversarial process.

Class actions are allowed, including to claim damages (since 1 January 2020). Both the Dutch Civil Code and the Dutch General Administrative Law Act allow for class actions, which for instance could lead to a judgment declaring a certain processing unlawful.

The primary source with respect to law enforcement access to data for serious crimes is the Dutch Code of Criminal Procedure. Other relevant laws are the Police Data Act and the Judicial Data and Criminal Records Act. Furthermore, sector-specific regulators may have access under sector-specific legislation, such as the Competition Law Act.

For accessing means of communication and private homes, the general rule is that independent judicial approval is required.

It is generally believed that Dutch law enforcers and regulators obey legal restrictions to access to data. If personal data has been accessed without proper legal grounds, the basic rule is that the courts will ignore that data and may declare a certain investigation or prosecution unlawful.

The main laws applying to government access to data for intelligence, anti-terrorism or other national security purposes are the Dutch Code of Criminal Procedure for regular public prosecutions and the Intelligence and Security Services Act 2017 for the two secret services, these being the General Intelligence and Security Service and the Military Intelligence and Security Service. For accessing means of communication and private homes, the general rule is that independent judicial approval is required.

The secret services (both the General Intelligence and Security Service and the Military Intelligence and Security Service) are supervised by the Review Committee for the Intelligence and Security Services (Commissie van Toezicht op de Inlichtingen en Veiligheidsdiensten –CTIVD). As well as the CTIVD, the Dutch Intelligence Review Committee (Toetsingscommissie Inzet Bevoegdheden – TIB) has been established to review the use of the specific or general powers of the secret services. As a basic rule, access to personal data requires prior approval of the responsible minister or TIB.

It is generally believed that Dutch law enforcers and regulators obey legal restrictions to access to data. If personal data has been accessed without proper legal grounds, the basic rule is that courts will ignore such data and may declare a certain investigation or prosecution unlawful. The CTIVD actively supervises the secret services. For instance, it has published various progress reports regarding the introduction of the Intelligence and Security Services Act 2017. In its latest report of November 2019, the CTIVD is critical of the safeguards in place at the secret services to uphold effective judicial protection of citizen’s rights.

On 14 November 2018, CTIVD and TIM issued a joint statement announcing co-ordinated co-operation in supervising the secret services.

There is no Dutch law that expressly allows an organisation to invoke a foreign government access request as a legitimate basis for the collection and transfer of personal data, except that the secret services or other governmental agencies may have certain rights to share data with foreign governmental agencies.

With respect to foreign government access requests, the basic rule is therefore, as follows from Article 6(3) of the GDPR, that an organisation may not invoke such a request as a legitimate basis to collect or transfer personal data. However, a foreign government access request may offer evidence that the processing can be based on the legitimate interests ground of Article 6(1) of the GDPR. In this respect, reference is made to the guidance on the processing of personal data in the context of whistle-blower hotlines that was issued by ArtWP29 (WP 117). Although this opinion is not explicitly endorsed by the EDPB in Endorsement 1/2018, it may still serve as useful guidance on this matter.

The Netherlands does not participate in a Cloud Act agreement with the USA.

The main privacy issues that have arisen in the last few years have been in connection with government access to personal data, particularly access to bulk internet data by the secret services. The government wanted to introduce such access rights in the Intelligence and Security Services Act 2017. A referendum was held with respect to the draft of this Act, and the majority of voters expressed their criticism. Nonetheless the Act was adopted, although with some minor changes.

Under the applicable GDPR framework, international data transfers of personal information to countries outside the EU (or the EEA) are subject to restrictions. Aside from these restrictions, Dutch law does not impose any particular further restrictions on international data transfer. An exception is that data transfer to sanctioned countries or persons may be forbidden or restricted.

International data transfers are subject to the mechanisms set out in the GDPR. For instance, it is permitted to transfer data to a country outside the EU (or EEA) if the transfer is based on Binding Corporate Rules (BCRs) for intra group transfers, standard data protection clauses, approved codes of conduct or approved certification mechanisms.

BCRs are well established in the Netherlands, with some Dutch multinationals being pioneers in this respect. The AP has played an active role and the EDPB has adopted several documents in which guidance is given to companies wishing to implement BCRs.

On its website, the AP gives further guidance with respect to data transfers to the UK in the light of Brexit.

With the exception of the events set out in Article 46 of the GDPR, Dutch law does not require any government notifications or approvals to transfer data internationally.

Where a company wishes to implement BCRs for the international transfer of data, such BCRs need to be approved by the competent DPAs within the EU and the EDPB. Once BCRs are in place, no further authorisation of the AP is required.

There are no data localisation requirements, as such, in the Netherlands. However, companies should ensure that the international transfer of data does not restrict supervision by competent regulators, including the AP and financial regulators such as the Dutch National Bank (DNB) and the Financial Market Supervisor (AFM).

The DNB has provided guidance on cloud computing. It refers to the recommendations of the European Banking Authorities on outsourcing to cloud service providers (EBA/REC/2017/03).

In the Netherlands, there is no requirement to share software code or algorithms or similar technical details with the government.

An organisation collecting and transferring data in violation of the GDPR or Dutch law (eg, the law of contracts if a contract prohibits such collection or transfer), faces the risk of legal action against it (such as penalties from the AP or claims for damages). In practice, this often means an organisation has to choose which law it decides to violate. There is no "golden bullet" to solve this dilemma. As discussed above, an organisation may argue that a foreign government data request adds weight to its argument that it has a legitimate interest for the data processing as defined by Article 6(1) of the GDPR.

The Netherlands does not have a tradition of blocking statutes in which the application of law of other jurisdictions is hindered, and no such blocking statutes are active. On a European level, blocking statutes may be adopted, such as recently with respect to US sanctions in relation to Iran.

Big Data Analytics

The GDPR does not specifically address the use of big data analytics, and neither has the AP provided any guidance on this topic. The general principles that apply to the processing of personal data pursuant to the GDPR, such as purpose limitation and data minimisation, should be complied with when processing personal data in the context of big data analytics, as well as the other requirements laid down in the GDPR, such as those to provide adequate information on the use of data analytics to data subjects and to keep data up to date. To the extent big data analytics results in profiling or automated decision-making, the rules on profiling or automated decision-making should be adhered to. Moreover, where AI and algorithms are used for big data analytics, the AP stresses that information should be provided on the processes used, and that an adequate supervisory systems should be in place.

Automated Decision-Making

In the Netherlands, automated decision-making is governed by the GDPR and the Implementation Act. The EDPB has also provided useful guidance on this topic.

The GDPR specifically addresses automated individual decision-making, including profiling, in Article 22. The starting point is that data subjects have the right not to be subject to automated decision-making, including profiling, where such automated decision-making produces legal or similarly significant effects concerning him or her, or unless one of the exceptions laid down in the GDPR or national data protection law applies. A controller who wishes to rely on an exception for automated individual decision-making based on special categories of data should take additional safeguards.

For the Netherlands, Article 40 of the Implementation Act contains an additional exemption in situations where the automated individual decision-making, other than based on profiling, is necessary for compliance with a legal obligation to which the controller is subject, or the performance of a task carried out for reasons of public interest. In order to be able to rely on this exemption, the controller should take suitable measures to safeguard the data. For this purpose, private entities should safeguard the right to obtain human intervention; the data subject’s right to express his or her point of view; and the right to contest the decision over the data subject’s rights, freedoms and legitimate interests.

The controller should provide information to the subject on the automated decision-making, including profiling, as part of its information requirement and the data subject’s access right (Articles 13-15 of the GDPR). Moreover, the controller should make a DPIA in the case of automated individual decision-making, including profiling (Article 35 of the GDPR).

The EDPB has issued guidance on automated individual decision-making and profiling for the purposes of the GDPR (WP 251 rev 01). The ArtWP29 has issued guidance on automated individual decision-making and profiling in the context of law enforcement data processing.

Profiling

Profiling is subject to the rules of the GDPR, including the legal grounds for processing or data protection principles. Profiling in the context of automated individual decision-making is specifically addressed in Article 22. To the extent that cookies are used for the purpose of profiling, the requirements relating to the provision of information and consent as laid down in the e-Commerce Directive and the Dutch Telecommunications Act should also be complied with.

The EDPB has issued guidance on profiling in the 2018 Guidelines on automated individual decision-making and profiling for the purposes of the GDPR (WP 251 rev 01).

The AP requires controllers to conduct a DPIA in case of profiling.

Artificial Intelligence

Artificial intelligence is not specifically addressed in the GDPR or national law, and neither has the EDPB issued any guidance on this topic.

The AP has published a guidance document: “Supervision of AI and algorithms”, in which the AP explains the legal and supervisory framework concerning AI and algorithms in the Netherlands.

AI & algorithms are one of the focus areas of the AP for the period 2020-2023. Consequently, the AP will pay extra attention to the use thereof by companies and organisations in the coming years. In this context, the AP, inter alia, clarified that an adequate control system should be in place when using AI and algorithms, and that information should be provided on the processes used in connection with AI and algorithms and how results or findings are generated.

Internet of Things (IoT)

The Internet of Things is not specifically addressed in the GDPR or national Dutch law, but general data protection principles apply. The AP requires controllers to conduct a DPIA in the case of large-scale processing, or systematic monitoring of personal data generated by IoT devices (eg, mobile phones and car sat navs).

The ArtWP29, issued guidance on this topic in 2014 (WP 223), which can be useful as a starting point for IoT-related matters although it is not endorsed by the EDPB.

The AP has announced that data brokering in the context of IoT is one of its focus areas for the period 2020 – 2023.

The upcoming e-Privacy Regulation will likely affect the IoT. Considering past proposals for the e-Privacy Regulation, machine-to-machine communication could be qualified as providing an electronic communications service. Consequently, IoT manufacturers would be providing electronic communications services, and hence would need to comply with the rules laid down in the upcoming Regulation (eg, they must obtain the user’s consent for the transmission of data from one IoT device to another).

Autonomous Decision-Making

Autonomous decision-making (including autonomous vehicles) is not specifically addressed in the GDPR or national law, and neither has the AP issued any guidance on this topic. However, related topics such as the use of AI and algorithms is a focus area of the AP for the period 2020 - 2023.

In 2017 the ArtWP29 issued an opinion on processing personal data in the context of Co-operative Intelligent Transport Systems (C-ITS) (WP 252), which is still relevant in daily practice. In this opinion, the ArtWP29 considers that the principles of privacy by design and default should be implemented in any C-ITS applications in line with the GDPR, that adequate security measures and retention periods should be adopted, and that special categories of data and data relating to criminal convictions and offences should not be broadcast. Also, the EDPB has recently issued draft guidance on the processing of personal data in the context of connected vehicles and mobility related applications (Guidelines 1/2020).

Facial Recognition

Facial recognition is not addressed in the GDPR or the Implementation Act, other than in the context of biometric data. However, both the AP and the ArtWP29 have issued guidance on facial recognition.

Pursuant to the definition of biometric data, facial images are considered biometric data when processed through a specific technical means which allows the unique identification or authentication of a natural person. Therefore, it is likely that the rules applying to the processing of biometric data should be complied with when using facial recognition techniques (eg, the general prohibition on the processing of such data set forth in Article 9 of the GDPR and the exceptions to this prohibition laid down in Articles 22 and 28 of the Implementation Act). This has recently been confirmed by the EDBP, in its guidelines on processing of personal data through video devices (3/2019).

The AP addressed facial recognition in its policy rules and dos and don’ts on camera surveillance in 2016. It considers that the digital images recorded by smart cameras qualify as personal data, more specifically as special data revealing racial or ethnic origin. Where facial recognition will be used for automated individual decision-making, including profiling, the rules set forth in Article 22 of the GDPR should be adhered to. Where smart cameras are used for facial recognition, data subjects should be informed about the use prior to recording (eg, by means of signs). Moreover, a DPIA must be conducted in the case of large-scale processing, or systematic monitoring of personal data by means of cameras. In addition, the AP published advice on the usage of facial recognition in 2004.

The EDPB’s predecessor published an opinion on facial recognition in online and mobile services in 2012. In this opinion, the ArtWP29 considers, inter alia, that facial recognition may involve processing of sensitive data, that a legal basis (eg, consent) is required to process images, that appropriate measures should be taken to secure the data transit, and that the principle of data minimisation should be adhered to. Although this opinion is not endorsed by the EDPB, it can still be useful as guidance on this matter.

Biometric Data

Processing of biometric data is governed by the GDPR and the Implementation Act. Biometric data is defined in Article 4(14) of the GDPR.

Pursuant to Article 9 of the GDPR, the use of biometric data for the purpose of uniquely identifying a natural person is prohibited, unless one of the exceptions listed in the Article or national law applies, such as the explicit consent of the data subject (unless this exception is prohibited by national law).

Article 22 of the Implementation Act contains additional general exceptions that apply to any special categories of personal data (including biometric data). Article 29 contains an additional exception that applies specially to the processing of biometric data for the purpose of uniquely identifying a natural person, if such processing is necessary for authentication or security purposes.

The AP requires controllers to conduct a DPIA in the case of large-scale processing, systematic monitoring of biometrical data (eg, in the context of performing DNA analyses) or bio-databanks.

Geolocation

Geolocation data is primarily governed by the GDPR, as well as the Dutch Telecommunications Act where the processing of location data concerns location data relating to subscribers or users of public electronic communication networks or public electronic communication services.

Pursuant to the definition of personal data in the GDPR, location data should be considered as personal data. The AP even qualifies location data as data of a sensitive nature.

The AP requires controllers to conduct a DPIA in the case of processing of geolocation data on a large scale, or systematic monitoring of geolocation data.

The AP has conducted various investigations in which the processing of geolocation data played an important role, including:

  • its 2015 investigation into the processing of location data by means of Wifi tracking in and outside shops by Bluetrace;
  • the 2011 investigation into the processing of geolocation data stored in devices by TomTom; and
  • the 2010 investigation into the processing of Wifi data by using street view cars by Google.

Drones

The use of drones is not specifically addressed in the GDPR or the Implementation Act. However, the AP addressed drones in its policy rules and dos and don’ts on camera surveillance in 2016. It considers drones more privacy-infringing than static cameras, as drones can follow people and make recordings from places where people do not expect to be recorded. Also, the AP requires controllers to conduct a DPIA in the case of large-scale processing, or systematic monitoring of personal data by means of drones.

The subject of drones is also addressed by the ArtWP29, in its 2015 opinion on privacy and data protection issues relating to the utilisation of drones (WP 231). This opinion may be useful as a starting point when dealing with drone-related matters, although it is not endorsed by the EDPB.

There or no specific protocols for digital governance or specific fair data practice review boards or committees in the Netherlands. However, over the years, various initiatives and studies relating to the use of new technologies are conducted at the request of the Dutch Government, including the procedural safeguards required for the use of big data.

Please also see 2.5 Enforcement and Litigation for cases in the last 12 months. Some significant earlier investigations, penalties and other enforcement measures of the AP for alleged privacy or data protection violations include the following:

At the end of 2018, the AP imposed a substantial penalty of EUR600,000 on Uber for breach of its obligation to timely notify a data breach to the AP and data subjects. The penalty was imposed under the former Dutch data protection act, taking into account the (lower) maximum penalty provided in this former act.

In the beginning of 2018, the AP imposed an order, subject to a penalty in the event of non-compliance, on Dutch healthcare insurer Menzis, as unauthorised employees of Menzis were able to access medical information in breach of its obligation to adequately secure personal data and the firm kept no logging files. Almost one year later, in the beginning of 2019, the AP issued a recovery decision of EUR50,000 as Menzis had failed to cure the breach in full within the specified period. Menzis did not agree and lodged an appeal, but without success.

Conducting diligence must take place in accordance with the applicable data protection laws, including the GDPR. Two important provisions the target must comply with are: (i) the need of a legal basis for disclosing personal data in the data room; and (ii) the principle of data minimisation. In particular, issues may arise in relation to the disclosure of HR related data.

Moreover, potential buyers will need a legal basis for accessing and using personal data made available in the data room. As a general rule, personal data processed in the context of due diligence should be erased upon completion of the investigation.

Other important GDPR requirements relating to confidentiality, security and limitation of access to data are generally already well arranged in the context of a due diligence.

It is advisable to make arrangements on the processing of personal data in the context of the due diligence in addition to the arrangements on secrecy (ie, NDAs) prior to conducting diligence.

As part of the due diligence itself, generally it is advisable for a potential acquirer to assess whether the target complies with applicable data protection laws, including by requesting disclosure of the data processing register, the data breach register, template agreements, template DPIA, (external) privacy statement, (internal) privacy policies, information on DPO (if any), information on contact with the supervisor, information on privacy related claims, investigations and litigation, and information on the use of cookies.

Depending on the circumstances, an organisation’s cybersecurity risk profile or experience may be considered as price sensitive information, and consequently disclosed in accordance with the Dutch Act on Financial Supervision. 

Last year, Dutch healthcare insurer Menzis published information on a periodic penalty payment that was imposed and collected by the AP in its Annual Report, even though the underlying investigation report and decision of the AP were not yet published at that time due to appeal proceedings lodged by Menzis. At the end of 2019, the court rejected Menzis’ appeal, and the documentation was published on the website of the AP.

The AP has announced that it will pay extra attention to the following three focus areas in the period 2020 – 2023: data brokering, digital government and AI & algorithms. With respect to AI and algorithms, to AP will focus on supervisory systems implemented by companies and organisations. The focus on data brokering includes subjects such as the Internet of Things, supervising data results, profiling and behavioural advertising. Digital government focuses on data processing by both central and local governments, in particular in the context of partnerships, smart cities, security and elections and microtargeting.

Vondst Advocaten N.V.

Jacob Obrechtstraat 56
1071 KN
Amsterdam

+31 20 504 20 00

+31 20 504 20 10

info@vondst.com www.vondst.com
Author Business Card

Law and Practice

Authors



Vondst Advocaten N.V. is an Amsterdam-based boutique law firm that focuses on data protection law, IT, IP and life sciences. The firm combines a deep knowledge of and experience with the application of data protection concepts and is therefore pre-eminently able to provide workable solutions. A strict application of law when required, a flexible approach whenever possible. The data protection team has repeatedly and successfully assisted clients in enforcement matters against the Data Protection Authority and the Authority for Consumers & Markets. Vondst advises various clients on a strategic level about compliance with the GDPR. Recent matters have included advising a service provider in relation to a data breach, advising a financial institution on data retention issues and assisting a healthcare company in data access requests.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.