Data Protection & Privacy 2024 Comparisons

Last Updated March 12, 2024

Contributed By Nyborg & Rørdam law firm

Law and Practice

Authors



Nyborg & Rørdam law firm is a well-reputed law firm known among peers and clients for its experienced and renowned attorneys. The firm consists of 16 attorneys and covers multiple practice areas, including criminal defence, family law and corporate and commercial law. The firm’s specialists within data protection and privacy consist of the three authors, who also work in technology and intellectual property law in general. The firm’s clients within data protection and privacy range from large international companies to smaller domestic companies and start-ups, and cover various sectors such as IT, life sciences, entertainment, hospitality and manufacturing.

Introduction to the Legislation

The Danish Constitution was last amended in 1953, and it does not take IT and other modern-day technologies (eg, artificial intelligence) into consideration. It includes a generic provision about privacy of correspondence and the inviolability of property, but it does not contain general provisions on data protection or privacy.

Denmark is a party to the European Convention on Human Rights which includes a general provision on privacy in Article 8 (about the right to respect private and family life). In practice, however, matters regarding personal data are governed by the special laws on data protection; most prominently, the EU General Data Protection Regulation 2016/679 (GDPR).

The GDPR has been a major law within the field since it came into force on 25 May 2018. It applies directly in Denmark and constitutes the fundamental law in the field, supplemented by the Danish Data Protection Act of 23 May 2018 (DDPA). The DDPA contains local procedural rules, and it details and modifies requirements in a number of areas, for example, the processing of personal data about children and data about criminal offences, as well as the use of data by public authorities.

The Danish Act on Law Enforcement of 27 April 2017 (DALE) – based on the so-called law enforcement directive (EU Directive 2016/680) – supplements the GDPR and DDPA and governs the processing of personal data in the context of public law enforcement by the police, the prosecution service and similar public bodies.

The central definitions used in the GDPR, such as “personal data”, “processing”, “controller” and “processor”, apply directly and have exactly the same meaning in Denmark.

Overview of the Enforcement and Penalty Environment

Enforcement in Denmark is predominantly carried out by the Danish Data Protection Authority (the “Regulator”). In addition, claims regarding violation of privacy and data protection regulations are from time to time made in civil law suits.

The entry into force of the GDPR has led to an increase in enforcement as well as in the severity of penalties and claims, but the environment in Denmark is still fairly mild compared to that in larger countries in the EU, such as Germany and France, and even the other Nordic countries (who historically have had a similar approach to Denmark, with relatively low fines).

When enforcing the legislation the Regulator is formally handing over the enforcement actions to the Danish Prosecution Service (DPS), who subsequently prosecute the alleged violations before the courts. The enforcement of the GDPR and DDPA by the Regulator and DPS is still at an early stage, where the Regulator and DPS are considering and selecting which cases to bring before the courts to establish a clear methodology for calculating fines and determining the fine level for various types of violations.

In 2021, the Regulator and DPS published official guidelines with proposals for methodology and fine levels for the different types of violations of the GDPR and DDPA. In June 2023, the Regulator stated that an updated version of the guidelines is being assembled to reflect the new guidelines on the calculation of administrative fines from the European Data Protection Board (adopted on 24 May 2023).

The Danish Data Protection Authority (the “Regulator”) is the key regulator in the field.

The Regulator has jurisdiction to oversee compliance with the GDPR and DDPA in relation to all persons and private entities, as well as public bodies, in Denmark. In addition, the Regulator has jurisdiction to oversee compliance with DALE in relation to the processing of personal data in the context of law enforcement.

The Regulator publishes guidance in relation to the GDPR, DDPA and DALE on its website, and conducts audits when overseeing compliance. The audits may take place on a written basis (where the Regulator sends a written request for information to be provided in writing) or as physical audits (where the Regulator inspects relevant sites and premises). The Regulator will by default notify in advance of physical audits in most cases, but prior notification is not required.

In 2022, the Regulator reported a total of 513 cases relating to audits and investigations (up from 402 in 2020), including planned audits of 35 private entities (up from 18 in 2020) and 94 public authorities (up from 64 in 2020), 79 cases initiated on the basis of media coverage and similar (up from 73 in 2020), as well as 176 cases based on notifications from private persons and public authorities (up from 137 in 2020). The numbers for 2022 show a general increase, reflecting the overall increase in funding and the number of employees in the Regulator’s office, from 34 (full-time equivalents) to 61 (full-time equivalents) in 2021.

With respect to the use of cookies on websites, both the Regulator and the Danish Business Authority have jurisdiction. The Danish Business Authority formally has jurisdiction in relation to the executive order on cookies (No 1148 of 9 December 2011) which implements the cookie rules from the ePrivacy directive (EU Directive 2002/58). In practice, however, the Regulator also enforces how personal data collected using cookies is used (because the GDPR and DDPA also apply to such processing).

Overall Approach to Audits

The Regulator may at its own discretion decide which private and public bodies to audit. In practice, the decision will depend on the type of enforcement the Regulator is carrying out.

The Regulator carries out so-called planned enforcement and ad hoc enforcement. To carry out planned enforcement, the Regulator defines a number of focus areas each year, and for each focus area the Regulator identifies relevant private and public bodies, and decides which ones to audit. The ad hoc enforcement is carried out on the basis of complaints, notification of data breaches, news in the media and similar information received by the Regulator that indicates a need for the Regulator to look into compliance in a specific private or public body. Based on this information, the Regulator decides which private and public bodies to audit ad hoc.

The Regulator may carry out the audits on a written basis by sending a questionnaire or letter with questions to the person or entity that is subject to the audit. The Regulator may render a decision on the basis of the answers, ask additional questions or, if relevant, initiate a physical inspection of the subject’s premises. Written audits are the most common, as they allow the Regulator to audit larger groups of persons and entities efficiently. Physical inspections are less common, but do take place when deemed relevant by the Regulator. The Regulator will typically notify the persons or entities that are to be the subject of a physical inspection at least 14 days in advance, but prior notification is not strictly required if certain exceptions apply (eg, if there is a risk of destruction of evidence if the subject is notified).

Conducting Inspections

When carrying out inspections, the employees of the Regulator are entitled to access premises where data is processed at any time without a court order, provided that appropriate ID is shown. The Regulator may require the local police to assist in order to gain access to such premises. During audits, the Regulator can demand any information that is of relevance to assess compliance with the GDPR and DDPA, but the nemo tenetur principle applies to the persons or entities subject to an audit (except for public bodies), whereby they are allowed to refrain from answering questions and providing information to the extent that this may incriminate them.

When carrying out physical inspections of premises that are not publicly available (eg, internal office spaces or server rooms) the Regulator must also ensure that the access it requests is reasonable and proportionate to the purpose for which the access is sought.

Concluding on the Audit

When the Regulator has completed an audit and analysed the collected information, it will by default send a letter to the persons or entities that have been subject to the audit outlining the facts, its findings and the reasoning behind its proposed decision. The persons or entities that have been subject to the audit will then have the opportunity to comment on this and provide additional information. Said persons may also, at any time, request access to the information the Regulator holds about them, and the Regulator shall by default provide such information.

When the audit is complete the Regulator will render a formal decision with its findings in respect of compliance with the GDPR and DDPA. The decision cannot be appealed, but it can be brought before the ordinary courts.

Depending on the circumstances, the Regulator may decide to publish the decision, and if deemed relevant by the Regulator, also the identity of the person or entity subject to the audit. The extent of disclosure will depend on balancing the interests of the public (to get the information) and the private interests of the persons and entities whose information will be disclosed (but who would like to keep it confidential). If the Regulator contemplates disclosing the information, and it is not apparent that the public interest clearly outweighs the private interest, the Regulator will usually consult the persons or entities whose information is to be disclosed.

Territorial Scope of the GDPR and DDPA

As the GDPR is an EU regulation, the national system in Denmark is from an overall perspective largely identical to the national systems of the other EU member states. In relation to the Regulator, the GDPR and DDPA contain rules that determine their jurisdiction over matters governed by the GDPR and DDPA.

The GDPR applies to the processing of personal data by persons and entities established in the EU, regardless of whether the processing takes place in the EU. In addition, the GDPR applies to processing data about subjects who are in the EU by persons or entities not established in the EU, where the processing relates to offering goods and services to such subjects, or monitoring their behaviour (as far as the behaviour takes place in the EU).

Correspondingly, as a local addition to the GDPR, the DDPA applies to the processing of personal data by persons and entities established in Denmark, regardless of whether the processing takes place in the EU, and the DDPA applies to the processing of data about subjects who are in Denmark by persons or entities not established in the EU, where the processing relates to offering goods and services to such subjects, or monitoring their behaviour (as far as the behaviour takes place in the EU).

Jurisdiction of the Regulator

The Regulator has jurisdiction in relation to violations of the DDPA as well as violations of the GDPR that take place in Denmark, and violations of the GDPR by persons and entities established in Denmark.

Where a violation relates to cross-border processing (defined in the GDPR) the rules regarding a one-stop-shop in the GDPR determine the competence and mode of collaboration of the relevant local regulators. Cross-border processing means processing carried out by persons or entities in more than one EU member state, as well as processing by a person or entity in one EU member state which substantially affects, or is likely to substantially affect, data subjects in more than one EU member state.

The main principle in the one-stop-shop rules is that one of the regulators (the data protection authorities or supervisory authorities) in the countries concerned is appointed as the lead supervisory authority. This supervisory authority is the regulator in the country where the person or entity in question is established, or in the case of a group of entities, the country where the main establishment of the group is located.

The lead supervisory authority facilitates co-operation and the exchange of information between the regulators in relation to the case, requests assistance from the other regulators as necessary, and drives the case-handling forward. The lead supervisory authority is in charge of drafting the decision in the case, which is submitted to the other regulators for comment. Depending on the comments, the lead supervisory authority may be obliged to modify the draft decision, but it will otherwise finalise and adopt it, and notify the main establishment of this.

Multilateral Effects Outside the EU

In an international context, the GDPR imposes restrictions on transfers of personal data from EU member states to countries outside the EU. The rationale behind the restrictions is to ensure that the safeguards surrounding personal data subject to the GDPR cannot be undermined by simply transferring the data to countries where the local data protection legislation is more relaxed or non-existent.

A transfer of personal data to a third country outside the EU is only allowed if the conditions defined in the GDPR are met. In most cases, the relevant conditions are that (i) the third country in question has been deemed as a “safe third country” that has adequate protection; or (ii) the appropriate safeguards are ensured on the basis of binding corporate rules or standard data protection clauses implemented between the data exporter and the data importer in the third country. The binding corporate rules and the standard data protection clauses impose the GDPR on the data importers (located in the third countries) on a contractual basis.

Many countries outside the EU have implemented data protection legislation that contains safeguards similar to those in the GDPR (and even, in some cases, largely copied from the GDPR, eg, as seen in the LGPD in Brazil) – and the list of “safe third countries” made by the EU Commission is growing. On 10 July 2023, the adequacy decision regarding the EU-US Data Privacy Framework established a new basis for data transfers from the EU to the USA. Based on the decision, data transfers to US organisations on the Data Privacy Framework List can take place without the need to meet additional requirements for international data transfers under the GDPR, and without the need to conduct special transfer impact assessments.

The need to conduct transfer impact assessments in relation to contemplated data transfers from EU countries to third countries was introduced with the Schrems II decision (see 4.1 Restrictions on International Data Issues).

Subnational Regulations

There are no subnational regulations in Denmark (ie, no additional specific regulations at regional or municipal level).

There are no major NGOs or self-regulatory organisations (SROs) that work exclusively with privacy or data protection in Denmark, but several organisations provide guidance and participate in the public debate regarding privacy and data protection matters.

The Danish Consumer Ombudsman

The Danish Consumer Ombudsman (DCO) is an independent public authority which supervises compliance with Danish marketing law. Data protection law (the GDPR, DDPA, etc) and marketing law intertwine in certain areas, for example, in relation to consent (as a legal basis for processing personal data or for sending marketing materials).

The Cybersecurity Council

The Cybersecurity Council is a council established to advise the government on how to improve digital security; how to facilitate the exchange of knowledge between authorities, industries and universities; and how to develop cybersecurity competencies. It consists of members from various industries, authorities, consumer organisations and universities.

The Data Ethics Council

The Data Ethics Council is an independent organisation that was established by the Danish government in 2019 to create debate and raise public awareness in relation to data ethics, and to support responsible and sustainable data use within the business sector and the public sector. It consists of members appointed by the Ministries of Justice, Finance, Innovation, Industry, Business and Financial Affairs, who represent public bodies and institutions, private companies and associations, as well as NGOs.

As Denmark is an EU member state, the national system follows the EU omnibus model with the GDPR applying directly in Denmark (supplemented by the local DDPA, which only makes minor modifications and supplementations to the GDPR from a high-level perspective).

In an international context, the Danish system is – like the systems of other EU member states – highly developed. In relation to enforcement, the Danish Regulator is relatively pragmatic and is not considered to be as aggressive as regulators in other EU countries.

The EU-US Data Privacy Framework

The adequacy decision by the EU Commission of 10 July 2023 regarding the EU-US Data Privacy Framework is a key development in the context of cross-border data transfers. The adequacy decision was made to address the issues raised in the Schrems II decision in relation to data transfers from the EU to the USA. Prior to the adequacy decision, international data transfers had been getting a lot of public attention in Denmark, and the use of global cloud providers was challenged by interpretations of the Schrems II decision and guidance from the authorities in the EU. By addressing the formal concerns of the legality of data transfers from the EU to the USA, the adequacy decision has paved the way for continued use of US cloud providers.

Enforcement of the GDPR

In the context of enforcement, other key developments are two cases where the Regulator has proposed fines that are quite high in a Danish context.

The first case concerns the Danish IT company Netcompany, which had developed a solution for secure sending and receiving of official digital mail between businesses, citizens and authorities in Denmark. After launching the solution, an error occurred giving certain users unauthorised access to the digital mail of other users, including confidential and sensitive personal data. The Regulator deemed Netcompany to have violated the GDPR by not having implemented adequate security measures in the design of the solution, and specifically noted the lack of a data protection impact assessment (DPIA). Based on this, the Regulator proposed a fine of DKK15 million (the highest fine proposed to date by the Regulator). The case illustrates that it is critical to carry out and document relevant risk assessments, including DPIAs, when developing and implementing IT solutions – and it shows that the Regulator will impose high fines if risks and security measures have not been adequately considered and documented.

The second case concerns the Danish private hospital Capio, which was being audited by the Regulator as part of the Regulator’s planned enforcement focusing on audits and inspections of data processors. During the audit, the Regulator made a random selection of three of Capio’s data processors, and found that Capio had not carried out audits in relation to any of the data processors. On this basis, the Regulator proposed a fine of DKK1.5 million for violation of the accountability obligations pursuant to the GDPR (which include an obligation for data controllers to carry out audits of data processors). The case illustrates that the Regulator expects all data controllers to carry out audits and inspections of their data processors, and that failure to do so will be regarded as a severe violation of the GDPR that the Regulator will sanction with high fines.

The Regulator published its enforcement focus areas for 2024 in January 2024. Several of the focus areas are of relevance to businesses, as the Regulator will audit both public bodies and private entities in those areas. These focus areas include compliant use of AI, surveillance of employees, data subject access rights, adequate data security and access controls.

AI has been a very hot topic since late 2022 and is still very high on the agenda. In a legal context, the upcoming EU regulation on AI expected to come into force in 2025 and 2026 is receiving a lot of attention (see 2.1 Omnibus Laws and General Requirements). In this context, the convergence between AI (risk) regulations, copyright and privacy laws is an area that is likely to see many developments over the next months and years. In the context of data protection, the Regulator issued a guideline in October 2023 on the use of AI in the public sector. In the guideline, the Regulator elaborates on, among other things, the legal basis for processing personal data with AI systems, monitoring of AI systems, informing data subjects, and the need to conduct DPIAs (which is required by default for AI systems, according to the Regulator). These principles will, to a large extent, also apply to businesses that make use of AI.

The need to conduct transfer impact assessments (as required per the Schrems II decision) is not as hot a topic as in recent years due to the Data Privacy Framework (see 1.7 Key Developments).

Appointment of Data Protection Officers

The role of a data protection officer (DPO) is formally defined in the GDPR, and the GDPR determines when the appointment of a DPO is mandatory. All public authorities and bodies (except the courts) are required to appoint a DPO. Private entities are only obliged to appoint a DPO if the core activities of the entity include (i) regular and systematic monitoring of data subjects on a large scale; or (ii) processing on a large scale of sensitive personal data or data relating to criminal convictions.

A DPO must inform and advise on the obligations pursuant to the GDPR and DDPA; monitor compliance with the GDPR, DDPA and data protection policies; advise in relation to DPIAs; co-operate with the supervisory authority; and act as a contact point for the supervisory authority.

Legal Basis for the Collection, Use or Other Processing of Personal Data

All the basic principles in the GDPR apply in Denmark, including the principles in Article 5 regarding lawfulness, fairness and transparency; purpose limitation; data minimisation; accuracy; storage limitation; and integrity and confidentiality.

The basic requirement of a legal basis in the GDPR also applies, whereby any processing of personal data for a given purpose must be covered by one of the exceptions in Article 6 of the GDPR, as well as by Article 9 in the case of sensitive personal data. This includes legal bases such as consent, performance of a contract, compliance with a legal obligation, and legitimate interest.

Privacy by Design and by Default

The requirements for privacy by design and by default in the GDPR apply in Denmark.

Privacy by design

Privacy by design entails that the controller must consider and implement relevant technical and organisational measures (eg, pseudonymisation) which are designed to implement data-protection principles (eg, data minimisation). In practice, this means that the controller must design and configure its processes and IT systems in a manner that ensures privacy requirements are duly reflected in the design and configuration.

Privacy by default

Privacy by default means that the controller must design and configure its processes and IT systems in a manner that ensures that only necessary personal data is processed (ie, only necessary data should be collected, it should only be processed and stored as long as it is relevant, and it should only be accessible to relevant persons).

Impact and Risk Analyses

Assessing impact and risk to data subjects is a fundamental concept in the GDPR.

Controllers must carry out formal DPIAs where processing is likely to result in high risk to the data subjects (in particular, when using new technologies, or in the case of automated systematic and extensive evaluation of persons, large-scale processing of sensitive data, or large-scale systematic monitoring of areas accessible to the public).

In addition, controllers and processors must assess risks when determining appropriate technical and organisational measures to ensure the ongoing confidentiality, integrity, availability and resilience of their IT systems and processes.

In the context of international data transfers, data controllers are required to perform transfer impact assessments prior to transferring data to third countries. Following the implementation of the EU-US Data Privacy Framework, the requirement for transfer impact assessments does not apply to data transfers from Denmark to organisations participating in the EU-US Data Privacy Framework.

Internal and External Privacy Notices and Policies

Transparency is a fundamental principle in the GDPR, and controllers are required to provide a wide range of specific information to the data subjects. A privacy policy is not explicitly required in the GDPR, but privacy policies or notices are often used in practice to convey mandatory information to data subjects. This includes information on what data the controller is collecting and using, what the data is being used for, the legal basis for use of the data, how long the data will be stored, and what rights the data subjects have in relation to the data.

External privacy notices are directed at external data subjects (ie, persons outside the organisation, such as customers and website visitors), while internal privacy notices are directed at internal data subjects (eg, employees). In addition to conveying mandatory information, organisations often create formal internal policies and processes to manage their privacy compliance programmes (eg, an internal policy outlining internal roles and responsibilities, and processes on how to handle data breaches or data subject requests). Implementing such policies and processes is in line with the accountability principle in the GDPR.

Data Subject Rights

The GDPR has defined a number of formal rights that data subjects have in relation to their data: the right of access, the right to rectification, the right to erasure, the right to restriction, the right to data portability, and the right to object. Controllers must inform data subjects about these rights and how to exercise them.

Anonymised and Pseudonymised Data

The definition of “personal data” in the GDPR is “any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person”.

Anonymised data

Anonymised data or information is “information which does not relate to an identified or identifiable natural person”, or “personal data rendered anonymous in such a manner that the data subject is not or [is] no longer identifiable”. As a consequence, anonymised data or information does not constitute personal data, and the GDPR does not apply to such information.

Pseudonymised data

Pseudonymised data is information which has undergone pseudonymisation, but which could still be attributed to a natural person by the use of additional information.For example, the name of the person has been made into a pseudonym or deleted, but other information about the person is unchanged. Such information is still deemed personal data.

Profiling, Automated Decision-Making, AI and Big Data

Profiling and automated decision-making

Automated decision-making and profiling is governed explicitly in the GDPR, and data subjects are by default entitled not to be subject to such processing, if it produces legal effects concerning them or similarly significantly affects them.

As exceptions to this main rule, such automated decision-making and profiling is permitted if it is necessary in relation to a contract with the data subject, if it is specifically authorised by local member state law, or if the data subject has given explicit consent.

Big data

Big data is not explicitly governed by the GDPR or DDPA, but processing of big data will in many cases entail processing of personal data, in so far as part of the data sets consists of data that makes it possible to identify natural persons. Where the data is aggregated or otherwise anonymised in a manner that makes identification of natural persons impossible, the GDPR does not apply to the data and processing thereof.

Artificial intelligence

AI is often defined as the ability of a computer to perform tasks commonly associated with intelligent beings. In a scientific AI context, even cutting-edge machine learning technologies and generative AI (eg, large language models such as ChatGPT) are generally considered to be simple forms of AI (in contrast to forms of AI where the computer or software, for example, has its own consciousness similar to that of humans). In such simple forms of AI a computer “learns” to perform certain tasks based on large sets of data used for training. Based on the training, the computer develops and refines software algorithms used for generating outcome, whereby the outcome is gradually improved and validated against the data set.

AI is not governed explicitly in the GDPR or DDPA, but automated decision-making and profiling may make use of AI/machine learning to build and improve algorithms (eg, to predict consumer behaviour).

At the time of writing, the EU Council presidency and the European Parliament’s negotiators have reached a provisional agreement on the draft proposal of a harmonised regulation on AI. According to the official timeline, this AI regulation will come into force in 2025 and 2026. The draft regulation divides AI systems into different categories based on impact and risk, introduces a governance system, and includes obligations to conduct fundamental rights impact assessments.

Harm and Injury Related to the Use of Personal Data

Any person who has suffered material or non-material damage as a result of an infringement of the GDPR or DDPA is entitled to compensation pursuant to the GDPR or DDPA. Under the GDPR and DDPA, the entity or person responsible for the infringement is by default liable for any such damage (exemption of liability requires the entity or person to prove that it is not in any way responsible for the event giving rise to the damage).

The damage suffered can include financial or pecuniary damage, but non-financial (eg, emotional) damage may also be compensated. In relation to the latter, it is currently being debated among legal scholars in Denmark whether compensation for non-financial damage applies directly on the basis of the GDPR, or whether it is further required that the general local requirements for non-financial damage apply (in Section 26 of the Danish Liability for Damages Act).

Special Rules Regarding Sensitive and Sectoral Data

In relation to the financial sector, the DDPA includes a number of special restrictions regarding credit bureaus, and the sectoral Danish Financial Business Act also includes, for example, a detailed regulation on the sharing of confidential customer information by banks and insurance companies.

Within the life science sector, companies often process health data in the context of their research and when performing clinical trials. In practice, such private entities will only be able to process such data for research on the basis of consent from the data subjects, or based on the so-called research exception (in Article 9(2)(j) of the GDPR) due to the sensitive nature of health data. The DDPA contains additional rules on the research exception in Denmark. Consent and the research exception have limitations as legal bases for the use of health data, and the possibilities in this area are subject to debate and still undergoing changes.

The processing of data on children is in some aspects subject to additional requirements under the GDPR and DDPA. This is seen, for example, in relation to consent from children below the age of 16, where consent must be given or authorised by the holder of parental responsibility. In Denmark, the DDPA has lowered this threshold to children below the age of 15 (prior to 2024, the threshold was below the age of 13).

In an employment context, the Regulator has published specific guidance elaborating on, for example, how employees should be furnished with mandatory information, how to process job applicant data and conduct recruitment processes in a compliant manner, and how trade unions and their employee representatives may process data.

Internet and Streaming

The GDPR includes special rules on online behavioural or targeted advertising, whereby the right to object must be brought explicitly to the attention of the data subjects – and separately from other information – at the latest, at the time of the first communication with the data subject. In addition, the processing must cease if the data subject objects.

In the context of social media, the European Court of Justice has established that social media platforms and companies making use of social media pages can be considered joint controllers in some regards. This decision has sparked the need for special privacy notices on corporate pages on social media.

Data Subject Rights

See 2.1 Omnibus Laws and General Requirements for details on data subject rights.

The spam rules in the EU directive on privacy and electronic communications have been implemented in the Danish Marketing Practices Act. As a consequence, prior consent is required by default for sending unsolicited commercial or marketing communications via email, automated texts or other electronic formats. This applies for communication to any type of recipient (whether consumer, business or public bodies).

In addition, the Danish Consumer Contracts Act further prohibits the use of telemarketing calls to consumers (except for the marketing of books, newspapers, journals, insurance, and rescue services).

Reference is made to 2.2 Sectoral and Special Issues for details on online targeted advertising.

There are no special laws on workplace privacy. Instead, this area is governed by the general privacy rules of the GDPR and DDPA, the Danish Whistleblower Protection Act, as well as other general areas of law (such as employment and criminal law).

Companies are in general allowed to implement cybersecurity tools and insider threat detection and prevention programmes, and monitor workplace communications in this regard (depending on the level of detail of this monitoring).

Monitoring Tools

In practice, monitoring tools may, for example, monitor network traffic and look for unusual patterns, without processing the specific files and messages that are being transmitted, and who they are being transmitted from and to. By default, this type of monitoring will not give the person or system carrying out the monitoring access to private communications. Such monitoring will not, therefore, generally be problematic in relation to the GDPR, as personal data is only processed indirectly, if at all, and only to pursue a legitimate purpose. On the other hand, monitoring by mass screening messages sent to and from employees will, by default, constitute processing of personal data, and is unlawful.

Sharing Employee Data With Labour Organisations and Works Councils

Labour organisations and works councils are deemed independent data controllers under the GDPR. In practice, this means that organisations must ensure a legal basis for sharing employee data with labour organisations and works councils, and such labour organisations and works councils must ensure that employees are notified of their processing of the employees’ personal data. When determining to what extent an organisation may share data with labour organisations and works councils, it is relevant to consider obligations under collective bargaining agreements and similar agreements.

The Danish Whistle-Blower Protection Act

A new Danish Whistle-Blower Protection Act came into force in 2021. Among other things, the act obliges organisations with more than 49 employees to establish formal whistle-blower schemes, where employees can report serious misconduct and violations of applicable law.

Discovery Rules

The concept of discovery as seen, for example, in the USA, does not exist in Denmark. The Danish Administration of Justice Act does, however, contain rules where a court can order a party to attend a legal proceeding to disclose information, but the rules in this respect are much more relaxed than the US discovery rules. International Danish companies are from time to time subject to US discovery proceedings, and may be obliged to store copies of the data in scope for long periods of time (beyond regular retention rules). The prolonged storage of such data is in general deemed to be in compliance with the GDPR and DDPA, given the foreign legal obligation to do so, and the limited scope of access to the data in question.

Legal Standards

When determining whether an act or omission is punishable under Danish law, the general principle of legality applies (nullum crimen sine lege, nulla poena sine lege poenali), whereby the act or omission in question must be defined as a criminal offence in the law in order to be punished.

In Danish criminal cases the defendant is by default deemed innocent, and the DPS must lift the burden of proof and show beyond a reasonable doubt that the defendant has committed the criminal offence in question. The court is not bound by any laws or rules when assessing the evidence and determining whether the defendant has committed the criminal offence in question.

Article 83 of the GDPR outlines a number of basic principles that must be applied when imposing and deciding on the level of fines for violation of the GDPR. Among other things, this includes “the nature, gravity and duration of the infringement taking into account the nature scope or purpose of the processing concerned as well as the number of data subjects affected and the level of damage suffered by them”, “the intentional or negligent character of the infringement”, “any relevant previous infringements by the controller or processor” and “the degree of cooperation with the supervisory authority, in order to remedy the infringement and mitigate the possible adverse effects of the infringement”.

In addition, the general principles for determining the sanction in chapter 10 of the Danish Penal Code – which in some aspects overlap with Article 83 of the GDPR – also apply.

Enforcement Penalties

Fines for violation of the GDPR and DDPA are subject to two specific limits defined in Article 83 of the GDPR (depending on which provisions the infringement relates to). In the case of private entities, the maximum fines are either EUR10 million or 2% of the total worldwide annual turnover of the preceding financial year (whichever is higher), or EUR20 million or 4% of the total worldwide annual turnover of the preceding financial year (whichever is higher). The DDPA further stipulates that the punishment may also be up to six months in prison (where the defendant is a physical person). The Regulator and the prosecution service published a model for calculating fines in 2021, and they are currently revising this in light of the new guidelines from the European Data Protection Board. For more information, see “Overview of the Enforcement and Penalty Environment” in 1.1 Laws.

In addition to the formal penalties, the Regulator may publish its decision. For more information about this, see 1.3 Administration and Enforcement Process.

Enforcement Cases

The Regulator has proposed fines since the GDPR came into force, but only a few cases have been decided before the Danish courts. A recent example from September 2023 is a decision by the Eastern High Court, where a fine of DKK1 million was awarded in a case regarding inadequate deletion of data on 500,000 customers. The case was under appeal to the Eastern High Court, where the DPS argued for a fine of DKK1.1 million (in accordance with a revised recommendation from the Regulator based on the turnover of the defendant).

The Regulator has made proposals for higher fines in a number of other cases, including since 2022, a total of seven cases with amounts of DKK1 million or above (with DKK15 million as the highest amount to date, see 1.7 Key Developments). Although an increase has been seen over the last couple of years, the current level of proposed and awarded fines is still relatively low in Denmark in comparison to the other Nordic countries and other countries in the EU.

Private Litigation

Private persons and entities may bring a suit for compensation and damages against controllers and processers under the GDPR and DDPA. In the event of damage, the controller or processor involved in the processing will be liable for the damage, unless the controller or processor can prove “that it is not in any way responsible for the event giving rise to the damage”. For further information on what constitutes damage, see 2.1 Omnibus Laws and General Requirements.

Under the Danish Administration of Justice Act it is possible for a multitude of persons and entities to bring a suit jointly as a group, provided that their claims are alike. Group actions of this kind are prima facie similar to class actions, but they differ in that all the participants have to actively opt in – and the code of conduct (that all Danish lawyers are subject to) prohibits “no cure, no pay” arrangements where the lawyer is entitled to a percentage of the compensation as their legal fee.

As a consequence, group actions are therefore only rarely seen in Denmark, but the formal possibility could lead to group actions being raised, for example, as a consequence of major data breaches severely affecting large groups of persons.

The Danish Act on Law Enforcement (DALE) governs the processing of personal data by law enforcement in order to prevent, investigate, reveal or prosecute crimes or enforce criminal sanctions. DALE does not apply to access to data by the Danish Security and Intelligence Service and the Danish Defence Intelligence Service.

The Danish Administration of Justice Act outlines the basic requirements for giving law enforcement access to, for example, data, IT systems and messages. Such access will, as a general rule, require a prior court order allowing specific access. If waiting for a court order will prevent law enforcement from securing evidence, access can be given without a prior court order, but law enforcement will in such case be required to obtain a court order as soon as possible, and no later than 24 hours after initiation of the operation. When law enforcement requests a court order, a defence counsel must be appointed by the court to represent the defendant, and the defence counsel must be given the opportunity to argue against the granting of access.

The safeguards in DALE in relation to law enforcement’s processing of personal data are similar to the safeguards in the GDPR and DDPA, but more limited (eg, transparency only applies to the extent that it does not harm investigations, public security, etc).

The same overall principles referenced in 3.1 Laws and Standards for Access to Data for Serious Crimes apply, but access to messages, etc, is formally governed by different acts (the Act on the Danish Defence Intelligence Service and the Act on the Danish Security and Intelligence Service), and details regarding deadlines, for example, are different.

The GDPR includes a legal basis for collecting and transferring personal data to governments and authorities, for example, if there is a legal obligation to provide such information when receiving an access request. The legal obligations referenced in the GDPR are, however, limited to legal obligations under the laws of Denmark or other EU member states, and not the laws of countries outside the EU. In addition, data transfer restrictions also limit the ability to transfer personal data to foreign governments outside the EU.

Denmark does not participate in a Cloud Act agreement with the USA.

The key public debates arising in connection with government access to personal data relate to access by foreign governments, most notably, access by authorities in the USA on the basis of, for example, the Foreign Intelligence Surveillance Act (FISA) 702 (also addressed in the Schrems II case) and access by hacker groups (eg, government-supported industrial espionage, but also ransomware attacks). In spite of the implementation of the EU-US Data Privacy Framework, it is expected that this debate will continue and that international data transfers and access will be challenged.

As described in 1.4 Multilateral and Subnational Issues, the GDPR includes a general prohibition against transfers of personal data from EU countries to third countries outside the EU. A transfer to such third countries is only allowed if one of the exceptions in the GDPR applies, including for example, a transfer mechanism as further described in 4.2 Mechanisms or Derogations That Apply to International Data Transfers. The threshold for what constitutes a transfer of data to a third country is very low. By way of example, any forwarding or copying of data to a recipient in a third country, storage on servers in a third country, as well as remote access to the data from a third country, all constitute a transfer.

Following the Schrems II decision, EU data exporters are further required to conduct a transfer impact assessment prior to transferring personal data to a third country. The purpose of the transfer impact assessment is to verify that the safeguards under the GDPR remain effective for the data in question in the third country to which the data is being exported. When making the assessment, the data exporter must consider, among other things, whether there is anything in the law and/or practices of the third country that may impinge on the effectiveness of the safeguards. The requirement for transfer impact assessments does not apply to transfers to countries or recipients subject to an adequacy decision, that is, transfers to “safe third countries” or to organisations on the Data Privacy Framework list.

A transfer of personal data from a country in the EU to a third country may take place if the EU Commission has rendered a formal adequacy decision in relation to the third country in question, and decided that the third country ensures an adequate level of protection. A list of such “safe” third countries can be found here.

In the absence of an adequacy decision a transfer of personal data to a third country may take place if appropriate safeguards (as defined in Article 46 of the GDPR) or derogations (as defined in Article 49 of the GDPR) apply.

Examples of the appropriate safeguards (transfer mechanisms) often seen in practice are (i) a contract between the data exporter and data importer that includes the data protection clauses adopted by the EU Commission; or (ii) binding corporate rules pursuant to Article 47 of the GDPR implemented by the data importer.

In practice, derogations are not used as often as appropriate safeguards. The most common derogation is consent for the transfer from the data subjects in question.

Government notifications or approvals are not required to transfer data internationally under Danish law.

No data localisation requirements apply in relation to personal data in general. Section 3(9) of the DDPA does, however, contain a special localisation rule whereby the Minister of Justice is entitled to decide that personal data processed in certain IT systems on behalf of public bodies must be processed and stored solely in Denmark. This special rule – also known as the “war rule” – also existed prior to the GDPR and DDPA. The Minister of Justice has issued an executive order (last updated on 15 March 2023) that includes a list of the systems subject to the localisation requirement. The list currently only covers 11 systems, including the system used by the Danish police and the system used for public digital mail.

The Danish Bookkeeping Act previously included a requirement that companies were obliged to obtain permission to store bookkeeping materials outside Denmark. This requirement has since been relaxed, and it is now possible to store such materials outside Denmark (eg, on cloud platforms), provided that they are readily available, that passwords, etc, are kept in Denmark, that the material can be extracted or printed, and that it is otherwise stored in accordance with the act.

Software codes, algorithms or similar technical details are not required to be shared with the Danish government.

An organisation that collects or transfers data in connection with government data requests will in practice often do so to comply with a legal obligation or to carry out a task in the public interest. In this regard, recital 45 of the GDPR stipulates that processing to comply with a legal obligation or in the public interest “should have a basis in Union or Member State law”. As a consequence, legal obligations and public interest stemming from a third country outside the EU cannot be used as a formal legal basis for the collection and transfer of data. For such scenarios, the organisation would need to use another legal basis, eg, legitimate interest, which in practice limits the possibility to collect and transfer data for such purposes. International transfer restrictions (see 1.4 Multilateral and Subnational Issues and 4.1 Restrictions on International Data Issues) should also be considered.

In relation to internal investigations, the DDPA stipulates that the processing of data on criminal offences by private entities can take place, provided that this is necessary in order to pursue a legitimate interest, and that this interest clearly outweighs the interests of the data subject.

The EU considers the extraterritorial application of laws adopted by third countries to be contrary to international law, and has enacted a blocking statute (regulation 2271/96) to protect EU operators against such laws. The blocking statute does not specifically relate to privacy or data protection, and is in practice directed at export control, trade sanctions, etc.

See 2.1 Omnibus Laws and General Requirements.

It is not a formal requirement to establish protocols for digital governance or fair data practice review boards or committees, but developments in recent years within the fields of AI and data ethics have led a lot of organisations to establish policies and protocols for the use of AI, digital governance and fair (or ethical) practices on a voluntary basis. In this respect, some companies establish internal boards or committees that oversee compliance with their policies and protocols, or anchor this oversight in existing boards or committees. Addressing risks associated with new technologies is a fundamental part of AI and data ethics programs (eg, creating internal governance on the use of AI and other technologies that may entail a risk to individuals).

See 2.5 Enforcement and Litigation.

The Regulator has only published limited guidance in relation to the processing of personal data in the context of due diligence in corporate transactions. The guidance was published prior to the GDPR and DDPA, but is based on the same principles as the GDPR.

In short, it is generally allowed to share ordinary non-sensitive personal data as part of a due diligence process, but only to the extent necessary for the due diligence and always subject to customary confidentiality obligations. The legal basis for such processing is legitimate interest (now Article 6(1)(f) of the GDPR). For this reason, sensitive personal data must by default be redacted from documentation that is to be uploaded into a data room, and consent is required by default if sensitive personal data is to be included.

In the event that access to the data room is given to persons located outside the EU, data transfer restrictions should also be considered (see 1.4 Multilateral and Subnational Issues and 4.1 Restrictions on International Data Issues).

The reporting obligations in the NIS directive (EU Directive 2016/1148) were implemented in Danish law in the Danish Act on the Centre for Cyber Security. Operators of Essential Services are obliged to report cybersecurity incidents to the Danish Centre for Cyber Security pursuant to this act. The NIS 2 directive (EU Directive 2022/2555), which broadens the scope of applicability to many large organisations in the private sector, is in the process of being implemented in Denmark.

According to the official plan, a draft bill was to be introduced in February 2024, but on 8 February 2024 the Danish Centre for Cyber Security (part of the Danish Ministry of Defence responsible for the implementation) announced that the implementation would be delayed. It is now expected that a draft bill will be introduced in October 2024, and that implementation will take place after the formal deadline on 17 October 2024. In addition to a much broader scope (covering organisations in critical sectors such as energy, transport, banking, finance, IT, waste management and food processing) NIS 2 will introduce additional requirements in relation to risk assessments and incident reporting to the authorities, as well as personal liability for management.

Separately, all organisations are subject to the general obligations under the GDPR to report data breaches to supervisory authorities.

Both the Digital Markets Act and the Digital Services Act are formally made as EU regulations, which means that the acts apply directly as laws in Denmark.

In a Danish context, the Digital Markets Act and the Digital Services Act are supplemented by the local act on supplementary provisions for the Digital Services Act (which came into force on 1 January 2024) and the local act on enforcement of the Digital Services Act (which came into force on 17 February 2024). The local acts formally appoint the Danish Competition and Consumer Authority as the local competent authority pursuant to the Digital Markets Act and the Digital Services Act. Apart from this appointment, the local acts simply reflect the provisions of the Digital Markets Act and the Digital Services Act necessary for local enforcement.

Cybersecurity attacks (eg, ransomware) have severely affected several large international Danish companies over the last few years, eg, Demant, Vestas and AP Moller – Maersk, as well as a large number of smaller Danish companies. The Danish Centre for Cyber Security is assessing the cyber threat against Denmark on an ongoing basis, and has for several years – including in its latest assessment in 2023 – assessed the threat as “very high”.

The increase in the number, impact and sophistication of the attacks – and the overall risk – has created a heightened awareness of cyber-risk in Denmark, and the topic has been moving up the management agenda in many companies. In the last year, this development has been further substantiated by the forthcoming implementation of NIS 2 (see 5.5 Public Disclosure).

Nyborg & Rørdam law firm

Store Kongensgade 77
1264 Copenhagen
Denmark

+45 33 12 45 40

info@nrlaw.dk www.nrlaw.dk
Author Business Card

Law and Practice in Denmark

Authors



Nyborg & Rørdam law firm is a well-reputed law firm known among peers and clients for its experienced and renowned attorneys. The firm consists of 16 attorneys and covers multiple practice areas, including criminal defence, family law and corporate and commercial law. The firm’s specialists within data protection and privacy consist of the three authors, who also work in technology and intellectual property law in general. The firm’s clients within data protection and privacy range from large international companies to smaller domestic companies and start-ups, and cover various sectors such as IT, life sciences, entertainment, hospitality and manufacturing.