As a member state of the European Union, Germany is subject to EU law. Therefore, the most important data protection regulation is EU Regulation 2016/679 (the General Data Protection Regulation, or GDPR). In addition to the GDPR, the German Federal Data Protection Act (Bundesdatenschutzgesetz, or BDSG) is of importance. In Sections 1–44, the BDSG contains supplementary regulations on the GDPR putting data controllers’ rights and responsibilities in more concrete terms. Sections 45–85 of the BDSG implement the provisions of EU Directive 2016/680. This Directive concerns the processing of personal data by authorities for the purposes of crime prevention and prosecution.
In addition to the BDSG, each federal state has its own data protection law. These state laws only affect public bodies.
Apart from the general data protection laws (GDPR, BDSG) there are various sector-specific data protection regulations; for example, in the areas of telecommunications, media or public health. However, so far there is no specific regulation for current hot topics such as AI or the metaverse. The EU Commission published its first draft of an AI regulation in April 2021. However, it was not until June 2023 that the EU Parliament cleared the way and agreed internally on an amended version. Since then, negotiations have taken place between the EU Parliament and the EU Council, which have now been successfully concluded, meaning that the draft regulation is currently being finalised in Brussels.
Article 58 of the GDPR vests investigative and corrective powers in the supervisory authorities (see 1.2 Regulators for details). As State authorities, they are bound by the general administrative law restrictions. With regard to fines, in addition to Article 83 of the GDPR, certain provisions of the German Law on Administrative Offences (Ordnungswidrigkeitengesetz, or OWiG) apply. Fines cannot be imposed on authorities (Section 43 (3), BDSG). Some state laws further stipulate exceptions to fines on public bodies.
The federal structure of Germany is evident in the organisation of data protection supervision. The national supervisory authority, the Federal Commissioner for Data Protection and Freedom of Information, is responsible for federal authorities and telecommunications companies. The data protection authority of a particular state is responsible for the respective state authorities and for all the companies that have their main establishments in that federal state. Consequently, there are several supervisory bodies in Germany (18 in total) that sometimes have different views on the interpretation of the GDPR. For this reason, the supervisory authorities have formed a joint informal body to develop common positions on individual issues in order to ensure a uniform interpretation (Datenschutzkonferenz, or DSK). The DSK does not have any powers. However, if the DSK publishes opinions supported by all supervisory bodies, they cannot deviate from these opinions to the disadvantage of the controller due to administrative law restrictions.
All data protection authorities act independently. They are, in particular, not subject to any ministry’s right of instruction. This independence of the supervisory authority is required by the GDPR (Article 52 (2)). It used to be one of the most discussed topics in German data protection law before the GDPR came into force.
The authorities may investigate on the basis of their own initiative, a complaint by a citizen, a notification or other request from the controller or a request by another authority. According to Article 58 (1) litera a of the GDPR, every supervisory authority is entitled to conduct data protection audits. There is no requirement that a specific occasion must exist (however, see the restrictions described in 1.3 Administration and Enforcement Process).
With regard to the forthcoming AI regulations, a so-called “AI Office” is to be set up to co-ordinate at European level and monitor providers of large-scale AI-based models.
The authorities, like all state authorities, are bound by law. They may not proceed arbitrarily, must hear the person concerned before they act and may only act proportionately. For example, a fine is usually only proportionate if it is imposed in conjunction with another specific corrective measure that remedies the data protection breach.
In addition to these general restrictions, there are various specific restrictions that the supervisory authorities must observe: Section 29 (3) of the BDSG stipulates that no investigations may be carried out on persons subject to the obligations of professional secrecy (doctors, lawyers, etc) if this would be in violation of their confidentiality obligations. Apart from this, in the event of a search, the authority must only be granted access to the business premises during normal business hours (Section 16 (4), BDSG).
Fines
The DSK developed a model for the amount of fines imposed on companies that created a uniform procedure for all German supervisory authorities. This model was highly controversial, as it could lead to very high fines even for minor infringements. An initial decision by the Regional Court of Bonn (LG Bonn, judgment of 11 November 2020, No 29 OWi 1/20) agreed with the critical voices. It expressly stated that it is problematic to use a company’s turnover as a basis; rather, the severity of the infringement must be taken into account.
In this particular case, the Federal Commissioner for Data Protection and Freedom of Information had originally imposed a fine of EUR9.55 million on the telecommunications company 1&1. The court reduced this fine to EUR900,000 as there were no serious violations.
This German model has now been replaced by European guidelines from the European Data Protection Board. Since then, the fines imposed by data protection authorities in Europe have been standardised.
The guidelines provide for an assessment procedure consisting of five steps for the calculation of fines:
One criticism is that the total turnover of the company to be fined is taken into account excessively. This particularly affects companies with high turnover. It remains to be seen whether either the German or the European calculation models will meet requirements for penalties and fines under German constitutional law.
Another topic of discussion is whether the violation of the GDPR needs to be attributable to a responsible person within the data controller. The regional court of Berlin said this was the case and annulled a EUR14.5 million fine because the supervisory authority failed to do so (LG Bonn, decision of 18 February 2021, case No (526 OWi LG) 212 Js-OWi 1/20). This decision has been appealed. The ECJ has now ruled that a company responsible for data protection may only be fined in accordance with Article 83 of the GDPR if it is proven that the company committed the infringement intentionally or negligently (C-807/21). However, it is not necessary to identify a responsible person or to establish that a management body had knowledge of the unlawful conduct.
To provide effective legal protection, all actions of the supervisory authorities can be challenged before the administrative courts. This does not apply to fines. These must be challenged before criminal courts due to different procedural rules applying.
As mentioned in 1.1 Laws, the GDPR applies in Germany. This also means that the German data protection authorities are part of the European Data Protection Board (EDPB), which aims to ensure the consistent application of data protection rules throughout the European Union and to promote co-operation between the national supervisory authorities. According to Article 70 (1) lit. k of the GDPR, the EDPB can, for example, issue guidelines for the imposition of fines by the supervisory authorities, which the EDPB did on 24 May 2023 by adopting the final version of the guidelines on the imposition of fines, as mentioned in 1.3 Fines.
Data protection laws at state level apply only to public bodies. However, the state laws on administrative procedures can become relevant for private companies when challenging rulings by the data protection authorities (except those of the federal authority).
The European Commission, on 10 July 2023, approved the adequacy decision concerning the EU-US Data Privacy Framework. This decision affirms that the United States maintains an acceptable level of protection for personal data transferred from the EU to US companies within the framework. It comes subsequent to the United States’ implementation of an Executive Order aimed at enhancing safeguards for US Signals Intelligence Activities. These safeguards were introduced in response to concerns highlighted by the ECJ following its Schrems-II decision in July 2020. They specifically focus on limiting access to data by US intelligence agencies to what is deemed necessary and proportionate and establishing an impartial redress mechanism for Europeans’ complaints about their data collection for national security purposes.
The German legislature has decided that some data protection rules are consumer protective (Section 2 (2) No 11, Law on Injunctions for Consumer Rights and other Infringements (Gesetz über Unterlassungsklagen bei Verbraucherrechts- und anderen Verstößen, or UKlaG)). As a result, consumer protection associations can take legal action against violations of data protection in certain cases.
Article 40 of the GDPR provides for the possibility of inter-branch organisations drawing up codes of conduct for their members. These codes can then be approved by the competent supervisory authority. This has the advantage that it is subsequently easier for a company to prove that it operates in conformity with data protection requirements. However, this possibility has only been used by a few associations so far.
A comparable regulation for AI is not provided for in the draft AI Act.
Germany is part of the EU data protection system, which is one of the most developed systems in the world. Even before the GDPR came into force, Germany already had a differentiated data protection regime. However, the GDPR has further increased acceptance and sensitivity around this topic. In addition, data protection authorities are increasingly enforcing the respective rules, particularly imposing fines. For these reasons, more and more companies are realising that data protection is an integral part of their compliance management. Accordingly, there is a high demand for consulting services. Apart from this, it is a political goal in Germany and the EU to establish data protection-compliant solutions as a trade mark.
Since Schrems-II, there has been no simple solution for data transfers from the EU to the US. US President Joe Biden issued an Executive Order on 7 October 2022, which is supposed to make data transfers possible by addressing the issues that were criticised by the European Court of Justice, like the proportionality of the processing of personal data in the context of intelligence surveillance measures, as well as the legal protection possibilities of European citizens. Even though many privacy activists do not consider the Executive Order to be far-reaching enough, the European Commission published a so-called adequacy decision on 10 July 2023. This is intended to allow European data controllers to make data transfers to the relevant destination country without having to process standard contractual clauses or other transfer mechanisms.
The question of the scope of the right of access (see also 2.1 Omnibus Laws and General Requirements) under Article 15 of the GDPR has now been answered by the ECJ in its ruling of 4 May 2023. The ECJ ruled that the data subject must be provided with a faithful and intelligible reproduction of all personal data that is the subject of the processing. This means that excerpts from documents or complete documents or, where applicable, excerpts from databases must be transmitted if and to the extent that this is necessary for the effective exercise of the data subject’s rights.
The ePrivacy Regulation
The EU has still not passed the so-called ePrivacy Regulation. However, the German legislature replaced the data protection provisions of the Telecommunication Act (Telekommunikationsgesetz, or TKG) and the Telemedia Act (Telemediengesetz, or TMG) with the TTDSG.
The TTDSG
The new Telecommunications Telemedia Data Protection Act (Telekommunikations-Telemedien-Datenschutzgesetz, or TTDSG) combines provisions on data protection for telemedia and in the telecommunications sector previously found in other laws. Among other things, it contains new rules for the use of cookies and other tracking tools. In principle, the user’s consent is required before cookies can be set. This corresponds to pre-existing case law, but the legal situation had not previously been formulated so clearly. In addition, the responsibilities of the supervisory authorities are to be changed. The Federal Commissioner for Data Protection and Freedom of Information is to be uniformly responsible for data protection in telemedia (eg, for cookies). As an accompanying document, the DSK released guidelines for cookie management, consent, etc.
Artificial Intelligence
In spring 2024, the so-called AI Act will regulate the previously unregulated area of artificial intelligence. Individual obligations are planned for companies depending on their risk classification.
As mentioned in 1.1 Laws, the relevant data protection laws in Germany are the GDPR, the BDSG and some sector-specific regulations. These data protection laws all follow the same basic principle: the processing of personal data is prohibited if no permission has been granted (Article 6, GDPR). This can be a legal permission or the consent of the person concerned.
The scope of data protection laws is a crucial point. In order for the data protection rules to be applicable, the following conditions must be met. First, there must be personal data (Article 4 No 1, GDPR). Secondly, this personal data must be processed (Article 4 No 2, GDPR). Thirdly, the processing must be carried out by means of an automatic system or, at least, the personal data must be stored in a filing system (Article 2, GDPR). Finally, the territorial applicability requirements must be met (Article 3, GDPR).
Personal Data
Generally speaking, personal data entails all information that can be linked to a natural person (eg, name, address and gender). According to the case law of the ECJ, even legal means that can be used against third parties to identify a person must be taken into account. One of the most important examples is an IP address. Through these, it is possible to identify a person if one claims information against the respective internet provider. Thus, an IP address is considered personal data.
Processing
The GDPR lists the following examples for processing: collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, and erasure or destruction. In general, any action concerning personal data constitutes processing in the meaning of the law.
Automatic System or Stored in a Filing System
An automatic or partially automatic system means all electronic means of processing personal data. Storage in a filing system refers, for example, to file folders that are sorted alphabetically.
Territorial Applicability
The GDPR is applicable if a controller or a processor is established in the EU. Apart from that, the GDPR is also applicable if the controller or processor is not established in the EU but their goods or services are offered to persons in the EU. Finally, the GDPR is also applicable if a company monitors the behaviour of persons in the EU.
Data Protection Officers (DPOs)
Whether a DPO must be appointed is determined by Article 37 of the GDPR. Authorities must always appoint a DPO. For private companies, the standard follows a risk-based approach: if the processing of personal data is one of the main activities of the data controller and if it is critical from the perspective of the data subjects, a DPO must be appointed. Whether the processing is critical depends on the nature, scope and/or purpose of the data processed. An example of critical processing in this sense is big data applications. Apart from this, there is an obligation to appoint a DPO if the controller or processor mainly processes special categories of personal data (Articles 9 and 10, GDPR). The German legislature has tried to specify these general requirements and stipulated in Section 38 of the BDSG that a DPO must be appointed in addition to the provisions of the GDPR if, as a rule, at least 20 persons in a company are permanently involved in the automated processing of personal data. If this threshold is met, the controller must appoint a DPO regardless of the other requirements.
The data protection officer is not subject to directives and enjoys special protection against dismissal. Their duties include working towards compliance with all provisions of data protection law. To this end, they shall advise the data controller and all employees and monitor compliance with the regulations. In addition, they act as a contact point for the supervisory authority.
Authorisation to Process (Legal Basis)
As mentioned, the legal permissions for processing personal data are listed in Article 6 of the GDPR. In detail, the possible permissions are:
Another important legal permission is stated in Section 26 of the BDSG, which specifically applies in the employment context. Accordingly, personal data of employees may be processed for purposes of employment. This concerns, among other things, recruitment, employment and termination of employment and also includes the processing of special categories of data. According to this provision, data processing in a company may also be carried out on the legal basis of a works agreement (see 2.2 Sectoral and Special Issues (Employment Data) for details).
Data Protection by Design and by Default
It is the controller’s obligation to implement appropriate technical and organisational measures to ensure that data protection principles are intact (Article 25 (1), GDPR). This must be taken into account when creating the procedure and during the actual processing. The relevant data protection principles are listed in Article 5 of the GDPR (eg, data minimisation). As a specific obligation, the controller has to ensure that, by default, only personal data that is necessary for each specific purpose of the processing is processed (Article 25 (2), GDPR). All these regulations implement data protection by design and by default as a legal obligation. A violation can be punished with a fine.
Impact Assessment
Under certain conditions, the responsible person is obliged to conduct a data protection impact assessment in advance of the processing; however, that and the Transfer Impact Assessment for international data transfers are the only mandatory impact assessments. So far, no mandatory fairness or legitimacy assessments are required. At this point, the risk-based approach of the GDPR becomes evident: it is not necessary to carry out a prior impact assessment for every data processing operation; it is only necessary if the processing results in a high risk to the rights and freedoms of natural persons. The purpose of the assessment is to identify high-risk processing operations and to then take measures to minimise the associated risks. The DSK has drawn up a binding list for some processing operations where an impact assessment is or is not mandatory. This list is approved by the EDPB.
The assessment should be carried out in three steps:
The impact assessment must be carried out for specific processing operations. Operations with similar risks may be grouped together.
Privacy Policies
There is no direct obligation to implement a data protection policy. However, there are numerous information obligations for certain processing situations (eg, Articles 13, 14 and 49 (1) litera a, GDPR) and the general obligation to be able to prove compliance with the GDPR (Article 5 (2), GDPR). It is therefore recommended that a comprehensive data protection concept is developed and communicated to customers and/or employees. The information obligations are mandatory.
Furthermore, codes of conduct may exist in industry associations (see 1.5 Major NGOs and Self-Regulatory Organisations). They can also become part of a company’s privacy policy.
Data Subject Rights
It is one of the central goals of the GDPR to strengthen the rights of data subjects. For this reason, every data subject has the right of:
Apart from these rights, a data subject has the right to withdraw its consent at any time and without any reasons (Article 7, GDPR) and the right to lodge a complaint with a supervisory authority.
Some of these rights have certain preconditions that must be met before the affected person can exercise them. Nevertheless, it is the duty of the controller to be able to comply with these rights in a timely and transparent manner (Article 12, GDPR). The data protection authorities can enforce these obligations (Article 58 (2) litera c, GDPR).
Anonymisation, De-identification, Pseudonymisation
Anonymisation, according to the GDPR, means that data can no longer be associated with a person. It is therefore no longer personal and is no longer subject to the provisions of the GDPR. However, it is important to note that data is only anonymous if it can no longer be traced back to a person under any circumstances. This is not the case with pseudonymisation. Here, the name of a person is replaced by a combination of different letters and figures. However, as long as it is possible for the controller to re-establish the connection to the name or to identify the person on the basis of the remaining personal information, the data continues to be personal in the meaning of the law. Nevertheless, pseudonymisation can be a useful technical protection measure (Article 25, GDPR), because it can, among other things, prevent unauthorised third parties from using the data.
Profiling, Automated Decision-Making, Big Data Analysis and AI
These ways of processing pose a higher risk to the rights and freedoms of data subjects than normal data processing. This does not mean that they are prohibited. However, following the GDPR’s risk-based approach, these ways of processing personal data increase the obligations of the controller. For instance, as already mentioned, in the case of profiling, the controller has to conduct an impact assessment. Besides, Article 21 of the GDPR permits automated individual decision-making, which has legal effects, only in specific cases. In addition, the German legislature has created two further permits for automated decision-making in Section 37 of the BDSG. These requirements would also apply to microtargeting, which has not been addressed specifically by regulatory requirements so far.
Injury or Harm
The GDPR provides, in Article 82, for a claim for damages for all immaterial and material damage resulting from a violation of data protection. Recital 85 of the GDPR gives the following examples of such damages: loss of control over personal data, restriction of rights, discrimination, identity theft or fraud, financial loss, unauthorised removal of pseudonymisation, damage to reputation, loss of confidentiality of data subject to professional secrecy, or other significant economic or social harm to the natural person concerned. The claim can be directed against the controller or the processor. The liability of the person responsible or the processor is presumed, but they may show that they are not responsible for the event giving rise to the damage and thereby evade liability. This provision is quite favourable to the data subject.
In addition, a so-called model declaratory action may be brought before a court. This is based on Sections 606 et seq of the Code of Civil Procedures (Zivilprozessordnung, or ZPO). This legal instrument allows certain consumer protection associations to bring such an action and thereby have it determined by a court whether consumers are entitled to damages. This can also refer to compensation for damages according to Article 82 of the GDPR. If the action is successful, the affected consumers themselves must, in a second lawsuit, seek legal action to obtain actual compensation based on the findings made in the class action.
The risk of representative actions in the case of data protection violations will increase in the future. The EU has adopted a new directive providing for the Europe-wide introduction of representative actions to protect consumer interests (Directive 2020/1828). Breaches of the GDPR are explicitly mentioned as a basis for such lawsuits. What is special about these lawsuits is that qualified associations can not only bring an action for a declaration of a breach, but also directly for redress measures. Such measures include compensation, repair, replacement, price reduction, contract termination or reimbursement of the price paid (Article 9 (1) Directive 2020/1828). This possibility did not exist in this form in German law until now. This directive was transposed into German law in October 2023.
“Sensitive” Data
Different rules apply to the processing of special categories of data. According to Article 9 of the GDPR, these special categories of data are information relating to:
In all these cases, data cannot be processed on the legal basis of Article 6 of the GDPR. Instead, the permissions are listed in Article 9 (2) of the GDPR. However, the regulation contains some so-called opening clauses. These allow each EU member state to pass its own rules in a specific matter. The German legislature has used the opening clauses regarding special categories of data. For example, under Section 22 (1) No 1 of the BDSG, data may be processed in connection with social security and social protection law. This concerns, for example, pension and health insurance funds. However, it must be taken into account that sector-specific regulations – such as Sections 67 et seq of the Social Code (Sozialgesetzbuch, or SGB X) may also intervene. The processing of special categories of data of employees is regulated in Section 26 (3) of the BDSG. Accordingly, such special employee data may be processed, for example, if this is necessary for the fulfilment of legal obligations arising from labour law and if there is no reason to assume that the data subject’s legitimate interest in the exclusion of the processing outweighs the employer’s interests. Another special category of data concerns data on criminal convictions and offences (Article 10, GDPR). This data may in principle only be processed by public authorities. Private individuals or companies may do so only under certain conditions and to a very limited extent.
Children’s Data
It is not generally prohibited to process the data of minors. However, there are some special rules and the obligations for the persons responsible, which are tightened. If the processing is based on the consent of the minor, Article 8 of the GDPR applies. If the minor concerned has reached the age of 16, they can give their own consent. If they are under 16 years of age, the persons having custody must give their consent. This only applies if the service offered is aimed at children. Due to another opening clause, EU member states can lower this age limit. Germany did not modify the EU law at this point. There are also special obligations for the controller processing children’s data. For example, the controller has to ensure that the language used is suitable for children. It must be noted that in all balancing tests decisions, the protection of children’s data has a particularly high priority. Apart from data protection law, national advertising law also imposes special requirements on advertising aimed at minors.
Communication Data
The GDPR does not contain specific provisions on communication data. This should be done through the ePrivacy Regulation, which was originally intended to come into force at the same time as the GDPR. However, the legislative process has still not been completed. As a result, the processing of communications data is governed by Article 95 of the GDPR. Accordingly, the rules of the GDPR are applicable to communication data unless a more specific rule exists in the ePrivacy Directive. In Germany, this Directive was mainly implemented in the TTDSG.
Section 3 of the TTDSG prohibits the providers of telecommunications services from obtaining knowledge of the contents and circumstances of any communication. This concerns telephone calls as well as internet communication. The only exceptions concern acts necessary for the provision of the service or the protection of technical systems. In addition, only specific legal standards can justify the disclosure of data, such as the investigative powers of public authorities. Sections 9 and 10 of the TTDSG permit and at the same time limit the processing of traffic data and data necessary for determining charges.
The provisions of the TTDSG only apply if telecommunications services are provided. This classification may be difficult in individual cases. Internet service providers and telephone providers, for example, fall under the aforementioned provisions of the TTDSG. The decisive factor is whether a service provider transmits signals in a technical way. This is the case with internet and telephone connections. In regard to the email service Gmail, on the other hand, the ECJ has ruled that this is not a telecommunications service, as there is a lack of sufficient signal transmission. SkypeOut, on the other hand, was classified as a service within the meaning of the TTDSG, since regular telephone numbers could also be called using the service.
Employment Data
The legal framework for handling employee data has long been a much-discussed topic in Germany. There have even been attempts to create a separate law for employee data. With the GDPR, the most important legal requirements are now Article 88 of the GDPR in conjunction with Section 26 of the BDSG. Accordingly, employee data may be processed by the employer if it is necessary for the establishment, performance or termination of the employment relationship. This also applies if the data processing is necessary to fulfil obligations arising from a law, a collective agreement or a works agreement. The latter represents a useful instrument for regulating internal company data protection policies.
In addition, employees can also give their consent to data processing. However, special care must be taken to ensure that consent is given voluntarily and that no direct or indirect pressure is exerted. For example, no pressure is exerted if consent is obtained to process data required for bonus payments.
Cookies
Regarding website cookies, the ECJ has ruled that pre-checked boxes do not qualify as valid consent. This does not mean that all cookies require consent, but where it is required, active action is necessary to meet the legal requirements for consent. In addition, the person concerned must be informed about the functioning and duration of the cookies. It should also be noted that consent under Article 7 (4) of the GDPR may be invalid if consent is given to data processing that is not at all necessary for the desired action. For a best practice design of website cookies, it is therefore recommended that a distinction is made between necessary and unnecessary cookies and that the visitor has the opportunity to consent to the various categories.
Hate Speech, Disinformation, Terrorist Propaganda, Abusive Material, Political Manipulation, Bullying, Etc
With regard to hate speech, terrorist propaganda and abusive material on social networks, the German legislature has taken a special path and created a law specifically for this purpose: the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, or NetzDG). According to this law, operators of social networks are obliged to set up an effective complaint management system for certain illegal content, especially hate speech, terrorist propaganda and abusive material. Furthermore, they have to delete affected content if necessary and have to report on the complaints and deletions to the authorities. Currently, a change in the law is in progress that may lead to further obligations for the operators. For example, the possibility that the operators must report illegal content and information about the account holder directly to the police is being discussed. The dissemination of fake news has not yet been covered by this law or made into a punishable offence.
The admissibility of advertising is governed by the Law against Unfair Competition (Gesetz gegen den unlauteren Wettbewerb, or UWG) and the data protection laws. In principle, the prior consent of the recipient must be obtained. This applies equally to advertising calls and advertising emails. The advertiser must be able to evidence the consent. While there is no mandatory procedure for this, the double opt-in with a registration and confirmation message has established itself as the de facto market standard. In addition, the sender must always be identifiable in advertising by email and there must be an opportunity to object to the advertising.
Section 7(3) of the UWG provides for an exception for advertising by email, in which no prior consent is required. The following conditions must be met:
When using web tracking tools and other analysis tools, care must be taken to ensure that the provisions of the GDPR are observed in the processing of personal data. In the opinion of the ECJ, it is even possible for the website operator and the provider of the analysis tool to be regarded as joint controllers.
Infection Prevention Measures
During the COVID-19 pandemic, the federal government developed and published a “Corona Warning App”. In anonymised form, encounters between mobile devices are stored so that in the event of a positive test the person concerned can notify their contacts via the app. Employers should not require their employees to use the app. This is because it is based on the legal basis of voluntary consent. Consent cannot be given voluntarily if there is an instruction from the employer. The employer may recommend its use.
Similarly, mandatory temperature-taking before entering the workplace should be refrained from. This is sensitive health data (Article 9, GDPR) that does not provide reliable information about a coronavirus infection and is therefore not necessary.
In autumn 2021, in the context of COVID-19 measures, the German legislature required employers to only admit employees to the premises who were fully vaccinated, had fully recovered from a previous infection or could provide a negative test result. This applied until February 2023.
A common response to the pandemic was the widespread introduction of working from home. The German government imposed an obligation on companies to offer employees the opportunity to work in a home office, provided there were no operational reasons to prevent it. In doing so, it must be ensured that adequate security measures (Article 32, GDPR) have been taken to protect data on the company network. These include a secure VPN connection, two-factor authentication and training. However, the exact measures always depend on the specific individual case.
Monitoring
When monitoring the email traffic and internet use of employees, it makes an important difference whether employees are allowed to use the company IT systems for private purposes or not. If private use is permitted, the employer may be considered a provider of telecommunications services within the meaning of the TKG (see 2.2 Sectoral and Special Issues). If the employer were a telecommunications provider, surveillance would only be possible in very limited exceptional cases, for example, in the case of a justified suspicion of misuse. If private use is prohibited, the employer may monitor the use at least by random sampling. This does not apply to telephone conversations, which in both cases may not be recorded. Recording of telephone conversations requires a special legal basis or the consent of all participants. It is widely recognised that IT threat detection is permissible even if the employer is considered a telecommunications service provider.
Protection Measures
The employer may protect its technical systems by, for example, blocking access to websites or using a firewall. In some cases, there might also be a legal obligation to impose protection measures. As far as possible, these measures should be carried out without the processing of employees’ personal data. Otherwise, a legal basis is required. In most of these cases, Article 6 litera c (legal obligation) or litera f (legitimate interest) of the GDPR will apply.
Works Councils
Apart from telecommunications law, the provisions of labour law have to be followed. The works council must approve all technical equipment that is suitable for monitoring the behaviour of employees (Section 87 (1) No 6, Works Constitution Act (Betriebsverfassungsgesetz, or BetrVG)). This applies to explicit monitoring equipment such as video cameras and internet log files, as well as to any equipment that allows conclusions to be drawn about work behaviour. This can be a photocopying machine that records the printouts on a personal basis or access equipment that records the personal identification of employees entering and leaving the building. Works councils further have a right to information concerning data protection according to Section 80 (1) No 1, (2) of the BetrVG, but specific questions of compliance are not subject to co-determination.
Whistle-Blowing
The EU has issued a directive on whistleblowing, which the German legislator has implemented with effect from 2 July 2023 through the Whistleblower Protection Act (Hinweisgeberschutzgesetz, or HinSchG). The law does not require either internal or external reporting offices to accept anonymous reports. However, the law recommends that this should be possible. Under Section 4d of the Financial Services Supervision Act (Finanzdienstleistungsaufsichtsgesetz, or FinDAG), whistle-blowers can contact the Financial Supervisory Authority to report legal violations in the financial sector. This can also be done anonymously. The German Act on the Protection of Business Secrets (Geschäftsgeheimnisgesetz, or GeschGehG) also provides that information on illegal conduct may be provided if this serves the public interest.
Artificial Intelligence
The GDPR applies as soon as AI is used to produce summaries or evaluations and personal data is processed.
Supervisory Authorities
The supervisory authorities must first investigate the facts of the case in order to be able to initiate any sanctions. Therefore, the competent authority may request information from the controller. The authority will then issue a formal decision stating that a breach of data protection has occurred and, if necessary, impose sanctions. In any case, the responsible person must be heard before any sanctions are imposed. In general, these sanctions must be dissuasive but also proportionate. It is important to note that fines are only one part of the set of sanctions available to the authorities. The others are:
As regards fines, the model already mentioned in 1.3 Administration and Enforcement Process is expected to be revised. Irrespective of this, the supervisory authorities have also imposed numerous fines in recent years. These include the following.
Once a data protection breach has been identified, the authority is also entitled to inform the data subjects and other authorities of the breach. Germany has also adopted further provisions in Sections 42 and 43 of the BDSG concerning criminal offences and administrative offences relating to the unlawful processing of personal data.
Private Litigation
In order to obtain damages, Article 80 of the GDPR requires the following:
Under these conditions, the data subject may claim damages from the controller or the processor.
Law enforcement authorities may request information on personal data for the investigation of criminal offences. If there is a legitimate request for information, it must be complied with. It does not matter whether the crime is a serious offence. Judicial approval is not required. However, it is necessary for the requesting authority to specify the requested data, to give reasons as to why the data is relevant and to state the legal basis for the request for information. The transmission of the data to the authority can then be based on Article 6 litera c of the GDPR and Section 24 of the BDSG. In order to avoid violations of data protection, enquiries and information should be carefully documented.
It should be noted that special rules apply to requests for information on telecommunications data. In principle, this may only be requested in the case of serious crimes and only by court order (Section 100a ff of the Code of Criminal Procedure (Strafprozessordnung, or StPO)). In case of imminent danger – ie, in urgent cases – the public prosecutor’s office can order the information to be provided. However, this must be approved by a court afterwards.
In June 2023, the European Parliament voted on a digital evidence law and adopted the so-called e-Evidence Act in criminal matters, which is intended to enable investigating authorities to request digital evidence directly from telecommunications providers in other EU countries in the future.
Very similar rules to those discussed in 3.1 Laws and Standards for Access to Data for Serious Crimes apply in the field of national security. In these cases, the authorities can demand information if it is necessary for the prevention of imminent threats. These demands have to be followed if the data request is specific, states reasons for the relevance of the requested data and names a legal basis. Telecommunications data is again subject to increased requirements.
Although Germany has signed the Declaration on Government Access to Personal Data Held by Private Sector Entities, there are not yet any practical implications.
A request for information from a court or authority of a non-EU country is only legitimate if the request is based on an international agreement, such as a mutual legal assistance treaty, in force between the requesting third country and the EU or a member state (Article 48, GDPR). The US Cloud Act, for example, does not meet the requirements of a mutual legal assistance treaty in the meaning of the law. It is contrary to the requirements of the GDPR and cannot be used as a legal basis for any transfer of data. Irrespective of Article 48 of the GDPR, other legal bases from Articles 44-50 of the GDPR may justify the transfer in individual cases (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers).
One of the most controversial issues from a legal and political point of view concerns so-called data retention. Telecommunications companies will be obliged to store connection data for a longer period in order for law enforcement and security authorities to access it under certain conditions. A first statutory regulation was declared invalid by the German Constitutional Court in 2010. In 2014, the ECJ also ruled that a corresponding EU directive was unconstitutional. In 2015, the German legislature nevertheless enacted a new law regulating data retention. This was also declared invalid by the ECJ. Since then, the so-called “quick freeze procedure” has been under discussion. Under this procedure, investigating authorities can have relevant telecommunications data “frozen” immediately at the providers if there is a suspicion of a criminal act of considerable importance.
If personal data is transferred within the EU, the general rules apply (see 2.1 Omnibus Laws and General Requirements). It is a key objective of the GDPR to create a single market for personal data in the EU. Since the same rules apply in all EU member states, no special rules need to be followed when transferring data to other EU countries.
If personal data is transferred in countries outside the EU, there are additional rules that need to be followed (Articles 44–50, GDPR). First of all, there has to be an additional legal basis. This can be an adequacy decision by the European Commission (Article 45, GDPR), appropriate safeguards (Article 46, GDPR) or specific exemptions – eg, consent (Article 49, GDPR). Apart from that, all other rules (general legal basis, rights of the data subject, etc) apply as well.
EU companies doing business in the UK should consider whether they are required to appoint a UK representative. This is required by UK law under certain circumstances, especially if the company does not have a branch in the UK.
The European Commission can decide that a specific country has an adequate level of data protection (adequacy decision, Article 45, GDPR). If there is such a decision, it serves as a legal basis for the transfer of personal data in this country. The Commission has so far approved Andorra, Argentina, Canada, Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, South Korea, Switzerland, United Kingdom and Uruguay.
In the absence of an adequacy decision, Article 46 of the GDPR lists various possibilities for adequate safeguards. In addition to these safeguards, enforceable data subject rights and effective legal remedies for data subjects have to be ensured. Appropriate safeguards are, for example, binding corporate rules, standard data protection clauses approved by the Commission or a supervisory authority, approved codes of conduct, certifications or approved individual contract clauses.
With regard to SCCs, the SCCs newly issued in 2021 must now be used and the requirements of the ECJ (see 1.7 Key Developments) must still be observed. The court’s requirements that any controller seeking to transfer data to a third country on the basis of checking the adequacy of the level of data protection in that third country in advance has been made part of the new SCCs. In particular, how much access government agencies have to the data in the destination country must be checked. If the controller concludes that the level of protection is not adequate, it must either terminate the transfer or take supplementary measures to ensure adequate protection of the data. The recommendation of the data protection authorities suggesting various contractual, organisational and technical measures that can be taken in such a situation should also be taken into account.
If there is neither an adequacy decision nor appropriate safeguards, Article 49 of the GDPR provides specific exemptions for the transfer of data. These are, inter alia, explicit consent by the data subject, the performance of a contract concluded with or in the interest of the data subject, the protection of vital interests or reasons of public interest.
Apart from the decision itself, transfers based on Article 45 of the GDPR do not require further government approval. The appropriate safeguards listed in Article 46 of the GDPR are all authorised by a public body at some point in time. Transfers based on Article 49 of the GDPR generally do not require government approval. However, if a transfer is based on Article 49 (1) subparagraph 2 of the GDPR (compelling legitimate interest), the controller has to inform the supervisory authority.
There are no specific data localisation requirements under German law.
There are no obligations to share software code, algorithms or similar technical details with the government. Information regarding technical details might come to the attention of a data protection authority because of an investigation by that authority. However, it acts independently from other state bodies and is not allowed to share this information.
In the case of foreign government data requests, foreign litigation proceedings (eg, civil discovery) or internal investigations, the general rules on international data transfers (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers) and on foreign government requests (see 3.3 Invoking Foreign Government Obligations) apply.
The only blocking statute currently affecting German companies was passed by the EU and is intended to protect companies from US sanctions against Iran. The Blocking Statute nullifies any court rulings based on the sanctions and provides for the possibility of compensation.
The topic of big data analysis is still being intensively discussed at the academic level. Specific laws on this topic have not yet been enacted. In principle, the GDPR does not prohibit comprehensive analyses of personal data. However, numerous questions arise, for example, with regard to identifiability, change of the purpose of the processing, data minimisation and effective rights management. In addition to data protection law, other legal areas are also affected. For example, there is the question of liability in the event of inadequate data quality.
The discussion is similar with regard to algorithms that regularly require large data collections. If these algorithms make decisions with legal effect, the question of liability for erroneous decisions or for an erroneous data basis arises again. In this context, there is a discussion about creating legal personality for algorithms and/or autonomous systems. In addition, there is the specific problem of non-discrimination, for example, when an algorithm decides on the hiring of persons and ends up perpetuating the discrimination that already existed in the data collection based on previous (human) hiring decisions.
The German legislature has made the first legal regulation in the area of liability for automated systems in the case of autonomous motor vehicles. In doing so, it has defined criteria that an automated car system must meet. Furthermore, the legislature has stipulated that the vehicle driver is still liable even if permitted to transfer vehicle control to the system and to turn their attention away from the traffic. At the same time, however, the driver must remain on standby during the drive to take control of the vehicle again. This is, so far, the only example of specific liability rules regarding autonomous systems.
With regard to profiling and automated decisions, see 2.1 Omnibus Laws and General Requirements.
The Federal Ministry of the Interior has appointed an independent commission of experts to consider the ethical processing of data in the future (Data Ethics Commission, Datenethikkommission). The Commission has proposed a comprehensive catalogue of measures. Among other things, it recommends more specific regulation regarding big data analyses, effective supervision and an adaptation of liability law. With very few exceptions, the Commission considers the use of algorithms in government decisions as unjustifiable.
The federal government has announced that it intends to take the recommendations into account. So far, no specific regulatory proposals have been presented.
Please see 2.5 Enforcement and Litigation.
Data protection plays an important role in due diligence in corporate transactions in two ways. From the buyer’s point of view, there is interest in the question of whether the company to be purchased has acted in conformity with data protection regulations or whether financial risks exist in the form of fines or claims for damages due to data protection violations. In this respect, there must be an assessment as to whether the company has acted in conformity with data protection regulations. If the target runs a digital business, it is also important to establish whether the business model itself is data protection compliant as otherwise the target’s value would be greatly reduced.
In addition, however, the due diligence itself is a data processing exercise, as personal data on employees and/or customers is often disclosed to potential buyers. There must be a review as to which legal basis applies. As a rule, the legitimate interest according to Article 6 litera f of the GDPR should be taken into account. Care must be taken to ensure that only data that is really necessary is disclosed and that this data is anonymised as much as possible.
There are no laws that require a public disclosure of an organisation’s cybersecurity risk profile. The operators of facilities that constitute critical infrastructure (eg, energy suppliers) must submit their IT security plan to the competent authority. However, this does not constitute a public disclosure.
The need to regulate large technology companies existed almost simultaneously at the German and European levels. As a pioneer, the German legislature introduced Section 19a GWB (Gesetz gegen Wettbewerbsbeschränkung), which enabled the German Federal Cartel Authority to designate large companies as “gatekeepers” and to prohibit them from engaging in certain types of conduct in order to protect competition. In 2017, German legislators created the NetzDG (Netzwerkdurchsetzungsgesetz), which regulates social networks to ensure public safety by combating hate crime. The European legislature has created an insight into future regulation with several proposed regulations. For example, it created the AI Act, the Digital Service Act, the Data Act and the Digital Markets Act. Once the EU regulations enter into force, Germany will also be subject to these regulations.
The obligations of the GDPR depend on whether a company is to be regarded as a controller. The term is currently being further extended by the ECJ’s case law (see 1.7 Key Developments). Particularly in relation to co-operation models, it remains to be precisely determined where the boundary lies between controlling, processing and other auxiliary activities.
Georg-Glock-Straße 4
40474 Düsseldorf
Germany
+49 211 600 55 166
+49 211 600 55 050
p.kempermann@heuking.de www.heuking.de