Data Protection & Privacy 2026 Comparisons

Last Updated March 10, 2026

Contributed By Gerrish Legal

Law and Practice

Authors



Gerrish Legal is a Paris and Stockholm-based boutique law firm with additional presence in London, specialising in privacy, data protection, AI and technology law. With lawyers qualified in France, England and Wales and Ireland, the firm’s multilingual team advises international clients – from scale-ups to listed multinationals – across sectors such as SaaS, life sciences, fashion, recruitment, security and catering. Its core practice focuses on GDPR compliance, international data transfers, AI regulation, digital platform regulation, privacy-by-design frameworks, data breach management and privacy litigation. The firm also has strong expertise in commercial law, particularly technology contracts (SaaS), cross-border commercial arrangements and intellectual property matters. Gerrish Legal advises both EU-based organisations on privacy compliance and non-EU companies expanding into Europe (particularly France) by adapting their data governance frameworks and commercial practices to EU regulatory requirements, including the GDPR, AI Act, Data Act and sector-specific digital regulations.

EU data protection law is grounded in EU constitutional sources, notably Article 16, Treaty on the Functioning of the European Union (TFEU) which constitutes the legal basis for legislation and fundamental rights protection (Article 7 of the Charter of Fundamental Rights of the EU which consecrates respect for private life and family life, and Article 8 which directly consecrates the protection of personal data). These establish both EU legislative competence and the constitutional status of data protection as a fundamental right).

The centrepiece of the data privacy framework is the General Data Protection Regulation (Regulation (EU) 2016/679 (GDPR) on the protection of natural persons with regard to the processing of personal data and on the free movement of such data), which was adopted by the EU and became directly applicable across all member states on 25 May 2018 without the need for national transposition.

The GDPR prevails over incompatible national provisions. Its objective is to ensure an equal and harmonised level of protection for personal data across the Union while safeguarding the free movement of such data. It operates as the EU’s general (horizontal) regime for personal data processing.

These sources interact through the primacy and direct applicability of EU law: the GDPR serves as the default framework, while sectoral instruments either displace it for specific scopes or add rules consistent with GDPR principles (lex specialis).

EU-level interpretation is also shaped materially by Court of Justice of the European Union (CJEU) case law. The CJEU’s jurisprudence has been central in defining core concepts of data protection law.

Additionally, EU regulatory convergence is promoted through EDPB soft-law instruments (for example, Guidelines 05/2021). While formally non-binding, these guidelines play a significant harmonising role in enforcement practice.

The GDPR has explicit extraterritorial reach where processing relates to the offering of goods or services to individuals in the EU or monitoring their behaviour in the EU (Article 3(2), GDPR), and its international transfer regime can significantly extend compliance requirements beyond the EU (Chapter V, GDPR), as reflected in leading transfer case law (CJEU, Schrems II, Case C-311/18).

The EU privacy/data protection framework also overlaps with EU regimes on non-personal data, cybersecurity and AI: non-personal and mixed datasets are addressed through the free flow framework and data economy rules (Regulation (EU) 2018/1807 (Free Flow of Non-Personal Data); Regulation (EU) 2022/868 (Data Governance Act); Regulation (EU) 2023/2854 (Data Act)), while security obligations intersect with GDPR security requirements (Directive (EU) 2022/2555 (NIS2 Directive); Article 32, GDPR). AI governance adds further product- and risk-based obligations that must be implemented alongside GDPR where personal data is used (Regulation (EU) 2024/1689 (AI Act)).

Under EU law, the general principles governing the processing of personal data are set out in the GDPR.

Article 5, GDPR establishes core principles which structure all processing activities: lawfulness, fairness and transparency; purpose limitation; data minimisation; accuracy; storage limitation; integrity and confidentiality; and accountability (Article 5(1)-(2), GDPR).

Chapter 2 (Articles 6-11), GDPR sets out the substantive conditions for lawful processing. Processing must be based on one of the lawful grounds listed in Article 6, GDPR (consent, contract, legal obligation, vital interests, public task or legitimate interests) (Article 6(1), GDPR), with stricter conditions for special categories of data (Article 9, GDPR).

Chapter 4 (Articles 24-43), GDPR establishes obligations for controllers and processors.

Controllers bear primary responsibility for ensuring and demonstrating compliance (Articles 5(2) and 24), must implement data protection by design and by default (Article 25), conduct data protection impact assessments where processing is likely to result in high risk (Article 35), appoint a data protection officer where required (Articles 37–39) and ensure appropriate technical and organisational security measures proportionate to risk (Article 32).

Processors may process data only on documented instructions from the controller (Article 28), must implement appropriate security measures (Article 32) and assist controllers in fulfilling data subject rights and compliance duties.

Organisations acting as controllers or processors must operationalise the accountability principle (Articles 5(2) and 24, GDPR) through documentation, internal governance structures and demonstrable risk management practices.

Chapter 3 (Articles 12–23) also grants data subjects a broad catalogue of enforceable rights. These include the right to transparent information at the time of collection (GDPR, Articles 12–13); the right of access (Article 15); rectification (Article 16); erasure (“right to be forgotten”) (Article 17); restriction of processing (Article 18); data portability (Article 20); and objection, including an absolute right to object to direct marketing (Article 21). Individuals also have rights relating to automated decision-making and profiling (Article 22). Requests must generally be answered within one month (Article 12(3)), and data subjects may lodge complaints with a supervisory authority (Article 77) and seek judicial remedies (Articles 78–79).

The CJEU has played a central role in interpreting these rights (for example, Google Spain, C-131/12; Schrems II, C-311/18).

Key “to dos” include:

  • maintaining a record of processing activities (Article 30);
  • conducting data protection impact assessments where processing is likely to result in high risk (Article 35), and consulting supervisory authorities where residual high risk remains (Article 36);
  • implementing appropriate technical and organisational security measures (Article 32);
  • establishing procedures for detecting and managing personal data breaches, including notification within 72 hours where required (Articles 33–34);
  • appointing a data protection officer where mandatory (Article 37); and
  • ensuring lawful transfer mechanisms for international data flows (Chapter V GDPR).

These obligations operate cumulatively with relevant sectoral instruments such as the e-Privacy Directive (Directive 2002/58/EC) and, where applicable, Directive (EU) 2016/680.

As a principle, Article 9(1), GDPR prohibits processing data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data used for unique identification, health data, and data concerning sex life or sexual orientation.

By exception, such processing is permitted only where one of the exhaustively listed grounds in Article 9(2) applies, namely:

  • explicit consent (Article 9(2)(a));
  • employment, social security and social protection obligations under EU or Member State law (Article 9(2)(b));
  • vital interests (Article 9(2)(c));
  • legitimate activities of certain non-profit bodies (Article 9(2)(d));
  • data manifestly made public by the data subject (Article 9(2)(e));
  • legal claims (Article 9(2)(f));
  • substantial public interest (Article 9(2)(g));
  • preventive or occupational medicine and health care management (Article 9(2)(h)–(3));
  • public health (Article 9(2)(i)); and
  • archiving, scientific or historical research and statistical purposes subject to safeguards (Article 9(2)(j) and Article 89(1)).

Automated decision-making based on special categories of data is further restricted by Article 22(4) of the GDPR. Member states may introduce additional conditions for genetic, biometric and health data (Article 9(4), GDPR).

The GDPR also provides a specific regime for children’s data in the context of information society services. Where processing is based on consent (Article 6(1)(a), GDPR) and relates to the direct offer of such services to a child, consent is valid only if the child is at least 16 years old, unless member states lower the age threshold (not below 13) (Article 8(1), GDPR). For children below the applicable age, consent must be given or authorised by the holder of parental responsibility, and controllers must make reasonable efforts to verify this (Article 8(2), GDPR).

Data relating to criminal convictions, offences and related security measures are governed separately under Article 10, GDPR. Unlike Article 9 data, such data may be processed only under the control of a public authority or where authorised by EU or member state law providing appropriate safeguards.

Comprehensive registers of criminal convictions may be kept only under the control of a public authority. In the law enforcement context, processing by competent authorities falls under Directive (EU) 2016/680 (Law Enforcement Directive), which establishes a parallel regime.

Patient data is a special category of personal data under Article 9, GDPR. As a principle, processing health data is prohibited, unless an Article 6 legal basis and an Article 9(2) exception applies.

Health data includes information relating to physical or mental health, such as healthcare services, genetic and biometric data.

Only data that is truly anonymised falls outside the GDPR. Pseudonymised data remains personal data. The processing carried out to achieve anonymisation must itself be lawful.

In practice, life sciences companies most commonly rely on:

  • explicit consent (Article 9(2)(a)), given specifically for research or product development purposes, and provided it is explicit, freely given, specific and informed;
  • scientific research purposes (Article 9(2)(j)), provided the processing is based on EU or member state law and subject to appropriate safeguards under Article 89(1), GDPR (eg, pseudonymisation, access controls, data minimisation);
  • public interest in the area of public health (Article 9(2)(i)), such as ensuring high standards of quality and safety of healthcare or medical devices, where supported by EU or member state law; and
  • preventive or occupational medicine, diagnosis or management of healthcare systems (Article 9(2)(h) GDPR), particularly where the company acts on behalf of or in co-operation with healthcare providers subject to professional secrecy.

The European Health Data Space Regulation (EHDS), once applicable, will significantly impact life sciences companies operating in the EU by establishing a harmonised framework for secondary use of electronic health data.

Secondary use for research, innovation, regulatory purposes and public health policy will require a data permit issued by a national health data access body. Permits will specify the authorised purpose, datasets and conditions of use, and data will generally be accessed through secure processing environments, subject to strict purpose limitation, security, governance controls and a prohibition on re-identification. The EHDS expressly prohibits certain uses (eg, advertising, discriminatory decision-making, decisions detrimental to individuals in areas such as employment or insurance).

It further strengthens individuals’ rights (including enhanced access and portability, and in some cases the ability to restrict secondary use) and introduces interoperability obligations for electronic health record systems, reinforcing privacy by design under the GDPR.

Overall, the EHDS aims to facilitate innovation by expanding lawful access to large-scale health datasets while tightening governance and limiting purely commercial uses. Together, the GDPR and EHDS create a high-compliance but innovation-oriented regime: companies may anonymise and use patient data for development or research where valid grounds exist and robust safeguards are in place, while future access will increasingly depend on EHDS governance compliance.

Under EU law, any AI system that processes personal data is subject to the GDPR. Controllers must identify a valid Article 6 legal basis (and Article 9 exception where special categories are involved), comply with the core principles in Article 5 (including fairness, transparency, purpose limitation, data minimisation and accuracy), and implement data protection by design and by default (Article 25). Given the scale and complexity of AI training and inference, Article 32 security obligations are particularly significant. A Data Protection Impact Assessment is required under Article 35 where AI processing is likely to result in a high risk, including large-scale profiling, behavioural inference or use of sensitive data.

Article 22, GDPR also specifically restricts decisions based solely on automated processing that produce legal or similarly significant effects. Such decisions are prohibited unless one of the narrow exceptions apply: necessity for entering into or performing a contract, authorisation by EU or member state law with appropriate safeguards, or explicit consent. In those cases, controllers must ensure meaningful human intervention and enable individuals to express their views and contest the decision, and must provide meaningful information about the logic involved and the envisaged consequences.

The Artificial Intelligence Act (AI Act) also established a harmonised, risk-based framework for the development, placing on the market and deployment of AI systems in the EU. It classifies systems into prohibited, high-risk, limited-risk and minimal-risk categories:

  • prohibited practices (eg, certain manipulative techniques, social scoring and specific biometric identification uses) may not be deployed;
  • high-risk systems, including those used in employment, creditworthiness, law enforcement, migration, education and critical infrastructure, are subject to ex ante conformity assessments and ongoing compliance obligations;
  • limited-risk systems, mainly subject to transparency obligations; and
  • minimal-risk systems, which are largely unregulated under the AI Act.

For high-risk systems, the Act imposes extensive requirements centred on risk management, technical documentation, record-keeping, robustness and cybersecurity. A core focus is data governance: providers must ensure appropriate data management practices, assess the quality of training, validation and testing datasets, implement bias detection and mitigation measures, and document data sources and preprocessing methods.

Transparency is required across several risk levels. Users must be informed when interacting with an AI system (eg, chatbots) or when content is AI-generated (eg, deepfakes). High-risk systems must also be accompanied by clear instructions for use and sufficient information to enable effective oversight.

Human oversight is a structural requirement for high-risk systems: they must be designed so that natural persons can understand system capabilities and limitations, monitor outputs and intervene or override where necessary.

The regimes are complementary: the GDPR governs the lawfulness of personal data processing and individual rights, while the AI Act regulates AI systems from a product safety and governance perspective. In practice, GDPR compliance is always required where personal data is involved, and additional AI Act obligations apply depending on risk classification, with transparency, data governance and human oversight forming the core pillars for high-risk AI.

Under EU law, a “personal data breach” is defined as a security breach leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data (Article 4(12), GDPR). This covers confidentiality breaches (unauthorised access/disclosure), availability breaches (loss/inaccessibility) and integrity breaches (unauthorised alteration).

Controllers and processors must implement appropriate technical and organisational measures to ensure a level of security appropriate to risk (Articles 5(1)(f) and 32, GDPR), adopting a risk-management approach rather than guaranteeing absolute security (eg, CJEU, Natsionalna agentsia za prihodite, Case C-340/21).

When a breach occurs, controllers must promptly assess whether it is likely to result in a risk to individuals’ rights and freedoms, and comply with the GDPR’s notification and documentation obligations.

Where the breach is likely to result in a risk, the controller must notify the competent supervisory authority without undue delay and, where feasible, within 72 hours after becoming aware of it (Article 33(1), GDPR); any delay must be justified.

Processors must notify the controller without undue delay after becoming aware of a breach (Article 33(2), GDPR). The notification must include, at minimum, a description of the breach, the categories and approximate number of data subjects and records concerned, contact details for the DPO or other point of contact, likely consequences and measures taken or proposed to address and mitigate the breach (Article 33(3), GDPR); information may be provided in phases (Article 33(4), GDPR).

Controllers must document all breaches (facts, effects, remedial action) to enable regulatory verification (Article 33(5), GDPR). Where the breach is likely to result in a high risk, the controller must communicate it to affected individuals without undue delay, using clear and plain language and providing equivalent core information and mitigation steps (Article 34(1)–(2), GDPR).

Communication to individuals may be avoided in limited cases, including where robust technical measures (eg, effective encryption) render the data unintelligible, or where subsequent measures eliminate the high risk, or where individual notification would involve disproportionate effort (in which case a public communication may be used) (GDPR, Article 34(3)).

Operationally, organisations should have a breach response plan that enables rapid detection, containment and documentation; an initial legal and technical qualification of the event; a structured risk assessment; timely regulator and (if required) individual notifications; and remediation steps (containment, patching, credential resets, access reviews, restoration and enhanced monitoring), followed by a post-incident review (Articles 24, 25 and 32, GDPR).

Supervisory authorities have extensive investigative and corrective powers, including requiring information, conducting audits, ordering notifications, restricting processing and imposing administrative fines (Articles 57, 58 and 83, GDPR). However, corrective measures are not automatic and must be necessary and proportionate (see judgment of 26 September 2024 C-768/21).

Data breaches also create civil liability exposure and potential mass claims. Data subjects may lodge complaints with supervisory authorities (Article 77, GDPR) and seek judicial remedies and compensation for material or non-material damage (Articles 79 and 82, GDPR). The Court of Justice has confirmed that fear of potential misuse of data following a breach may itself constitute compensable non-material damage (Natsionalna agentsia za prihodite).

At EU level, privacy and data protection oversight is organised around national supervisory authorities, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS), with additional roles for sectoral regulators under specific instruments (eg, the ePrivacy Directive).

National Supervisory Authorities (NSAs)

Under Article 51, GDPR, each member state must designate at least one independent authority responsible for monitoring and enforcing the GDPR. NSAs are vested with investigative, corrective and sanctioning powers under Articles 57 and 58, GDPR, including the power to conduct audits and inspections, require information, order compliance, impose processing bans and administrative fines.

Proceedings are typically triggered by (i) complaints lodged by data subjects (Article 77, GDPR), (ii) referrals or co-operation requests from other supervisory authorities in cross-border cases, or (iii) ex officio investigations. In cross-border cases, the “one-stop-shop” mechanism (Articles 56 and 60, GDPR) designates a lead supervisory authority, which co-operates with other “concerned” authorities. Disputes may be escalated to the EDPB for a binding decision under Article 65, GDPR.

NSAs also enforce Directive 2016/680 (law enforcement data processing) and, depending on national law, the ePrivacy Directive (2002/58/EC), sometimes alongside or instead of sector-specific regulators (eg, telecommunications authorities). Their decisions are binding domestically, subject to judicial review.

European Data Protection Board (EDPB)

Established under Articles 68–76, GDPR, the EDPB ensures consistent application of the GDPR across the Union. It issues guidelines, recommendations and best practices (Article 70 GDPR), advises the European Commission (including on adequacy decisions), and adopts binding decisions in dispute resolution under Article 65. Its guidelines are formally non-binding but highly persuasive in practice; Article 65 decisions are legally binding on the national authorities concerned.

European Data Protection Supervisor (EDPS)

The EDPS supervises compliance with EU data protection rules by EU institutions, bodies and agencies (currently under Regulation (EU) 2018/1725). It exercises comparable investigative and corrective powers within the EU institutional framework and may issue reprimands, orders or administrative fines. Its decisions are binding on EU institutions and subject to review by the CJEU.

Sectoral and Related Authorities

Under the ePrivacy Directive, member states may designate specific authorities for rules on confidentiality of communications, cookies and direct marketing rules. The GDPR’s one-stop-shop mechanism does not automatically apply to national rules implementing the ePrivacy Directive, which may lead to parallel national proceedings.

In addition, for international data transfers, the European Commission plays a key role through adequacy decisions (Article 45, GDPR), while national authorities retain investigative powers and may refer validity questions to the CJEU (as illustrated by Schrems case law).

Under EU law, investigations and enforcement actions in data protection matters are primarily governed by the GDPR, which establishes a decentralised enforcement system based on independent national supervisory authorities (Articles 51–59, GDPR), combined with a co-operation and consistency mechanism for cross-border processing (Articles 56, 60–66, GDPR).

Initiation of Investigations

Proceedings may be triggered by (i) a complaint lodged by a data subject (Article 77, GDPR), (ii) ex officio action by a NSA, or (iii) a co-operation request from other supervisory authorities in cross-border cases (Articles 60–61, GDPR). In cross-border scenarios, the “lead supervisory authority” (usually that of the controller’s main establishment) conducts the investigation under the one-stop-shop mechanism (Article 56, GDPR), in co-operation with “concerned” authorities. If disagreement persists, the matter may be referred to the EDPB, which can adopt a binding decision under the consistency mechanism (Article 65, GDPR).

Conduct of Investigations and Procedural Guarantees

NSAs have extensive investigative powers, including the power to order the provision of information, carry out audits and inspections, and access to premises and data (Article 58(1), GDPR). They may adopt corrective measures (Article 58(2) GDPR), including warnings, reprimands, orders to comply, temporary or definitive processing bans, suspension of data flows and administrative fines.

The GDPR does not fully harmonise procedural timelines for investigations; these are governed by national administrative law, subject to EU law principles of effectiveness, equivalence and the right to good administration (Article 41, Charter of Fundamental Rights of the EU) and the right to an effective remedy (Article 47, Charter). Data subjects must be informed of the progress or outcome of their complaint and have the right to challenge legally binding decisions or inaction before national courts (Articles 78–79, GDPR).

Controllers must respond to data subject requests within one month, extendable by two further months where necessary (Article 12(3), GDPR). Failure to respond may prompt a complaint and subsequent enforcement. In practice, national laws may provide for hearings, written submissions and settlement-like discussions during investigations, but there is no harmonised EU-level “transaction” mechanism; informal resolution or commitments may nevertheless influence the authority’s choice of corrective measures.

Sanctions and Remedies

The GDPR provides for a harmonised regime of administrative fines (Article 83, GDPR), structured in two tiers: up to EUR10 million or 2% of the total worldwide annual turnover (Article 83(4)), and up to EUR20 million or 4% of worldwide annual turnover (Article 83(5)), whichever is higher. Fines must be effective, proportionate and dissuasive (Article 83(1)). In addition, supervisory authorities may impose non-pecuniary corrective measures (Article 58(2)). Member states may lay down additional penalties, including criminal sanctions, for infringements not subject to administrative fines or to supplement them (Article 84).

Data subjects are entitled to compensation for material or non-material damage resulting from a GDPR infringement (Article 82). Controllers and processors may be held jointly and severally liable, subject to rights of recourse between them.

Criteria for Setting Penalties

Article 83(2), GDPR lists the factors to be taken into account when deciding whether to impose a fine and determining its amount. These include: the nature, gravity and duration of the infringement; the number of data subjects affected and the level of damage; whether the infringement was intentional or negligent; actions taken to mitigate damage; the degree of responsibility, taking into account technical and organisational measures; previous infringements; co-operation with the authority; categories of personal data involved; how the infringement became known; compliance with prior measures; adherence to approved codes of conduct or certification mechanisms; and any financial benefits gained or losses avoided.

Recent CJEU case law (eg, Joined Cases C-683/21 and C-807/21, 5 December 2023) confirms that administrative fines require a culpable infringement (intentional or negligent conduct) attributable to the controller or processor.

Overall, EU enforcement combines harmonised substantive with national procedural frameworks. The system is designed to ensure consistent application across the Union, while preserving judicial oversight by national courts and, ultimately, the CJEU.

In the last 24 months, EU enforcement has centred on the accountability principle. National supervisory authorities and the CJEU increasingly require operational compliance frameworks, not merely formal documentation.

Accountability, Automated Decision-Making and Minimisation

Recent CJEU case law has clarified fault requirements for administrative fines in Deutsche Wohnen (C-807/21) and National Police of Latvia (C-683/21), strictly interpreted the prohibition on automated decision-making in SCHUFA (C-634/21), and reinforced data minimisation and purpose limitation in Schrems v Meta (C-446/21). In its 4 October 2024 judgment in Schrems v Meta (C-446/21), the court held that large-scale, indefinite processing for behavioural advertising cannot be justified simply because some data was made public. Together, these rulings confirm that accountability and privacy by design/default (Articles 24–25 GDPR) are concrete, documented governance obligations.

Practical takeaway

GDPR compliance must function as embedded, risk-based governance infrastructure, evidenced in practice.

International Data Transfers

After the CJEU invalidated Safe Harbor and Privacy Shield in Schrems (C-362/14) and Schrems II (C-311/18), the Commission adopted the EU–US Data Privacy Framework (Implementing Decision (EU) 2023/1795). In Latombe v Commission (T-553/23, 3 September 2025), the General Court upheld the adequacy decision, accepting that safeguards introduced by Executive Order 14086 and the Data Protection Review Court could ensure essentially equivalent protection.

The EDPB’s Guidelines 05/2021 (adopted 14 February 2023) clarified the cumulative criteria for a “transfer” under Chapter V, GDPR. Concurrently, CJEU case law has reaffirmed that supervisory authorities retain investigative powers even where an adequacy decision exists, consistent with Schrems II.

Practical takeaway

Organisations should map cross-border data flows, rely on adequacy decisions where available, conduct and document transfer impact assessments for non-adequate destinations, and periodically reassess transfers – even where DPF certification applies.

Security, Liability and Regulatory Convergence

In Natsionalna agentsia za prihodite (C-340/21), the CJEU confirmed that fear of misuse after a breach may constitute compensable non-material damage under Article 82 GDPR, increasing litigation exposure. In IAB Europe (C-604/22), the court adopted a broad approach to joint controllership in digital advertising, while in EDPS v SRB (C-413/23 P) it emphasised contextual identifiability in assessing pseudonymised data.

Parallel obligations under the Digital Services Act (Regulation (EU) 2022/2065) and the Data Act (Regulation (EU) 2023/2854, applicable from 12 September 2025) reinforce convergence.

Practical takeaway

Organisations should implement integrated compliance models aligning GDPR governance with platform, advertising and data-sharing obligations.

Overall, enforcement reflects a fundamental-rights-oriented, proportionate and evidence-based standard: demonstrable, risk-based compliance embedded in organisational decision-making.

Mass and Collective Data Privacy Actions

A defining trend is the growth of mass claims, particularly in jurisdictions with procedural mechanisms facilitating collective redress (eg, the Netherlands), and member states implementing the Representative Actions Directive (EU) 2020/1828). These mechanisms lower procedural barriers and increase strategic litigation risk for large-scale data processing operations.

Damages

Courts across Europe are seeing a rise in compensation claims under Article 82, GDPR, especially for non-material damages – one of the most debated issues in EU data protection law. A central question is whether “loss of control” over personal data is sufficient to establish damage. The CJEU clarified in UI v Österreichische Post AG (C-300/21, 4 May 2023) that mere infringement of the GDPR is not sufficient: claimants must demonstrate actual damage and a causal link, although no minimum seriousness threshold is required. This leaves national courts discretion in interpreting and quantifying non-material harm.

Security

Beyond classic post-breach litigation, claims increasingly scrutinise the adequacy of technical and organisational measures under Article 32, GDPR, as well as incident detection and notification practices under Articles 33-34. In Natsionalna agentsia za prihodite (C-340/21), the CJEU confirmed that the mere occurrence of a breach does not automatically establish non-compliance; courts must assess whether the security measures implemented were appropriate to the risk.

International Data Transfers

Cross-border data transfers remain a significant source of litigation and regulatory exposure. Organisations must navigate adequacy decisions, standard contractual clauses and politically contested frameworks such as the EU–US Data Privacy Framework, which continues to attract scrutiny from privacy activists and regulators. Transfer impact assessments and ongoing monitoring remain essential in practice.

Technology-Driven Disputes

A growing strand of litigation and regulatory action concerns technology-driven processing, particularly where AI, automated profiling and other high-risk algorithmic systems intersect with data protection rights. Regulatory investigations into AI chatbots generating deepfake or harmful content illustrate heightened scrutiny of system design, risk assessments and compliance with GDPR obligations, especially where sensitive data or children’s rights are implicated.

Disputes also arise from large-scale biometric and tracking practices, underscoring continued regulatory focus on facial recognition and mass surveillance technologies.

At EU level, in the past two years, the CJEU has delivered a series of judgments that have significantly structured privacy litigation in the EU, particularly in relation to Article 82, GDPR compensation claims, security obligations and the scope of data subject rights. Many of these decisions refine and consolidate an increasingly coherent line of case law governing civil liability and procedural standards in GDPR litigation.

  • Compensation under Article 82, GDPR – in Österreichische Post (C-300/21, 4 May 2023), the court clarified that compensation requires three cumulative elements: (i) an infringement of the GDPR, (ii) actual damage (material or non-material), and (iii) a causal link between the infringement and the damage. A mere infringement is insufficient to found liability. At the same time, member states may not impose a minimum seriousness threshold for non-material damage. The judgment firmly established Article 82 as a compensatory (not punitive) mechanism and structured the analytical framework now applied by national courts.
  • Security obligations and breach-related liability (Articles 24 and 32 GDPR) – in Natsionalna agentsia za prihodite (C-340/21, 14 December 2023), the court clarified that the mere occurrence of unauthorised disclosure or access does not automatically establish that the controller failed to implement “appropriate” technical and organisational measures under Articles 24 and 32, GDPR. The appropriateness of security measures must be assessed concretely and in light of the risk, taking into account the nature of the processing and the data involved. The CJEU thus rejected any irrebuttable presumption that a breach equals non-compliance. However, it acknowledged that erroneous disclosure by employees may indicate deficiencies in organisational measures if it reflects inadequate risk assessment or internal governance. The judgment also confirmed that fear of misuse may constitute non-material damage, provided that it is well founded and substantiated. This case therefore reinforces the risk-based logic of the GDPR while maintaining a fact-sensitive approach to liability.
  • Fear, loss of control and hypothetical risk (Article 82) – in MediaMarktSaturn (C-687/21, 25 January 2024), the CJEU further clarified the boundaries of non-material damage. It held that the concept of “non-material damage” may, in principle, encompass well-founded fear of future misuse and even temporary loss of control over personal data. However, a purely hypothetical risk is insufficient. Where it is established that an unauthorised third party did not actually become aware of the personal data, the mere fear of possible future dissemination does not, in itself, constitute compensable damage. The judgment reiterates that the data subject must demonstrate actual damage, however minimal. In this respect, MediaMarktSaturn does not depart from earlier case law but refines it by drawing a clearer distinction between abstract risk and substantiated harm.
  • Liability regime and burden of proof (Article 82(3)) – in juris GmbH (C-741/21, 11 April 2024), the CJEU clarified the operation of Article 82(3), confirming that the GDPR establishes a fault-based liability regime with a reversed burden of proof. A controller cannot avoid liability merely by arguing that the damage resulted from negligence of an employee acting under its authority. Since employees act under the controller’s authority within the meaning of Article 29, GDPR, the controller remains responsible unless it proves that it was not in any way responsible for the event giving rise to the damage. The exemption under Article 82(3) therefore applies only where the controller demonstrates the absence of a causal link.
  • Right of access and scope of the “copy” obligation (Article 15 GDPR) – in FT v DW (C-307/22, 26 October 2023), the CJEU strengthened the effectiveness of the right of access. It held that patients are entitled to obtain a first copy of their medical records free of charge, irrespective of the purpose of the request, including where the data is sought for potential litigation. National law cannot impose systematic fees for the first copy. The CJEU also clarified that the right to obtain a “copy” may require full reproduction of documents, including diagnoses and treatment details, where necessary to ensure intelligibility and effective exercise of rights.

Taken together, these decisions have not revolutionised EU privacy litigation but have consolidated a structured and increasingly predictable framework. This jurisprudence now provides national courts with a coherent template for adjudicating GDPR-based civil claims.

At EU level, collective redress in privacy and data protection matters is structured around Article 80, GDPR and the Representative Actions Directive (RAD) (Directive (EU) 2020/1828).

Article 80, GDPR allows not-for-profit organisations representing data subjects to bring complaints and judicial remedies on their behalf. Member states may also permit such organisations to act without an individual mandate. This mechanism has enabled strategic litigation by consumer and privacy associations, particularly in cases involving large-scale tracking, platform practices and data breaches.

The RAD, applicable since 25 June 2023, requires all member states to ensure the availability of representative actions aimed at protecting the collective interests of consumers. It applies to infringements of a broad list of EU legislation set out in Annex I, including data protection rules.

Only designated “qualified entities” may bring representative actions under the RAD. These must generally be non-profit organisations or public bodies pursuing consumer interests and satisfying independence and transparency requirements, including safeguards regarding third-party funding. For cross-border actions, designation criteria are harmonised and subject to mutual recognition across member states.

The RAD requires member states to provide for both injunctive measures and redress measures (including compensation), but it leaves significant procedural discretion at national level. In particular, member states may choose between opt-in or opt-out participation models, or adopt hybrid approaches, especially for redress actions. As a result, the structure and practical reach of collective compensation vary across jurisdictions.

Participation models (opt-in or opt-out) and the availability of collective compensation therefore differ across member states. In practice, injunctions remain more common and procedurally straightforward than collective damages. Recent developments centre on the domestic implementation of the RAD and the gradual emergence of case law applying these new mechanisms, with some jurisdictions (notably the Netherlands and Germany) becoming more active fora for data-related collective litigation.

At EU level, rules on non-personal data are shaped primarily by a set of “data economy” instruments to facilitate data circulation, access and market fairness, rather than solely protect privacy.

The Free Flow of Non-Personal Data Regulation (Regulation (EU) 2018/1807) addresses barriers to the movement of non-personal data within the internal market by prohibiting member states from enforcing data localisation restrictions for non-personal data, allowing it to be stored or processed anywhere in the EU.

The Data Governance Act (Regulation (EU) 2022/868) complements this framework by creating mechanisms to encourage voluntary data sharing. It regulates data intermediation services, introduces a framework for data altruism organisations, and establishes conditions for the reuse of certain protected public sector data.

The Data Act (Regulation (EU) 2023/2854) goes further and provides a broad, cross-sector regime governing access to and use of data generated by connected products and related services, including in the IoT environment. It imposes obligations on data holders to make data available to users and, in defined circumstances, to third parties. It also introduces business-to-business data sharing obligations subject to fair, reasonable and non-discriminatory (FRAND) conditions, establishes rules to facilitate cloud switching and interoperability, and includes safeguards against unlawful access by third-country authorities.

Taken together, these instruments form a layered regulatory architecture in which non-personal data regulation, competition policy and data protection law operate in parallel and, where relevant, cumulatively.

The EU’s data economy instruments are designed to operate alongside, not instead of, the GDPR. They regulate access to and use of data, but do not alter the fundamental rules governing the processing of personal data.

Where datasets are purely non-personal, instruments such as the Free Flow of Non-Personal Data Regulation and the Data Act apply without triggering GDPR obligations. In the case of mixed datasets, the GDPR continues to govern the personal data component, while the data economy framework regulates access, sharing and portability at the level of the dataset.

The Data Act expressly states that it does not affect the application of Union data protection law. Accordingly, any data-sharing obligation must comply with GDPR requirements, including the existence of a lawful basis, respect for purpose limitation and data minimisation, and appropriate security safeguards.

Non-personal data may also be protected under trade secret law, intellectual property rules or contractual confidentiality. The Data Act seeks to balance broader access rights with the protection of commercially sensitive information through confidentiality safeguards.

In this way, the EU framework is layered: data economy legislation promotes access and re-use, while data protection law continues to constrain the processing of personal data.

Across the EU data economy framework, several recurring principles and obligations emerge, distributed across different instruments (Free Flow Regulation, Data Governance Act, Data Act, etc).

Common Structural Features

Although the relevant rules are spread across several instruments, they reflect a shared policy orientation. The EU data framework seeks to promote data mobility and access, reduce structural imbalances in data-driven markets, and prevent technical or contractual lock-in, while preserving confidentiality and legitimate commercial interests.

A first recurring principle is the free circulation of non-personal data within the internal market. The Free Flow Regulation prohibits unjustified data localisation requirements and reinforces the idea that non-personal data may be stored and processed anywhere in the Union.

A second core feature is the promotion of access to and re-use of data. The Data Act establishes access rights in defined contexts, particularly for data generated by connected products and related services, allowing users to obtain and, in certain cases, direct the sharing of such data with third parties. More broadly, EU legislation aims to ensure that data access is not unreasonably withheld where it is economically and socially valuable.

Fairness and non-discrimination are also central themes. In business-to-business settings, data sharing may be subject to fair, reasonable and non-discriminatory (FRAND) conditions. Switching and interoperability obligations for data processing services aim to prevent lock-in and enhance market contestability.

Finally, all instruments recognise the need to safeguard confidential business information. Trade secrets and commercially sensitive data remain protected, and access obligations must be implemented with appropriate confidentiality safeguards.

Instrument-Specific Rights and Duties

While these principles are common across the framework, certain instruments introduce more specific obligations.

For instance, the Data Act imposes duties on “data holders” to make product-generated data accessible to users and, in defined circumstances, to third parties. It also introduces interoperability and switching obligations for data processing service providers.

Similarly, the Data Governance Act regulates data intermediation services and data altruism organisations, requiring neutrality, transparency and organisational safeguards.

Organisational Compliance Considerations

For organisations, compliance begins with identifying their role under the relevant instruments (data holder, user, intermediary or cloud service provider) and mapping relevant data flows, particularly in IoT environments.

Contractual arrangements must be reviewed to ensure alignment with access and FRAND standards, and technical systems assessed for interoperability and switching readiness. Internal processes should safeguard trade secrets and commercially sensitive information.

Where personal data is involved, GDPR obligations apply in parallel.

Under the Data Governance Act and the Data Act, member states are required to designate one or more competent authorities responsible for supervision and enforcement. The institutional choice is left to national law, and approaches differ across the EU.

Some member states have opted to entrust enforcement, at least in part, to their data protection authority, particularly where Data Act obligations overlap with personal data processing (eg, France, Spain). Others have designated digital, communications or competition regulators as lead authorities, reflecting the market-regulatory dimension of the framework (eg, Germany, Netherlands).

This divergence reflects the hybrid nature of EU data economy legislation, which combines elements of data governance, digital market regulation and, in certain cases, competition oversight. Where personal data is involved, co-ordination with GDPR supervisory authorities is necessary. In parallel, disputes involving dominant platforms or data access conditions may also fall within the remit of competition authorities.

Overall, enforcement remains decentralised but increasingly requires co-operation across regulatory domains, mirroring the integrated structure of the EU’s digital regulatory strategy.

Online tracking technologies (including cookies, SDKs, pixels and similar device identifiers) are governed primarily by Article 5(3) of the ePrivacy Directive (Directive 2002/58/EC), as implemented in national law, in conjunction with the GDPR.

Article 5(3) establishes a general opt-in model: storing or accessing information on a user’s device requires prior informed consent, unless the technology is strictly necessary to provide a service explicitly requested by the user. This rule applies regardless of whether the data accessed is personal data. Where personal data is subsequently processed, the GDPR applies in parallel.

Although the core consent requirement is harmonised, implementation and enforcement vary across member states. In practice, most member states require granular, prior consent for analytics and advertising cookies, typically via consent management platforms. Legitimate interests cannot substitute for consent at the device-access stage under Article 5(3), even if they may be relied upon for subsequent processing under the GDPR in limited contexts.

Personalised and targeted advertising in the EU is regulated primarily under the GDPR and the ePrivacy Directive, supplemented, in the case of large online platforms, by the Digital Services Act (DSA).

Under the GDPR, personalised advertising must rely on a valid lawful basis under Article 6 and comply with transparency, purpose limitation and data minimisation principles. In practice, consent is frequently relied upon in online advertising environments, particularly where advertising is based on user-level data. The use of special categories of data for marketing purposes is generally prohibited unless explicit consent is obtained.

The DSA introduces additional constraints for online platforms, notably prohibiting targeted advertising based on profiling using sensitive data and restricting targeted advertising directed at minors based on profiling.

Rules governing unsolicited electronic marketing (such as email or SMS campaigns) derive from the ePrivacy Directive as implemented in national law. While most member states apply an opt-in model for business-to-consumer (B2C) communications, the treatment of business-to-business (B2B) marketing varies. Some jurisdictions extend consent requirements to communications addressed to corporate contacts, whereas others allow opt-out systems for professional communications, subject to national conditions.

Accordingly, although the core data protection standards applicable to personalised advertising are harmonised, practical compliance in online marketing remains partly dependent on national implementation choices.

In the employment context, the GDPR applies as a baseline, but Article 88 permits member states to adopt more specific workplace rules. As a result, data protection obligations are often complemented by national labour laws, works council rights or collective agreements.

Employee monitoring, whether through time-recording systems, IT usage controls or CCTV, must be justified, proportionate and transparent. Employers commonly rely on legitimate interests, but this requires a careful balancing exercise given the imbalance of power in employment relationships. Consent is generally not considered freely given in this setting.

Remote work and bring-your-own-device (BYOD) arrangements require appropriate technical and organisational measures, particularly to ensure data security and a clear separation between professional and private information.

In recruitment, processing must be limited to what is necessary for the role. Special categories of data require a specific legal basis, and information relating to unsuccessful applicants should not be retained longer than necessary.

In many member states, national labour law imposes additional safeguards, especially regarding workplace surveillance.

During due diligence throughout an M&A transaction, any personal data disclosed to potential buyers must be limited to what is necessary and proportionate. In practice, secure and access-restricted virtual data rooms (VDRs) are used, with encryption, logging and tiered access controls. Sensitive information should be anonymised or redacted where feasible. Disclosure must rely on a lawful basis, typically legitimate interests, and be supported by appropriate confidentiality arrangements, including NDAs and documented access restrictions.

In asset deals, the transfer of personal data to the purchaser requires a lawful basis and must remain compatible with the original purposes of processing. Where a new controller is introduced, transparency obligations under Articles 13 and 14, GDPR may apply. In share deals, although the legal entity remains unchanged, changes in processing practices or governance structures may trigger updated transparency or internal compliance measures.

Cross-border transactions must comply with the international transfer regime under Chapter V GDPR.

Following closing, integration requires alignment of privacy notices, retention schedules and technical and organisational measures. Data must be securely transferred, retained or deleted in accordance with the agreed transaction structure and GDPR principles.

Cross-border transfers of personal data from the EU are governed by Chapter 5 of the GDPR.

A “transfer” occurs where personal data is disclosed or made accessible to a controller or processor in a third country (that is to say, outside of the EU) or to an international organisation. This includes remote access from outside the EU.

Personal data may be transferred to a third country only if Chapter 5 conditions are met. The primary route is an adequacy decision under Article 45, GDPR. Where the European Commission has determined that a third country ensures an essentially equivalent level of protection, personal data may flow to that country without additional transfer authorisation. In the absence of adequacy, transfers must rely on appropriate safeguards under Article 46, GDPR, most commonly standard contractual clauses or binding corporate rules. Exporters are required to assess whether the legal framework of the recipient country allows the safeguards to be effective in practice.

Derogations under Article 49, GDPR are available only for specific and occasional situations and cannot be used for systematic or large-scale transfers.

Regarding non-personal data, EU law does not impose a comparable transfer regime. However, the Data Act introduces safeguards aimed at preventing unlawful third-country governmental access to data held by EU data processing service providers.

Under the GDPR, international transfers of personal data do not generally require prior registration, notification or approval by a supervisory authority, provided that a recognised transfer mechanism under Chapter 5 is used.

Sector-specific frameworks (eg, in financial services, telecommunications or export-controlled industries) may impose separate notification or approval requirements under national or EU law.

EU law promotes the free movement of data within the Union. The Free Flow Regulation prohibits member states from imposing unjustified localisation requirements for non-personal data. For personal data, the GDPR does not impose localisation requirements but restricts transfers to third countries unless Chapter V conditions are met. Remote access from a third country is typically considered a transfer and must comply with GDPR transfer mechanisms.

Sector-specific rules may impose localisation or residency requirements in limited contexts (eg, certain public sector like health or financial data), but these are exceptions rather than the rule at EU level.

EU law contains rules that may restrict compliance with certain foreign disclosure or discovery orders.

Under Article 48, GDPR, judgments or administrative decisions from third-country authorities requiring the transfer or disclosure of personal data are enforceable in the EU only if based on an international agreement, such as a mutual legal assistance treaty (MLAT). In other words, foreign court orders cannot, by themselves, justify a transfer of personal data from the EU.

Where a foreign authority requests access to personal data, the disclosure must still comply with the applicable transfer mechanism (eg, adequacy or appropriate safeguards).

Recent developments at the EU level regarding the regulation of international transfers of personal data continue to be shaped by the aftermath of the judgment in Schrems II (C-311/18). The EU–US Data Privacy Framework (DPF) was adopted on 10 July 2023, to restore a formal transatlantic data transfer mechanism by addressing privacy concerns related to US surveillance and redress options.

Although debate persists as to whether the DPF fully achieves its intended level of protection, the General Court, in its judgment of 3 September 2025, Latombe v Commission (T-553/23), upheld the DPF, accepting that safeguards introduced by US Executive Order 14086 (7 October 2022) and the Data Protection Review Court could ensure a level of protection essentially equivalent to that guaranteed by EU law.

In May 2025, the Irish Data Protection Commission (DPC) imposed a substantial EUR530 million fine on TikTok Technology Limited (“TikTok”) for failing to ensure equivalent protection for personal data transferred to China. The DPC found that, despite conducting a transfer impact assessment (TIA), TikTok had not sufficiently assessed Chinese laws and practices affecting the data and therefore failed to demonstrate “essential equivalence” under the GDPR. This decision raises doubts about the sufficiency of standard contractual clauses (SCCs) for transfers to certain third countries and highlights the practical challenges of compliance where risk assessments depend on third-country legal frameworks rather than solely on the physical location of data.

The CJEU and the EDPB have also provided important clarifications on international transfer rules, including the requirement to conduct transfer impact assessments to determine whether supplementary measures are needed in addition to SCCs (see EDPB Guidelines 05/2021 and Recommendations 01/2020; Schrems I (C-362/14); Schrems II (C-311/18); Ministerstvo zdravotnictví (C-710/23); and Bindl v Commission (T-354/22)). Together, this case law and regulatory guidance consolidate a framework that requires exporters to assess third-country laws carefully and implement effective supplementary safeguards where necessary.

Looking ahead, proposed amendments under the Digital Omnibus Regulation include a more contextual definition of personal data, under which information would not qualify as personal data for a given entity if that entity cannot reasonably identify the data subject. This could potentially reduce the scope of GDPR obligations, including those related to international transfers, by excluding certain data from being classified as personal data. In practice, this may ease burdens on certain cross-border data routing arrangements by limiting the application of Chapter 5, GDPR.

In EDPS v SRB (C-413/23 P), the CJEU confirmed that whether information constitutes personal data depends on the circumstances, particularly on who holds the additional information necessary for re-identification and whether re-identification is reasonably likely in relation to a particular recipient. This confirms a contextual (or relational) understanding of personal data: information may constitute personal data in the hands of one actor, yet not for another recipient lacking reasonably available means of re-identification.

Gerrish Legal

15 rue de Surène
75008
Paris
France

Kammakargatan 47
11124
Stockholm
Sweden

+33 6 74 02 45 07

info@gerrishlegal.com www.gerrishlegal.com
Author Business Card

Law and Practice in EU

Authors



Gerrish Legal is a Paris and Stockholm-based boutique law firm with additional presence in London, specialising in privacy, data protection, AI and technology law. With lawyers qualified in France, England and Wales and Ireland, the firm’s multilingual team advises international clients – from scale-ups to listed multinationals – across sectors such as SaaS, life sciences, fashion, recruitment, security and catering. Its core practice focuses on GDPR compliance, international data transfers, AI regulation, digital platform regulation, privacy-by-design frameworks, data breach management and privacy litigation. The firm also has strong expertise in commercial law, particularly technology contracts (SaaS), cross-border commercial arrangements and intellectual property matters. Gerrish Legal advises both EU-based organisations on privacy compliance and non-EU companies expanding into Europe (particularly France) by adapting their data governance frameworks and commercial practices to EU regulatory requirements, including the GDPR, AI Act, Data Act and sector-specific digital regulations.