Data Protection & Privacy 2026

Last Updated March 10, 2026

France

Law and Practice

Authors



Nomos is an independent business law firm bringing together approximately 40 lawyers and legal professionals. The firm advises and represents clients in both advisory and litigation matters, in France and internationally. In an evolving legal and regulatory environment, Nomos combines precision, strategic vision and responsiveness. Its practice is grounded in a detailed understanding of sector-specific economic and financial challenges, enabling the firm to support innovative and complex business models. It has particular strength in technology, data protection, IP, entertainment and regulated sectors, positioning the firm at the forefront of digital and market transformations. Nomos adopts a comprehensive, 360-degree approach, integrating sharp sector knowledge, legal creativity and a results-oriented mindset. Its teams are committed, agile and focused on building trust-based, long-term relationships. The firm’s main areas of practice include technology and data protection; media and entertainment; IP; competition and consumer law; corporate/M&A; tax; employment; healthcare and life sciences; business litigation; and non-profit and professional organisations law.

General Legal Framework

France’s data protection and privacy landscape is built upon a two-tier regulatory structure combining European Union law with national legislation. The cornerstone of this framework is the General Data Protection Regulation (Regulation (EU) 2016/679) (GDPR), which has been directly applicable since 25 May 2018, and serves as the primary legal instrument establishing minimum standards for data protection across all EU member states, including France. Under French domestic law, the main regulatory instrument for the protection of personal data is the French Data Protection Act (FDPA) – known as “Loi informatique et libertés”, which was originally enacted on 6 January 1978 (Law No 78-17) and has been amended several times since.

The interaction between the EU texts and the FDPA is hierarchical but complementary: the GDPR sets the harmonised standard, while the FDPA:

  • fills gaps left by the GDPR – for example, the FDPA contains rules on the fate of personal data after the death of the data subjects, which are expressly excluded from the GDPR’s scope;
  • introduces specific requirements and protection pursuant to the GDPR’s opening clauses – for example, the FDPA defines a specific regulatory framework to process health data in the context of research projects;
  • transposes the EU Law Enforcement Directive (2016/680), as well as the cookie provisions of the EU ePrivacy Directive (2002/58), into French Law; and
  • delineates the powers of the national regulator, the National Commission for Information Technology and Civil Liberties (Commission Nationale de l'Informatique et des Libertés – CNIL).

Sectoral Legal Instrument

Sectoral instruments also play an important role. The Public Health Code (Code de la santé publique) contains particular provisions governing the processing of health data, including requirements for hosting certification (“HDS certification”). The Post and Electronic Communications Code (Code des postes et des communications électroniques – CPCE) governs electronic marketing and the confidentiality of correspondence, transposing the ePrivacy Directive. Furthermore, the Labour Code (Code du travail) regulates employee monitoring, requiring proportionality and prior consultation with employee representative bodies.

Extraterritorial Reach

Both the GDPR and the FDPA have significant extraterritorial reach. Under GDPR Article 3, the regulation applies to organisations established in the EU, regardless of where data processing occurs; and to organisations established outside the EU that offer goods or services to data subjects in the EU or monitor their behaviour. This territorial principle is reproduced in the FDPA. Regarding specifically the “opening provisions”, the FDPA states that French rules apply as long as the data subject resides in France, including when the data controller is not established in France.

Interplay with Non-Personal Data, Cybersecurity, and AI Regulation

France’s regulatory framework increasingly integrates data protection with rules on non-personal data, cybersecurity and AI.

  • The Data Act (Regulation (EU) 2024/1864), applicable from September 2025, governs data access and sharing – especially for IoT data – and overlaps with the GDPR where personal data is involved; France’s 2024 SREN Law (Loi sur la Sécurité et Régulation des Espaces Numériques) complements this by imposing interoperability and portability duties on cloud providers.
  • The French cybersecurity agency, ANSSI (Agence nationale de la sécurité des systèmes d'information) enforces NIS and NIS2 obligations, requiring incident notifications that may also trigger GDPR breach reporting to the CNIL.
  • The AI Act (Regulation (EU) 2024/1689), phased in until August 2027, applies cumulatively with the GDPR when AI systems process personal data, requiring organisations to comply simultaneously with both AI-specific and data protection requirements.

General Principles for Personal Data Processing

The processing of personal data in France is governed by foundational principles established in the GDPR and reflected in the FDPA. These principles constitute the normative framework within which all data processing must occur.

  • Lawfulness, fairness and transparency: All processing must be lawful, fair, and transparent. Lawfulness requires a valid legal basis under GDPR Article 6 (or FDPA Article L 13-13), which includes consent, contract performance, legal obligation, vital interests, public task, or legitimate interests. Fairness requires that processing must be carried out honestly and without deception. Transparency mandates providing clear information to data subjects regarding the main aspects of processing.
  • Purpose limitation: Personal data may be processed only for specified, explicit and legitimate purposes. Any use beyond the original purpose requires a compatibility assessment or a new legal basis.
  • Data minimisation: Only personal data that is adequate, relevant, and strictly necessary for the specified purposes may be collected and processed.
  • Accuracy: Personal data must be accurate and kept up to date, with reasonable measures taken to rectify or delete inaccurate data.
  • Storage limitation: Personal data must be kept only as long as necessary for the processing purpose.
  • Security: Personal data must be processed securely and protected against unauthorised access, alteration, loss or destruction through appropriate technical and organisational measures.
  • Accountability: Data controllers must demonstrate compliance with all principles through documentation, records, impact assessments, and governance structures.

Data Subject Rights

The GDPR grants individuals a broad set of enforceable rights over their personal data. These include the right to access and obtain information about how their data is processed, to rectify inaccurate data, and, where legally justified, to request erasure. Data subjects may also restrict processing, receive their data in a portable and machine-readable format, and object to certain types of processing, including marketing or processing based on legitimate interests or public tasks. In addition, they are protected against purely automated decision-making, with the right to seek explanations and human review, and they may lodge complaints with a supervisory authority free of charge.

The FDPA adds a specific data subject right, linked to digital death: Article 85 of the FDPA allows individuals to define general or specific directives concerning the storage, erasure and communication of their personal data after their death.

Organisational Compliance: Essential Checklist

  • Apply privacy by design/default: Embed data protection principles from the design phase of products or services.
  • Establish a legal basis: Identify and document a valid legal basis (GDPR Article 6) for each processing purpose and reassess when purposes or conditions change.
  • Define retention and deletion rules: Establish structured retention periods and ensure systematic, preferably automated deletion backed by documented reviews.
  • Ensure adequate security measures: Implement technical and organisational safeguards appropriate to processing risks, including encryption, access controls, testing, and incident response plans.
  • Provide transparent information: Supply complete privacy information at collection (at first contact or within one month when collection is indirect), covering all mandatory items listed under Article 13 (direct collection) or 14 (indirect collection) of the GDPR.
  • Enable data subject rights: Set up practical processes to exercise rights easily, ensuring effective access, objection, deletion, restriction, etc.
  • Manage third-party processing: Contractually define data protection obligations when working with external processors or controllers.
  • Conduct data protection impact assessments (DPIAs): For high-risk processing, perform DPIAs outlining risks and mitigation measures.
  • Appoint a data protection officer (DPO): Organisations that do extensive or sensitive data processing must designate an independent, qualified DPO who reports directly to senior management.
  • Document compliance.

Prohibition on Processing Special Categories of Personal Data

Under Article 9 GDPR, processing of special categories of personal data (including data revealing racial or ethnic origin, political opinions, religious beliefs, trade union membership, and genetic, biometric, or health data) is prohibited unless a specific derogation applies.

Derogations include, among other things:

  • obtaining the explicit consent of the data subject to the processing of such data;
  • specific obligations and rights in the fields of employment, social security, or social protection law;
  • protection of the vital interests of the data subject when they are physically or legally incapable of giving consent;
  • preventative or occupational medicine, medical diagnosis, and the management of health or social care systems and services – generally under the responsibility of a professional subject to confidentiality obligations;
  • public interest in the field of public health, eg, monitoring epidemics; and
  • archiving in the public interest, for scientific or historical research purposes, or statistical purposes, subject to appropriate safeguards.

Specific hosting requirement for health data

Where health data is hosted for care-related activities, French law requires use of a certified host (“Health Data Hosting Provider” – Article L 1111‑8 of the Public Health Code), with contractual constraints.

Processing data related to criminal convictions

Personal data related to criminal convictions is not among the special categories where processing is prohibited by principle under Article 9 of the GDPR. However, it is also subject to specific rules. Under Article 46 of the FDPA (implementing Article 10 GDPR), processing data regarding criminal convictions is largely a state monopoly, reserved for courts and public authorities. Private entities are strictly limited in their ability to process such data. Employers, for example, cannot conduct blanket criminal background checks; they may only request a criminal record extract (Bulletin No 3) if it is strictly necessary for the job and authorised by law (eg, for security personnel or employees working with minors).

Protecting minors’ data

Following the GDPR, the age of digital consent for the processing of personal data in France is 15, meaning that children under 15 cannot validly consent to the processing of their data by online services without parental authorisation. Organisations must implement proportionate age verification or parental consent mechanisms and ensure that data collection remains strictly necessary. More broadly, the protection of minors is a crucial and highly topical issue in France. At the time of writing, parliamentary debates are underway regarding a proposed ban on social media for those under the age of 15.

Scientific Research (Including Health Research, Studies and Evaluations)

The GDPR recognises a research-oriented regime: further processing for scientific research is, subject to Article 89 safeguards, not considered incompatible with the initial purpose (Article 5(1)(b)), and member state law may allow limited derogations from certain data-subject rights where necessary and proportionate.

To process health data (see 1.3 Special Categories of Personal Data), controllers commonly rely on the scientific research ground (Article 9(2)(j)) coupled with Article 89 safeguards.

France adds a specific gatekeeping layer for health-sector “research/study/evaluation” processing: such processing must pursue a public-interest purpose and either comply with a CNIL reference methodology (in which case, it can proceed after filing a simple declaration of compliance) or have obtained specific authorisation from the CNIL.

The CNIL reference methodologies include MR‑001 (scientific research requiring patients’ consent), MR‑003 (certain research without patient’s consent but with information and a right to object) and MR‑004 (secondary-use studies/research not involving the person).

Product and Service Development

If R&D is primarily product improvement, commercial analytics or model training, it may not qualify as “scientific research” and should be assessed under ordinary GDPR rules: a valid Article 6 legal basis plus, if health data or other special categories of data are processed, an Article 9(2) condition (often explicit consent). Re-use of consent-based data beyond the original scope generally requires new consent or a new legal basis.

Anonymisation as a Data Protection Safeguard

The legal provisions outlined above governing scientific research and product or service development apply to the processing of “personal” data. Where such data is anonymised in accordance with principles and technical requirements, ensuring no reasonable risk of re-identification (notable in accordance with WP29 Opinion 05/2014), it ceases to constitute personal data. As a result, it falls outside the scope of the GDPR and may be used freely for research or development purposes.

True anonymisation is therefore a key privacy protection mechanism and is encouraged in all cases. Moreover, when identifiable data is not strictly necessary for the intended processing, anonymisation is required to comply with the GDPR’s data minimisation principle.

Anonymisation must not be confused with pseudonymisation. Pseudonymisation replaces direct identifiers (eg, name, surname) with a code (eg, a number), but the data remains personal as long as a re-identification key exists. A recent CJEU ruling held that, in certain situations, pseudonymised data may be considered anonymous for a third party that cannot reasonably access the key or other re-identifying information.

The European Health Data Space (EHDS)

Regulation (EU) 2025/327 establishing the European Health Data Space (EHDS), will create a harmonised EU framework for accessing and re‑using electronic health data. It will apply from 26 March 2027, with the core secondary‑use regime becoming operational on 26 March 2029. For life sciences companies, the EHDS establishes a structured EU‑wide pathway for secondary use of electronic health data through data permits issued by national Health Data Access Bodies (HDABs) or health data requests, including for scientific research and related development or innovation activities (such as algorithm training and evaluation).

Access must occur exclusively within a secure processing environment with strict technical and organisational controls, including restricted access and mandatory audit logs. Only non‑personal, anonymised outputs may be exported. Certain purposes – such as advertising or marketing – and any form of re‑identification are expressly prohibited. Results of secondary use must be published in anonymised form within 18 months. Individuals may opt out of secondary use of their personal electronic health data; where identifiable, their data may not be used for new permits or newly approved requests, subject to limited exceptions under national law.

Enforcement risks are significant: fines may reach EUR10 million or 2% of global turnover, going up to EUR20 million or 4% of global turnover for serious infringements (including prohibited uses, extraction of personal data from secure environments, and re‑identification). The EHDS applies alongside the GDPR, meaning that parallel GDPR liability may arise.

Applicability of Data Protection Laws to AI

In France, the personal data regulatory framework, comprising the EU GDPR and the FDPA, fully applies to the processing of personal data through AI systems. The GDPR already includes rules that are particularly relevant for AI systems, such as the prohibition – subject to certain exceptions – of fully automated decisions that produce legal effects or significantly affect the individuals concerned (Article 22 of the GDPR).

CNIL’s AI Guidance

The CNIL has positioned itself as a key AI regulator by issuing guidance in 2024–2025 on how the GDPR applies to AI systems. Its recommendations address core issues such as choosing a lawful basis, complying with data minimisation and transparency principles, conducting DPIAs, managing data subject rights, securing AI development, mitigating bias, and ensuring proper documentation and accountability, thereby promoting a risk-based and integrated compliance approach.

Overlay of the EU AI Act

The AI Act (Regulation (EU) 2024/1689) establishes a risk-based compliance framework distinct from, but complementary to, EU data protection law. It introduces prohibitions, transparency duties, and extensive obligations for high-risk AI systems and certain general-purpose models. Application is being phased in between 2025 and 2027 (prohibitions from February 2025; general-purpose model obligations from August 2025). It does not affect the GDPR (Article 2(7)).

  • Transparency: Providers of high-risk systems must prepare technical documentation (Article 11) and clear instructions for use enabling risk assessment and mitigation (Article 13), including, where relevant, a DPIA (Article 26(9)). Specific information duties apply towards individuals, eg, disclosure of AI interaction or synthetic/deepfake content (Articles 50, 26(11)).
  • Data governance: High-risk system providers must implement documented governance for training, validation, and testing data, ensuring relevance, representativeness, quality, traceability, and bias mitigation. Limited processing of special categories of data is permitted strictly for bias monitoring under enhanced safeguards (Article 10).
  • Human oversight: High-risk systems must enable effective human control (interpretation, override, interruption, disregard of outputs) (Article 14). Deployers must assign oversight to competent persons with appropriate authority (Article 26(2)), reinforcing fundamental-rights safeguards, including data protection.

Under Article 33 of the GDPR, data controllers must notify the competent data protection authority of any personal data breach not later than 72 hours after becoming aware of it, unless the breach is unlikely to result in a risk to the rights and freedoms of natural persons. In France, this notification is typically done via CNIL’s dedicated online portal. Furthermore, if the breach is likely to result in a high risk to the rights and freedoms of a natural person, Article 34 of the GDPR imposes controllers to communicate the personal data breach to the data subject “without undue delay”.

Cross-Reporting

In France, the notification landscape in case of a security incident is fragmented. For example, where an entity is subject to the NIS framework, it must separately notify ANSSI of significant cyber-incidents. For financial entities, the EU Digital Operational Resilience Act (DORA) requires the notification of major ICT-related incidents to the “competent authority”. In France, depending on the type of financial entity concerned, this will generally be the Autorité de contrôle prudentiel et de résolution (ACPR) and/or the Autorité des marchés financiers (AMF). At the time of writing, the European Commission has proposed, as part of a digital regulatory simplification initiative (often referred to as the “Digital Omnibus”), the creation of a single reporting entry point intended to streamline notifications under several EU digital frameworks (including NIS2, the GDPR and DORA). This remains a legislative proposal and is not yet in force.

Action Items for Organisations

  • Immediate assessment: Upon detection, the organisation must assess the nature of the breach, the categories of data (eg, sensitive v non-sensitive), the volume of data subjects affected, and the likely consequences. The goal is to assess the level of risk for data subjects (and thus what notification obligations apply) and determine the remediation measures that must be implemented to contain the incident and avoid its repetition.
  • Notify the CNIL (unless risks for data subjects are unlikely): Submit breach notification to the CNIL within 72 hours via the online portal, with sufficient specificity.
  • Notification to data subjects: If the breach poses a “high risk” to individuals (eg, risk of identity theft, fraud or discrimination), the controller must communicate the breach to the data subjects without undue delay, providing clear advice on how to protect themselves (Article 34 GDPR).
  • Additional actions: Controllers will often also consider filing a criminal complaint and notifying their cyber-insurance provider (typically also within 72 hours).
  • Documentation: All breaches, including those not notified to the CNIL, must be recorded in an internal breach register, detailing the facts, effects and remedial actions taken. This register is often one of the first documents requested during a CNIL inspection.

Recent enforcement actions, such as the EUR5 million fine against France Travail in 2024 and the EUR45 million fine against the internet access provider FREE in 2025 following massive data leaks, underscore the CNIL’s intolerance for failures in basic security hygiene and breach response.

France employs a multi-regulator model for the digital space, requiring organisations to navigate overlapping jurisdictions.

French Data Protection Authority

The CNIL acts as the French guardian of data privacy. Its missions and powers include:

  • Standards and guidelines – it develops frameworks for sectors like AI, health data, and cookies. While technically “advisory”, these are treated as authoritative by the courts.
  • Public awareness – it provides educational resources and tools (templates, checklists) for the public and small businesses.
  • Investigative procedures – investigations are triggered either by individual complaints (approximately 4,000/year) or self-initiated audits by the CNIL.
  • Enforcement powers – the CNIL has broad legal authority to ensure compliance, including by issuing administrative fines (up to EUR20 million or 4% of the worldwide annual turnover, whichever is higher), public reprimands or injunctions, or revoking certifications.

The CNIL operates within a network of regulators:

  • Domestic – it notably co-ordinates with ANSSI (cybersecurity), the ACPR (finance), and the ARCEP (telecoms).
  • International – as a member of the European Data Protection Board (EDPB), it works with other EU authorities to ensure consistent application of the GDPR across borders.

Sectoral Authorities

  • ANSSI (the National Cybersecurity Agency of France) is the authority dealing with cyber-defence and resilience. It enforces the NIS Directive and certifies security products (eg, SecNumCloud). ANSSI focuses on technical standards and critical infrastructure protection.
  • ARCEP (Autorité de Régulation des Communications Électroniques, des Postes et de la Distribution de la Presse) has been empowered by the SREN Law and the Data Governance Act to regulate data intermediation services and the cloud computing market (specifically regarding switching and interoperability).
  • ARCOM (Autorité de régulation de la communication audiovisuelle et numérique) acts as the digital services co-ordinator for France under the Digital Services Act (DSA) and oversees certain aspects of platform regulation, particularly regarding the protection of minors online.
  • The DGCCRF (Direction générale de la concurrence, de la consommation et de la répression des fraudes) is the Directorate General for Competition, Consumer Affairs and Fraud Control, which focuses on consumer protection aspects of digital platforms, including dark patterns and transparency in online marketplaces.
  • The ACPR (Autorité de contrôle prudentiel et de résolution) is the French Prudential Supervision and Resolution Authority, which supervises financial institutions and addresses data protection and cybersecurity standards alongside financial regulation.
  • The ANSM (L'Agence nationale de sécurité du médicament et des produits de santé) is the National Agency for the Safety of Medicines and Health Products. It oversees medical device manufacturers’ data protection compliance, particularly concerning patient data processing and device security.

These authorities notably co-ordinate through a formal national network established by the SREN Law to ensure consistent application of the DSA.

Investigation Initiation and Procedures

The CNIL generally initiates investigations based on complaints, external reports, data breach notifications, or on its own initiative when it identifies a sensitive risk area or sector. It may investigate any controller or processor through unannounced on-site inspections, document-based reviews, hearings, or online checks of websites and services.

During these investigations, the CNIL collects documents, interviews staff and examines systems to assess compliance with the GDPR and the FDPA. An investigation can close with no further action, or the CNIL’s president may issue a formal notice (mise en demeure) giving the organisation a deadline within which to comply. This notice is not a sanction but a warning phase. If breaches are serious, persistent or unresolved, the president may – whether or not a prior notice was issued – refer the case to the CNIL’s sanctions committee (formation restreinte), which is responsible for imposing sanctions.

Once referred, a formal sanctions procedure begins: the president appoints a rapporteur, who exchanges written submissions with the organisation and may hold a hearing. The case is then transmitted to the sanctions committee for an adversarial decision. The CNIL has no power to settle cases.

Sanctions imposed by the panel are immediately enforceable, and fines are paid to the French Treasury (meaning they do not finance the CNIL). Decisions can be appealed before the Conseil d’État.

For less serious cases, a simplified procedure allows the president of the sanctions committee or a designated member to decide alone, without a public hearing, with lower maximum fines (up to EUR20,000) and limited publicity – enabling the CNIL to handle a higher volume of straightforward infringements.

Available Sanctions and Remedies

Administrative sanctions

  • Warnings: Formal notice of non-compliance without financial penalties, typically for minor violations or first-time offences where organisations demonstrate rapid remediation.
  • Injunctions to cease processing: Orders requiring organisations to halt specific data processing within defined timeframes.
  • Interim injunctions with daily penalties: Injunctions compelling specific corrective actions (to implement data protection policies, delete unlawful data or implement security measures) with daily penalty assessments (up to EUR100,000 per day) for non-compliance.
  • Administrative fines: Monetary penalties up to EUR20 million or 4% of the annual worldwide turnover (whichever is higher) pursuant to GDPR Article 83.
  • Certification revocation: Where organisations hold CNIL certifications, the CNIL may revoke these, effectively prohibiting certification-dependent activities.
  • Public disclosure: CNIL decisions, particularly those imposing significant fines, are typically published (sometimes, but seldom, in anonymised form).

Civil remedies

Beyond CNIL administrative sanctions, data subjects may pursue civil damages before the French judicial courts (pursuant to GDPR Article 82).

Criminal sanctions

Serious data protection violations may trigger criminal sanctions under the French Criminal Code, which criminalises, among other things:

  • failure to implement security measures (Article 226-17);
  • the processing of special categories of personal data without legal basis (Article 226-19); or
  • failure to notify personal data violations (Article 226-17-1).

Criminal penalties include fines up to EUR300,000; imprisonment for up to five years; and publication of conviction decisions. However, criminal prosecution is very rare in practice. The CNIL refers cases to prosecuting authorities where criminal elements appear evident, but prosecutors exercise prosecutorial discretion in pursuing charges.

Aspects Used to Set Penalty Levels

The CNIL applies the criteria set out in Article 83(2) of the GDPR to determine the amount of the fine, including the seriousness and impact of the infringement, whether the conduct was intentional or negligent, and the controller’s efforts to mitigate damage. It also considers elements such as prior violations, the degree of co-operation with the authority, and the sensitivity of the personal data concerned.

It is important to note that the CNIL does not provide details on how it reaches a specific amount, including when that amount covers infringements of different natures. The absence of any obligation in this respect was confirmed by the French Supreme Administrative Court (Conseil d’État).

Over the last approximately 24 months, data-protection enforcement in France has become both higher volume and higher impact. In 2024, the national regulator reported 331 corrective measures, including 87 financial penalties totalling EUR55,212,400, alongside a record 17,772 complaints. It also received 5,629 personal-data breach notifications (20% more than in 2023) and highlighted a rise in very large-scale incidents, noting that breaches affecting more than one million individuals doubled. In 2025, it issued 83 sanctions with cumulative fines of EUR486,839,500 and identified three recurring enforcement themes: cookies/trackers, employee monitoring, and data security.

Most significant enforcement trends:

  • Cookies and trackers are driving the biggest exposure: there have been repeat findings to do with placing trackers without valid prior consent or providing sufficient user information, as well as failure to respect refusals or withdrawals of consent.
  • Workplace surveillance under sustained scrutiny: enforcement emphasised that continuous or disproportionate employee monitoring, notably through video surveillance, is unlawful in the absence of exceptional, well-justified circumstances.
  • Processor compliance in the spotlight: sanctions also addressed breaches of processors’ obligations, recalling in particular the need to –
    1. implement appropriate technical and organisational security measures;
    2. process data only on the controller’s documented instructions; and
    3. delete data at the end of the contractual relationship.
  • Security of processing as a major enforcement driver: the regulator issued particularly heavy sanctions in 2025 for shortcomings in the security of processing, repeatedly underscoring the need to apply strict “baseline hygiene” measures (eg, strong authentication and access control, timely patching, secure configuration, least privilege, and monitoring).
  • More frequent enforcement via streamlined procedures: the regulator attributes a sharp rise in the number of sanction decisions to its simplified process, increasing the likelihood of action against a broader set of organisations.

General Trends

Over the last 24 months, personal data disputes in France have increasingly reached the courts through private enforcement of the GDPR. Two recurring streams stand out: (i) compensation claims following security incidents or unauthorised disclosures; and (ii) litigation seeking compliance with data-subject rights – particularly access requests, including in an employment context (eg, access to professional emails, subject to the rights of others).

Claimants are mainly individuals (customers, users, employees or former employees). Typical remedies include compliance orders (eg, to grant access, erase data, or cease unlawful processing), sometimes subject to penalty payments, as well as damages.

Non-Material Damage

Non-material damage is compensable under Article 82 GDPR, but a mere infringement is insufficient. The claimant must prove:

  • an infringement;
  • actual damage (including non-material); and
  • a causal link (CJEU, C-300/21, Österreichische Post).

No minimum seriousness threshold applies, yet harm must be real and proven.

Negative emotions (fear, anxiety, loss of control, reputational concern, risk of misuse) may qualify if substantiated and causally linked (CJEU, C-340/21, Natsionalna agentsia za prihodite; CJEU, C-655/23, Quirin Privatbank). Purely hypothetical risk or unproven fear is insufficient.

Compensation is strictly compensatory, not punitive. It must reflect the actual damage suffered; fault is irrelevant to quantification once liability is established (CJEU, C-667/21, Krankenversicherung Nordrhein). A separate injunction under national law cannot reduce damages (CJEU, C-655/23, Quirin Privatbank).

The GDPR does not harmonise methods of calculation. Quantification remains subject to national procedural autonomy, constrained by equivalence and effectiveness, and Article 83 fine criteria cannot be transposed to Article 82 (CJEU, C-741/21, juris GmbH).

The court decisions that have most significantly shaped the framework for privacy litigation before the courts (ie, to be distinguished from the CNIL’s GDPR enforcement cases through administrative proceedings) in recent years are described below.

Article 82 GDPR – Conditions and Nature of Compensation

  • CJEU, C-300/21, Österreichische Post (4 May 2023): The court held that compensation requires proof of an infringement, actual damage, and a causal link, and that a mere GDPR breach does not automatically entitle a claimant to damages. Non-material damage need not reach a seriousness threshold, but it must be proven, and compensation must remain purely compensatory under national rules subject to equivalence and effectiveness.
  • CJEU, C-741/21, juris (11 April 2024): The court clarified that infringement of a GDPR right does not in itself constitute non-material damage, although “loss of control” may qualify if genuinely suffered. It further held that the fine criteria in Article 83 are irrelevant when assessing damages under Article 82.
  • CJEU, C-590/22, PS (20 June 2024): The court recognised that proven fear of disclosure, together with its negative consequences, may constitute non-material damage, while reaffirming that Article 82 has no punitive or deterrent function.
  • CJEU, C-200/23, Agentsia po vpisvaniyata (4 October 2024): The court held that temporary public accessibility of personal data may generate compensable “loss of control” even without additional tangible harm, provided actual damage is established.
  • CJEU, C-655/23, Quirin Privatbank (4 September 2025): The court confirmed that negative emotions may amount to non-material damage if proven and causally linked, and it clarified that preventative injunctions cannot replace or reduce compensation under Article 82.

Structure of Liability and Exoneration

  • CJEU, C-667/21, Krankenversicherung Nordrhein (21 December 2023): The court characterised Article 82 as establishing fault-based liability with a strict exoneration standard, under which the controller avoids liability only by proving that it is not in any way responsible for the event causing the damage.
  • CJEU, C-741/21, juris(11 April 2024): The court confirmed that a controller cannot evade liability by attributing the infringement to a person acting under its authority.

GDPR Breach and Unfair Competition

CJEU, C-21/23, Lindenapotheke (4 October 2024): The court ruled that Chapter VIII of the GDPR does not preclude national law from allowing competitors of an undertaking infringing the GDPR to bring civil proceedings against the undertaking for unfair commercial practices, where such infringement confers a competitive advantage on that undertaking.

Representation Under the GDPR

GDPR Article 80 allows individuals to mandate non-profit bodies, organisations or associations to lodge complaints and bring judicial proceedings on their behalf. Member states may also allow certain bodies to act without a mandate.

Action de Groupe” and the 2025 Reform

France has allowed class-action style procedures (action de groupe) in specific sectors, including consumer matters and data protection. In 2025, France adopted a significant reform aimed at streamlining and expanding group actions (Law No 2025-391 of 30 April 2025). The new framework establishes a unified regime applicable across matters and is designed to facilitate claims for both cessation of unlawful conduct and compensation, subject to French procedural safeguards and typically an opt-in mechanism for claimants.

The EU Data Act

The European Data Act (Regulation (EU) 2023/2854), adopted on 13 December 2023, entered into force on 11 January 2024 and has been applicable since 12 September 2025. It sets EU-wide rules on fair access to and use of data. It focuses on data generated by connected products and related services, and on switching and interoperability for data processing services (including cloud and edge computing services).

The regulation applies to situations involving both personal and non-personal data. Where personal data is concerned, the GDPR prevails in cases of conflict, and the Data Act operates without prejudice to EU data-protection law.

Scope and key mechanisms

  • Product data and related service data: users of connected products are granted access rights to data generated by their use of the product or related service, including necessary metadata.
  • B2B data sharing: data holders must make data available to third parties at the user’s request, subject to safeguards (eg, trade secrets).
  • Public-sector access: public bodies may request data in cases of “exceptional need” under strict conditions.
  • Cloud switching and portability: providers of data processing services must remove unjustified switching barriers and enable portability of customers’ exportable data and digital assets.

The regulation defines key roles, including users, data holders, data recipients, and, for cloud services, customers and providers of data processing services. It also references data intermediation services as defined in the Data Governance Act.

French Context

While the Data Act is EU-wide regulation, France has anticipated it with the “SREN” Law No 2024-449 of 21 May 2024 (Law to Secure and Regulate the Digital Space). The SREN Law establishes interoperability and portability obligations for cloud computing service providers, complementing Data Act requirements. This law addresses specific French concerns regarding data sovereignty and technological resilience.

A core principle of the EU Data Act is that, whenever personal data is involved, the GDPR continues to apply, and prevails in case of conflict. For mixed datasets (containing personal and non-personal data), the GDPR applies to the personal-data component. Where the two are technically or functionally inseparable, GDPR obligations cannot be bypassed and, in practice, the dataset must be handled under GDPR-compliant safeguards.

This interaction creates specific legal constraints. For example:

  • Legal basis – The Data Act grants rights of access and sharing but does not create a standalone GDPR legal basis for processing personal data. Where disclosure involves personal data – especially if the requesting user is not the data subject – a valid GDPR legal basis must independently exist.
  • Trade secrets and IP – The Data Act protects trade secrets and intellectual property. As a rule, access should be enabled with appropriate confidentiality measures; refusal to share on trade secret grounds is limited to narrowly defined and exceptional circumstances and cannot serve as a pretext to block legitimate access.

Access and Sharing of Connected Product Data

The Data Act gives users of connected products and related services the right to access and share certain “readily available” product and related service data, including necessary metadata (Articles 3–7).

Accordingly, data holders must notably:

  • design products for data accessibility (for products placed on the market from 12 September 2026);
  • provide access without undue delay, free of charge, in a structured, commonly used, machine-readable format (including continuous/real-time access where relevant and technically feasible);
  • enable user-directed sharing with third parties under equivalent conditions;
  • respect the GDPR where personal data is involved;
  • apply trade secret protections only where justified, proportionate and safeguarded; and
  • ensure third-party use complies with statutory limitations.

B2B Fairness and Cloud Switching

The Data Act controls unfair terms in B2B data-sharing contracts where clauses are unilaterally imposed by one party, and it regulates the concept of reasonable compensation in certain mandatory data-sharing scenarios, including specific safeguards in defined SME-related situations. To support implementation, the European Commission issued non-binding Model Contractual Terms and Standard Contractual Clauses in November 2025 (Chapter IV).

In relation to switching between data processing services (including cloud and edge computing services), providers must:

  • allow switching with no more than two months’ notice;
  • complete the transition generally within 30 calendar days (subject to exceptions); and
  • eliminate switching charges entirely from 12 January 2027, following a transitional cost-based regime (Chapter VI).

Public Sector Access and Third-Country Requests

The Data Act permits public sector bodies to access privately held data only in situations of “exceptional need”, including certain emergencies, and subject to strict requirements of necessity and proportionality (Chapter V). It also establishes safeguards against unlawful access by third-country authorities to non-personal data held in the EU, with implications for contractual arrangements and cloud governance structures (Chapter VII).

Key Organisational Priorities

Organisations should:

  • identify whether they qualify as data holders and/or providers of data processing services;
  • map “readily available” product/service data;
  • implement compliant access and sharing mechanisms;
  • align contracts (data-sharing and cloud) with fairness and switching rules;
  • update mandatory information and ensure GDPR alignment; and
  • document trade secret and non-discrimination safeguards.

ARCEP is the competent authority for enforcing the cloud-related provisions of the SREN Law. It has investigative powers, can settle disputes, and may impose fines of up to 3% of worldwide annual turnover (or 5% in the case of a repeat infringement within the statutory period).

Regarding the EU Data Act, a French DDADUE bill currently under parliamentary examination proposes to designate ARCEP as the competent authority for enforcement, with the exception of Chapter VII. At the time of writing, this bill has not yet been adopted.

The CNIL remains the supervisory authority for GDPR matters. For example, if a cloud-switching failure results in a personal data breach, the CNIL is competent for the GDPR aspects (eg, security and breach notification), while ARCEP may address any breach of cloud-switching obligations within its remit.

The SREN Law establishes co-ordination mechanisms – ARCEP must consult the CNIL for certain decisions concerning data intermediation services involving personal data and must refer suspected anti-competitive practices in cloud markets to the French Competition Authority. These arrangements ensure institutional co-ordination but do not create a unified tripartite regulatory regime.

Legal Framework: ePrivacy Implemented in France

In France, the placement or reading of cookies and similar trackers on a user’s terminal equipment is governed by the ePrivacy framework (Directive 2002/58/EC, Article 5(3)) as implemented by Article 82 of the French DPA. As a rule, storing or accessing information on a device requires the user’s prior informed consent, unless a narrow exemption applies (eg, trackers that are “strictly necessary” for providing a service are expressly requested by the user or required to enable core site functionality).

Consent Standard and Practical Design

CNIL guidance has established demanding expectations for consent interfaces: users should be able to refuse as easily as accept; consent must be specific and granular; and withdrawal must be possible at any time with an effect comparable to refusal. Organisations must also be able to evidence consent, document retention periods, and manage third‑party trackers through contract and technical controls (eg, a consent management platform, tag governance, and auditing).

Cookie Walls and Conditional Access

Following French and EU case law developments, cookie walls are not automatically unlawful, but they raise a high bar: the user must have a genuine, free choice and must not be coerced into consenting, taking into account the service’s market position and the availability of alternatives. In practice, many services in France avoid blanket “take it or leave it” consent models and implement alternatives (eg, contextual advertisements, paid access, or reduced tracking options), coupled with clear information.

Enforcement Exposure

Cookies and tracking remain one of the CNIL’s highest enforcement priorities, including for foreign platforms targeting French users. This means that even technically “minor” consent design issues (lack of prominence symmetry between options, unclear purposes, or continued reading of cookies after withdrawal) can lead to significant regulatory risk.

Personalised or targeted advertising is regulated through a combination of the GDPR (eg, legal basis and restriction on profiling where it is associated with automated decision-making), the ePrivacy Directive (consent for trackers and electronic marketing), and the new DSA/SREN Law transparency rules.

Profiling Restrictions

Article 22 of the GDPR strictly regulates decisions based solely on automated processing, particularly profiling, when they produce legal effects or have similar significant impacts on an individual. As a rule, every person has the right not to be subject to such automated decisions. Exceptions apply only when the decision is necessary for a contract, authorised by EU or member state law, or based on the individual’s explicit consent. In these cases, the controller must implement safeguards, including human intervention, the possibility to present one’s viewpoint, and the right to contest the decision. Moreover, automated decisions cannot rely on sensitive data unless very specific legal conditions are met, and reinforced protective measures are in place.

In addition, the SREN Law and DSA prohibit the presentation of advertisements based on profiling using special categories of personal data (eg, political opinions, health). Targeting minors with profiling-based ads is also strictly banned.

Transparency

Under the DSA, online platforms must ensure advertisements are clearly identifiable and provide, directly from the advert itself, meaningful information about the main parameters used to determine why the recipient is being shown the ad, and where applicable, information on how to change those parameters. 

Direct Marketing

Under Article L 34-5 of the CPCE, electronic marketing (email, SMS) requires prior opt-in consent, unless the recipient is an existing customer for similar products/services or is a professional (B2B) being contacted on a work address.   

In France, there is no single “employment privacy act”. Employment privacy is governed by the GDPR and the FDPA, overseen by the CNIL. Importantly, it is also framed by French Labour Code rules (notably proportionality and information/consultation requirements). For instance, the Labour Code expressly states that no personal information may be collected through a device that has not been brought to the employee’s attention beforehand, and the employees’ representative body (Lecomité social et économique – CSE) must be informed/consulted in advance on employee monitoring tools and methods.

Key Themes Related to Privacy in the Workplace

Employee monitoring

Monitoring (eg, time tracking, CCTV, IT logs, geolocation) must be necessary, proportionate and disclosed. For vehicle geolocation, the CNIL indicates that the employer should not track outside working time and information collected should not be used to calculate working time when another method already exists. A CNIL fine published on 4 February 2025 (“inactivity” scoring, frequent screenshots, permanent filming) shows where monitoring crosses the line, and high-risk/constant monitoring may trigger a DPIA.

Remote work

Remote work is defined in Labour Code Article L1222-9 and is usually set by a collective agreement or, failing that, an employer charter after consulting the CSE; it may also be agreed individually. The CNIL highlights reinforced security for telework and configuring tools to minimise data collected.

Communication tools at work

The CNIL notes that employers may control internet and email use for security and policy enforcement but must set rules and inform employees (often via an IT charter). For bring your own device (BYOD), the CNIL recommends separating professional from private spaces; remote wiping should be limited to the professional space, not the whole personal device.

Recruitment

The CNIL’s guide requires collecting only data that is strictly necessary to assess candidates, providing GDPR notices, and defining retention. The CNIL indicates that an unsuccessful candidate’s file may be kept for a short period (eg, up to about three months) to manage feedback/contestation, and then up to two years after the last contact only if the candidate is clearly informed and has agreed or can easily object. Background checks must be job-relevant – a criminal record extract (Bulletin No 3) may be requested only in limited, justified cases and should be consulted but not retained; where an authority grants approval, that approval generally suffices.

Under the EU GDPR and the FDPA, there is no “M&A carve-out” – any disclosure, transfer and reuse of personal data must be lawful, transparent, proportionate and secure. The following key principles must be followed:

  • Due diligence/data room – disclose only what is necessary (data minimisation), and use strict access controls, logging and NDAs; avoid special-category data unless strictly needed.
  • Controller/processor set-up – the main parties (auditor and auditee) are usually independent controllers during diligence. However, the due diligence process typically involves data processors, such as electronic data room providers. A data processing agreement compliant with Article 28 of the GDPR must be concluded with such entities.
  • Change of control/transparency – when the buyer becomes the new controller, they must provide updated privacy information; where data is obtained indirectly, Article 14 information must be given “as soon as reasonably possible” and at the latest, within one month (subject to limited exceptions).
  • Post-closing integration – reassess purpose compatibility, retention, security, contracts, and cross-border transfer safeguards before merging datasets.

France adheres to Chapter V of the GDPR regarding international transfers of personal data. Transfers of personal data outside the EEA are prohibited unless the destination country ensures an adequate level of protection.

  • Adequacy decisions: Transfers to countries with an EU adequacy decision (eg, Japan, UK and USA under the Privacy Framework) are unrestricted.
  • Appropriate safeguards: For other countries, organisations must use tools like the European Commission’s standard contractual clauses (SCCs) or binding corporate rules (BCRs).
  • Transfer impact assessments (TIAs): Organisations must conduct a TIA to verify if the laws of the recipient country (especially regarding government surveillance) impinge on the effectiveness of the safeguards (ie, this does not concern transfers based on adequacy decisions). In January 2025, the CNIL published a comprehensive guide on TIAs, outlining a five-step methodology for this assessment.

No General Prior Approval for GDPR Transfers

France does not impose a general government notification or approval requirement for transfers of personal data outside France or the EEA beyond the GDPR mechanisms (adequacy, Article 46 safeguards, or Article 49 derogations). In practice, organisations are expected to self-assess document compliance, and the CNIL may review transfer arrangements in audits or investigations.

Sector-Specific or Secrecy-Driven Constraints

Certain categories of information (eg, defence secrets, classified information, certain regulated critical infrastructure data, professional secrecy such as legal privilege or banking secrecy) may be subject to separate restrictions with regard to disclosure abroad, which operate independently from GDPR transfer rules. These restrictions are case-specific and typically arise from national security law, public law, criminal law or professional regulations rather than from data protection statutes as such.

Non-Personal Data: EU Data Act Context

The EU Data Act (Article 32) sets safeguards against third-country government access to or transfer of non-personal data held in the EU by providers of data processing services. Providers must take adequate technical, organisational and legal (including contractual) measures to prevent such access/transfer where it conflicts with EU or member state law. Third-country decisions requiring access/transfer are enforceable only if based on a relevant international agreement (eg, a mutual legal assistance treaty or MLAT); otherwise, and where compliance would risk such a conflict, access/transfer may occur only under specified procedural conditions, with data minimisation and (generally) customer notification.

No General Localisation Rule

French law does not impose a broad requirement to store personal data in France. Under the GDPR, localisation may be relevant as a risk mitigation measure (eg, to limit international transfers) but is not mandated as such.

Sectoral and Contractual Localisation Drivers

France enforces data localisation for specific strategic sectors.

Health data

Electronic health data must be hosted by a certified health data hosting (hébergeurs de données de santé or HDS) provider. While the law allows hosting within the EEA, there is a strong policy push for hosting in France or on sovereign cloud infrastructures. 

SecNumCloud

The “Cloud de Confiance” doctrine promotes the use of cloud services qualified by ANSSI (SecNumCloud) for public administrations and essential operators (Opérateur d'Importance Vitale/Opérateur de Services Essentiels – OIV/OSE). This qualification imposes strict immunity from extraterritorial laws (like the US CLOUD Act), effectively requiring data to be stored and processed within the EU by entities that are not subject to non-EU jurisdiction.

France has a long-standing blocking statute (Law No 68-678 of 26 July 1968, as amended). Subject to international treaties or agreements, it restricts the communication of certain economic information to foreign public authorities and, more broadly, the gathering/communication of such information for use as evidence in foreign judicial or administrative proceedings.

Covered Information

The statute applies to documents or information of an economic, commercial, industrial, financial, or technical nature where disclosure is likely to harm France’s sovereignty, security, essential economic interests, or public order.

Personal data is not singled out as a standalone category in the law, but datasets that include personal data (eg, customer files) can be treated as covered when they qualify as economic/commercial/technical information and meet the “harm” threshold.

Enforcement Mechanism (SISSE – Since 1 April 2022)

Since 1 April 2022, a decree has designated the Strategic Information and Economic Security Service (SISSE) as the “single point of contact” for notifications. Organisations receiving (i) a request from a foreign public authority to communicate potentially covered information, or (ii) a request relating to the constitution of evidence for foreign proceedings, must notify SISSE without delay and may obtain an administrative opinion on whether Articles 1 and/or 1 bis apply.

Criminal Penalties

Breaches of Articles 1 or 1 bis of the law are criminal offences punishable by up to six months’ imprisonment and a EUR18,000 fine (for individuals). For legal entities, the maximum fine is multiplied by five (ie, EUR90,000).

Exceptions/Lawful Disclosure Routes

The statute allows disclosure where an applicable international treaty or agreement provides for it (eg, co-operation/assistance mechanisms). Article 1 bis also operates subject to applicable laws and regulations. There is no general carve-out simply because the recipient is in the EU or because information is “publicly available” (these factors may be relevant factually case by case, but they are not stated as blanket exceptions in the statute).

Interaction With Data Protection

The blocking statute is distinct from the GDPR, but the two can converge in cross-border evidence collection or regulatory investigations that involve personal data. In such contexts, organisations in France need to reconcile multiple constraints: GDPR Chapter V transfer rules, the blocking statute (including SISSE notification/opinion process), professional secrecy (eg, legal privilege), trade secret protection, and sector-specific secrecy obligations (eg, banking secrecy).

EU-US Data Privacy Framework (DPF)

In September 2025, the General Court of the EU (Latombe case) upheld the validity of the DPF, rejecting a challenge by a French parliamentarian. This provides renewed stability for transfers to certified US companies, although privacy advocates remain vigilant. 

CNIL International Strategy 2025–2028

The CNIL has adopted a strategy aimed at strengthening European co-operation (including more efficient handling of cross-border cases) and promoting high data protection standards internationally through what it refers to as “data diplomacy”.

New Adequacy Developments – Brazil

In January 2026, the EU and Brazil concluded adequacy arrangements intended to facilitate secure and free data flow between the EU and Brazil. This reduces the need for SCCs or BCRs for transfers covered by the decision.

Anticipated Developments

UK adequacy decision

The EU adequacy decisions for the UK were renewed in December 2025 for a further six-year period (until 27 December 2031), subject to a “sunset clause” and ongoing monitoring. While transfers may continue without additional safeguards, divergence between UK and EU data protection law remains a structural risk that organisations should monitor.

Digital sovereignty and sector-specific localisation trends

Discussions on digital sovereignty and strategic autonomy continue at EU and member state level. While the GDPR does not impose a general data residency requirement, sector-specific rules, certification schemes, or public procurement conditions may increasingly influence expectations regarding localisation or EU-controlled processing in sensitive sectors. At present, however, this remains policy-driven and context-dependent rather than a generalised legal obligation under Chapter V of the GDPR.

Nomos

49 avenue de l’Opéra
75002 Paris
France

+33 (0)1 43 18 55 00

contact@nomosparis.com www.nomosparis.com
Author Business Card

Law and Practice

Authors



Nomos is an independent business law firm bringing together approximately 40 lawyers and legal professionals. The firm advises and represents clients in both advisory and litigation matters, in France and internationally. In an evolving legal and regulatory environment, Nomos combines precision, strategic vision and responsiveness. Its practice is grounded in a detailed understanding of sector-specific economic and financial challenges, enabling the firm to support innovative and complex business models. It has particular strength in technology, data protection, IP, entertainment and regulated sectors, positioning the firm at the forefront of digital and market transformations. Nomos adopts a comprehensive, 360-degree approach, integrating sharp sector knowledge, legal creativity and a results-oriented mindset. Its teams are committed, agile and focused on building trust-based, long-term relationships. The firm’s main areas of practice include technology and data protection; media and entertainment; IP; competition and consumer law; corporate/M&A; tax; employment; healthcare and life sciences; business litigation; and non-profit and professional organisations law.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.