Contributed By Nomos
General Legal Framework
France’s data protection and privacy landscape is built upon a two-tier regulatory structure combining European Union law with national legislation. The cornerstone of this framework is the General Data Protection Regulation (Regulation (EU) 2016/679) (GDPR), which has been directly applicable since 25 May 2018, and serves as the primary legal instrument establishing minimum standards for data protection across all EU member states, including France. Under French domestic law, the main regulatory instrument for the protection of personal data is the French Data Protection Act (FDPA) – known as “Loi informatique et libertés”, which was originally enacted on 6 January 1978 (Law No 78-17) and has been amended several times since.
The interaction between the EU texts and the FDPA is hierarchical but complementary: the GDPR sets the harmonised standard, while the FDPA:
Sectoral Legal Instrument
Sectoral instruments also play an important role. The Public Health Code (Code de la santé publique) contains particular provisions governing the processing of health data, including requirements for hosting certification (“HDS certification”). The Post and Electronic Communications Code (Code des postes et des communications électroniques – CPCE) governs electronic marketing and the confidentiality of correspondence, transposing the ePrivacy Directive. Furthermore, the Labour Code (Code du travail) regulates employee monitoring, requiring proportionality and prior consultation with employee representative bodies.
Extraterritorial Reach
Both the GDPR and the FDPA have significant extraterritorial reach. Under GDPR Article 3, the regulation applies to organisations established in the EU, regardless of where data processing occurs; and to organisations established outside the EU that offer goods or services to data subjects in the EU or monitor their behaviour. This territorial principle is reproduced in the FDPA. Regarding specifically the “opening provisions”, the FDPA states that French rules apply as long as the data subject resides in France, including when the data controller is not established in France.
Interplay with Non-Personal Data, Cybersecurity, and AI Regulation
France’s regulatory framework increasingly integrates data protection with rules on non-personal data, cybersecurity and AI.
General Principles for Personal Data Processing
The processing of personal data in France is governed by foundational principles established in the GDPR and reflected in the FDPA. These principles constitute the normative framework within which all data processing must occur.
Data Subject Rights
The GDPR grants individuals a broad set of enforceable rights over their personal data. These include the right to access and obtain information about how their data is processed, to rectify inaccurate data, and, where legally justified, to request erasure. Data subjects may also restrict processing, receive their data in a portable and machine-readable format, and object to certain types of processing, including marketing or processing based on legitimate interests or public tasks. In addition, they are protected against purely automated decision-making, with the right to seek explanations and human review, and they may lodge complaints with a supervisory authority free of charge.
The FDPA adds a specific data subject right, linked to digital death: Article 85 of the FDPA allows individuals to define general or specific directives concerning the storage, erasure and communication of their personal data after their death.
Organisational Compliance: Essential Checklist
Prohibition on Processing Special Categories of Personal Data
Under Article 9 GDPR, processing of special categories of personal data (including data revealing racial or ethnic origin, political opinions, religious beliefs, trade union membership, and genetic, biometric, or health data) is prohibited unless a specific derogation applies.
Derogations include, among other things:
Specific hosting requirement for health data
Where health data is hosted for care-related activities, French law requires use of a certified host (“Health Data Hosting Provider” – Article L 1111‑8 of the Public Health Code), with contractual constraints.
Processing data related to criminal convictions
Personal data related to criminal convictions is not among the special categories where processing is prohibited by principle under Article 9 of the GDPR. However, it is also subject to specific rules. Under Article 46 of the FDPA (implementing Article 10 GDPR), processing data regarding criminal convictions is largely a state monopoly, reserved for courts and public authorities. Private entities are strictly limited in their ability to process such data. Employers, for example, cannot conduct blanket criminal background checks; they may only request a criminal record extract (Bulletin No 3) if it is strictly necessary for the job and authorised by law (eg, for security personnel or employees working with minors).
Protecting minors’ data
Following the GDPR, the age of digital consent for the processing of personal data in France is 15, meaning that children under 15 cannot validly consent to the processing of their data by online services without parental authorisation. Organisations must implement proportionate age verification or parental consent mechanisms and ensure that data collection remains strictly necessary. More broadly, the protection of minors is a crucial and highly topical issue in France. At the time of writing, parliamentary debates are underway regarding a proposed ban on social media for those under the age of 15.
Scientific Research (Including Health Research, Studies and Evaluations)
The GDPR recognises a research-oriented regime: further processing for scientific research is, subject to Article 89 safeguards, not considered incompatible with the initial purpose (Article 5(1)(b)), and member state law may allow limited derogations from certain data-subject rights where necessary and proportionate.
To process health data (see 1.3 Special Categories of Personal Data), controllers commonly rely on the scientific research ground (Article 9(2)(j)) coupled with Article 89 safeguards.
France adds a specific gatekeeping layer for health-sector “research/study/evaluation” processing: such processing must pursue a public-interest purpose and either comply with a CNIL reference methodology (in which case, it can proceed after filing a simple declaration of compliance) or have obtained specific authorisation from the CNIL.
The CNIL reference methodologies include MR‑001 (scientific research requiring patients’ consent), MR‑003 (certain research without patient’s consent but with information and a right to object) and MR‑004 (secondary-use studies/research not involving the person).
Product and Service Development
If R&D is primarily product improvement, commercial analytics or model training, it may not qualify as “scientific research” and should be assessed under ordinary GDPR rules: a valid Article 6 legal basis plus, if health data or other special categories of data are processed, an Article 9(2) condition (often explicit consent). Re-use of consent-based data beyond the original scope generally requires new consent or a new legal basis.
Anonymisation as a Data Protection Safeguard
The legal provisions outlined above governing scientific research and product or service development apply to the processing of “personal” data. Where such data is anonymised in accordance with principles and technical requirements, ensuring no reasonable risk of re-identification (notable in accordance with WP29 Opinion 05/2014), it ceases to constitute personal data. As a result, it falls outside the scope of the GDPR and may be used freely for research or development purposes.
True anonymisation is therefore a key privacy protection mechanism and is encouraged in all cases. Moreover, when identifiable data is not strictly necessary for the intended processing, anonymisation is required to comply with the GDPR’s data minimisation principle.
Anonymisation must not be confused with pseudonymisation. Pseudonymisation replaces direct identifiers (eg, name, surname) with a code (eg, a number), but the data remains personal as long as a re-identification key exists. A recent CJEU ruling held that, in certain situations, pseudonymised data may be considered anonymous for a third party that cannot reasonably access the key or other re-identifying information.
The European Health Data Space (EHDS)
Regulation (EU) 2025/327 establishing the European Health Data Space (EHDS), will create a harmonised EU framework for accessing and re‑using electronic health data. It will apply from 26 March 2027, with the core secondary‑use regime becoming operational on 26 March 2029. For life sciences companies, the EHDS establishes a structured EU‑wide pathway for secondary use of electronic health data through data permits issued by national Health Data Access Bodies (HDABs) or health data requests, including for scientific research and related development or innovation activities (such as algorithm training and evaluation).
Access must occur exclusively within a secure processing environment with strict technical and organisational controls, including restricted access and mandatory audit logs. Only non‑personal, anonymised outputs may be exported. Certain purposes – such as advertising or marketing – and any form of re‑identification are expressly prohibited. Results of secondary use must be published in anonymised form within 18 months. Individuals may opt out of secondary use of their personal electronic health data; where identifiable, their data may not be used for new permits or newly approved requests, subject to limited exceptions under national law.
Enforcement risks are significant: fines may reach EUR10 million or 2% of global turnover, going up to EUR20 million or 4% of global turnover for serious infringements (including prohibited uses, extraction of personal data from secure environments, and re‑identification). The EHDS applies alongside the GDPR, meaning that parallel GDPR liability may arise.
Applicability of Data Protection Laws to AI
In France, the personal data regulatory framework, comprising the EU GDPR and the FDPA, fully applies to the processing of personal data through AI systems. The GDPR already includes rules that are particularly relevant for AI systems, such as the prohibition – subject to certain exceptions – of fully automated decisions that produce legal effects or significantly affect the individuals concerned (Article 22 of the GDPR).
CNIL’s AI Guidance
The CNIL has positioned itself as a key AI regulator by issuing guidance in 2024–2025 on how the GDPR applies to AI systems. Its recommendations address core issues such as choosing a lawful basis, complying with data minimisation and transparency principles, conducting DPIAs, managing data subject rights, securing AI development, mitigating bias, and ensuring proper documentation and accountability, thereby promoting a risk-based and integrated compliance approach.
Overlay of the EU AI Act
The AI Act (Regulation (EU) 2024/1689) establishes a risk-based compliance framework distinct from, but complementary to, EU data protection law. It introduces prohibitions, transparency duties, and extensive obligations for high-risk AI systems and certain general-purpose models. Application is being phased in between 2025 and 2027 (prohibitions from February 2025; general-purpose model obligations from August 2025). It does not affect the GDPR (Article 2(7)).
Under Article 33 of the GDPR, data controllers must notify the competent data protection authority of any personal data breach not later than 72 hours after becoming aware of it, unless the breach is unlikely to result in a risk to the rights and freedoms of natural persons. In France, this notification is typically done via CNIL’s dedicated online portal. Furthermore, if the breach is likely to result in a high risk to the rights and freedoms of a natural person, Article 34 of the GDPR imposes controllers to communicate the personal data breach to the data subject “without undue delay”.
Cross-Reporting
In France, the notification landscape in case of a security incident is fragmented. For example, where an entity is subject to the NIS framework, it must separately notify ANSSI of significant cyber-incidents. For financial entities, the EU Digital Operational Resilience Act (DORA) requires the notification of major ICT-related incidents to the “competent authority”. In France, depending on the type of financial entity concerned, this will generally be the Autorité de contrôle prudentiel et de résolution (ACPR) and/or the Autorité des marchés financiers (AMF). At the time of writing, the European Commission has proposed, as part of a digital regulatory simplification initiative (often referred to as the “Digital Omnibus”), the creation of a single reporting entry point intended to streamline notifications under several EU digital frameworks (including NIS2, the GDPR and DORA). This remains a legislative proposal and is not yet in force.
Action Items for Organisations
Recent enforcement actions, such as the EUR5 million fine against France Travail in 2024 and the EUR45 million fine against the internet access provider FREE in 2025 following massive data leaks, underscore the CNIL’s intolerance for failures in basic security hygiene and breach response.
France employs a multi-regulator model for the digital space, requiring organisations to navigate overlapping jurisdictions.
French Data Protection Authority
The CNIL acts as the French guardian of data privacy. Its missions and powers include:
The CNIL operates within a network of regulators:
Sectoral Authorities
These authorities notably co-ordinate through a formal national network established by the SREN Law to ensure consistent application of the DSA.
Investigation Initiation and Procedures
The CNIL generally initiates investigations based on complaints, external reports, data breach notifications, or on its own initiative when it identifies a sensitive risk area or sector. It may investigate any controller or processor through unannounced on-site inspections, document-based reviews, hearings, or online checks of websites and services.
During these investigations, the CNIL collects documents, interviews staff and examines systems to assess compliance with the GDPR and the FDPA. An investigation can close with no further action, or the CNIL’s president may issue a formal notice (mise en demeure) giving the organisation a deadline within which to comply. This notice is not a sanction but a warning phase. If breaches are serious, persistent or unresolved, the president may – whether or not a prior notice was issued – refer the case to the CNIL’s sanctions committee (formation restreinte), which is responsible for imposing sanctions.
Once referred, a formal sanctions procedure begins: the president appoints a rapporteur, who exchanges written submissions with the organisation and may hold a hearing. The case is then transmitted to the sanctions committee for an adversarial decision. The CNIL has no power to settle cases.
Sanctions imposed by the panel are immediately enforceable, and fines are paid to the French Treasury (meaning they do not finance the CNIL). Decisions can be appealed before the Conseil d’État.
For less serious cases, a simplified procedure allows the president of the sanctions committee or a designated member to decide alone, without a public hearing, with lower maximum fines (up to EUR20,000) and limited publicity – enabling the CNIL to handle a higher volume of straightforward infringements.
Available Sanctions and Remedies
Administrative sanctions
Civil remedies
Beyond CNIL administrative sanctions, data subjects may pursue civil damages before the French judicial courts (pursuant to GDPR Article 82).
Criminal sanctions
Serious data protection violations may trigger criminal sanctions under the French Criminal Code, which criminalises, among other things:
Criminal penalties include fines up to EUR300,000; imprisonment for up to five years; and publication of conviction decisions. However, criminal prosecution is very rare in practice. The CNIL refers cases to prosecuting authorities where criminal elements appear evident, but prosecutors exercise prosecutorial discretion in pursuing charges.
Aspects Used to Set Penalty Levels
The CNIL applies the criteria set out in Article 83(2) of the GDPR to determine the amount of the fine, including the seriousness and impact of the infringement, whether the conduct was intentional or negligent, and the controller’s efforts to mitigate damage. It also considers elements such as prior violations, the degree of co-operation with the authority, and the sensitivity of the personal data concerned.
It is important to note that the CNIL does not provide details on how it reaches a specific amount, including when that amount covers infringements of different natures. The absence of any obligation in this respect was confirmed by the French Supreme Administrative Court (Conseil d’État).
Over the last approximately 24 months, data-protection enforcement in France has become both higher volume and higher impact. In 2024, the national regulator reported 331 corrective measures, including 87 financial penalties totalling EUR55,212,400, alongside a record 17,772 complaints. It also received 5,629 personal-data breach notifications (20% more than in 2023) and highlighted a rise in very large-scale incidents, noting that breaches affecting more than one million individuals doubled. In 2025, it issued 83 sanctions with cumulative fines of EUR486,839,500 and identified three recurring enforcement themes: cookies/trackers, employee monitoring, and data security.
Most significant enforcement trends:
General Trends
Over the last 24 months, personal data disputes in France have increasingly reached the courts through private enforcement of the GDPR. Two recurring streams stand out: (i) compensation claims following security incidents or unauthorised disclosures; and (ii) litigation seeking compliance with data-subject rights – particularly access requests, including in an employment context (eg, access to professional emails, subject to the rights of others).
Claimants are mainly individuals (customers, users, employees or former employees). Typical remedies include compliance orders (eg, to grant access, erase data, or cease unlawful processing), sometimes subject to penalty payments, as well as damages.
Non-Material Damage
Non-material damage is compensable under Article 82 GDPR, but a mere infringement is insufficient. The claimant must prove:
No minimum seriousness threshold applies, yet harm must be real and proven.
Negative emotions (fear, anxiety, loss of control, reputational concern, risk of misuse) may qualify if substantiated and causally linked (CJEU, C-340/21, Natsionalna agentsia za prihodite; CJEU, C-655/23, Quirin Privatbank). Purely hypothetical risk or unproven fear is insufficient.
Compensation is strictly compensatory, not punitive. It must reflect the actual damage suffered; fault is irrelevant to quantification once liability is established (CJEU, C-667/21, Krankenversicherung Nordrhein). A separate injunction under national law cannot reduce damages (CJEU, C-655/23, Quirin Privatbank).
The GDPR does not harmonise methods of calculation. Quantification remains subject to national procedural autonomy, constrained by equivalence and effectiveness, and Article 83 fine criteria cannot be transposed to Article 82 (CJEU, C-741/21, juris GmbH).
The court decisions that have most significantly shaped the framework for privacy litigation before the courts (ie, to be distinguished from the CNIL’s GDPR enforcement cases through administrative proceedings) in recent years are described below.
Article 82 GDPR – Conditions and Nature of Compensation
Structure of Liability and Exoneration
GDPR Breach and Unfair Competition
CJEU, C-21/23, Lindenapotheke (4 October 2024): The court ruled that Chapter VIII of the GDPR does not preclude national law from allowing competitors of an undertaking infringing the GDPR to bring civil proceedings against the undertaking for unfair commercial practices, where such infringement confers a competitive advantage on that undertaking.
Representation Under the GDPR
GDPR Article 80 allows individuals to mandate non-profit bodies, organisations or associations to lodge complaints and bring judicial proceedings on their behalf. Member states may also allow certain bodies to act without a mandate.
“Action de Groupe” and the 2025 Reform
France has allowed class-action style procedures (action de groupe) in specific sectors, including consumer matters and data protection. In 2025, France adopted a significant reform aimed at streamlining and expanding group actions (Law No 2025-391 of 30 April 2025). The new framework establishes a unified regime applicable across matters and is designed to facilitate claims for both cessation of unlawful conduct and compensation, subject to French procedural safeguards and typically an opt-in mechanism for claimants.
The EU Data Act
The European Data Act (Regulation (EU) 2023/2854), adopted on 13 December 2023, entered into force on 11 January 2024 and has been applicable since 12 September 2025. It sets EU-wide rules on fair access to and use of data. It focuses on data generated by connected products and related services, and on switching and interoperability for data processing services (including cloud and edge computing services).
The regulation applies to situations involving both personal and non-personal data. Where personal data is concerned, the GDPR prevails in cases of conflict, and the Data Act operates without prejudice to EU data-protection law.
Scope and key mechanisms
The regulation defines key roles, including users, data holders, data recipients, and, for cloud services, customers and providers of data processing services. It also references data intermediation services as defined in the Data Governance Act.
French Context
While the Data Act is EU-wide regulation, France has anticipated it with the “SREN” Law No 2024-449 of 21 May 2024 (Law to Secure and Regulate the Digital Space). The SREN Law establishes interoperability and portability obligations for cloud computing service providers, complementing Data Act requirements. This law addresses specific French concerns regarding data sovereignty and technological resilience.
A core principle of the EU Data Act is that, whenever personal data is involved, the GDPR continues to apply, and prevails in case of conflict. For mixed datasets (containing personal and non-personal data), the GDPR applies to the personal-data component. Where the two are technically or functionally inseparable, GDPR obligations cannot be bypassed and, in practice, the dataset must be handled under GDPR-compliant safeguards.
This interaction creates specific legal constraints. For example:
Access and Sharing of Connected Product Data
The Data Act gives users of connected products and related services the right to access and share certain “readily available” product and related service data, including necessary metadata (Articles 3–7).
Accordingly, data holders must notably:
B2B Fairness and Cloud Switching
The Data Act controls unfair terms in B2B data-sharing contracts where clauses are unilaterally imposed by one party, and it regulates the concept of reasonable compensation in certain mandatory data-sharing scenarios, including specific safeguards in defined SME-related situations. To support implementation, the European Commission issued non-binding Model Contractual Terms and Standard Contractual Clauses in November 2025 (Chapter IV).
In relation to switching between data processing services (including cloud and edge computing services), providers must:
Public Sector Access and Third-Country Requests
The Data Act permits public sector bodies to access privately held data only in situations of “exceptional need”, including certain emergencies, and subject to strict requirements of necessity and proportionality (Chapter V). It also establishes safeguards against unlawful access by third-country authorities to non-personal data held in the EU, with implications for contractual arrangements and cloud governance structures (Chapter VII).
Key Organisational Priorities
Organisations should:
ARCEP is the competent authority for enforcing the cloud-related provisions of the SREN Law. It has investigative powers, can settle disputes, and may impose fines of up to 3% of worldwide annual turnover (or 5% in the case of a repeat infringement within the statutory period).
Regarding the EU Data Act, a French DDADUE bill currently under parliamentary examination proposes to designate ARCEP as the competent authority for enforcement, with the exception of Chapter VII. At the time of writing, this bill has not yet been adopted.
The CNIL remains the supervisory authority for GDPR matters. For example, if a cloud-switching failure results in a personal data breach, the CNIL is competent for the GDPR aspects (eg, security and breach notification), while ARCEP may address any breach of cloud-switching obligations within its remit.
The SREN Law establishes co-ordination mechanisms – ARCEP must consult the CNIL for certain decisions concerning data intermediation services involving personal data and must refer suspected anti-competitive practices in cloud markets to the French Competition Authority. These arrangements ensure institutional co-ordination but do not create a unified tripartite regulatory regime.
Legal Framework: ePrivacy Implemented in France
In France, the placement or reading of cookies and similar trackers on a user’s terminal equipment is governed by the ePrivacy framework (Directive 2002/58/EC, Article 5(3)) as implemented by Article 82 of the French DPA. As a rule, storing or accessing information on a device requires the user’s prior informed consent, unless a narrow exemption applies (eg, trackers that are “strictly necessary” for providing a service are expressly requested by the user or required to enable core site functionality).
Consent Standard and Practical Design
CNIL guidance has established demanding expectations for consent interfaces: users should be able to refuse as easily as accept; consent must be specific and granular; and withdrawal must be possible at any time with an effect comparable to refusal. Organisations must also be able to evidence consent, document retention periods, and manage third‑party trackers through contract and technical controls (eg, a consent management platform, tag governance, and auditing).
Cookie Walls and Conditional Access
Following French and EU case law developments, cookie walls are not automatically unlawful, but they raise a high bar: the user must have a genuine, free choice and must not be coerced into consenting, taking into account the service’s market position and the availability of alternatives. In practice, many services in France avoid blanket “take it or leave it” consent models and implement alternatives (eg, contextual advertisements, paid access, or reduced tracking options), coupled with clear information.
Enforcement Exposure
Cookies and tracking remain one of the CNIL’s highest enforcement priorities, including for foreign platforms targeting French users. This means that even technically “minor” consent design issues (lack of prominence symmetry between options, unclear purposes, or continued reading of cookies after withdrawal) can lead to significant regulatory risk.
Personalised or targeted advertising is regulated through a combination of the GDPR (eg, legal basis and restriction on profiling where it is associated with automated decision-making), the ePrivacy Directive (consent for trackers and electronic marketing), and the new DSA/SREN Law transparency rules.
Profiling Restrictions
Article 22 of the GDPR strictly regulates decisions based solely on automated processing, particularly profiling, when they produce legal effects or have similar significant impacts on an individual. As a rule, every person has the right not to be subject to such automated decisions. Exceptions apply only when the decision is necessary for a contract, authorised by EU or member state law, or based on the individual’s explicit consent. In these cases, the controller must implement safeguards, including human intervention, the possibility to present one’s viewpoint, and the right to contest the decision. Moreover, automated decisions cannot rely on sensitive data unless very specific legal conditions are met, and reinforced protective measures are in place.
In addition, the SREN Law and DSA prohibit the presentation of advertisements based on profiling using special categories of personal data (eg, political opinions, health). Targeting minors with profiling-based ads is also strictly banned.
Transparency
Under the DSA, online platforms must ensure advertisements are clearly identifiable and provide, directly from the advert itself, meaningful information about the main parameters used to determine why the recipient is being shown the ad, and where applicable, information on how to change those parameters.
Direct Marketing
Under Article L 34-5 of the CPCE, electronic marketing (email, SMS) requires prior opt-in consent, unless the recipient is an existing customer for similar products/services or is a professional (B2B) being contacted on a work address.
In France, there is no single “employment privacy act”. Employment privacy is governed by the GDPR and the FDPA, overseen by the CNIL. Importantly, it is also framed by French Labour Code rules (notably proportionality and information/consultation requirements). For instance, the Labour Code expressly states that no personal information may be collected through a device that has not been brought to the employee’s attention beforehand, and the employees’ representative body (Lecomité social et économique – CSE) must be informed/consulted in advance on employee monitoring tools and methods.
Key Themes Related to Privacy in the Workplace
Employee monitoring
Monitoring (eg, time tracking, CCTV, IT logs, geolocation) must be necessary, proportionate and disclosed. For vehicle geolocation, the CNIL indicates that the employer should not track outside working time and information collected should not be used to calculate working time when another method already exists. A CNIL fine published on 4 February 2025 (“inactivity” scoring, frequent screenshots, permanent filming) shows where monitoring crosses the line, and high-risk/constant monitoring may trigger a DPIA.
Remote work
Remote work is defined in Labour Code Article L1222-9 and is usually set by a collective agreement or, failing that, an employer charter after consulting the CSE; it may also be agreed individually. The CNIL highlights reinforced security for telework and configuring tools to minimise data collected.
Communication tools at work
The CNIL notes that employers may control internet and email use for security and policy enforcement but must set rules and inform employees (often via an IT charter). For bring your own device (BYOD), the CNIL recommends separating professional from private spaces; remote wiping should be limited to the professional space, not the whole personal device.
Recruitment
The CNIL’s guide requires collecting only data that is strictly necessary to assess candidates, providing GDPR notices, and defining retention. The CNIL indicates that an unsuccessful candidate’s file may be kept for a short period (eg, up to about three months) to manage feedback/contestation, and then up to two years after the last contact only if the candidate is clearly informed and has agreed or can easily object. Background checks must be job-relevant – a criminal record extract (Bulletin No 3) may be requested only in limited, justified cases and should be consulted but not retained; where an authority grants approval, that approval generally suffices.
Under the EU GDPR and the FDPA, there is no “M&A carve-out” – any disclosure, transfer and reuse of personal data must be lawful, transparent, proportionate and secure. The following key principles must be followed:
France adheres to Chapter V of the GDPR regarding international transfers of personal data. Transfers of personal data outside the EEA are prohibited unless the destination country ensures an adequate level of protection.
No General Prior Approval for GDPR Transfers
France does not impose a general government notification or approval requirement for transfers of personal data outside France or the EEA beyond the GDPR mechanisms (adequacy, Article 46 safeguards, or Article 49 derogations). In practice, organisations are expected to self-assess document compliance, and the CNIL may review transfer arrangements in audits or investigations.
Sector-Specific or Secrecy-Driven Constraints
Certain categories of information (eg, defence secrets, classified information, certain regulated critical infrastructure data, professional secrecy such as legal privilege or banking secrecy) may be subject to separate restrictions with regard to disclosure abroad, which operate independently from GDPR transfer rules. These restrictions are case-specific and typically arise from national security law, public law, criminal law or professional regulations rather than from data protection statutes as such.
Non-Personal Data: EU Data Act Context
The EU Data Act (Article 32) sets safeguards against third-country government access to or transfer of non-personal data held in the EU by providers of data processing services. Providers must take adequate technical, organisational and legal (including contractual) measures to prevent such access/transfer where it conflicts with EU or member state law. Third-country decisions requiring access/transfer are enforceable only if based on a relevant international agreement (eg, a mutual legal assistance treaty or MLAT); otherwise, and where compliance would risk such a conflict, access/transfer may occur only under specified procedural conditions, with data minimisation and (generally) customer notification.
No General Localisation Rule
French law does not impose a broad requirement to store personal data in France. Under the GDPR, localisation may be relevant as a risk mitigation measure (eg, to limit international transfers) but is not mandated as such.
Sectoral and Contractual Localisation Drivers
France enforces data localisation for specific strategic sectors.
Health data
Electronic health data must be hosted by a certified health data hosting (hébergeurs de données de santé or HDS) provider. While the law allows hosting within the EEA, there is a strong policy push for hosting in France or on sovereign cloud infrastructures.
SecNumCloud
The “Cloud de Confiance” doctrine promotes the use of cloud services qualified by ANSSI (SecNumCloud) for public administrations and essential operators (Opérateur d'Importance Vitale/Opérateur de Services Essentiels – OIV/OSE). This qualification imposes strict immunity from extraterritorial laws (like the US CLOUD Act), effectively requiring data to be stored and processed within the EU by entities that are not subject to non-EU jurisdiction.
France has a long-standing blocking statute (Law No 68-678 of 26 July 1968, as amended). Subject to international treaties or agreements, it restricts the communication of certain economic information to foreign public authorities and, more broadly, the gathering/communication of such information for use as evidence in foreign judicial or administrative proceedings.
Covered Information
The statute applies to documents or information of an economic, commercial, industrial, financial, or technical nature where disclosure is likely to harm France’s sovereignty, security, essential economic interests, or public order.
Personal data is not singled out as a standalone category in the law, but datasets that include personal data (eg, customer files) can be treated as covered when they qualify as economic/commercial/technical information and meet the “harm” threshold.
Enforcement Mechanism (SISSE – Since 1 April 2022)
Since 1 April 2022, a decree has designated the Strategic Information and Economic Security Service (SISSE) as the “single point of contact” for notifications. Organisations receiving (i) a request from a foreign public authority to communicate potentially covered information, or (ii) a request relating to the constitution of evidence for foreign proceedings, must notify SISSE without delay and may obtain an administrative opinion on whether Articles 1 and/or 1 bis apply.
Criminal Penalties
Breaches of Articles 1 or 1 bis of the law are criminal offences punishable by up to six months’ imprisonment and a EUR18,000 fine (for individuals). For legal entities, the maximum fine is multiplied by five (ie, EUR90,000).
Exceptions/Lawful Disclosure Routes
The statute allows disclosure where an applicable international treaty or agreement provides for it (eg, co-operation/assistance mechanisms). Article 1 bis also operates subject to applicable laws and regulations. There is no general carve-out simply because the recipient is in the EU or because information is “publicly available” (these factors may be relevant factually case by case, but they are not stated as blanket exceptions in the statute).
Interaction With Data Protection
The blocking statute is distinct from the GDPR, but the two can converge in cross-border evidence collection or regulatory investigations that involve personal data. In such contexts, organisations in France need to reconcile multiple constraints: GDPR Chapter V transfer rules, the blocking statute (including SISSE notification/opinion process), professional secrecy (eg, legal privilege), trade secret protection, and sector-specific secrecy obligations (eg, banking secrecy).
EU-US Data Privacy Framework (DPF)
In September 2025, the General Court of the EU (Latombe case) upheld the validity of the DPF, rejecting a challenge by a French parliamentarian. This provides renewed stability for transfers to certified US companies, although privacy advocates remain vigilant.
CNIL International Strategy 2025–2028
The CNIL has adopted a strategy aimed at strengthening European co-operation (including more efficient handling of cross-border cases) and promoting high data protection standards internationally through what it refers to as “data diplomacy”.
New Adequacy Developments – Brazil
In January 2026, the EU and Brazil concluded adequacy arrangements intended to facilitate secure and free data flow between the EU and Brazil. This reduces the need for SCCs or BCRs for transfers covered by the decision.
Anticipated Developments
UK adequacy decision
The EU adequacy decisions for the UK were renewed in December 2025 for a further six-year period (until 27 December 2031), subject to a “sunset clause” and ongoing monitoring. While transfers may continue without additional safeguards, divergence between UK and EU data protection law remains a structural risk that organisations should monitor.
Digital sovereignty and sector-specific localisation trends
Discussions on digital sovereignty and strategic autonomy continue at EU and member state level. While the GDPR does not impose a general data residency requirement, sector-specific rules, certification schemes, or public procurement conditions may increasingly influence expectations regarding localisation or EU-controlled processing in sensitive sectors. At present, however, this remains policy-driven and context-dependent rather than a generalised legal obligation under Chapter V of the GDPR.
49 avenue de l’Opéra
75002 Paris
France
+33 (0)1 43 18 55 00
contact@nomosparis.com www.nomosparis.com