Data Protection & Privacy 2025

Last Updated March 11, 2025

France

Law and Practice

Authors



Jeantet has been one of the leading independent French corporate law firms since 1924, delivering customised, high value-added services, committed to ethics and human values. The firm is used to dealing with complex cross-border IT, data protection and cybersecurity issues for international companies. It acts on behalf both of IT service providers (publishers, IaaS, PaaS, SaaS, service providers, etc) and of their clients (banking, insurance, industry, tourism or retail) at all stages of IT projects: choice of architectural architecture, negotiation and drafting of contracts from the simplest to the most complex (outsourcing, maintenance, integration, ERP, migration, cloud services, etc). It has broad experience in IT disputes, especially during expertise phases and offers a fully integrated external DPO service.

In France, data protection and privacy laws are primarily governed by the General Data Protection Regulation (the “GDPR”) and the French Data Protection Act No. 78-17 of 6 January 1978 as amended by Act No 2018-493 of 20 June 2018 and Ordinance No 2018-1125 of 12 December 2018 (the “FDPA”), hereinafter together “data protection and privacy laws”.

GDPR as the baseline: The GDPR serves as the primary legal framework for data protection across the EU, including France. It aims to harmonise data protection standards, ensuring a consistent approach to individual rights and data protection across member states.

The FDPA as a Complement: While the GDPR establishes a baseline for data protection, each EU member state has the option to introduce additional provisions through their national laws. In France, the FDPA provides extra protections in specific contexts, addressing particular national concerns or local customs.

For example, the FDPA includes specific provisions that focus on certain areas not fully addressed by the GDPR, such as genetic and biometric data usage, health data, and data relating to criminal convictions or offences.

In conclusion, France’s data protection and privacy landscape is characterised by the interplay between the GDPR and the FDPA. The GDPR provides a harmonised framework that sets minimum standards for the protection of personal data across the EU, while the FDPA introduces specific adaptations to address national needs and contexts.

In France, the National Commission for Information Technology and Civil Liberties (“CNIL”) is the primary regulator for data protection and privacy. However, other relevant regulators in the broader context of information technology and digital services include the French Information Systems Security Agency (“ANSSI”) and the French Regulatory Authority for Electronic Communications, Postal Services and Press Distribution (“ARCEP”). Furthermore, the French Prudential Supervision and Resolution Authority (“ACPR”), which regulates financial institutions, will take into account whether a regulated entity is meeting data privacy standards.

The CNIL

The CNIL is responsible for overseeing data protection and privacy laws, particularly the enforcement of data protection and privacy laws. The CNIL’s main functions are the following:

  • Supervision: Monitors compliance with data protection and privacy laws by organisations operating in France.
  • Guidance: Provides advice and guidelines to businesses, public authorities, and individuals on data protection issues.
  • Complaints: Handles complaints from individuals regarding data protection violations.
  • Sanctions: Has the authority to conduct audits and investigations and impose sanctions such as fines for non-compliance with data protection regulations.
  • Public awareness: Promotes understanding of data protection rights and responsibilities among the public.

The CNIL conducts assessments and audits of organisations to ensure compliance with data protection and privacy laws. It issues guidelines and recommendations to clarify legal requirements and best practices. CNIL actively engages with businesses to help them implement robust data protection measures, offering tools and resources. Regarding enforcement, CNIL can issue warnings, impose corrective measures, and levy fines for serious data protection and privacy law breaches.

The ANSSI

The ANSSI is dedicated to enhancing cybersecurity within France. Its responsibilities include protecting networks and offering guidance to improve cybersecurity resilience in the private sector. The agency develops and implements national cybersecurity strategies, provides support during cyber incidents, and certifies secure products. It is mandatory to report major cybersecurity incidents to ANSSI. Additionally, notifications to ANSSI will be combined with any reports to CNIL (National Commission on Informatics and Liberty) if a security incident involves personal data.

The ARCEP regulates electronic communications, postal services, and press distribution in France and ensures that communications networks operate fairly and efficiently. While not focused on data protection per se, ARCEP can intersect with privacy issues in the communications sector.

The ACPR

The ACPR supervises and regulates France’s banking and insurance sectors. It also plays a key role in resolving failing financial institutions to minimise systemic risk. In essence, the ACPR acts as a guardian of the French financial system’s health and integrity. Major operational or security incidents affecting the information systems of financial entities must be notified to the ACPR (as well as to the CNIL if such security incidents involve the violation of personal data).

These regulators work through a combination of rule-making, guidance publication, compliance assessment, and enforcement activities. The CNIL, in particular, plays a pivotal role in shaping the data protection landscape in France by engaging with stakeholders, responding to technological advancements, and maintaining a balance between privacy rights and innovation.

In France, administrative proceedings related to data protection and privacy are governed primarily by the CNIL.

Initiation of Administrative Proceedings

Complaint Submission: Individuals can file complaints with CNIL if they believe their data protection rights have been violated. Complaints can be submitted via CNIL’s website.

Investigations: CNIL can also initiate proceedings on its own initiative without a complaint if it suspects a breach of data protection and privacy laws. This often occurs in response to reported incidents.

Conducting Administrative Proceedings

Investigation process: CNIL can conduct on-site inspections or online investigations of the organisation in question. It may require the organisation to provide relevant documents and information related to the data processing activities.

Cooperation with the organisation: Organisations are expected to cooperate with CNIL during the investigation. CNIL may issue recommendations for compliance before moving to sanctions.

Reporting findings: After the investigation, CNIL compiles its findings and notifies the organisation of any violations found, allowing the entity to respond or rectify issues before formal sanctions are imposed.

Decision-making: CNIL issues formal decisions based on its findings, determining whether a violation occurred and what sanctions (if any) are appropriate.

Calculation of Administrative Fines

  • Criteria for Fines: When calculating administrative fines under the French Data Protection Act, CNIL considers various factors, such as:
    1. the nature, severity and duration of the violation;
    2. the number of affected individuals;
    3. the intention or negligence behind the violation;
    4. the categories of personal data involved;
    5. the previous compliance history of the organisation; as well as
    6. the degree of cooperation with CNIL during the investigation.
  • Proportionality and fairness: The CNIL takes into account the company’s turnover with a view to imposing such a fine.
  • Fine limits: The CNIL applies the limit set up in the GDPR (up to €20 million or 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher). CNIL also follows these thresholds but typically applies fines based on the specific context of the violation.
  • Publication of sanctions: Decisions, including fines, are generally published (in an annual report or online), but the specifics of the organisation involved may sometimes be anonymised to protect confidentiality.
  • This administrative approach underscores CNIL’s commitment to enforcing data protection and privacy laws effectively while allowing organisations to comply before facing punitive measures (even if nothing prevents the CNIL from sanctioning directly).

The CNIL has undertaken several notable administrative proceedings in recent years, reflecting its commitment to enforcing data protection and privacy laws and ensuring compliance with them. Here are some of the most significant cases.

Orange (2024)

Background: In November 2024, Orange was fined EUR50 million.

Findings: The company was sanctioned for displaying advertisements in its users’ emails without their consent.

Amazon (2024)

Background: In June 2024, Amazon received a EUR15 million fine.

Findings: CNIL identified violations concerning excessive worker monitoring.

Yahoo! (2023)

Background: In December 2023, Yahoo! received a EUR10 million fine.

Findings: The company was sanctioned for failing to respect web users’ choice to refuse cookies on its “Yahoo.com” site and for failing to allow users of its “Yahoo! Mail” messaging service to freely withdraw their consent to cookies.

Criteo (2023)

Background: In June 2023, Criteo was fined EUR40 million.

Findings: The company was sanctioned for infractions related to processing personal data for targeted advertising without valid user consent.

Discord (2022)

Background: In November 2022, CNIL sanctioned Discord with a EUR800,000 fine.

Findings: The company was penalised for several breaches, including issues with user data retention and password security.

TikTok (2022):

Background: In December 2022, CNIL fined TikTok EUR5 million for cookie-related violations.

Findings: The investigation revealed that TikTok did not allow users to refuse cookies as easily as they could accept them, thus violating French data protection law.

These cases demonstrate CNIL’s ongoing vigilance in enforcing data protection and privacy laws, particularly paying attention to large tech companies and large-scale data processing practices.

Recent developments regarding artificial intelligence (AI) regulation in France reflect ongoing efforts to ensure that AI technologies are used responsibly and ethically, particularly regarding data protection.

EU Artificial Intelligence Act (the “AI Act”) Proposal and Adoption

As part of the European Union’s broader approach to AI regulation, France has supported the proposed AI Act. This legislation seeks to classify AI systems based on risk levels (e.g., minimal, limited, high, and unacceptable) and impose stricter requirements for high-risk applications, particularly those that process personal data. The AI Act has been published on 12 July 2024, and its implementation will take place in stages until 2 August 2027.

CNIL’s Implications

The CNIL is actively participating in the work currently being carried out by the European Data Protection Board (EDPB) on the relationship between the rules applicable to the protection of personal data and the AI Act. The aim of this work is to provide further clarification on the points of articulation while enabling a harmonised interpretation between the CNIL and its European counterparts. Moreover, the CNIL published several guidelines and recommendations on the interplay between data protection and privacy laws and the AI Act.

Integration of AI Systems

The integration of AI systems in various sectors necessitates firms to ensure compliance with data and privacy laws. Organisations utilising AI must assess how their technologies affect the processing of personal data and implement measures to remain compliant, such as those outlined below.

  • Data minimisation: AI systems must adhere to the principle of data minimisation, meaning that only data necessary for the specified purpose should be collected and processed.
  • Transparency: Organisations using AI must clearly inform individuals about how their data is being used by AI systems, including the purposes of processing, data retention periods, and the logic behind automated decision-making.
  • Rights of data subjects: Individuals have the right to access their data, request corrections, object to data processing, and seek the right to be forgotten. These rights protect individuals from potential harms associated with AI systems, such as biased decision-making.
  • Algorithmic accountability: The proposed regulations emphasise accountability mechanisms for AI systems, requiring organisations to ensure that their algorithms are fair and transparent and do not perpetuate bias. This includes regular audits of AI systems to evaluate data handling practices and algorithm performance.
  • Security: The security of AI systems remains an obligation in order to guarantee data protection both during system development and in anticipation of its deployment (eg, security measures concerning both the training data and the development and operation of the AI system).
  • Impact assessments: For high-risk AI systems, conducting Data Protection Impact Assessments (DPIAs) is required to evaluate the risks to personal data and the measures needed to mitigate those risks.

Regulations

The evolving regulation of AI in France reflects a balance between fostering innovation and ensuring robust data protection. As AI systems increasingly permeate various sectors, the emphasis on ethical considerations, transparency, and individual rights translates into strong safeguards designed to protect personal data. Thus, organisations must navigate these regulatory landscapes carefully to leverage AI technologies while complying with data protection and privacy laws.

The AI Act significantly impacts data protection in France by integrating principles of privacy and ethical governance into the development and deployment of AI technologies.

Alignment with Data Protection Principles

The AI Act emphasise key principles of data protection, including transparency, accountability, and data minimisation. These principles ensure that AI systems processing personal data are designed and deployed in ways that respect individuals’ rights.

Stricter Requirements for High-Risk Applications

Under the AI Act, high-risk AI applications (eg, in sectors like healthcare and finance) face rigorous requirements, including the necessity for risk assessments.

Enhanced Transparency Obligations

AI systems must provide clear and understandable information to users regarding how personal data is being processed. This aligns with the transparency obligations of data protection and privacy laws, reinforcing users’ rights to be informed about data usage.

Protection Against Bias and Discrimination

AI regulations compel developers to consider fairness and non-discrimination. This is particularly important in the context of data protection and privacy laws, which require adherence to principles of equality and non-discrimination, minimising the risk of biased algorithms adversely affecting individuals.

National Governance

The CNIL is fully committed to securing companies that are innovating in AI in their application of data protection and privacy laws and promoting AI that respects people’s rights over their data.

Moreover, high-risk AI systems already subject to sector-specific regulation will continue to be regulated by competent national regulators, such as the “National Agency for the Safety of Medicines and Health Products” (ANSM) for medical devices.

Data Protection and Privacy under the AI Act

In addition, the CNIL considers that the AI Act can extend and take over from data protection and privacy laws on certain well-defined points, such as:

  • the AI Act replaces certain data protection and privacy laws rules for the use by law enforcement agencies of real-time remote biometric identification in publicly accessible spaces, which it makes very exceptionally possible under certain conditions (article 5);
  • it exceptionally allows the processing of sensitive data to detect and correct potential biases that could cause harm, if strictly necessary and subject to appropriate safeguards (article 10);
  • it allows the re-use of personal data, including sensitive data, within the “regulatory sandboxes” framework. These sandboxes are intended to facilitate the development of systems of significant public interest (such as improving the healthcare system) and are placed under the supervision of a dedicated authority which must first consult the CNIL and verify compliance with a certain number of requirements (article 59).

The regulation of AI in France and its impact on data protection reflect a comprehensive approach aimed at safeguarding individual rights while fostering innovation. The interplay between data protection and privacy laws and the AI Act demonstrates a cohesive framework that addresses the complexities of AI technology. Together, these laws ensure that the development and deployment of AI systems are conducted ethically, transparently, and in alignment with data protection principles, promoting trust in AI technologies across society.

Recent Trends in Privacy Litigation in France

There has been a rise in privacy-related lawsuits in France, largely driven by heightened awareness of data protection rights following the implementation of data protection and privacy laws.

Many cases relate to data breaches, unauthorised processing of personal information, or mishandling of a request to exercise rights (particularly the right to access). Individuals are increasingly seeking compensation for damages resulting from breaches, specifically emphasising organisations’ accountability to protect personal data.

There has been a notable trend in seeking damages for moral damage caused by data privacy violations. Plaintiffs are leveraging the data protection and privacy laws’ provisions on damages to pursue financial compensation, reflecting a shift in how privacy violations are perceived and litigated.

The CNIL plays a crucial role in shaping privacy litigation. It not only enforces compliance through fines and investigations but also provides guidance that can influence suits brought before courts. Injunctions and sanctions issued by CNIL can lay the groundwork for subsequent legal actions.

As the use of AI and digital technologies proliferates, litigation surrounding challenges such as algorithmic bias, data subject rights in automated decision-making, and transparency requirements is likely to increase.

Impact of Supranational/International Developments on Domestic Litigation

Decisions made by the ECJ regarding data protection and privacy, such as those on the validity of the Privacy Shield, the interpretation of fundamental rights under EU law or, more recently, on compensation for moral prejudice,, in the event of a breach of data protection and privacy laws, may have direct implications for domestic litigation. Such rulings inform courts in France and help shape legal interpretations concerning privacy matters.

Influence of the European Data Protection Board (EDPB): the EDPB can adopt opinions to ensure consistent application of the GDPR and binding decisions to settle disputes between supervisory authorities in the EU referred to it.

The European Commission has a key role to play especially in personal data transfers outside the EU, in international cooperation or in relation to EDPB missions.

France’s obligations under international treaties, like the European Convention on Human Rights (ECHR), which guarantees the right to respect for private and family life, influence the judicial landscape around privacy litigation. French courts often consider these obligations when adjudicating privacy-related cases.

The activities of major technology companies, particularly concerning the handling of personal data, have sparked litigation not only in France but also across Europe and globally. Developments and regulatory measures taken against these companies (eg, fines or compliance orders) can create a ripple effect domestically, prompting similar litigation.

CJEU Case Law on Articles 82 of the GDPR

Article 82 of the GDPR allows individuals to claim damages (both material and non-material) from organisations violating their data protection rights. Recent CJEU rulings clarify that compensation is required for all damages resulting from GDPR infringements. To successfully claim damages under Article 82, three conditions must be met: a fault by the organisation, demonstrable damage to the individual, and a causal link between the fault and the damage. French courts apply this principle, consistent with the French civil code, requiring justification for the claimed damages even when a GDPR violation is established.

CJEU Case Law on the Impact of a GDPR Breach on Unfair Commercial Practices

The CJEU recently ruled that violations of the GDPR can also be considered unfair commercial practices under national laws. This means competitors can sue companies for GDPR violations if they give them an unfair competitive advantage. The CJEU’s decision supports existing French case law, in which courts have already considered non-compliance with the GDPR to be a form of unfair competition. 

CJEU Case Law on the Retention of Metadata and Login Data

The CJEU ruled that French legislation allowing generalised and indiscriminate storage of traffic and location data by internet service providers violated EU law. The CJEU held that such data retention is only permissible for serious crimes and requires prior authorisation from a court or independent authority, except in emergencies. Subsequently, the French Criminal Procedure Code (article 60-1) and court decisions have been aligned with this CJEU ruling.

The Representative Actions Directive (EU Directive 2020/1828) aims to create a harmonised framework across EU member states for collective redress, particularly in consumer protection cases. France is late in transposing the Representative Actions Directive, as the deadline for doing so was 25 December 2022, but France is now in the process of implementing the Directive into its national legislation. The objective is to enhance the existing framework for collective redress, making it more accessible and aligning it with the provisions set forth by such Directive.

The French System for Collective Redress

The French legislator wanted to avoid the excesses attributed to the American “class action.” As such, it has opted for several restrictions that significantly reduce the effectiveness of class action in France.

France does not have one single, unified system. In fact, there are six, including one for personal data.

In addition to the reduced scope of application, the vast majority of the class actions do not offer the possibility of claiming compensation for the entirety of damages. For example, Article L. 623-2 of the French Consumer Code states that the class action can only claim compensation for economic loss resulting from material damage suffered by consumers. To benefit from the class action (opt-in system), the plaintiff must declare himself within a certain period.

Only consumer associations can proceed with collective redress related to personal data.

As France moves to implement the Representative Actions Directive, the landscape for collective redress is likely to become more accessible and structured, enabling consumers to pursue their rights effectively.

The Data Act, which came into force on 11 January 2024 and will be applicable in September 2025, aims to govern access to and sharing of data, particularly in the Internet of Things (IoT) context. Its primary objective is to create a fairer and more competitive single-data market within the European Union.

The Data Act grants users the right to access data generated by their connected products, which means that manufacturers must provide users with easy and direct access to this data in an interoperable and reusable format. This right particularly applies to data related to the product’s operation and performance data, often crucial for maintenance and repair.

Significant emphasis is placed on data interoperability. Manufacturers are required to design compatible products that allow users to easily transfer data to other services or platforms. This promotes competition and the creation of a more open ecosystem.

In certain cases, manufacturers may be required to share the data they collect with third parties, particularly for reasons of competition, innovation, or public interest.

The Data Act also addresses the issue of industrial data sharing, which is particularly important for industrial IoT. It aims to encourage the sharing of this data to promote innovation and the competitiveness of European businesses.

In short, the Data Act represents a significant change in the regulation of data generated by connected objects. It aims to empower users, encourage competition and innovation, and create a fairer and more open data ecosystem. Its impact on the IoT will be substantial in the medium term, requiring significant adaptations from manufacturers and application developers.

Moreover, the Cybersecurity Act and the Cyber Resilience Act (which will be applicable, in part, in December 2027), both pieces of EU legislation, have significant implications for the Internet of Things (IoT) since the Cybersecurity Act establishes a framework for managing cybersecurity risks at the infrastructure level, indirectly affecting IoT, while the Cyber Resilience Act directly addresses the security of IoT products themselves.

The interplay between data regulation on IoT services, such as the Data Act, the Cybersecurity Act, and the Cyber Resilience Act, and data protection requirements in France is complex. It is primarily shaped by data protection and privacy laws (where IoT systems process personal data) and may be complemented by French national laws in the future.

GDPR’s Broad Applicability

The GDPR is the cornerstone. It applies to any processing of personal data by an organisation within the EU, regardless of the organisation’s location. This means IoT services operating in or targeting French users fall under GDPR’s scope, regardless of where the service provider is based.

French National Laws

While the GDPR sets the baseline, France may likely have specific national laws that further detail or specify certain aspects of data protection within the IoT context. To this end, France has enacted a law to secure and regulate the digital space (the Law No 2024-449 of 21 May 2024 aimed at securing and regulating the digital space) in anticipation of the Data Act’s obligations. For example, the law sets out interoperability, portability and functional equivalence obligations for cloud computing service providers.

In France, the use of IoT entails specific obligations for organisations operating in this field. Thus, some key aspects of data protection relevant to IoT can be highlighted.

Data minimisation and purpose limitation: IoT devices often collect vast amounts of personal data. Data protection and privacy laws mandate that only necessary data be collected for specified, explicit, and legitimate purposes. This requires careful design and implementation of IoT systems to avoid excessive data collection.

Consent: For IoT devices, obtaining meaningful consent where appropriate can be challenging due to the complexity of the technology and the variety of data collected. This often requires clear, accessible privacy policies and user-friendly consent mechanisms.

Data security: Data protection and privacy laws demand appropriate technical and organisational measures to ensure the security of personal data. IoT devices are often vulnerable to breaches. Hence, strong security protocols, regular updates, and robust incident response plans are crucial.

Data subject rights: Individuals have rights under data protection and privacy laws, including the right to access, rectify, erase, restrict processing, and data portability. IoT service providers must implement mechanisms to allow users to exercise these rights effectively.

Accountability: Data protection and privacy laws place a significant emphasis on accountability. Organisations must be able to demonstrate compliance with the regulation. For IoT, this translates into maintaining detailed records of data processing activities, conducting data protection impact assessments (DPIAs) where appropriate, and implementing appropriate data governance structures.

Data security and privacy by design: Integrating data protection into the design and development process (Privacy by Design) from the outset is paramount. This requires a multidisciplinary approach involving engineers, data scientists, legal experts, and ethics specialists.

Cross-border data transfers: If IoT data is transferred outside the EU, compliance with data transfer mechanisms (eg, standard contractual clauses and binding corporate rules) is necessary.

The regulatory landscape governing IoT services and data processing in France establishes stringent obligations to ensure the protection of personal data, security of IoT devices, and compliance with relevant laws. Organisations must navigate these obligations carefully, implementing necessary compliance measures, enhancing transparency, and prioritising user rights. Failing to comply can result in significant legal and financial repercussions, emphasising the importance of robust data governance in the context of IoT.

In France, several key bodies could be responsible for specifically enforcing data regulation concerning IoT providers, data holders, and data processing services.

The CNIL

This regulator is the primary personal data protection authority in France, overseeing compliance with data protection laws, including those relevant to IoT devices and services. In this regard, the CNIL published some IoT-related articles and provided a privacy assessment (PIA) on IoT.

The ARCEP

This authority is responsible for regulating telecommunications operators, including those providing IoT services related to communication networks, and ensuring compliance with privacy and data protection standards. In this regard, the ARCEP set up “ARCEP’s IoT workshops” to learn more about IoT services.

The ANSSI

Focusing on cybersecurity, ANSSI ensures that IoT devices and systems are secure and comply with national security standards, protecting data integrity. In this regard, the ANSSI published specific guidelines on the security of the IoT.

Summary

Data regulation related to IoT in France is enforced by the CNIL, where personal data are processed through IoT services. Furthermore, sector-specific regulators like the ARCEP and the ANSSI also play vital roles in overseeing compliance in their respective domains. To date, there is no official regulator in France specifically dedicated to IoT. 

In France, the use of cookies is mainly governed by the FDPA, the ePrivacy Directive (often referred to as the Cookie Law and which has been transposed into the FDPA), and specific guidelines on cookies issued by the CNIL.

Consent Requirements

Websites must obtain explicit consent from web users before placing cookies on their devices, except for cookies that are strictly necessary for the website’s functioning. This means that web users must be presented with a clear and affirmative option to accept cookies (opt-in).

Web users must be provided with clear information about the types of cookies used, their purposes, and how long they are stored.

Cookies

Essential cookies are necessary for the website to function correctly (eg, for session management and shopping carts). Consent is not required for these cookies.

Non-essential cookies include cookies that track user behaviour for analytics, advertising, and marketing purposes. Consent is required before using these types of cookies.

For third-party cookies (eg, those from advertisers), the website must obtain web user consent for its own cookies and any cookies placed by third parties.

Websites must also display a cookie banner or pop-up that informs users about cookie usage upon their first visit. This banner should include:

  • clear options for users to accept or reject cookies; and
  • a link to a detailed cookie policy that explains what cookies are used, their purposes, and how users can manage their preferences.

Users must be able to withdraw their consent easily at any time. This should be straightforward and accessible, similar to the process of providing consent.

Organisations must keep records of user consent and cookie preferences to demonstrate compliance with cookie regulations. Furthermore, organisations are encouraged to regularly review their use of cookies, ensuring that consent mechanisms function correctly and that users are informed of any changes in cookie policies.

Websites must provide the web user with a cookie policy that clearly outlines:

  • the types of cookies used (eg, first-party vs. third-party cookies);
  • the purpose of each cookie (eg, functionality, performance, marketing);
  • how long cookies will be stored on the web user’s device; and
  • how web users can manage their cookie preferences, including how to delete cookies or opt out.

In summary, the requirements for using cookies in France emphasise the necessity of obtaining informed and explicit consent from web users, providing transparency about cookie usage, and ensuring that web users can easily manage their cookie preferences. Thus, organisations operating websites within France need to comply with these requirements.

The FDPA primarily shapes the French legal landscape regarding online personalised advertising, the ePrivacy Directive and the CNIL’s guidelines on cookie use.

Over the past few years, personalised advertising has been at the heart of numerous complaints lodged with the CNIL. Recently, the CNIL severely punished organisations that use personalised advertising without the consent of web users: DOCTISSIMO (2023) – EUR100,000 and YAHOO! (2023) – EUR10 million.

To respect data protection principles, organisations shall notably have a valid lawful basis for processing personal data for personalised online advertising purposes. In such cases, explicit consent is required to track web users across websites (eg, through cookies) in order to provide such web users with personalised content. Web users shall also be informed about the use of such tracking mechanisms. Moreover, web users shall be able to exercise their right to object to the use of their personal data for personalised advertising at any time.

The French legal landscape establishes a comprehensive set of regulations companies must navigate when engaging in personalised advertising. Compliance with consent requirements and transparency obligations is essential for advertisers operating in France.

Data protection and privacy laws in France significantly influence the employment relationship, particularly in how employers collect, process, and manage employee personal data.

Regarding transparency, employers are required to provide clear and transparent information to employees about the purposes of data processing, the categories of data collected, the recipients of the data collected, and how long such data will be retained. In this regard, employers must issue privacy notices to employees outlining such information and informing them of their rights concerning their personal data.

Regarding the legal basis for processing, while consent can be a valid basis for processing personal data, it must be freely given, specific, informed, and unambiguous, which can be challenging in employment relationships where power dynamics exist. This is why, in France, consent is not the preferred legal basis for processing employee data. In practice, consent is required for the use of employee images, but other purposes are mainly based on contract performance, legitimate interest or compliance with a legal obligation.

Regarding monitoring, employers may be tempted to monitor employees’ activities (eg, email and internet usage, CCTV, working time, and phone call recording) in the workplace. However, such monitoring must comply with data protection and privacy laws and be justified as necessary and proportionate. In this regard, the CNIL has published a large number of recommendations on employee monitoring over the years.

Regarding the impact on Human Resources practices: HR departments must develop policies and practices that align with data protection and privacy laws as well as provide training to HR staff and employees about data protection rights and responsibilities, fostering a culture of privacy compliance.

Non-compliance with data protection and privacy laws can lead to significant penalties, including fines from CNIL and potential civil liabilities. Employees may also have grounds for legal action if their data privacy rights are infringed. For example, in December 2023, the CNIL fined AMAZON FRANCE LOGISTIQUE EUR32 million for implementing an excessively intrusive system for monitoring employee activity and performance. The company was also fined for uninformed and insufficient video protection.

Data protection and privacy laws substantially impact the employment relationship in France by establishing comprehensive rights and obligations concerning personal data for both employers and employees. Employers must navigate these legal requirements carefully to create compliant and respectful data handling practices. The focus on employee privacy rights not only aims to protect individuals but also encourages organisations to foster trust and transparency within the workplace, ultimately leading to a more equitable employment environment.

When conducting asset deals in France, parties must pay careful attention to data processing requirements to ensure compliance with applicable data protection laws, particularly privacy and data protection laws.

In summary, the requirements for data processing during asset deals in France involve a thorough understanding of relevant data protection laws and compliance measures. Parties must assess and manage personal data effectively, ensuring lawful processing, conducting due diligence, and respecting the rights of data subjects throughout the transaction. By adhering to these requirements, organisations can mitigate risks and ensure a smooth data transfer during asset deals.

International data transfers of personal information from France are subject to strict regulations under data protection and privacy laws. These regulations establish requirements and restrictions designed to protect individuals’ personal data when it is transferred outside the European Economic Area (EEA).

General Principles of International Data Transfers

Safeguards for international transfers

Adequacy decisions: The FDPA applies the GDPR requirements and allows for the transfer of personal data to third countries (non-EEA countries) only if the European Commission has determined that the country ensures an adequate level of data protection. Countries with adequate decisions are deemed to provide protection comparable to data protection and privacy laws, allowing unrestricted data transfers.

Special case of data transfers with the USA: Since July 2023, transfers between the USA and the European Union are now governed by the “EU-US Data Privacy Framework.” This framework follows the CJEU’s invalidation of the previous adequacy decision (Privacy Shield).

If a third country does not have an adequacy decision, organisations may not transfer personal data there unless they provide appropriate safeguards such as:

Standard Contractual Clauses (SCCs): Organisations can use SCCs, which are pre-approved contract templates provided by the European Commission that outline data protection obligations and rights for parties involved in the data transfer.

Binding Corporate Rules (BCRs): Multinational companies can implement BCRs, which are internal policies enforced across their global operations. BCRs must meet specific requirements and receive approval from relevant data protection authorities.

Transfer Risk Assessment (TRA) of Data Importing Jurisdiction

In cases where personal data is transferred to jurisdictions without an adequacy decision, organisations must carefully assess the data protection practices of the importing country, as follows.

  • Evaluating local laws: Organisations must analyse the local data protection laws and practices in the recipient country to determine whether they provide sufficient protection for the transferred data. This includes assessing:
    1. the comprehensiveness of privacy laws;
    2. the enforcement of privacy rights; and
    3. potential government access to data and surveillance practices.
  • The impact on data subjects’ rights: Organisations should consider how local laws may affect individuals’ rights under data protection and privacy laws. If the imported jurisdiction’s laws pose risks to data subjects’ rights (eg, through excessive government access or lack of recourse), this may prohibit transfers unless additional protections are implemented.

Furthermore, the CNIL has issued recommendations emphasising the need for a thorough TRA and highlighting the importance of maintaining documentation of the measures taken to ensure compliance during international data transfers.

Data subjects should be informed if their data will be transferred to a non-EEA country, particularly if that jurisdiction lacks an adequate decision. This communication should include details about the potential risks and the safeguards implemented.

In summary, international data transfers of personal information from France are regulated rigorously under data protection and privacy laws, with specific restrictions and requirements for assessing international data importing jurisdictions. Organisations must ensure that any transfers comply with applicable regulations, utilising appropriate safeguards and conducting thorough risk assessments to protect data subjects’ rights.

Government notifications or approvals can be required to apply the “French Blocking Statute” (see section 5.4 Blocking Statutes).

French public health code requires that health data must be hosted by an “HDS”-certified hosting provider and be exclusively hosted in a country within the “European Economic Area” (EEA). This localisation requirement provides important guarantees in terms of data protection and helps strengthen the confidence of patients and professionals in digital healthcare, as well as contributing to the emergence of an ecosystem of European players.

The “SecNumCloud standards”, published by the ANSSI, is a reference framework for cloud service providers. SecNumCloud requires personal data to be stored and processed within the EEA. Moreover, the French government encourages public bodies to host “particularly sensitive” data only on SecNumCloud-qualified cloud offerings.

France’s Blocking Statute (Law No 68-678, strengthened in 1980) restricts the transfer of sensitive economic, commercial, industrial, financial, or technical information to foreign authorities. This restriction applies to French citizens, residents, and companies operating in France unless permitted by international treaties. The law protects information potentially harming French sovereignty, security, essential economic interests, or public order. Since 1 April 2022, French companies receiving such requests must immediately report them to the Strategic Information and Economic Security Service (SISSE).

French Law No 68-678 of 26 July 1968, also known as the “French Blocking Statute” modified by Law n° 80-538 of 16 July 1980, deals with the communication of certain types of information to foreign entities. This law is notably a response to American courts’ use of the discovery procedure, which allows strategic data to be communicated during legal proceedings with rival companies.

This law aims to protect France’s economic interests by limiting the transmission of certain sensitive information abroad.

It covers two things:

  • the communication of information by French nationals or residents to foreign public authorities that could harm national interests or public order; and
  • the exchange of information to gather evidence for or in the context of foreign judicial or administrative proceedings.

French Blocking Statute prohibits any individual or legal entity from communicating these types of information to foreign public authorities except within the framework of international treaties or agreements. It requires prior authorisation from the French government to communicate such information (for example, international judicial cooperation, cross-border merger and acquisition, international banking compliance). Violating this law can result in criminal sanctions, including fines and imprisonment. It applies not only to acts committed in France but also abroad by French persons or entities.

In view of the growing use of laws with extraterritorial reach by foreign players and the lack of dissuasive effect of the French Blocking Statute, the decree of 18 February 2022 and the order of 7 March 2022 have clarified the procedure for companies and designated the Strategic Information and Economic Security Department (SISSE) as the single point of contact. SISSE will assist French companies, in liaison with the various government departments, to meet the demands of foreign courts. However, in practice, the French Blocking Statute seems to have been only relatively effective in preventing such communications.

Recent developments in France’s regulation of the international transfer of personal data reflect ongoing changes in the European data protection landscape.

Post-Schrems II Context

Schrems II ruling (2020)

The CJEU issued its landmark ruling in July 2020, invalidating the EU-US Privacy Shield framework, which previously allowed for the transfer of personal data between the EU and the United States This ruling emphasised concerns regarding US surveillance practices and the lack of comparable protection for EU citizens’ data rights.

Implications for Transfers: Following the ruling, organisations faced increased scrutiny and challenges when transferring personal data to the United States and other third countries without an adequate decision, necessitating appropriate safeguards such as Standard Contractual Clauses (SCCs).

Updated Standard Contractual Clauses (SCCs)

New SCCs

In June 2021, the European Commission adopted new Standard Contractual Clauses, replacing the previous versions. These updated clauses provide a more flexible and comprehensive framework for organisations to establish compliance with data protection and privacy laws when transferring personal data internationally.

CNIL guidance

In France, the CNIL has provided guidance on implementing new SCCs and emphasised the necessity of conducting thorough assessments of data protection laws in third countries when utilising SCCs for international transfers.

Prompting Additional Safeguards

Risk assessments (TRA) are essential for organisations following the invalidation of the Privacy Shield and the implications of the Schrems II ruling. These assessments help evaluate whether the legal framework of the importing country offers adequate data protection.

Organisations are encouraged to implement supplementary measures alongside SCCs or other safeguards when transferring data to jurisdictions considered inadequate. This may include encryption, pseudonymisation, or additional contractual clauses.

New EU-US Data Transfer Framework

Following the invalidation of the Privacy Shield, ongoing discussions between EU and US authorities have aimed to establish a new transatlantic data transfer framework to address the concerns raised in the Schrems II ruling.

Efforts continue to create a stable framework that aligns with EU data protection standards while enabling data flows between the USA and EU member states. However, no final agreement has been reached.

The EU-US Data Privacy Framework now regulates data transfers between the USA and the European Union.

Jeantet

11 rue Galilée
75116
Paris
France

+33 0 1 45 05 80 08

+33 0 1 47 04 20 41

info@jeantet.fr www.jeantet.fr/en/
Author Business Card

Trends and Developments


Authors



LPA Law is a leading French full-service law firm with over 250 lawyers across 14 international offices. The firm offers strategic advisory and litigation services across 11 industries, combining deep sector expertise and a global presence to address complex legal and regulatory challenges. The firm’s IP/IT/Data team, led by Prudence Cadio and supported by senior associate Lobna Boudiaf and two other lawyers, provides comprehensive advice to high-profile clients on matters such as data privacy, AI, and cybersecurity. The team excels in negotiating high-stakes data contracts, ensuring compliance with GDPR and French data protection laws, and managing regulatory investigations, data subject rights disputes, breach incidents, and supervisory authority audits. By integrating risk management and litigation strategies, they effectively address client needs in this dynamic and complex regulatory landscape.

Introduction

Data privacy has taken centre stage in France in recent years, with AI-driven technologies reshaping the debate. As artificial intelligence becomes increasingly integrated into digital services, concerns about transparency, user control, and ethical data processing have intensified. In response, the French data protection authority, the CNIL (Commission Nationale de l’Informatique et des Libertés), has reinforced its role as a key regulatory authority, not only enforcing compliance but also providing pedagogical guidance to help businesses and developers navigate the evolving legal landscape.

From mobile app permissions to AI governance, the CNIL has issued a series of recommendations aimed at ensuring that innovation respects fundamental rights. Whether it’s refining user consent mechanisms, promoting “privacy by design”, or clarifying AI’s legal implications, the CNIL remains at the forefront of data protection in France.

This guide offers a bird’s eye view of the latest updates in French data privacy law, helping organisations comply with current regulations, best practices, and emerging compliance challenges.

AI and Data Privacy: Latest Developments in France

The CNIL is actively addressing the intersection of AI and data privacy.

In February 2025, the CNIL published new recommendations to support responsible AI innovation, emphasising that the General Data Protection Regulation (GDPR) enables the development of innovative and responsible AI in Europe.

Additionally, the CNIL has been collaborating with other data protection authorities to promote an AI governance framework that provides legal certainty and safeguards for individuals, including transparency and respect for fundamental rights.

These initiatives underscore the CNIL’s commitment to ensuring that AI technologies respect individuals’ privacy rights while fostering innovation.

AI and GDPR compliance: CNIL’s practical guidelines for responsible innovation

As AI continues to reshape industries, European regulators strive to establish a legal framework that fosters innovation while upholding fundamental rights. In this evolving landscape, France’s data protection authority, the CNIL, has released new guidelines on AI models, offering a structured approach to compliance with the GDPR. These recommendations provide much-needed clarity amid the increasing complexity of AI governance at both the national and European levels.

A risk-based approach to AI and personal data

The CNIL recognises the growing interconnection between AI and personal data processing, particularly in machine learning models trained on vast datasets. In its new guidelines, the authority emphasises a risk-based approach, urging AI developers and deployers to assess data protection risks at every stage of the AI lifecycle. This aligns with the principles of privacy by design and by default, ensuring that GDPR compliance is not an afterthought but an integral part of AI system development.

A key takeaway from the CNIL's recommendations is the need to distinguish between personal and non-personal data in AI training datasets. While fully anonymised data falls outside the scope of GDPR, pseudonymised data remains subject to its provisions. The CNIL stresses that AI stakeholders must ensure robust anonymisation techniques or justify the necessity of processing personal data under an appropriate legal basis.

Clarifying the role of AI stakeholders

One of the central challenges in AI regulation is identifying responsible parties within the data processing chain. The CNIL provides practical insights into classifying AI developers, deployers, and users as data controllers or processors, depending on their role in determining the purposes and means of data processing.

  • AI model providers who design and train models on personal data will often be considered data controllers with direct obligations under GDPR.
  • Companies integrating AI solutions into their services may also qualify as controllers if they influence how the model processes personal data.
  • Cloud-based AI service providers could act as processors, processing data on behalf of client organisations.

These distinctions are crucial, as they define the extent of regulatory responsibility and the necessary contractual safeguards between parties involved in AI operations.

Managing AI model transparency and data subjects' rights

The CNIL highlights the challenge of ensuring transparency in AI decision-making, particularly for complex models based on machine learning and deep learning. AI systems must be designed to provide meaningful explanations to individuals affected by automated processing, in line with GDPR’s right to information and right to explanation.

The guidelines also stress that AI developers must facilitate data subjects’ rights, including:

  • the right to access and rectify data used by AI models;
  • the right to object to automated decision-making; and
  • the right to deletion, especially in cases where personal data is no longer necessary.

To achieve this, the CNIL encourages data governance mechanisms that ensure traceability of AI decisions and enable users to challenge or correct model outputs when necessary.

Bridging the gap between AI and GDPR: a pragmatic approach

The CNIL’s guidance takes a pragmatic approach to AI compliance under the GDPR by focusing on practical implementation. It offers companies a clear roadmap for mitigating AI-related data protection risks without stifling technological progress.

By providing concrete compliance steps, the CNIL aims to support responsible AI innovation while reinforcing GDPR as a key framework for AI governance. However, as AI regulation evolves, businesses operating in France and across the EU must remain adaptable, ensuring their AI-driven solutions meet both national and European legal standards.

The CNIL’s latest recommendations serve as a valuable reference point for AI developers, businesses, and legal practitioners, helping them navigate the increasingly complex intersection of AI, data privacy, and regulatory compliance.

Joint statement on building trustworthy data governance frameworks to encourage the development of innovative and privacy-protective AI

In a collaborative effort to ensure the responsible development and deployment of AI, five leading data protection authorities – namely, from France, Australia, South Korea, Ireland, and the United Kingdom – have come together to strengthen governance in this rapidly evolving field. At the Paris AI Action Summit, held from February 6 to 11, 2025, these organisations signed a joint statement that underscores their mutual commitment to fostering AI innovation while simultaneously protecting individual privacy rights.

This declaration stresses the importance of creating a legal framework that strikes a balance between AI technologies’ potential and the necessary safeguards to protect individuals from privacy infringements, biases, and discrimination. AI has tremendous potential across a wide range of sectors, including healthcare, education, finance, and transportation, but without robust governance, it could exacerbate issues like misinformation, surveillance, and unfair treatment.

Building a reliable governance framework for trustworthy AI

This initiative aims to promote an AI governance framework that provides legal certainty to stakeholders and guarantees to individuals, particularly in terms of transparency and respect for fundamental rights.

The declaration highlights the numerous opportunities offered by AI in various fields such as innovation, research, the economy, and society. It also warns of several risks related to personal data protection and privacy, discrimination and algorithmic bias, as well as misinformation and AI hallucinations.

Authorities advocate integrating data protection principles from the design phase of AI systems, establishing robust data governance, and proactively managing risks to ensure that AI complies with current regulations.

The declaration also emphasises the growing complexity of data processing through AI in areas such as healthcare, public services, public security, human resources, and education. It also highlights the diversity of actors involved and the need for a regulatory framework that adapts to technological advancements.

In response to the challenges posed by AI, the key commitments of the authorities in this joint statement are:

  • to clarify the legal bases for data processing within AI;
  • to share information and establish appropriate security measures;
  • to monitor the technical and societal impacts of AI by involving various stakeholders;
  • to encourage innovation while reducing legal uncertainties; and
  • to strengthen cooperation with other relevant authorities (consumer protection, competition, intellectual property).

A global approach to AI regulation

The collaboration of these five data protection authorities highlights the necessity of a global approach to AI regulation. As AI continues to evolve, both innovation and privacy protection must move forward in tandem.

Challenges in Consent and Data Privacy

As digital technologies evolve, consent and data privacy are becoming pressing issues in France. Companies collect personal data, often without clear user consent, raising concerns about privacy rights. Despite regulations like the GDPR requiring transparent, informed consent, incidents of unauthorised data collection continue. Recent developments in France highlight the ongoing challenges of ensuring user consent and privacy in the tech industry, with new cases and trends shaping data protection laws.

CNIL fines telecom provider

The CNIL has imposed a EUR50 million fine on a major telecommunications provider for inserting advertisements directly into users’ email inboxes without proper consent. The investigation revealed that these ads were displayed as regular emails but were, in fact, targeted advertising, requiring prior user consent under French and EU data protection laws (GDPR).

The regulatory authority found that the company failed to obtain valid, informed, and explicit user consent before delivering these ads. Since they were not traditional service messages but marketing content, the CNIL ruled that users should have been given a clear choice to opt in or out. The lack of proper transparency and user control over their data led to a significant penalty.

This decision highlights the crucial need for strict adherence to GDPR’s consent requirements, especially in digital advertising. Companies that manage user data must similarly ensure they provide clear information and obtain explicit consent from users to avoid regulatory penalties.

The controversial collection of voice data by tech giants

A criminal complaint has been filed in France against a large tech company due to allegations of unauthorised collection of audio recordings from user devices. The legal action targets the massive collection of voice data without properly informing users, which is a violation of privacy laws.

The French complaint is largely based on information provided by a former subcontractor (who has benefited from the whistle-blower’s status) who worked for a data analysis firm responsible for reviewing recorded audio snippets. The whistle-blower, who accessed thousands of recordings daily, claims that analysts could sometimes link voice data to specific users by cross-referencing additional app information from the device. The data allegedly included highly sensitive content, such as children’s voices, discussions about health issues, political views, or even intimate matters. The whistle-blower was particularly alarmed by recordings of disturbing content and raised concerns with supervisors, though it remains unclear how these reports were handled.

The complaint also highlights that millions of recordings were collected unintentionally, often without users actively triggering the voice assistant. Analysts were reportedly encouraged not to flag recordings as accidental unless absolutely necessary, leading to an exponential increase in the volume of processed voice data. This practice is alleged to be in direct conflict with GDPR, which requires clear and informed consent before personal data is collected and processed.

The whistle-blower previously alerted privacy regulators, including the CNIL, criticising what they perceive as a lack of enforcement against major tech firms. Meanwhile, the company in question has defended its practices in the US case, insisting that voice data has never been used for marketing profiles or sold to third parties. However, the French complaint argues that the company’s public messaging on privacy protections is misleading, as its products do not necessarily offer the level of data protection claimed.

This case underscores the growing tension between AI-powered voice assistants and data privacy regulations, raising critical questions about transparency, user control, and corporate accountability.

Summary of CNIL’s recommendations on mobile app permissions

The French data protection authority, CNIL, has issued guidelines to help developers create mobile applications that respect user privacy, particularly regarding app permissions.

Understanding permissions vs consent

Permissions allow apps to access certain device resources (eg, location, contacts, or microphone) but do not necessarily equate to user consent under GDPR and French data protection laws. While permissions control technical access, they do not regulate data use. In some cases, collecting explicit user consent is still required.

Key recommendations for OS providers

CNIL advises operating system (OS) providers to refine permission systems to enhance user control by:

  • allowing users to grant permissions with varying levels of precision (eg, granting approximate rather than exact location);
  • limiting permissions to specific files instead of entire media libraries; and
  • providing time-limited permissions instead of indefinite access.

Best practices for app developers

App developers must:

  • choose permissions that align with their application’s functionality while minimising data access;
  • clearly differentiate when permission alone is sufficient and when additional user consent is needed;
  • implement a Consent Management Platform (CMP) when handling data processing that requires explicit consent; and
  • ensure users understand the connection between permission requests and the actual data processing involved.

Balancing permissions and consent

Developers should seamlessly integrate permissions and consent collection, ensuring users are not confused. CNIL suggests:

  • consent may be requested before or after permission requests, but the process must remain transparent; and
  • if a user denies consent, there is no need to request permission unnecessarily.

These recommendations aim to empower users while ensuring compliance with privacy regulations. Developers and OS providers must work together to strike a balance between functionality and privacy protection.

The CNIL’s Expansive Interpretation of Personal Data

The CNIL has reaffirmed its particularly broad interpretation of the concept of personal data, drawing a clear distinction between personal and anonymous data, in a formal notice addressed to the French company, Qwant. This decision further destabilises an already fragile legal framework, where regulatory authorities and courts continue to adopt diverging interpretations.

Qwant’s Position: A Privacy-Centric Search Engine

Qwant, a search engine designed to prioritise user privacy, maintained that it did not collect personal data from users conducting searches when displaying advertisements related to their queries.

As part of its advertising model, Qwant transmitted primarily technical data to Microsoft, including truncated or hashed IP addresses used to generate an identifier. This data enabled Microsoft to:

  • display contextual advertisements relevant to users’ search queries;
  • count ad impressions; and
  • provide supplemental search results when Qwant’s own engine could not return sufficient results.

The CNIL’s findings: pseudonymisation does not equate to anonymisation

Following a detailed technical investigation, the CNIL concluded that the data transmitted could not be classified as anonymous but only as pseudonymous.

Furthermore, Qwant failed to disclose in its privacy policy the advertising purpose behind its data transfers to Microsoft, nor did it specify the legal basis for such processing – leading the CNIL to issue a formal reminder of its legal obligations.

The legal boundaries of anonymisation, a contested issue

At the heart of this case lies a fundamental and long-contested legal question: Can data that uniquely distinguishes an individual – without necessarily allowing for their direct identification – still be classified as personal data?

Data that enables the differentiation of an individual may be deemed to “relate” to that person, thus satisfying one of the core criteria of the GDPR’s definition of personal data. Such processing allows for individualised treatment – such as targeted advertising – even if the person’s identity is not immediately discernible.

However, whether such data meets the equally essential identifiability criterion remains a matter of legal debate, particularly in light of the evolving jurisprudence of the Court of Justice of the European Union (CJEU). The Court has consistently assessed whether an individual’s identity can be reasonably re-established using available means (notably in the Breyer and Scania rulings).

The CJEU is set to revisit this issue in the highly anticipated CRU case, where the Advocate General’s recent opinion offers crucial insights into the Court’s likely reasoning.

The Advocate General’s opinion in the CRU case: a potential shift in the legal framework?

In Single Resolution Board (CRU) v European Data Protection Supervisor (EDPS), Advocate General Dean Spielmann has taken a nuanced position on classifying pseudonymised data and its implications for data controllers and processors.

The case concerns the CRU’s failure to inform affected shareholders and creditors of a bank’s insolvency about the transfer of their questionnaire responses to Deloitte, which had been engaged as an independent auditor. These responses were pseudonymised – names were replaced with alphanumeric identifiers – raising the question of whether Deloitte should be considered a recipient of personal data under the GDPR.

Key takeaways from the Advocate General’s opinion are as follows.

  • Pseudonymisation does not automatically equate to personal data. Referring to Recital 26 of the GDPR, the Advocate General emphasises that data should only be considered personal if it allows for the reasonable identification of the individual concerned. In this case, the robustness of the pseudonymisation process should have been assessed to determine whether Deloitte could reasonably re-identify individuals.
  • A complex approach to the notion of a “data recipient.” While Deloitte may not have been processing personal data from its own perspective, the Advocate General suggests that it should still be considered a recipient of personal data vis-à-vis the CRU since the data remained personal from the CRU’s standpoint. This reasoning, while intricate, could have significant implications for how data controllers and processors are classified under the GDPR.
  • The data protection authority bears the burden of proof. The Advocate General also addressed the evidentiary standard for determining whether data is personal, concluding that the CRU had already provided sufficient evidence that Deloitte could not identify individuals. Consequently, the burden shifted to the EDPS to prove otherwise.

The CJEU’s forthcoming ruling in the CRU case is likely to shape the evolving landscape of personal data classification, particularly regarding pseudonymisation and the obligations of data controllers and processors. Its decision will be closely watched, as it could redefine fundamental aspects of data protection law in the EU.

Conclusion

In conclusion, as artificial intelligence reshapes digital environments, France maintains a strong commitment to safeguarding individual privacy, including that of clients. The CNIL plays an active role in developing data protection frameworks that balance innovation with ethical considerations. With the regulatory landscape evolving, businesses, developers, and organisations must stay updated on new legal requirements and guidance.

LPA Law

136 avenue des Champs-Elysées
75008 Paris
France

+33 (0)1 53 93 30 00

+33 (0)1 53 93 30 30

paris@lpalaw.com www.lpalaw.com
Author Business Card

Law and Practice

Authors



Jeantet has been one of the leading independent French corporate law firms since 1924, delivering customised, high value-added services, committed to ethics and human values. The firm is used to dealing with complex cross-border IT, data protection and cybersecurity issues for international companies. It acts on behalf both of IT service providers (publishers, IaaS, PaaS, SaaS, service providers, etc) and of their clients (banking, insurance, industry, tourism or retail) at all stages of IT projects: choice of architectural architecture, negotiation and drafting of contracts from the simplest to the most complex (outsourcing, maintenance, integration, ERP, migration, cloud services, etc). It has broad experience in IT disputes, especially during expertise phases and offers a fully integrated external DPO service.

Trends and Developments

Authors



LPA Law is a leading French full-service law firm with over 250 lawyers across 14 international offices. The firm offers strategic advisory and litigation services across 11 industries, combining deep sector expertise and a global presence to address complex legal and regulatory challenges. The firm’s IP/IT/Data team, led by Prudence Cadio and supported by senior associate Lobna Boudiaf and two other lawyers, provides comprehensive advice to high-profile clients on matters such as data privacy, AI, and cybersecurity. The team excels in negotiating high-stakes data contracts, ensuring compliance with GDPR and French data protection laws, and managing regulatory investigations, data subject rights disputes, breach incidents, and supervisory authority audits. By integrating risk management and litigation strategies, they effectively address client needs in this dynamic and complex regulatory landscape.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.