Data Protection & Privacy 2025 Comparisons

Last Updated March 11, 2025

Contributed By Jeantet

Law and Practice

Authors



Jeantet has been one of the leading independent French corporate law firms since 1924, delivering customised, high value-added services, committed to ethics and human values. The firm is used to dealing with complex cross-border IT, data protection and cybersecurity issues for international companies. It acts on behalf both of IT service providers (publishers, IaaS, PaaS, SaaS, service providers, etc) and of their clients (banking, insurance, industry, tourism or retail) at all stages of IT projects: choice of architectural architecture, negotiation and drafting of contracts from the simplest to the most complex (outsourcing, maintenance, integration, ERP, migration, cloud services, etc). It has broad experience in IT disputes, especially during expertise phases and offers a fully integrated external DPO service.

In France, data protection and privacy laws are primarily governed by the General Data Protection Regulation (the “GDPR”) and the French Data Protection Act No. 78-17 of 6 January 1978 as amended by Act No 2018-493 of 20 June 2018 and Ordinance No 2018-1125 of 12 December 2018 (the “FDPA”), hereinafter together “data protection and privacy laws”.

GDPR as the baseline: The GDPR serves as the primary legal framework for data protection across the EU, including France. It aims to harmonise data protection standards, ensuring a consistent approach to individual rights and data protection across member states.

The FDPA as a Complement: While the GDPR establishes a baseline for data protection, each EU member state has the option to introduce additional provisions through their national laws. In France, the FDPA provides extra protections in specific contexts, addressing particular national concerns or local customs.

For example, the FDPA includes specific provisions that focus on certain areas not fully addressed by the GDPR, such as genetic and biometric data usage, health data, and data relating to criminal convictions or offences.

In conclusion, France’s data protection and privacy landscape is characterised by the interplay between the GDPR and the FDPA. The GDPR provides a harmonised framework that sets minimum standards for the protection of personal data across the EU, while the FDPA introduces specific adaptations to address national needs and contexts.

In France, the National Commission for Information Technology and Civil Liberties (“CNIL”) is the primary regulator for data protection and privacy. However, other relevant regulators in the broader context of information technology and digital services include the French Information Systems Security Agency (“ANSSI”) and the French Regulatory Authority for Electronic Communications, Postal Services and Press Distribution (“ARCEP”). Furthermore, the French Prudential Supervision and Resolution Authority (“ACPR”), which regulates financial institutions, will take into account whether a regulated entity is meeting data privacy standards.

The CNIL

The CNIL is responsible for overseeing data protection and privacy laws, particularly the enforcement of data protection and privacy laws. The CNIL’s main functions are the following:

  • Supervision: Monitors compliance with data protection and privacy laws by organisations operating in France.
  • Guidance: Provides advice and guidelines to businesses, public authorities, and individuals on data protection issues.
  • Complaints: Handles complaints from individuals regarding data protection violations.
  • Sanctions: Has the authority to conduct audits and investigations and impose sanctions such as fines for non-compliance with data protection regulations.
  • Public awareness: Promotes understanding of data protection rights and responsibilities among the public.

The CNIL conducts assessments and audits of organisations to ensure compliance with data protection and privacy laws. It issues guidelines and recommendations to clarify legal requirements and best practices. CNIL actively engages with businesses to help them implement robust data protection measures, offering tools and resources. Regarding enforcement, CNIL can issue warnings, impose corrective measures, and levy fines for serious data protection and privacy law breaches.

The ANSSI

The ANSSI is dedicated to enhancing cybersecurity within France. Its responsibilities include protecting networks and offering guidance to improve cybersecurity resilience in the private sector. The agency develops and implements national cybersecurity strategies, provides support during cyber incidents, and certifies secure products. It is mandatory to report major cybersecurity incidents to ANSSI. Additionally, notifications to ANSSI will be combined with any reports to CNIL (National Commission on Informatics and Liberty) if a security incident involves personal data.

The ARCEP regulates electronic communications, postal services, and press distribution in France and ensures that communications networks operate fairly and efficiently. While not focused on data protection per se, ARCEP can intersect with privacy issues in the communications sector.

The ACPR

The ACPR supervises and regulates France’s banking and insurance sectors. It also plays a key role in resolving failing financial institutions to minimise systemic risk. In essence, the ACPR acts as a guardian of the French financial system’s health and integrity. Major operational or security incidents affecting the information systems of financial entities must be notified to the ACPR (as well as to the CNIL if such security incidents involve the violation of personal data).

These regulators work through a combination of rule-making, guidance publication, compliance assessment, and enforcement activities. The CNIL, in particular, plays a pivotal role in shaping the data protection landscape in France by engaging with stakeholders, responding to technological advancements, and maintaining a balance between privacy rights and innovation.

In France, administrative proceedings related to data protection and privacy are governed primarily by the CNIL.

Initiation of Administrative Proceedings

Complaint Submission: Individuals can file complaints with CNIL if they believe their data protection rights have been violated. Complaints can be submitted via CNIL’s website.

Investigations: CNIL can also initiate proceedings on its own initiative without a complaint if it suspects a breach of data protection and privacy laws. This often occurs in response to reported incidents.

Conducting Administrative Proceedings

Investigation process: CNIL can conduct on-site inspections or online investigations of the organisation in question. It may require the organisation to provide relevant documents and information related to the data processing activities.

Cooperation with the organisation: Organisations are expected to cooperate with CNIL during the investigation. CNIL may issue recommendations for compliance before moving to sanctions.

Reporting findings: After the investigation, CNIL compiles its findings and notifies the organisation of any violations found, allowing the entity to respond or rectify issues before formal sanctions are imposed.

Decision-making: CNIL issues formal decisions based on its findings, determining whether a violation occurred and what sanctions (if any) are appropriate.

Calculation of Administrative Fines

  • Criteria for Fines: When calculating administrative fines under the French Data Protection Act, CNIL considers various factors, such as:
    1. the nature, severity and duration of the violation;
    2. the number of affected individuals;
    3. the intention or negligence behind the violation;
    4. the categories of personal data involved;
    5. the previous compliance history of the organisation; as well as
    6. the degree of cooperation with CNIL during the investigation.
  • Proportionality and fairness: The CNIL takes into account the company’s turnover with a view to imposing such a fine.
  • Fine limits: The CNIL applies the limit set up in the GDPR (up to €20 million or 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher). CNIL also follows these thresholds but typically applies fines based on the specific context of the violation.
  • Publication of sanctions: Decisions, including fines, are generally published (in an annual report or online), but the specifics of the organisation involved may sometimes be anonymised to protect confidentiality.
  • This administrative approach underscores CNIL’s commitment to enforcing data protection and privacy laws effectively while allowing organisations to comply before facing punitive measures (even if nothing prevents the CNIL from sanctioning directly).

The CNIL has undertaken several notable administrative proceedings in recent years, reflecting its commitment to enforcing data protection and privacy laws and ensuring compliance with them. Here are some of the most significant cases.

Orange (2024)

Background: In November 2024, Orange was fined EUR50 million.

Findings: The company was sanctioned for displaying advertisements in its users’ emails without their consent.

Amazon (2024)

Background: In June 2024, Amazon received a EUR15 million fine.

Findings: CNIL identified violations concerning excessive worker monitoring.

Yahoo! (2023)

Background: In December 2023, Yahoo! received a EUR10 million fine.

Findings: The company was sanctioned for failing to respect web users’ choice to refuse cookies on its “Yahoo.com” site and for failing to allow users of its “Yahoo! Mail” messaging service to freely withdraw their consent to cookies.

Criteo (2023)

Background: In June 2023, Criteo was fined EUR40 million.

Findings: The company was sanctioned for infractions related to processing personal data for targeted advertising without valid user consent.

Discord (2022)

Background: In November 2022, CNIL sanctioned Discord with a EUR800,000 fine.

Findings: The company was penalised for several breaches, including issues with user data retention and password security.

TikTok (2022):

Background: In December 2022, CNIL fined TikTok EUR5 million for cookie-related violations.

Findings: The investigation revealed that TikTok did not allow users to refuse cookies as easily as they could accept them, thus violating French data protection law.

These cases demonstrate CNIL’s ongoing vigilance in enforcing data protection and privacy laws, particularly paying attention to large tech companies and large-scale data processing practices.

Recent developments regarding artificial intelligence (AI) regulation in France reflect ongoing efforts to ensure that AI technologies are used responsibly and ethically, particularly regarding data protection.

EU Artificial Intelligence Act (the “AI Act”) Proposal and Adoption

As part of the European Union’s broader approach to AI regulation, France has supported the proposed AI Act. This legislation seeks to classify AI systems based on risk levels (e.g., minimal, limited, high, and unacceptable) and impose stricter requirements for high-risk applications, particularly those that process personal data. The AI Act has been published on 12 July 2024, and its implementation will take place in stages until 2 August 2027.

CNIL’s Implications

The CNIL is actively participating in the work currently being carried out by the European Data Protection Board (EDPB) on the relationship between the rules applicable to the protection of personal data and the AI Act. The aim of this work is to provide further clarification on the points of articulation while enabling a harmonised interpretation between the CNIL and its European counterparts. Moreover, the CNIL published several guidelines and recommendations on the interplay between data protection and privacy laws and the AI Act.

Integration of AI Systems

The integration of AI systems in various sectors necessitates firms to ensure compliance with data and privacy laws. Organisations utilising AI must assess how their technologies affect the processing of personal data and implement measures to remain compliant, such as those outlined below.

  • Data minimisation: AI systems must adhere to the principle of data minimisation, meaning that only data necessary for the specified purpose should be collected and processed.
  • Transparency: Organisations using AI must clearly inform individuals about how their data is being used by AI systems, including the purposes of processing, data retention periods, and the logic behind automated decision-making.
  • Rights of data subjects: Individuals have the right to access their data, request corrections, object to data processing, and seek the right to be forgotten. These rights protect individuals from potential harms associated with AI systems, such as biased decision-making.
  • Algorithmic accountability: The proposed regulations emphasise accountability mechanisms for AI systems, requiring organisations to ensure that their algorithms are fair and transparent and do not perpetuate bias. This includes regular audits of AI systems to evaluate data handling practices and algorithm performance.
  • Security: The security of AI systems remains an obligation in order to guarantee data protection both during system development and in anticipation of its deployment (eg, security measures concerning both the training data and the development and operation of the AI system).
  • Impact assessments: For high-risk AI systems, conducting Data Protection Impact Assessments (DPIAs) is required to evaluate the risks to personal data and the measures needed to mitigate those risks.

Regulations

The evolving regulation of AI in France reflects a balance between fostering innovation and ensuring robust data protection. As AI systems increasingly permeate various sectors, the emphasis on ethical considerations, transparency, and individual rights translates into strong safeguards designed to protect personal data. Thus, organisations must navigate these regulatory landscapes carefully to leverage AI technologies while complying with data protection and privacy laws.

The AI Act significantly impacts data protection in France by integrating principles of privacy and ethical governance into the development and deployment of AI technologies.

Alignment with Data Protection Principles

The AI Act emphasise key principles of data protection, including transparency, accountability, and data minimisation. These principles ensure that AI systems processing personal data are designed and deployed in ways that respect individuals’ rights.

Stricter Requirements for High-Risk Applications

Under the AI Act, high-risk AI applications (eg, in sectors like healthcare and finance) face rigorous requirements, including the necessity for risk assessments.

Enhanced Transparency Obligations

AI systems must provide clear and understandable information to users regarding how personal data is being processed. This aligns with the transparency obligations of data protection and privacy laws, reinforcing users’ rights to be informed about data usage.

Protection Against Bias and Discrimination

AI regulations compel developers to consider fairness and non-discrimination. This is particularly important in the context of data protection and privacy laws, which require adherence to principles of equality and non-discrimination, minimising the risk of biased algorithms adversely affecting individuals.

National Governance

The CNIL is fully committed to securing companies that are innovating in AI in their application of data protection and privacy laws and promoting AI that respects people’s rights over their data.

Moreover, high-risk AI systems already subject to sector-specific regulation will continue to be regulated by competent national regulators, such as the “National Agency for the Safety of Medicines and Health Products” (ANSM) for medical devices.

Data Protection and Privacy under the AI Act

In addition, the CNIL considers that the AI Act can extend and take over from data protection and privacy laws on certain well-defined points, such as:

  • the AI Act replaces certain data protection and privacy laws rules for the use by law enforcement agencies of real-time remote biometric identification in publicly accessible spaces, which it makes very exceptionally possible under certain conditions (article 5);
  • it exceptionally allows the processing of sensitive data to detect and correct potential biases that could cause harm, if strictly necessary and subject to appropriate safeguards (article 10);
  • it allows the re-use of personal data, including sensitive data, within the “regulatory sandboxes” framework. These sandboxes are intended to facilitate the development of systems of significant public interest (such as improving the healthcare system) and are placed under the supervision of a dedicated authority which must first consult the CNIL and verify compliance with a certain number of requirements (article 59).

The regulation of AI in France and its impact on data protection reflect a comprehensive approach aimed at safeguarding individual rights while fostering innovation. The interplay between data protection and privacy laws and the AI Act demonstrates a cohesive framework that addresses the complexities of AI technology. Together, these laws ensure that the development and deployment of AI systems are conducted ethically, transparently, and in alignment with data protection principles, promoting trust in AI technologies across society.

Recent Trends in Privacy Litigation in France

There has been a rise in privacy-related lawsuits in France, largely driven by heightened awareness of data protection rights following the implementation of data protection and privacy laws.

Many cases relate to data breaches, unauthorised processing of personal information, or mishandling of a request to exercise rights (particularly the right to access). Individuals are increasingly seeking compensation for damages resulting from breaches, specifically emphasising organisations’ accountability to protect personal data.

There has been a notable trend in seeking damages for moral damage caused by data privacy violations. Plaintiffs are leveraging the data protection and privacy laws’ provisions on damages to pursue financial compensation, reflecting a shift in how privacy violations are perceived and litigated.

The CNIL plays a crucial role in shaping privacy litigation. It not only enforces compliance through fines and investigations but also provides guidance that can influence suits brought before courts. Injunctions and sanctions issued by CNIL can lay the groundwork for subsequent legal actions.

As the use of AI and digital technologies proliferates, litigation surrounding challenges such as algorithmic bias, data subject rights in automated decision-making, and transparency requirements is likely to increase.

Impact of Supranational/International Developments on Domestic Litigation

Decisions made by the ECJ regarding data protection and privacy, such as those on the validity of the Privacy Shield, the interpretation of fundamental rights under EU law or, more recently, on compensation for moral prejudice,, in the event of a breach of data protection and privacy laws, may have direct implications for domestic litigation. Such rulings inform courts in France and help shape legal interpretations concerning privacy matters.

Influence of the European Data Protection Board (EDPB): the EDPB can adopt opinions to ensure consistent application of the GDPR and binding decisions to settle disputes between supervisory authorities in the EU referred to it.

The European Commission has a key role to play especially in personal data transfers outside the EU, in international cooperation or in relation to EDPB missions.

France’s obligations under international treaties, like the European Convention on Human Rights (ECHR), which guarantees the right to respect for private and family life, influence the judicial landscape around privacy litigation. French courts often consider these obligations when adjudicating privacy-related cases.

The activities of major technology companies, particularly concerning the handling of personal data, have sparked litigation not only in France but also across Europe and globally. Developments and regulatory measures taken against these companies (eg, fines or compliance orders) can create a ripple effect domestically, prompting similar litigation.

CJEU Case Law on Articles 82 of the GDPR

Article 82 of the GDPR allows individuals to claim damages (both material and non-material) from organisations violating their data protection rights. Recent CJEU rulings clarify that compensation is required for all damages resulting from GDPR infringements. To successfully claim damages under Article 82, three conditions must be met: a fault by the organisation, demonstrable damage to the individual, and a causal link between the fault and the damage. French courts apply this principle, consistent with the French civil code, requiring justification for the claimed damages even when a GDPR violation is established.

CJEU Case Law on the Impact of a GDPR Breach on Unfair Commercial Practices

The CJEU recently ruled that violations of the GDPR can also be considered unfair commercial practices under national laws. This means competitors can sue companies for GDPR violations if they give them an unfair competitive advantage. The CJEU’s decision supports existing French case law, in which courts have already considered non-compliance with the GDPR to be a form of unfair competition. 

CJEU Case Law on the Retention of Metadata and Login Data

The CJEU ruled that French legislation allowing generalised and indiscriminate storage of traffic and location data by internet service providers violated EU law. The CJEU held that such data retention is only permissible for serious crimes and requires prior authorisation from a court or independent authority, except in emergencies. Subsequently, the French Criminal Procedure Code (article 60-1) and court decisions have been aligned with this CJEU ruling.

The Representative Actions Directive (EU Directive 2020/1828) aims to create a harmonised framework across EU member states for collective redress, particularly in consumer protection cases. France is late in transposing the Representative Actions Directive, as the deadline for doing so was 25 December 2022, but France is now in the process of implementing the Directive into its national legislation. The objective is to enhance the existing framework for collective redress, making it more accessible and aligning it with the provisions set forth by such Directive.

The French System for Collective Redress

The French legislator wanted to avoid the excesses attributed to the American “class action.” As such, it has opted for several restrictions that significantly reduce the effectiveness of class action in France.

France does not have one single, unified system. In fact, there are six, including one for personal data.

In addition to the reduced scope of application, the vast majority of the class actions do not offer the possibility of claiming compensation for the entirety of damages. For example, Article L. 623-2 of the French Consumer Code states that the class action can only claim compensation for economic loss resulting from material damage suffered by consumers. To benefit from the class action (opt-in system), the plaintiff must declare himself within a certain period.

Only consumer associations can proceed with collective redress related to personal data.

As France moves to implement the Representative Actions Directive, the landscape for collective redress is likely to become more accessible and structured, enabling consumers to pursue their rights effectively.

The Data Act, which came into force on 11 January 2024 and will be applicable in September 2025, aims to govern access to and sharing of data, particularly in the Internet of Things (IoT) context. Its primary objective is to create a fairer and more competitive single-data market within the European Union.

The Data Act grants users the right to access data generated by their connected products, which means that manufacturers must provide users with easy and direct access to this data in an interoperable and reusable format. This right particularly applies to data related to the product’s operation and performance data, often crucial for maintenance and repair.

Significant emphasis is placed on data interoperability. Manufacturers are required to design compatible products that allow users to easily transfer data to other services or platforms. This promotes competition and the creation of a more open ecosystem.

In certain cases, manufacturers may be required to share the data they collect with third parties, particularly for reasons of competition, innovation, or public interest.

The Data Act also addresses the issue of industrial data sharing, which is particularly important for industrial IoT. It aims to encourage the sharing of this data to promote innovation and the competitiveness of European businesses.

In short, the Data Act represents a significant change in the regulation of data generated by connected objects. It aims to empower users, encourage competition and innovation, and create a fairer and more open data ecosystem. Its impact on the IoT will be substantial in the medium term, requiring significant adaptations from manufacturers and application developers.

Moreover, the Cybersecurity Act and the Cyber Resilience Act (which will be applicable, in part, in December 2027), both pieces of EU legislation, have significant implications for the Internet of Things (IoT) since the Cybersecurity Act establishes a framework for managing cybersecurity risks at the infrastructure level, indirectly affecting IoT, while the Cyber Resilience Act directly addresses the security of IoT products themselves.

The interplay between data regulation on IoT services, such as the Data Act, the Cybersecurity Act, and the Cyber Resilience Act, and data protection requirements in France is complex. It is primarily shaped by data protection and privacy laws (where IoT systems process personal data) and may be complemented by French national laws in the future.

GDPR’s Broad Applicability

The GDPR is the cornerstone. It applies to any processing of personal data by an organisation within the EU, regardless of the organisation’s location. This means IoT services operating in or targeting French users fall under GDPR’s scope, regardless of where the service provider is based.

French National Laws

While the GDPR sets the baseline, France may likely have specific national laws that further detail or specify certain aspects of data protection within the IoT context. To this end, France has enacted a law to secure and regulate the digital space (the Law No 2024-449 of 21 May 2024 aimed at securing and regulating the digital space) in anticipation of the Data Act’s obligations. For example, the law sets out interoperability, portability and functional equivalence obligations for cloud computing service providers.

In France, the use of IoT entails specific obligations for organisations operating in this field. Thus, some key aspects of data protection relevant to IoT can be highlighted.

Data minimisation and purpose limitation: IoT devices often collect vast amounts of personal data. Data protection and privacy laws mandate that only necessary data be collected for specified, explicit, and legitimate purposes. This requires careful design and implementation of IoT systems to avoid excessive data collection.

Consent: For IoT devices, obtaining meaningful consent where appropriate can be challenging due to the complexity of the technology and the variety of data collected. This often requires clear, accessible privacy policies and user-friendly consent mechanisms.

Data security: Data protection and privacy laws demand appropriate technical and organisational measures to ensure the security of personal data. IoT devices are often vulnerable to breaches. Hence, strong security protocols, regular updates, and robust incident response plans are crucial.

Data subject rights: Individuals have rights under data protection and privacy laws, including the right to access, rectify, erase, restrict processing, and data portability. IoT service providers must implement mechanisms to allow users to exercise these rights effectively.

Accountability: Data protection and privacy laws place a significant emphasis on accountability. Organisations must be able to demonstrate compliance with the regulation. For IoT, this translates into maintaining detailed records of data processing activities, conducting data protection impact assessments (DPIAs) where appropriate, and implementing appropriate data governance structures.

Data security and privacy by design: Integrating data protection into the design and development process (Privacy by Design) from the outset is paramount. This requires a multidisciplinary approach involving engineers, data scientists, legal experts, and ethics specialists.

Cross-border data transfers: If IoT data is transferred outside the EU, compliance with data transfer mechanisms (eg, standard contractual clauses and binding corporate rules) is necessary.

The regulatory landscape governing IoT services and data processing in France establishes stringent obligations to ensure the protection of personal data, security of IoT devices, and compliance with relevant laws. Organisations must navigate these obligations carefully, implementing necessary compliance measures, enhancing transparency, and prioritising user rights. Failing to comply can result in significant legal and financial repercussions, emphasising the importance of robust data governance in the context of IoT.

In France, several key bodies could be responsible for specifically enforcing data regulation concerning IoT providers, data holders, and data processing services.

The CNIL

This regulator is the primary personal data protection authority in France, overseeing compliance with data protection laws, including those relevant to IoT devices and services. In this regard, the CNIL published some IoT-related articles and provided a privacy assessment (PIA) on IoT.

The ARCEP

This authority is responsible for regulating telecommunications operators, including those providing IoT services related to communication networks, and ensuring compliance with privacy and data protection standards. In this regard, the ARCEP set up “ARCEP’s IoT workshops” to learn more about IoT services.

The ANSSI

Focusing on cybersecurity, ANSSI ensures that IoT devices and systems are secure and comply with national security standards, protecting data integrity. In this regard, the ANSSI published specific guidelines on the security of the IoT.

Summary

Data regulation related to IoT in France is enforced by the CNIL, where personal data are processed through IoT services. Furthermore, sector-specific regulators like the ARCEP and the ANSSI also play vital roles in overseeing compliance in their respective domains. To date, there is no official regulator in France specifically dedicated to IoT. 

In France, the use of cookies is mainly governed by the FDPA, the ePrivacy Directive (often referred to as the Cookie Law and which has been transposed into the FDPA), and specific guidelines on cookies issued by the CNIL.

Consent Requirements

Websites must obtain explicit consent from web users before placing cookies on their devices, except for cookies that are strictly necessary for the website’s functioning. This means that web users must be presented with a clear and affirmative option to accept cookies (opt-in).

Web users must be provided with clear information about the types of cookies used, their purposes, and how long they are stored.

Cookies

Essential cookies are necessary for the website to function correctly (eg, for session management and shopping carts). Consent is not required for these cookies.

Non-essential cookies include cookies that track user behaviour for analytics, advertising, and marketing purposes. Consent is required before using these types of cookies.

For third-party cookies (eg, those from advertisers), the website must obtain web user consent for its own cookies and any cookies placed by third parties.

Websites must also display a cookie banner or pop-up that informs users about cookie usage upon their first visit. This banner should include:

  • clear options for users to accept or reject cookies; and
  • a link to a detailed cookie policy that explains what cookies are used, their purposes, and how users can manage their preferences.

Users must be able to withdraw their consent easily at any time. This should be straightforward and accessible, similar to the process of providing consent.

Organisations must keep records of user consent and cookie preferences to demonstrate compliance with cookie regulations. Furthermore, organisations are encouraged to regularly review their use of cookies, ensuring that consent mechanisms function correctly and that users are informed of any changes in cookie policies.

Websites must provide the web user with a cookie policy that clearly outlines:

  • the types of cookies used (eg, first-party vs. third-party cookies);
  • the purpose of each cookie (eg, functionality, performance, marketing);
  • how long cookies will be stored on the web user’s device; and
  • how web users can manage their cookie preferences, including how to delete cookies or opt out.

In summary, the requirements for using cookies in France emphasise the necessity of obtaining informed and explicit consent from web users, providing transparency about cookie usage, and ensuring that web users can easily manage their cookie preferences. Thus, organisations operating websites within France need to comply with these requirements.

The FDPA primarily shapes the French legal landscape regarding online personalised advertising, the ePrivacy Directive and the CNIL’s guidelines on cookie use.

Over the past few years, personalised advertising has been at the heart of numerous complaints lodged with the CNIL. Recently, the CNIL severely punished organisations that use personalised advertising without the consent of web users: DOCTISSIMO (2023) – EUR100,000 and YAHOO! (2023) – EUR10 million.

To respect data protection principles, organisations shall notably have a valid lawful basis for processing personal data for personalised online advertising purposes. In such cases, explicit consent is required to track web users across websites (eg, through cookies) in order to provide such web users with personalised content. Web users shall also be informed about the use of such tracking mechanisms. Moreover, web users shall be able to exercise their right to object to the use of their personal data for personalised advertising at any time.

The French legal landscape establishes a comprehensive set of regulations companies must navigate when engaging in personalised advertising. Compliance with consent requirements and transparency obligations is essential for advertisers operating in France.

Data protection and privacy laws in France significantly influence the employment relationship, particularly in how employers collect, process, and manage employee personal data.

Regarding transparency, employers are required to provide clear and transparent information to employees about the purposes of data processing, the categories of data collected, the recipients of the data collected, and how long such data will be retained. In this regard, employers must issue privacy notices to employees outlining such information and informing them of their rights concerning their personal data.

Regarding the legal basis for processing, while consent can be a valid basis for processing personal data, it must be freely given, specific, informed, and unambiguous, which can be challenging in employment relationships where power dynamics exist. This is why, in France, consent is not the preferred legal basis for processing employee data. In practice, consent is required for the use of employee images, but other purposes are mainly based on contract performance, legitimate interest or compliance with a legal obligation.

Regarding monitoring, employers may be tempted to monitor employees’ activities (eg, email and internet usage, CCTV, working time, and phone call recording) in the workplace. However, such monitoring must comply with data protection and privacy laws and be justified as necessary and proportionate. In this regard, the CNIL has published a large number of recommendations on employee monitoring over the years.

Regarding the impact on Human Resources practices: HR departments must develop policies and practices that align with data protection and privacy laws as well as provide training to HR staff and employees about data protection rights and responsibilities, fostering a culture of privacy compliance.

Non-compliance with data protection and privacy laws can lead to significant penalties, including fines from CNIL and potential civil liabilities. Employees may also have grounds for legal action if their data privacy rights are infringed. For example, in December 2023, the CNIL fined AMAZON FRANCE LOGISTIQUE EUR32 million for implementing an excessively intrusive system for monitoring employee activity and performance. The company was also fined for uninformed and insufficient video protection.

Data protection and privacy laws substantially impact the employment relationship in France by establishing comprehensive rights and obligations concerning personal data for both employers and employees. Employers must navigate these legal requirements carefully to create compliant and respectful data handling practices. The focus on employee privacy rights not only aims to protect individuals but also encourages organisations to foster trust and transparency within the workplace, ultimately leading to a more equitable employment environment.

When conducting asset deals in France, parties must pay careful attention to data processing requirements to ensure compliance with applicable data protection laws, particularly privacy and data protection laws.

In summary, the requirements for data processing during asset deals in France involve a thorough understanding of relevant data protection laws and compliance measures. Parties must assess and manage personal data effectively, ensuring lawful processing, conducting due diligence, and respecting the rights of data subjects throughout the transaction. By adhering to these requirements, organisations can mitigate risks and ensure a smooth data transfer during asset deals.

International data transfers of personal information from France are subject to strict regulations under data protection and privacy laws. These regulations establish requirements and restrictions designed to protect individuals’ personal data when it is transferred outside the European Economic Area (EEA).

General Principles of International Data Transfers

Safeguards for international transfers

Adequacy decisions: The FDPA applies the GDPR requirements and allows for the transfer of personal data to third countries (non-EEA countries) only if the European Commission has determined that the country ensures an adequate level of data protection. Countries with adequate decisions are deemed to provide protection comparable to data protection and privacy laws, allowing unrestricted data transfers.

Special case of data transfers with the USA: Since July 2023, transfers between the USA and the European Union are now governed by the “EU-US Data Privacy Framework.” This framework follows the CJEU’s invalidation of the previous adequacy decision (Privacy Shield).

If a third country does not have an adequacy decision, organisations may not transfer personal data there unless they provide appropriate safeguards such as:

Standard Contractual Clauses (SCCs): Organisations can use SCCs, which are pre-approved contract templates provided by the European Commission that outline data protection obligations and rights for parties involved in the data transfer.

Binding Corporate Rules (BCRs): Multinational companies can implement BCRs, which are internal policies enforced across their global operations. BCRs must meet specific requirements and receive approval from relevant data protection authorities.

Transfer Risk Assessment (TRA) of Data Importing Jurisdiction

In cases where personal data is transferred to jurisdictions without an adequacy decision, organisations must carefully assess the data protection practices of the importing country, as follows.

  • Evaluating local laws: Organisations must analyse the local data protection laws and practices in the recipient country to determine whether they provide sufficient protection for the transferred data. This includes assessing:
    1. the comprehensiveness of privacy laws;
    2. the enforcement of privacy rights; and
    3. potential government access to data and surveillance practices.
  • The impact on data subjects’ rights: Organisations should consider how local laws may affect individuals’ rights under data protection and privacy laws. If the imported jurisdiction’s laws pose risks to data subjects’ rights (eg, through excessive government access or lack of recourse), this may prohibit transfers unless additional protections are implemented.

Furthermore, the CNIL has issued recommendations emphasising the need for a thorough TRA and highlighting the importance of maintaining documentation of the measures taken to ensure compliance during international data transfers.

Data subjects should be informed if their data will be transferred to a non-EEA country, particularly if that jurisdiction lacks an adequate decision. This communication should include details about the potential risks and the safeguards implemented.

In summary, international data transfers of personal information from France are regulated rigorously under data protection and privacy laws, with specific restrictions and requirements for assessing international data importing jurisdictions. Organisations must ensure that any transfers comply with applicable regulations, utilising appropriate safeguards and conducting thorough risk assessments to protect data subjects’ rights.

Government notifications or approvals can be required to apply the “French Blocking Statute” (see section 5.4 Blocking Statutes).

French public health code requires that health data must be hosted by an “HDS”-certified hosting provider and be exclusively hosted in a country within the “European Economic Area” (EEA). This localisation requirement provides important guarantees in terms of data protection and helps strengthen the confidence of patients and professionals in digital healthcare, as well as contributing to the emergence of an ecosystem of European players.

The “SecNumCloud standards”, published by the ANSSI, is a reference framework for cloud service providers. SecNumCloud requires personal data to be stored and processed within the EEA. Moreover, the French government encourages public bodies to host “particularly sensitive” data only on SecNumCloud-qualified cloud offerings.

France’s Blocking Statute (Law No 68-678, strengthened in 1980) restricts the transfer of sensitive economic, commercial, industrial, financial, or technical information to foreign authorities. This restriction applies to French citizens, residents, and companies operating in France unless permitted by international treaties. The law protects information potentially harming French sovereignty, security, essential economic interests, or public order. Since 1 April 2022, French companies receiving such requests must immediately report them to the Strategic Information and Economic Security Service (SISSE).

French Law No 68-678 of 26 July 1968, also known as the “French Blocking Statute” modified by Law n° 80-538 of 16 July 1980, deals with the communication of certain types of information to foreign entities. This law is notably a response to American courts’ use of the discovery procedure, which allows strategic data to be communicated during legal proceedings with rival companies.

This law aims to protect France’s economic interests by limiting the transmission of certain sensitive information abroad.

It covers two things:

  • the communication of information by French nationals or residents to foreign public authorities that could harm national interests or public order; and
  • the exchange of information to gather evidence for or in the context of foreign judicial or administrative proceedings.

French Blocking Statute prohibits any individual or legal entity from communicating these types of information to foreign public authorities except within the framework of international treaties or agreements. It requires prior authorisation from the French government to communicate such information (for example, international judicial cooperation, cross-border merger and acquisition, international banking compliance). Violating this law can result in criminal sanctions, including fines and imprisonment. It applies not only to acts committed in France but also abroad by French persons or entities.

In view of the growing use of laws with extraterritorial reach by foreign players and the lack of dissuasive effect of the French Blocking Statute, the decree of 18 February 2022 and the order of 7 March 2022 have clarified the procedure for companies and designated the Strategic Information and Economic Security Department (SISSE) as the single point of contact. SISSE will assist French companies, in liaison with the various government departments, to meet the demands of foreign courts. However, in practice, the French Blocking Statute seems to have been only relatively effective in preventing such communications.

Recent developments in France’s regulation of the international transfer of personal data reflect ongoing changes in the European data protection landscape.

Post-Schrems II Context

Schrems II ruling (2020)

The CJEU issued its landmark ruling in July 2020, invalidating the EU-US Privacy Shield framework, which previously allowed for the transfer of personal data between the EU and the United States This ruling emphasised concerns regarding US surveillance practices and the lack of comparable protection for EU citizens’ data rights.

Implications for Transfers: Following the ruling, organisations faced increased scrutiny and challenges when transferring personal data to the United States and other third countries without an adequate decision, necessitating appropriate safeguards such as Standard Contractual Clauses (SCCs).

Updated Standard Contractual Clauses (SCCs)

New SCCs

In June 2021, the European Commission adopted new Standard Contractual Clauses, replacing the previous versions. These updated clauses provide a more flexible and comprehensive framework for organisations to establish compliance with data protection and privacy laws when transferring personal data internationally.

CNIL guidance

In France, the CNIL has provided guidance on implementing new SCCs and emphasised the necessity of conducting thorough assessments of data protection laws in third countries when utilising SCCs for international transfers.

Prompting Additional Safeguards

Risk assessments (TRA) are essential for organisations following the invalidation of the Privacy Shield and the implications of the Schrems II ruling. These assessments help evaluate whether the legal framework of the importing country offers adequate data protection.

Organisations are encouraged to implement supplementary measures alongside SCCs or other safeguards when transferring data to jurisdictions considered inadequate. This may include encryption, pseudonymisation, or additional contractual clauses.

New EU-US Data Transfer Framework

Following the invalidation of the Privacy Shield, ongoing discussions between EU and US authorities have aimed to establish a new transatlantic data transfer framework to address the concerns raised in the Schrems II ruling.

Efforts continue to create a stable framework that aligns with EU data protection standards while enabling data flows between the USA and EU member states. However, no final agreement has been reached.

The EU-US Data Privacy Framework now regulates data transfers between the USA and the European Union.

Jeantet

11 rue Galilée
75116
Paris
France

+33 0 1 45 05 80 08

+33 0 1 47 04 20 41

info@jeantet.fr www.jeantet.fr/en/
Author Business Card

Law and Practice in France

Authors



Jeantet has been one of the leading independent French corporate law firms since 1924, delivering customised, high value-added services, committed to ethics and human values. The firm is used to dealing with complex cross-border IT, data protection and cybersecurity issues for international companies. It acts on behalf both of IT service providers (publishers, IaaS, PaaS, SaaS, service providers, etc) and of their clients (banking, insurance, industry, tourism or retail) at all stages of IT projects: choice of architectural architecture, negotiation and drafting of contracts from the simplest to the most complex (outsourcing, maintenance, integration, ERP, migration, cloud services, etc). It has broad experience in IT disputes, especially during expertise phases and offers a fully integrated external DPO service.