The fundamental national legislation applicable to personal data in France is as follows:
Some other complementary national legislation applies to personal data, as follows:
The French Data Protection Authority is the Commission Nationale de l’Informatique et des Libertés (CNIL). It is, under French law, an independent administrative authority whose main missions are to inform, educate, advise, anticipate innovation, investigate and impose sanctions.
The CNIL's investigations may be initiated based on the news, on complaints received by the CNIL or on the CNIL's annual programme of control (published each year), and may be part of, or related to, previous investigations performed by the CNIL (follow-up further to a formal notice or to previous sanctions); they may also be part of the co-operation programme with other European Data Protection Authorities.
The CNIL's inspections are regulated by the CNIL's internal rules. Online and on-site inspections can be conducted. An official and publicly available list is published each year, identifying the agents of the CNIL authorised to carry out audits and inspections.
During inspections, particularly on-site inspections, CNIL agents may request access to all documents necessary to fulfil their mission (eg, records of processing activities), regardless of the medium, and make copies of them. They can also access software programs and request their transmission by any appropriate processing. In addition, they can request some documents or information to be communicated after the inspection.
At the end of the inspection, the CNIL issues an inspection report, and may request additional documents.
Consequences of an Inspection
If the information and documents provided by the data controller during and after the inspection do not call for any particular observations, the inspection procedure is closed.
However, if the inspection leads to the identification of a lack of compliance with the applicable data protection rules, the CNIL can decide to:
Appeal of a Sanction
Once a sanction has been issued by the CNIL, the company concerned can appeal it to the French Administrative Supreme Court (Conseil d'Etat) within two months of the CNIL's sanction.
As a member of the European Union, France has to comply with the European legal and regulatory framework in terms of personal data protection, which includes the GDPR, the Police-Justice Directive and the guidelines of the EDPB. The e-Privacy regulation is still at the draft stage, but once it is adopted French companies will also have to comply with the e-Privacy regulation.
The major privacy/data protection non-governmental organisations in this field in France are as follows:
In terms of the protection of personal data, the French legal and regulatory framework is one of the most developed and strictest in the world. Before the GDPR, the French Data Protection Act had been in place since 1978 and the CNIL was already one of the most active and influential authorities in Europe. Over the years, it has created a veritable toolbox for both professionals and individuals, with deliberations, recommendations, guidelines and practical guides. It was also the first European Authority to sanction a GAFA, through its EUR50 million sanction of Google on 21 January 2019 (currently under appeal).
Key developments in the past 12 months in France include the publication of the following documents (in chronological order):
At the European level, the EDPB published the following guidelines (in chronological order):
Significant topics over the next 12 months in France are as follows:
Appointment of Privacy or Data Protection Officers
Article 37 of the GDPR requires the appointment of a Data Protection Officer (DPO) for public organisations and for organisations whose core activities consist of either operations that require regular and systematic monitoring of individuals on a large scale, or large-scale processing of sensitive data or data relating to criminal convictions and offences.
In case of doubt, the appointment of a DPO is strongly recommended by data protection authorities.
The French Data Protection Act does not provide additional requirements relating to the appointment of a DPO.
Criteria to Authorise Collection, Use or Other Processing
Article 6 of the GDPR defines six legal bases upon which to process personal data:
Application of “Privacy by Design” or “by Default”
The concepts of “privacy by design” and “privacy by default” are included in Article 25 of the GDPR, and the EDPB has published Guidelines 4/2019 on this subject.
The “privacy by design” concept consists of implementing – from the very early stage of the conception of personal data processing – organisational and technical measures to implement the data protection principles and necessary safeguards to meet the GDPR requirements and protect the rights of data subjects.
The “privacy by default” concept requires the implementation of organisational and technical measures for ensuring that, by default and therefore at any stage of the processing, data protection principles are respected and the necessary safeguards applied.
Privacy Impact Analyses
The data protection impact assessment (DPIA) consists of analysing a processing and its risks in order to assess them and reduce them. Article 35 of the GDPR requires a DPIA for the following:
In addition, the Article 29 Working Party issued guidelines in which it lists nine criteria to be assessed. If one criterion is satisfied, a DPIA is recommended; if two criteria are satisfied, a DPIA is required. The criteria are as follows:
The CNIL also published a list of processing operations for which a DPIA is required, as well as a list for which it is not required.
Adoption of Internal or External Privacy Policies
The principle of accountability (Article 5-2 of the GDPR) requires organisations to document their compliance. In particular, organisations are required to implement documentation such as information notices, records of processing activities, global privacy policies, data retention policies, a handling procedure for complaints and data subject requests, data breach procedures, security policies, "privacy by design" and "privacy by default" procedures, etc.
In France, the French Toubon Law No 94-665 of 4 August 1994, relating to the use of the French language, and French employment law require policies and procedures to be translated into French in order to be enforceable on French employees. Privacy policies should also be translated into French.
Data Subject Access Rights
Data subjects’ rights under the GDPR include the right to be informed, the right of access, the right to rectification, the right to erasure (“to be forgotten”), the right to the restriction of processing, the right to object to processing, the right to data portability, and the right to lodge a complaint with a supervisory authority.
In France, Law No 2016-1321 of 7 October 2016 for a Digital Republic introduces an additional right for French data subjects, which is the right to define guidelines about the processing of their personal data after their death.
Use of Data Pursuant to Anonymisation, De-identification or Pseudonymisation
De-identification is a concept not used under the GDPR, which only refers to anonymisation and pseudonymisation.
Anonymisation is a processing operation that consists of using a set of techniques in such a way as to make it impossible, in practice, to identify the person by any means whatsoever and in an irreversible manner. The GDPR does not apply to anonymous data because such data is no longer of a personal nature. Before the GDPR entered into application, the Article 29 Working Party published a detailed opinion on anonymisation techniques (Opinion 05/2014 on Anonymisation Techniques adopted on 10 April 2014), which still provides useful guidance on how to anonymise personal data properly.
Pseudonymisation is the processing of personal data in such a way that data relating to a natural person can no longer be attributed without additional information. In practice, pseudonymisation consists of replacing directly identifying data in a dataset with indirectly identifying data (alias, number in a classification, etc). Pseudonymisation thus makes it possible to process data on individuals without being able to identify them directly. In practice, however, it is often possible to trace the identity of individuals using third-party data. For this reason, pseudonymised data remains personal data.
Restrictions on or Allowances for Profiling, Automated Decision Making, Online Monitoring or Tracking, Big Data Analysis, Artificial Intelligence, Algorithms
Article 22 of the GDPR provides a framework for automated decision-making processing that produces legal or significant effects. It applies to processing operations based exclusively on decisions "producing legal effects" (a decision has legal effect when it affects human rights and freedoms) or "significantly affecting individuals" (a decision can have a significant impact, similar to a legal effect, when it has the consequence of influencing a person's environment, behaviour or choices, or leads to a form of discrimination).
In principle, individuals have the right not to be subject to a decision based solely on automatic processing and producing legal effects concerning them or significantly affecting them in a similar manner. However, in specific cases, a data subject may be the subject of a fully automated decision, even if it has a significant legal effect or impact on him or her – for instance, if the decision is based on the explicit consent of the data subject, is necessary for the conclusion or performance of a contract, etc.
The Concept of “Injury” or “Harm”
"Injury" and "harm" caused by the unlawful processing of personal data are not explained by the French Data Protection Act, its implementing Decree or the guidelines of the CNIL. There is currently no sanction in France providing additional information on the notion of harm in terms of personal data protection. Any data subject may therefore seek compensation for any damages he/she has suffered in connection with privacy and data protection by invoking Article 1240 of the French Civil Code, which defines the general principle of responsibility under French law. Also, it is possible to invoke Article 226-1 of the French Criminal Code, which condemns the damage of the privacy, as well as Articles 226-16 to 226-22-1 and Articles R.625-10 to R.625-13 of the French Criminal Code, imposing sanctions for failure to comply with data protection rules.
Under Article 9 of the GDPR, sensitive data includes personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation.
In principle, the collection and processing of such data is prohibited, except under specific circumstances (eg, when the data subjects' consent is collected, when the information is manifestly made public by the data subject, when the processing is necessary for the protection of the vital interests of the data subject, when the processing is justified by the public interest and authorised by the CNIL, etc).
Financial data is not deemed sensitive. Nevertheless, in the CNIL's deliberation No 2018-303 of 6 September 2018, regarding the processing of payment card data for the sale of goods or the provision of services at a distance, financial data, including payment card data, is qualified as "highly personal data", given the serious impact on data subjects that its violation could have.
Health data is sensitive data. Personal data concerning health is data relating to the past, present or future physical or mental health of a natural person (including the provision of healthcare services), which reveals information about the state of health of that person. Hosting providers of such health data in France must apply for a specific certification to host such data, regardless of where the data is located.
Communications data is governed in French law by Article L.34-5 of the PECC transposing the e-Privacy directive. See 2.3 Online Marketing.
Voice telephony and text messaging are not deemed sensitive.
The content of electronic communications is not deemed sensitive, but is subject to specific rules and retention periods under French law. See 2.3 Online Marketing.
Children’s or students' data is not deemed sensitive, although minors are considered vulnerable persons according to the CNIL. Where the legal basis for a processing is consent and the data subject is a minor under the age of 15, the data controller must obtain the consent of the minor concerned and the holder(s) of parental authority over that minor, jointly. In addition, the processing of personal data of vulnerable persons (children) is one criterion used by the CNIL to impose the conducting of DPIAs.
Employment data is not deemed sensitive, but the CNIL has issued practical recommendations on the processing of employment data. In addition, the processing of personal data of vulnerable persons, such as employees, is one criterion used by the CNIL to impose the conducting of DPIAs.
Social Security number (NIR): although not listed as sensitive data in the French Data Protection Act, the Social Security Number is considered by the CNIL to be a very specific piece of personal data. Its use in France is strictly limited to specific purposes, as detailed in Decree No 2019-341 of 19 April 2019 relating to the use of such data, which provides a detailed list of authorised purposes limited to the sectors of welfare protection, healthcare, work and employment, financial, tax and customs, etc.
Internet, Streaming and Video Issues
Cookies and beacons
The data controller must now obtain the prior consent of the user (in a free, specific, informed and unambiguous manner by means of a declaration or a clear positive act) before storing information on a user's equipment or accessing information already stored, particularly for cookies related to targeted advertising operations, certain audience measurement cookies, performance cookies and social network cookies generated in particular by share buttons when they collect personal data without the consent of the data subjects. Certain audience measurement cookies are exempted from consent collection if cumulative criteria defined by the CNIL are met (eg, clear and complete information is provided, an easy objection mechanism is accessible and usable on all browsers and all types of terminals, there is no cross-checking with other processing operations, limited retention period, there is no possibility to follow the navigation of the internet user on other sites, etc).
In any case, all cookies, including strictly necessary cookies, require the provision of information to the user prior to their placement.
Large-scale processing of location data is one of the processing operations for which an impact assessment must necessarily be carried out, according to the list drawn up by the CNIL.
Do Not Track, and tracking technology
Do Not Track (DNT) is a function integrated into web browsers that allows internet users to indicate that they do not want to be tracked for advertising purposes. To this day, there are no regulations forcing sites to take this opposition into account.
Social media, search engines, large online platforms
French case law considers social networks as hosting providers under Law No 2004-575 of 21 June 2004 on confidence in the digital economy (transposing E-commerce Directive 2000/31/EC), meaning that, regarding user-generated content, liability can only be sought when social networks do not act expeditiously to remove or disable access to the litigious information, upon obtaining such knowledge or awareness.
Dereferencing (“right to be forgotten”) allows the user to remove one or more results provided by a search engine after a query based on the identity (surname and first name) of a person.
On 6 December 2019, the State Council took 13 decisions in light of the judgment of the Court of Justice of the European Union dated 24 September 2019 (case C 136/17), in which it applied a proportionality test and weighed against the public's right to information, in the light of three criteria:
The nature of data must also be taken into account in this weighing, and affects the scope and effectiveness of a delisting request.
Sensitive data and data relating to judicial proceedings or criminal convictions and offences benefits from a higher level of protection: such a request may only be refused where access to such data is strictly necessary to inform the public.
On 2 December 2019, the EDPB adopted guidelines on the criteria of the right to be forgotten in search engines cases, in the context of delisting requests, which provide six grounds upon which data subjects can rely to obtain the delisting of their personal data by search engines, but also on the exceptions that search engines may oppose to them. Furthermore, these guidelines clarify that the right to be forgotten concerns not only the right for data subjects to obtain from search engines the removal of links to websites containing their personal data, but also the right to object to the processing of such data under Article 21 of the GDPR.
Addressing hate speech, disinformation, terrorist propaganda, abusive material, political manipulation, bullying, etc
Pursuant to the French Law on the manipulation of information ("Fake News Law") of 22 December 2018, operators may stop the dissemination of "allegations or statements that are inaccurate or misleading in relation to a fact which may affect the upcoming vote's sincerity" for a define period before general elections and until the publication of the results. The Fake News Law specifically targets operators of online platforms whose activity exceeds a specific threshold of connection from the French territory.
In addition, a bill is under discussion to prevent the dissemination of hate speech (“Hate Speech Bill”). It increases the obligations of operators to remove certain illicit content within a very short timeframe (eg, content pertaining to terrorism and child pornography). It is likely to be adopted in the course of Spring 2020.
Article L.34-5 of the PECC defines direct marketing as "any message intended to promote, directly or indirectly, goods, services or the image of a person selling goods or providing services."
This definition is understood very broadly as it covers both the direct promotion (advertising campaigns by email, prospectus or brochure sent by email) and indirect marketing (any material aiming to promote the seller’s brand image) of products, services or the image of a company.
As a matter of principle, any B2C online direct marketing operation requires the data subject's explicit consent to receive direct marketing (opt-in). Opt-in must be obtained at the time of the collection of contact details.
Two exceptions exist, as follows:
Any electronic marketing communication must also provide the individual with a free, simple, direct and easily accessible means to opt out of marketing communications (eg, through a hyperlink at the bottom of the communication).
B2B electronic marketing communications are subject to a lighter regime. To send a B2B electronic marketing communication, the company must, at the time of collecting the contact details, provide (i) information to the recipient that its contact details will be used for marketing purposes, and (ii) a simple, free and easy way to opt out of receiving marketing communications.
For marketing communications carried out by post or by phone, opting out is admissible. When collecting the individual’s contact details, the company must inform the individual that its details can be used for direct marketing purpose and provide the individual with the possibility to opt out of direct marketing communications at any time (eg, through an unchecked box).
Some provisions of the French Labour Code complement the French data protection legal framework, as follows:
Monitoring Workplace Communications
The CNIL has issued guidelines about HR processing, which underline that employers can control and limit the use of the internet and messaging at work for personal purposes (i) to ensure the security of networks that may be affected by attacks and (ii) to limit the risks of abuse of personal use of the internet at work or a professional mail box.
However, the employer must define the rules for personal/private use of professional devices and the internet, and must inform the employees of such rules. In addition, it cannot use key loggers to remotely record all the actions performed on a computer, except in exceptional circumstances linked to a strong security imperative.
When recording employees' calls, for instance for training purposes or improvement of the services provided, employers cannot couple a call with a screen capture system of the employee's computer workstation, and may not set up a permanent or systematic listening or recording device, except as provided for by law (eg, for some specific cases, such as emergency services). It must provide employees with telephone lines that are not connected to the recording system, or a technical device enabling them to switch off the recording, for personal calls. The same applies to calls made by staff representatives in the exercise of their duties.
Labour Organisations or Works Councils
Works Councils have an active role in France, and some personal data processing carried out by employers may require the information and consultation of the Works Council before their implementation (eg, whistle-blowing schemes, geolocation of employees' vehicles, employees' performance monitoring system, etc).
Whistle-blower Hotlines and Anonymous Reporting
On 10 December 2019, the CNIL published new guidelines relating to whistle-blowing schemes, together with FAQ, which include the new French legal framework relating to anti-bribery and the duty of vigilance. The guidelines are applicable to schemes required by law pursuant to “Sapin 2” Law (Articles 8 and/or 17) and the French Law relating to the duty of vigilance and to schemes implemented by companies on a voluntarily basis to collect alerts relating to lacks of compliance with the company’s code of ethics or code of conduct.
Regarding anonymous reporting, the CNIL maintains its former position and indicates that, even if whistle-blowers can choose to remain anonymous, it is strongly recommended to encourage them to identify themselves. When exceptionally dealing with anonymous reports, companies must deploy specific and additional measures to assess the severity of the reported violation and the special care to apply.
Discovery proceedings are investigations and investigative phases prior to civil and commercial litigation, and are essential to any legal action in the United States. Discovery requests made to French companies may include requests to produce and transfer thousands of e-mails from employees.
On 23 July 2009, the CNIL adopted a recommendation underlining that the required transfers of information can be subject to the French Blocking Statute and must necessarily be carried out in accordance with the Hague Convention, the only international convention binding France and the United States with regard to legal proceedings. In addition, documents must be exhaustively listed and have a direct and precise link with the object of the conflict in compliance with data protection applicable laws. The CNIL has not issued any updated opinion since then.
A sanction procedure can be initiated against an organisation if a breach of the GDPR or the French Data Protection Act is found following the filing of a complaint, a data breach or an inspection carried out by the CNIL. In most cases, the CNIL will issue a formal warning and offer the company a chance to remediate the infringements identified, although it is not compelled to do so.
A Rapporteur is designated to draft a report, which is communicated to the data controller and the other supervisory authorities concerned if the CNIL acts as lead supervisory authority. The data controller can then submit written observations before the hearing.
After the hearing, if the CNIL acts as lead supervisory authority, it communicates the draft decision to the other supervisory authorities concerned, which must provide the CNIL with their comments within a period of four weeks. The data controller is then notified of the CNIL's decision, which can be public or not.
The data controller can appeal the CNIL's decision before the French Administrative Supreme Court (“Conseil d'Etat”) within two months.
Potential Enforcement Penalties
Non-compliance with the GDPR could result in administrative fines and criminal sanctions, as follows:
Leading Enforcement Cases
Recent major fines issued by the CNIL range from EUR20,000 to EUR50 million. The latest decisions of the CNIL have sanctioned infringements such as lacks of valid consent, appropriate retention periods, appropriate security and confidentiality measures, failure to inform data subjects and respect their rights, failure to comply with the supervisory authority, etc.
Article 77 of the GDPR recognises the right of data subjects to lodge a complaint with a national supervisory authority. Complaints in France can be addressed to the CNIL.
In addition, the French Data Protection Act provides two types of class actions:
The leading cases in the prior 12 months are the following:
Law enforcement’s access to data has been deeply modified and reinforced by Law No 2016-731 dated 3 June 2016 for “strengthening the fight against organised crime, terrorism and their financing, and improving the efficiency and safeguards of criminal proceedings”.
Under some circumstances, the Investigating Judge may prescribe the interception, recording and transcription of correspondence sent by electronic communications (articles 100 to 100-8 of the French Criminal Procedure Code). Such actions are subject to several procedural safeguards and limitations, particularly in terms of duration.
Moreover, the Investigating Judge (or police officers authorised by the Liberties and Detention Judge) can – remotely and without informing the data subject – access correspondence stored via electronic communications accessible by means of a computer identifier, for a maximum period of one month, renewed under strict conditions (Articles 706-95-1 to 706-95-3 of the French Criminal Procedure Code).
The Investigating Judge may also perform investigations at the premises concerned, to make any useful findings or conduct searches (articles 92 to 99-5 of the French Criminal Procedure Code). All objects, documents or computer data placed under judicial control shall be immediately inventoried and placed under seal.
Law No 2015-912 of 24 July 2015 on intelligence and its implementing Decree No 2016-67 of 29 January 2016 on intelligence gathering techniques define the legal framework that authorises the intelligence services to use information access techniques, particularly means of access to connection data and computer data capture, while guaranteeing the right to privacy.
For certain crimes and offences, the Investigating Judge may authorise the interception, recording and transcription of electronic correspondence. By way of exception and for the most serious crimes, it may also authorise, with or without the individual’s consent, access to information, remote access to correspondence stored via electronic communications services, the seizure of a device, copying of the device, or access to data stored on another computer device and previously available on the initially seized device. Procedural safeguards regulating such seizure are provided by articles 76 and 97 of the French Criminal Procedure Code.
An organisation must not transfer personal data to a foreign government if such transfer is not compliant with French and European data protection laws. Indeed, the Cloud Act explicitly provides that the service provider from whom the data is requested always has the possibility to object on the grounds that the request would lead to an infringement of the legislation of a foreign country and expose them to sanctions (conflict of laws situation). France does not participate in a Cloud Act agreement with the USA yet. The Mutual Legal Assistance Treaty agreement between France and the United States is not efficient, and discussions have been entered into between the USA and the EU to sign an EU-wide Mutual Legal Assistance Treaty. The provider may refuse to disclose the requested data on the basis of the common law principles of comity – ie, on the basis of the principle of international comity recognised by the US courts according to which, for the application of US law, the important interests of other countries must be taken into account and, where appropriate, US legislation must not be applied or applied in a nuanced manner.
Several laws in France are likely to hinder requisitions by the US authorities on data stored in Europe, particularly Articles 44 et seq of the GDPR (which lay down the conditions under which personal data may be transferred to third countries or international organisations), the French blocking statute (Law No 68-678 of 26 July 1968) and French Law No 2018-670 of 30 July 2018 on the protection of business secrecy.
“La Quadrature du net”, the French Data Network and “La Fédération des fournisseurs d'accès à internet associatifs” have challenged the constitutionality of certain provisions contained in Law No 2015-912 of 24 July 2015 on intelligence, before the French Constitutional Council.
The French Constitutional Council abrogated two provisions relating to the following:
Pursuant to Articles 44 to 50 of the GDPR, transfers of personal data outside the European Union are not authorised unless the third country recipient of the personal data ensures an adequate level of protection, or appropriate guarantees are applied.
The European Commission has so far recognised Andorra, Argentina, Canada (commercial organisations), the Faroe Islands, Guernsey, Israel, the Isle of man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the United States of America (limited to the Privacy Shield framework) as providing adequate protection. Adequacy talks are currently ongoing with South Korea.
Where a third state is not recognised as offering an adequate level of protection, appropriate safeguards must be deployed, such as the execution of Standard Contractual Clauses, Binding Corporate Rules, etc.
In the absence of adequacy decisions or appropriate safeguards, a transfer of personal data can rely on some exceptions (article 49 of the GDPR), such as the explicit consent of the data subject, the performance of a contract between the data subject and the controller or the implementation of pre-contractual measures taken at the data subjects' request, etc.
Privacy Shield is a self-certification mechanism for companies established in the United States, which has been recognised by the European Commission as providing an adequate level of protection for personal data transferred to companies established in the United States. Before transferring personal data to a US company declaring being Privacy Shield certification, it is necessary to check if the US company certification is still active (certifications must be renewed regularly) and covers the processing in question.
Standard Contractual Clauses (SCC) are model contracts for the transfer of personal data adopted by the European Commission. There are two types of SCC: those governing transfers between two data controllers, and those governing transfers between a data controller and a data processor. In the Schrems II case, Maximilien Schrems challenged the sufficiently of the SCC to protect personal data transfers to the United States. According to the opinion of the Advocate General of the Court of Justice of the European Union presented on 19 December 2019, the standard contractual clauses used for data transfers between EU countries and third countries are "valid". It is now necessary to await the position of the European Court of Justice on this subject.
Binding Corporate Rules (BCR) constitute intra-group data protection agreements for the transfer of personal data. They must be evaluated and validated against the EDPB's standards. BCR must be legally binding and implemented by all relevant entities of the group of companies, must expressly confer rights on data subjects regarding the processing of their personal data, and must meet the requirements set out in the GDPR.
No government notifications or approvals are required to transfer data internationally.
The French Data Protection Act does not include provisions requiring personal data to be localised in France. GDPR rules apply in France for transfers of personal data processing, and no stricter rules are imposed by the French Data Protection Act.
French law provides that national treasures cannot be exported; public archives are considered national treasures. Hosting data outside France is considered exporting. The following Articles apply to exporting:
No software codes, algorithms or similar technical details are required to be communicated to the government.
Limitations to the collection and transfer of personal data in connection with foreign government data requests may be limited in France through Articles 44 et seq of the GDPR pertaining to transfers of personal data outside the European Union, the French blocking statute (Law No 68-678 of 26 July 1968) and French Law No 2018-670 of 30 July 2018 on the protection of business secrecy.
The "French blocking statute" is Law No 68-678 of 26 July 1968 relating to the communication of documents and information of an economic, commercial, industrial, financial or technical nature to foreign individuals or legal entities.
The key measures of this law are as follows:
Big Data Analytics
See 2.2 Sectoral and Special Issues: Cookies and beacons for the regulation of analytics cookies and online trackers.
A profiling processing operation is based on the establishment of an individualised profile, relating to a particular person, in order to evaluate certain personal aspects of that person, with a view to making a judgement or drawing conclusions about him or her.
A fully automated decision is a decision taken with respect to an individual, through algorithms applied to his or her personal data, without any human being intervening in the process.
In practice, the two notions are closely linked: profiling a person frequently leads to a decision being made about them, and many fully automated decisions are made on the basis of profiling.
Article 22 of the GDPR regulates automated decision processing, whether it is based on profiling or not. Pursuant to this article, individuals have the right not to be subject to a decision based exclusively on automated processing and producing legal effects concerning them or significantly affecting them in a similar way. However, in certain specific cases, a person may be the subject of a fully automated decision, even if it has a legal effect or a significant impact on him or her. These exceptions concern (i) the decisions based on the explicit consent of the data subjects; (ii) the decisions necessary for the conclusion or performance of a contract; and (iii) the decisions framed by specific legal provisions.
Article 22 of the GDPR provides a framework for the use of personal data necessary for the operation of algorithms, as mentioned above.
Article 47 of the French Data Protection Act provides an additional exception for individual administrative decisions, provided that the processing does not involve sensitive data. Such decisions include an explicit mention to explain that the individual decision has been taken on the basis of an algorithmic processing, and the data controller must be able to explain in detail and in an intelligible form to the data subject how the processing operation has been carried out.
On 19 February 2019, the European commission published a white paper on artificial intelligence, which draws the guidelines of the EU strategy on artificial intelligence and robotics, from technological, ethical, legal and socio-economic standpoints. The European Commission calls for a co-ordinated plan on artificial intelligence "made in Europe".
Internet of Things (IoT)
The CNIL has issued several practical guides on IoT and on securing connected devices, and in particular on connected TVs, toys, cars and kitchen robots, to raise awareness and to invite manufacturers to implement data protection by design and by default measures. In this context, the CNIL makes the following recommendations in particular:
See above sections on automated decision-making and artificial intelligence.
In addition, in 2017 the CNIL issued guidelines for connected vehicles for responsible data usage, which analyse three processing scenarios under which data collected in the vehicle:
The CNIL promotes the first scenario, which involves processing data locally in the vehicle, without transmission to service providers, as it provides for the best data protection guarantees.
The CNIL also states that the licence plate number and the vehicle serial number are personal data, as is any data relating indirectly to a data subject (eg, data relating to the journeys made, the condition of the vehicle's parts, the dates of technical inspections, the number of kilometres driven, driving style, etc).
The EDPB also issued guidelines 1/2020 on Connected Vehicles, which are open to public consultation until 20 March 2020.
Although the pack today applies to connected vehicles, two of its obligations have a considerable impact on the development of autonomous vehicles: the implementation of data protection by design and by default principles, as well as security by default underlined by the GDPR.
Regarding "facial recognition", the CNIL considers that a balanced overview is necessary in order to avoid any confusion and any blanket conclusion on this technology as it can host possible uses, from unlocking smartphones to opening bank accounts and recognising a person being sought by police in a crowd, and raises different issues.
Biometric processing, including facial recognition, has been identified by the CNIL as a key challenge under high scrutiny and is regulated by several legal frameworks:
The CNIL issued a position paper on the use of facial recognition techniques, which supported a risk-based approach to determine which risks are not acceptable in a democratic society and which ones can be assumed with appropriate safeguards. It also requires that the proportionality of the means deployed and the special protection afforded to children are both guaranteed. When the legal basis of the processing is the consent of the data subjects, the CNIL specifies that the provision of an alternative means that does not involve biometric devices, without additional cost or particular constraints for the data subjects, is essential in order to guarantee freedom of consent.
The CNIL also specifies that the data controller must regularly ensure that the automatic erasure of biometric templates is effective.
Geolocation is only considered by the CNIL for specific applications – eg, geolocation of employees' vehicles or geolocation to control employees' working time. The CNIL excludes the use of geolocation of employees' vehicles for monitoring compliance with speed limits, permanent control of employees, the collection of location data outside working hours, including to fight against theft or check compliance with the conditions of use of the vehicle, or to calculate the working time of employees when another device already exists. In addition, geolocation processing is considered by the CNIL as an intrusive means and is likely to require conducting a DPIA.
There is no privacy-specific law provision in France dedicated to drones, but the data protection and security by default principles apply. French Law No 2016-1428 dated 24 October 2016 only contains general provisions pertaining to security and training requirements regarding the use of certain drones.
Many collective and sectoral initiatives are initiated by industry stakeholders to mitigate risks relating to the use of disruptive technologies and to create insights for reasonable uses. The CNIL and industry professionals are elaborating sectoral codes of conduct to introduce best standards and harmonisation to specific and commonly used processing operations within an industry.
See 2.5 Enforcement and Litigation.
Several privacy-related issues occur in the context of due diligence processes in corporate transactions. Such implications are stressed by the recent statement of the EDPB adopted on 19 February 2020 regarding the privacy implications of mergers. Corporate transactions entail more and more privacy issues, such as the combination of personal data, the disclosure of personal data in the transaction process and securing the transaction process. For instance, the setting up of a data room as part of the due diligence process must comply with the main principles of the GDPR (eg, security measures to be implemented for access to the data room, prohibition of shared accounts, minimisation of communicated personal data, etc).
The Network and Information System Security Directive (NIS) of 6 July 2016, transposed into French law by Law No 2018-133 of 26 February 2018 laying down various provisions adapting national law with the European Union law in the field of security and its Decree No 2018-384 of 23 May 2018, on the security of the networks and information systems of essential service operators and digital service providers, imposes obligations on digital service providers and operators of essential services. More details are provided in the France Law and Practice, and France Trends and Developments chapters of the Cybersecurity guide.
More details are provided in the France Law and Practice, and France Trends and Developments chapters of the Cybersecurity guide.
17, avenue Matignon
75378 Paris cedex 08
+33 1 5367 4747
+33 1 5367 4748www.hoganlovells.com
Guidance from the CNIL on Cookies and Online Trackers
To be in line with the GDPR provisions regarding consent and to anticipate the future ePrivacy EU Regulation, the French Data Protection Authority (CNIL) issued new guidelines on cookies and online trackers in July 2019. These guidelines were followed by publication in January 2020 of practical recommendations on consent collection, which were open to public consultation on the CNIL's website until 25 February 2020. Once the CNIL releases the finalised version of these recommendations, companies will have six months to comply with them.
The CNIL's guidelines change its approach regarding consent: the soft opt-in option that was previously accepted can no longer be used. Users' consent to cookies and online trackers can no longer be deduced from their ongoing browsing or scrolling. Consent must be freely given, specific, informed, unambiguous, auditable and revocable; as a consequence, the CNIL imposes an affirmative action requirement for a valid consent on the collection of data through cookies and online trackers.
The CNIL also provides guidance on the allocation of responsibilities and the qualification of actors. The CNIL stresses that both website/mobile app publishers and third parties can be deemed data controllers or joint controllers. Publishers are the ones in charge of collecting users' consent in accordance with the GDPR.
In addition, the CNIL gives important information with respect to the duration and proof of users' consent to trackers. A period of validity of six months from the expression of the user's choice is now considered as appropriate, compared to 13 months previously (although a longer period seems to be accepted if justified).
Data controllers should also be able to collect individual proof of users' consent with possible different options (eg, placing the code in escrow with a third party, keeping screen shots of the website, etc) and to perform regular audits of consent mechanisms.
Lastly, some exemptions are provided and some cookies and trackers do not require consent, such as those saving choices expressed by the user, enabling authentication of a service, keeping track of the content of a shopping cart or enabling audience measurement under certain conditions, etc.
The modification of the rules regarding cookies and online trackers is a major priority of the CNIL and will have an impact on digital actors that rely on cookies and trackers for their ad-based revenue model. In addition to its guidelines and practical recommendations, the CNIL has already initiated procedures against some major digital actors with respect to the use of and consent for cookies and online trackers (ongoing procedures). The CNIL's guidelines are also challenged by media and online advertising professional associations that question the legality of the guidelines, arguing that the CNIL should have waited for the adoption of the ePrivacy Regulation intended to deal with online trackers. The draft of the ePrivacy Regulation, published in February 2020, appears to follow different paths and may accept that consent is not the only way to process data from cookies. At this date, the new CNIL guidelines are not yet finalised, and the situation may still evolve in France.
Facial recognition will certainly be one of the most discussed topics in France in 2020, alongside discussions at the European level on this topic.
The European Commission published a White Paper "On artificial intelligence – A European approach to excellence and trust" in February 2020, which includes proposed provisions about facial recognition and contemplates the possibility of a moratorium on facial recognition technologies. In its paper, the European Commission provides that public facial recognition technologies should be considered as high-risk, and should therefore be subject to the mandatory legal requirements set out by the future European regulatory framework.
In France, discussions and governmental and local initiatives to develop applications of facial recognition are already ongoing.
The French Secretary of State in charge of Digital recently advocated for more experimentations with facial recognition technologies. Several local officials also implemented experiments with facial recognition applications in the cities of Nice and Marseille, and in high schools. The use of facial recognition in high schools was ruled as disproportionate by the CNIL, and these experiments had to stop.
The French government is also currently testing a new smartphone application, ALICEM, which relies on facial recognition and enables French citizens to prove their identity over the internet in order to use some online administrative services. Individuals must transmit one of their identity documents to the application, which will read its electronic chip in order to authenticate the person's identity through facial recognition. This application is currently under litigation initiated by consumer protection associations on the grounds that the ALICEM application does not provide its users with alternative solutions to facial recognition, meaning that the users' consent is not freely given as they have no other choice than using facial recognition, which does not adhere to the free criteria consent of the GDPR.
The CNIL wishes to initiate a public debate on the use of facial recognition, and to define rules that apply for such experimentations. In particular, in November 2019 the CNIL stated that the fundamental principles in terms of personal data protection must be respected when facial recognition is concerned. It emphasised that experimentations of facial recognition applications should not violate certain principles (eg, lawfulness and proportionality of the processing, prohibition of certain uses) and should be subject to prior consultation of the CNIL, centred on the consent of the data subjects, and should remain experimental. Pursuant to Article 9.4 of the GDPR and Article 8.2 (c) of the French Data Protection Act, the CNIL adopted a model regulation imposing compulsory obligations on any employer wishing to implement a biometric system, including facial recognition, in the workplace. Under the model regulation, there is no need to obtain consent from employees, nor to provide an alternative to biometric technology including facial recognition, with the exception of a back-up device for data subjects not using the biometric system because they do not meet the constraints of the biometric device, or in the event of the unavailability of the biometric device.
Retention of and Access to Users' Communications Metadata for Security and Crime-Fighting Purposes
At both French and European levels, 2020 will also be marked by discussions about the issue of retention and access to users' communications metadata for security and crime-fighting purposes.
The ePrivacy Directive 2002/58 gave EU Member States the ability to enjoin electronic communications services providers to retain communications metadata of their users for these purposes. France transposed the ePrivacy Directive by requiring electronic communications services providers to retain their users' metadata for a period of one year.
Since then, these data retention and data access regimes to the benefit of law enforcement and intelligence agencies, among other public bodies, have been challenged before national and European Courts. In two landmark cases concerning the UK and Ireland, Digital Rights Ireland (C-594-12) and Tele2Sveridge (Joint Cases C-203/15 and C-698/15), the Court of Justice of the European Union (CJEU) held that both general and indiscriminate retention of metadata for the sole purpose of fighting crime was contrary to EU law, as was the retention of metadata without the necessary safeguards to regulate its access and use.
Further national legislative framework will be tested before the CJEU in 2020, including the French framework, to assess its legality. Following a complaint by consumer protection associations, the French Supreme Administrative Court (Conseil d'Etat) referred to the CJEU on the legality of the French data retention framework (Request for preliminary ruling lodged on August 2018, C-511/18 and C-512/18). The Advocate General recently issued his opinion, in which he stated that general and indiscriminate French obligations on electronic communications services providers and hosting providers to retain users' communications metadata were contrary to EU law. The CJEU judgment is expected to be issued in the coming months, and may have an impact on the future ePrivacy Regulation, still in discussions, which also provides for the possibility to impose data retention obligations on electronic communications services providers, which now include OTT entities that are service companies (eg, Skype, WhatsApp or Viber) only present on the internet and that use the internet network to offer communications services (eg, voice, messaging, video, text, images, groups, etc), partly replacing the services of telecom operators.
At the French level, access to metadata by authorities for the purpose of fighting copyright infringements on peer-to-peer networks is currently challenged before the French Constitutional Council. Said Council previously held similar access by the French Competition Authority and the French Financial Markets Authority, among other public bodies, to be unconstitutional due to the absence of appropriate safeguards. As a consequence, in November 2019 the French government released a Decree regarding access to metadata by agents of the competition authority, which provides the required safeguards (Decree No 2019-1247 of 28 November 2019).
Open Data of Court Decisions and Data Protection
As the French government is committed to developing the public availability of court decisions, the implementation of this public policy is raising data protection issues.
In 2016, French Act No 2016-1321 of 7 October 2016 for a Digital Republic provided that decisions from administrative and judiciary courts shall be made available to the public through an open registry ("Open Data"), provided that the privacy of parties in the court decisions concerned was respected. So far, only some decisions issued by the French Supreme Administrative and Supreme Judiciary courts (Conseil d'Etat and Cour de cassation) are freely available to the public, representing just 1% of all French court decisions.
Provisions regarding Open Data of court decisions were also provided by the French Act for the Programming and Reform of Justice in March 2019 (Act No 2019-222 of 23 March 2019). It provides that, prior to rendering the court decisions available to the public, the first names of natural persons mentioned in the decisions when they are parties or third parties to the decision shall be removed. Where the release of a court decision can prejudice the security or privacy of the parties, the names of third parties, judges, members of the court and all the aforementioned persons shall be anonymised in the legal decision. Finally, the personal data of judges and members of the court cannot be reused for the purpose of evaluating, analysing, comparing or predicting their professional practices. All these safeguards do not apply to lawyers.
In addition, in January 2020 the Ministry of Justice released a draft version of a Decree implementing Open Data of court decisions. It identifies in particular the authorities in charge of deciding whether or not court decisions must be anonymised, and provides for a delay of six months from the date of issuance of the court decision being made publicly available. However, the draft Decree is silent on the fate of court decisions rendered before the adoption of the Decree and whether or not these court decisions should be made available to the public. In addition, Ministerial Orders will need to be published regarding the conditions of implementation of the Decree's provisions.
This Decree and the subsequent Ministerial Orders will be heavily debated as they will require a balance of data protection rights with the public right to information.
Experimentation of Data Mining by French Tax and Customs Authorities
The French Finance Act for 2020 (Act No 2019-1479 of 28 December 2019) created a data mining framework for French tax and customs authorities for the purpose of fighting tax and customs fraud. This processing is an experimentation that will be used for the next three years.
French tax and customs authorities can now collect and use content that is freely available, not password protected, and manifestly rendered public by their users on the websites of certain categories of online platform operators (as defined in Article L.111-7 2° of the French Consumer Code) that collect, moderate and diffuse online adverts, such as platforms enabling users to buy and sell second-hand products (eg, Le bon coin, which is a well-known C2C online platform in France for second-hand products).
Where the personal data collected is of a nature to assist with the finding of an infringement, it may be retained for a maximum period of one year, whereas data not related to tax and customs fraud shall be deleted five days after its collection. Data subjects' right to object does not apply to this processing by French tax and customs authorities.
This experimentation framework was heavily criticised by the CNIL, which stated in particular that this processing constituted a change in the approach of data processing for the purpose of fighting tax and customs fraud. Indeed, such data processing is handled in a general and global way in order to detect fraud, instead of being targeted to specific individuals where doubts and suspicions of fraud pre-exist, which was so far the French authorities' usual approach to fighting fraud. The CNIL underlined that such data processing could significantly alter the behaviour of web users and restrain their liberties, in particular their freedom of expression on the internet.
The French Constitutional Council, however, approved most of the experimentation framework provisions, considering that the safeguards deployed were appropriate to ensure a balance between the right to privacy and the constitutional objective of fighting tax and customs fraud.
Smart Cities and Data Sharing
In the context of 5G rollout in 2020, the development of the Internet of Things and ecological incentives, smart cities applications and projects are booming in France. These applications and projects imply many regulatory issues regarding data protection, as the sharing of data with respect to energy consumption, public transport and vehicles is expected to increase with smart grids, open data or connected vehicles.
As an illustration, the CNIL recently addressed a formal notice to two major French energy companies with regards to their processing of personal data by intelligent and communicating meters (CNIL's deliberations No 2019-035 of 10 February 2020 and No 2019-036 of 31 December 2019). In these formal notices, the CNIL enjoined the two companies to implement a compliant data retention policy with respect to the daily and half-hourly collected energy consumption data of their customers, and for one of the companies to collect specific consent from their customers for the processing of their half-hourly energy consumption data.
The recent Law on Mobility (Law No 2019-1428 of 24 December 2019), adopted further to the Commission Delegated Regulation 2017/1926 with regard to the provisions of EU-wide multimodal travel information services, provides that transport authorities and transport operators shall provide static and dynamic travel traffic data and historic traffic data as defined in Article 2 of the aforementioned Regulation to some designated authorities to make available to transport companies and final users. The Law on Mobility also paves the way for the sharing of data of connected vehicles, particularly to road managers, law enforcement, fire and rescue services, authorities organising mobility services and insurance companies for purposes including the detection and management of incidents and accidents, knowledge of the road, its condition and equipment, or the performance of insurance contracts.
The articulation between these data sharing obligations and data protection regulations will be a key regulatory subject, and guidance from regulators is expected, particularly with the future adoption of the final European Data Protection Board guidelines on connected vehicles (currently displayed for open consultation).
Processing of Personal Data and Political Communications
The political landscape in France will be marked by municipal elections in 2020, which will see the application of certain laws specifically designed to protect the integrity of political elections (eg, the so-called "Anti Fake-News Act"). Pursuant to the Anti Fake-News Act, hosting service providers may be ordered in expedited hearings by the court to implement measures to stop the dissemination of "allegations or statements that are inaccurate or misleading in relation to a fact which may affect the upcoming vote's sincerity" for a period starting three months before the first day of the month during which general elections are to be held and until the date on which the results are known.
These elections will also be scrutinised by the CNIL with regards to data protection issues. The CNIL publicly affirmed that it will focus its attention on the processing of personal data in the context of France's 2020 municipal elections. In late 2019, the CNIL provided an action plan with respect to political communications, which includes special guidance on political communications to be published on the CNIL's website, advice to candidates and political parties, an alert platform to notify non-compliant practices and accrued surveillance actions by the CNIL.
Such action plan follows the revelation in 2018 of the Cambridge Analytica Scandal where a company managed to access the personal data of a major social network's users for political communications in the context of the 2016 US Presidential Elections and the 2016 Brexit Referendum, affecting around 70 million individuals in the United States, and nearly 1 million individuals in the UK.
The CNIL will also watch closely the implementation of appropriate measures to secure the electronic voting machines that some cities provide to their constituents, especially after the recent events affecting the vote integrity of the Iowa Caucus Democratic Primary.
17, avenue Matignon
75378 Paris cedex 08
+33 1 5367 4747
+33 1 5367 4748www.hoganlovells.com