Contributed By Hogan Lovells (Paris) LLP
The fundamental national legislation applicable to personal data in France is as follows:
Some other complementary national legislation applies to personal data, as follows:
The French Data Protection Authority is the Commission Nationale de l’Informatique et des Libertés (CNIL). It is, under French law, an independent administrative authority whose main missions are to inform, educate, advise, anticipate innovation, investigate and impose sanctions.
The CNIL's investigations may be initiated based on the news, on complaints received by the CNIL or on the CNIL's annual programme of control (published each year), and may be part of, or related to, previous investigations performed by the CNIL (follow-up further to a formal notice or to previous sanctions); they may also be part of the co-operation programme with other European Data Protection Authorities.
The CNIL's inspections are regulated by the CNIL's internal rules. Online and on-site inspections can be conducted. An official and publicly available list is published each year, identifying the agents of the CNIL authorised to carry out audits and inspections.
During inspections, particularly on-site inspections, CNIL agents may request access to all documents necessary to fulfil their mission (eg, records of processing activities), regardless of the medium, and make copies of them. They can also access software programs and request their transmission by any appropriate processing. In addition, they can request some documents or information to be communicated after the inspection.
At the end of the inspection, the CNIL issues an inspection report, and may request additional documents.
Consequences of an Inspection
If the information and documents provided by the data controller during and after the inspection do not call for any particular observations, the inspection procedure is closed.
However, if the inspection leads to the identification of a lack of compliance with the applicable data protection rules, the CNIL can decide to:
Appeal of a Sanction
Once a sanction has been issued by the CNIL, the company concerned can appeal it to the French Administrative Supreme Court (Conseil d'Etat) within two months of the CNIL's sanction.
As a member of the European Union, France has to comply with the European legal and regulatory framework in terms of personal data protection, which includes the GDPR, the Police-Justice Directive and the guidelines of the EDPB. The e-Privacy regulation is still at the draft stage, but once it is adopted French companies will also have to comply with the e-Privacy regulation.
The major privacy/data protection non-governmental organisations in this field in France are as follows:
In terms of the protection of personal data, the French legal and regulatory framework is one of the most developed and strictest in the world. Before the GDPR, the French Data Protection Act had been in place since 1978 and the CNIL was already one of the most active and influential authorities in Europe. Over the years, it has created a veritable toolbox for both professionals and individuals, with deliberations, recommendations, guidelines and practical guides. It was also the first European Authority to sanction a GAFA, through its EUR50 million sanction of Google on 21 January 2019 (currently under appeal).
Key developments in the past 12 months in France include the publication of the following documents (in chronological order):
At the European level, the EDPB published the following guidelines (in chronological order):
Significant topics over the next 12 months in France are as follows:
Appointment of Privacy or Data Protection Officers
Article 37 of the GDPR requires the appointment of a Data Protection Officer (DPO) for public organisations and for organisations whose core activities consist of either operations that require regular and systematic monitoring of individuals on a large scale, or large-scale processing of sensitive data or data relating to criminal convictions and offences.
In case of doubt, the appointment of a DPO is strongly recommended by data protection authorities.
The French Data Protection Act does not provide additional requirements relating to the appointment of a DPO.
Criteria to Authorise Collection, Use or Other Processing
Article 6 of the GDPR defines six legal bases upon which to process personal data:
Application of “Privacy by Design” or “by Default”
The concepts of “privacy by design” and “privacy by default” are included in Article 25 of the GDPR, and the EDPB has published Guidelines 4/2019 on this subject.
The “privacy by design” concept consists of implementing – from the very early stage of the conception of personal data processing – organisational and technical measures to implement the data protection principles and necessary safeguards to meet the GDPR requirements and protect the rights of data subjects.
The “privacy by default” concept requires the implementation of organisational and technical measures for ensuring that, by default and therefore at any stage of the processing, data protection principles are respected and the necessary safeguards applied.
Privacy Impact Analyses
The data protection impact assessment (DPIA) consists of analysing a processing and its risks in order to assess them and reduce them. Article 35 of the GDPR requires a DPIA for the following:
In addition, the Article 29 Working Party issued guidelines in which it lists nine criteria to be assessed. If one criterion is satisfied, a DPIA is recommended; if two criteria are satisfied, a DPIA is required. The criteria are as follows:
The CNIL also published a list of processing operations for which a DPIA is required, as well as a list for which it is not required.
Adoption of Internal or External Privacy Policies
The principle of accountability (Article 5-2 of the GDPR) requires organisations to document their compliance. In particular, organisations are required to implement documentation such as information notices, records of processing activities, global privacy policies, data retention policies, a handling procedure for complaints and data subject requests, data breach procedures, security policies, "privacy by design" and "privacy by default" procedures, etc.
In France, the French Toubon Law No 94-665 of 4 August 1994, relating to the use of the French language, and French employment law require policies and procedures to be translated into French in order to be enforceable on French employees. Privacy policies should also be translated into French.
Data Subject Access Rights
Data subjects’ rights under the GDPR include the right to be informed, the right of access, the right to rectification, the right to erasure (“to be forgotten”), the right to the restriction of processing, the right to object to processing, the right to data portability, and the right to lodge a complaint with a supervisory authority.
In France, Law No 2016-1321 of 7 October 2016 for a Digital Republic introduces an additional right for French data subjects, which is the right to define guidelines about the processing of their personal data after their death.
Use of Data Pursuant to Anonymisation, De-identification or Pseudonymisation
De-identification is a concept not used under the GDPR, which only refers to anonymisation and pseudonymisation.
Anonymisation is a processing operation that consists of using a set of techniques in such a way as to make it impossible, in practice, to identify the person by any means whatsoever and in an irreversible manner. The GDPR does not apply to anonymous data because such data is no longer of a personal nature. Before the GDPR entered into application, the Article 29 Working Party published a detailed opinion on anonymisation techniques (Opinion 05/2014 on Anonymisation Techniques adopted on 10 April 2014), which still provides useful guidance on how to anonymise personal data properly.
Pseudonymisation is the processing of personal data in such a way that data relating to a natural person can no longer be attributed without additional information. In practice, pseudonymisation consists of replacing directly identifying data in a dataset with indirectly identifying data (alias, number in a classification, etc). Pseudonymisation thus makes it possible to process data on individuals without being able to identify them directly. In practice, however, it is often possible to trace the identity of individuals using third-party data. For this reason, pseudonymised data remains personal data.
Restrictions on or Allowances for Profiling, Automated Decision Making, Online Monitoring or Tracking, Big Data Analysis, Artificial Intelligence, Algorithms
Article 22 of the GDPR provides a framework for automated decision-making processing that produces legal or significant effects. It applies to processing operations based exclusively on decisions "producing legal effects" (a decision has legal effect when it affects human rights and freedoms) or "significantly affecting individuals" (a decision can have a significant impact, similar to a legal effect, when it has the consequence of influencing a person's environment, behaviour or choices, or leads to a form of discrimination).
In principle, individuals have the right not to be subject to a decision based solely on automatic processing and producing legal effects concerning them or significantly affecting them in a similar manner. However, in specific cases, a data subject may be the subject of a fully automated decision, even if it has a significant legal effect or impact on him or her – for instance, if the decision is based on the explicit consent of the data subject, is necessary for the conclusion or performance of a contract, etc.
The Concept of “Injury” or “Harm”
"Injury" and "harm" caused by the unlawful processing of personal data are not explained by the French Data Protection Act, its implementing Decree or the guidelines of the CNIL. There is currently no sanction in France providing additional information on the notion of harm in terms of personal data protection. Any data subject may therefore seek compensation for any damages he/she has suffered in connection with privacy and data protection by invoking Article 1240 of the French Civil Code, which defines the general principle of responsibility under French law. Also, it is possible to invoke Article 226-1 of the French Criminal Code, which condemns the damage of the privacy, as well as Articles 226-16 to 226-22-1 and Articles R.625-10 to R.625-13 of the French Criminal Code, imposing sanctions for failure to comply with data protection rules.
Under Article 9 of the GDPR, sensitive data includes personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation.
In principle, the collection and processing of such data is prohibited, except under specific circumstances (eg, when the data subjects' consent is collected, when the information is manifestly made public by the data subject, when the processing is necessary for the protection of the vital interests of the data subject, when the processing is justified by the public interest and authorised by the CNIL, etc).
Financial data is not deemed sensitive. Nevertheless, in the CNIL's deliberation No 2018-303 of 6 September 2018, regarding the processing of payment card data for the sale of goods or the provision of services at a distance, financial data, including payment card data, is qualified as "highly personal data", given the serious impact on data subjects that its violation could have.
Health data is sensitive data. Personal data concerning health is data relating to the past, present or future physical or mental health of a natural person (including the provision of healthcare services), which reveals information about the state of health of that person. Hosting providers of such health data in France must apply for a specific certification to host such data, regardless of where the data is located.
Communications data is governed in French law by Article L.34-5 of the PECC transposing the e-Privacy directive. See 2.3 Online Marketing.
Voice telephony and text messaging are not deemed sensitive.
The content of electronic communications is not deemed sensitive, but is subject to specific rules and retention periods under French law. See 2.3 Online Marketing.
Children’s or students' data is not deemed sensitive, although minors are considered vulnerable persons according to the CNIL. Where the legal basis for a processing is consent and the data subject is a minor under the age of 15, the data controller must obtain the consent of the minor concerned and the holder(s) of parental authority over that minor, jointly. In addition, the processing of personal data of vulnerable persons (children) is one criterion used by the CNIL to impose the conducting of DPIAs.
Employment data is not deemed sensitive, but the CNIL has issued practical recommendations on the processing of employment data. In addition, the processing of personal data of vulnerable persons, such as employees, is one criterion used by the CNIL to impose the conducting of DPIAs.
Social Security number (NIR): although not listed as sensitive data in the French Data Protection Act, the Social Security Number is considered by the CNIL to be a very specific piece of personal data. Its use in France is strictly limited to specific purposes, as detailed in Decree No 2019-341 of 19 April 2019 relating to the use of such data, which provides a detailed list of authorised purposes limited to the sectors of welfare protection, healthcare, work and employment, financial, tax and customs, etc.
Internet, Streaming and Video Issues
Cookies and beacons
The data controller must now obtain the prior consent of the user (in a free, specific, informed and unambiguous manner by means of a declaration or a clear positive act) before storing information on a user's equipment or accessing information already stored, particularly for cookies related to targeted advertising operations, certain audience measurement cookies, performance cookies and social network cookies generated in particular by share buttons when they collect personal data without the consent of the data subjects. Certain audience measurement cookies are exempted from consent collection if cumulative criteria defined by the CNIL are met (eg, clear and complete information is provided, an easy objection mechanism is accessible and usable on all browsers and all types of terminals, there is no cross-checking with other processing operations, limited retention period, there is no possibility to follow the navigation of the internet user on other sites, etc).
In any case, all cookies, including strictly necessary cookies, require the provision of information to the user prior to their placement.
Large-scale processing of location data is one of the processing operations for which an impact assessment must necessarily be carried out, according to the list drawn up by the CNIL.
Do Not Track, and tracking technology
Do Not Track (DNT) is a function integrated into web browsers that allows internet users to indicate that they do not want to be tracked for advertising purposes. To this day, there are no regulations forcing sites to take this opposition into account.
Social media, search engines, large online platforms
French case law considers social networks as hosting providers under Law No 2004-575 of 21 June 2004 on confidence in the digital economy (transposing E-commerce Directive 2000/31/EC), meaning that, regarding user-generated content, liability can only be sought when social networks do not act expeditiously to remove or disable access to the litigious information, upon obtaining such knowledge or awareness.
Dereferencing (“right to be forgotten”) allows the user to remove one or more results provided by a search engine after a query based on the identity (surname and first name) of a person.
On 6 December 2019, the State Council took 13 decisions in light of the judgment of the Court of Justice of the European Union dated 24 September 2019 (case C 136/17), in which it applied a proportionality test and weighed against the public's right to information, in the light of three criteria:
The nature of data must also be taken into account in this weighing, and affects the scope and effectiveness of a delisting request.
Sensitive data and data relating to judicial proceedings or criminal convictions and offences benefits from a higher level of protection: such a request may only be refused where access to such data is strictly necessary to inform the public.
On 2 December 2019, the EDPB adopted guidelines on the criteria of the right to be forgotten in search engines cases, in the context of delisting requests, which provide six grounds upon which data subjects can rely to obtain the delisting of their personal data by search engines, but also on the exceptions that search engines may oppose to them. Furthermore, these guidelines clarify that the right to be forgotten concerns not only the right for data subjects to obtain from search engines the removal of links to websites containing their personal data, but also the right to object to the processing of such data under Article 21 of the GDPR.
Addressing hate speech, disinformation, terrorist propaganda, abusive material, political manipulation, bullying, etc
Pursuant to the French Law on the manipulation of information ("Fake News Law") of 22 December 2018, operators may stop the dissemination of "allegations or statements that are inaccurate or misleading in relation to a fact which may affect the upcoming vote's sincerity" for a define period before general elections and until the publication of the results. The Fake News Law specifically targets operators of online platforms whose activity exceeds a specific threshold of connection from the French territory.
In addition, a bill is under discussion to prevent the dissemination of hate speech (“Hate Speech Bill”). It increases the obligations of operators to remove certain illicit content within a very short timeframe (eg, content pertaining to terrorism and child pornography). It is likely to be adopted in the course of Spring 2020.
Article L.34-5 of the PECC defines direct marketing as "any message intended to promote, directly or indirectly, goods, services or the image of a person selling goods or providing services."
This definition is understood very broadly as it covers both the direct promotion (advertising campaigns by email, prospectus or brochure sent by email) and indirect marketing (any material aiming to promote the seller’s brand image) of products, services or the image of a company.
As a matter of principle, any B2C online direct marketing operation requires the data subject's explicit consent to receive direct marketing (opt-in). Opt-in must be obtained at the time of the collection of contact details.
Two exceptions exist, as follows:
Any electronic marketing communication must also provide the individual with a free, simple, direct and easily accessible means to opt out of marketing communications (eg, through a hyperlink at the bottom of the communication).
B2B electronic marketing communications are subject to a lighter regime. To send a B2B electronic marketing communication, the company must, at the time of collecting the contact details, provide (i) information to the recipient that its contact details will be used for marketing purposes, and (ii) a simple, free and easy way to opt out of receiving marketing communications.
For marketing communications carried out by post or by phone, opting out is admissible. When collecting the individual’s contact details, the company must inform the individual that its details can be used for direct marketing purpose and provide the individual with the possibility to opt out of direct marketing communications at any time (eg, through an unchecked box).
Some provisions of the French Labour Code complement the French data protection legal framework, as follows:
Monitoring Workplace Communications
The CNIL has issued guidelines about HR processing, which underline that employers can control and limit the use of the internet and messaging at work for personal purposes (i) to ensure the security of networks that may be affected by attacks and (ii) to limit the risks of abuse of personal use of the internet at work or a professional mail box.
However, the employer must define the rules for personal/private use of professional devices and the internet, and must inform the employees of such rules. In addition, it cannot use key loggers to remotely record all the actions performed on a computer, except in exceptional circumstances linked to a strong security imperative.
When recording employees' calls, for instance for training purposes or improvement of the services provided, employers cannot couple a call with a screen capture system of the employee's computer workstation, and may not set up a permanent or systematic listening or recording device, except as provided for by law (eg, for some specific cases, such as emergency services). It must provide employees with telephone lines that are not connected to the recording system, or a technical device enabling them to switch off the recording, for personal calls. The same applies to calls made by staff representatives in the exercise of their duties.
Labour Organisations or Works Councils
Works Councils have an active role in France, and some personal data processing carried out by employers may require the information and consultation of the Works Council before their implementation (eg, whistle-blowing schemes, geolocation of employees' vehicles, employees' performance monitoring system, etc).
Whistle-blower Hotlines and Anonymous Reporting
On 10 December 2019, the CNIL published new guidelines relating to whistle-blowing schemes, together with FAQ, which include the new French legal framework relating to anti-bribery and the duty of vigilance. The guidelines are applicable to schemes required by law pursuant to “Sapin 2” Law (Articles 8 and/or 17) and the French Law relating to the duty of vigilance and to schemes implemented by companies on a voluntarily basis to collect alerts relating to lacks of compliance with the company’s code of ethics or code of conduct.
Regarding anonymous reporting, the CNIL maintains its former position and indicates that, even if whistle-blowers can choose to remain anonymous, it is strongly recommended to encourage them to identify themselves. When exceptionally dealing with anonymous reports, companies must deploy specific and additional measures to assess the severity of the reported violation and the special care to apply.
Discovery proceedings are investigations and investigative phases prior to civil and commercial litigation, and are essential to any legal action in the United States. Discovery requests made to French companies may include requests to produce and transfer thousands of e-mails from employees.
On 23 July 2009, the CNIL adopted a recommendation underlining that the required transfers of information can be subject to the French Blocking Statute and must necessarily be carried out in accordance with the Hague Convention, the only international convention binding France and the United States with regard to legal proceedings. In addition, documents must be exhaustively listed and have a direct and precise link with the object of the conflict in compliance with data protection applicable laws. The CNIL has not issued any updated opinion since then.
A sanction procedure can be initiated against an organisation if a breach of the GDPR or the French Data Protection Act is found following the filing of a complaint, a data breach or an inspection carried out by the CNIL. In most cases, the CNIL will issue a formal warning and offer the company a chance to remediate the infringements identified, although it is not compelled to do so.
A Rapporteur is designated to draft a report, which is communicated to the data controller and the other supervisory authorities concerned if the CNIL acts as lead supervisory authority. The data controller can then submit written observations before the hearing.
After the hearing, if the CNIL acts as lead supervisory authority, it communicates the draft decision to the other supervisory authorities concerned, which must provide the CNIL with their comments within a period of four weeks. The data controller is then notified of the CNIL's decision, which can be public or not.
The data controller can appeal the CNIL's decision before the French Administrative Supreme Court (“Conseil d'Etat”) within two months.
Potential Enforcement Penalties
Non-compliance with the GDPR could result in administrative fines and criminal sanctions, as follows:
Leading Enforcement Cases
Recent major fines issued by the CNIL range from EUR20,000 to EUR50 million. The latest decisions of the CNIL have sanctioned infringements such as lacks of valid consent, appropriate retention periods, appropriate security and confidentiality measures, failure to inform data subjects and respect their rights, failure to comply with the supervisory authority, etc.
Article 77 of the GDPR recognises the right of data subjects to lodge a complaint with a national supervisory authority. Complaints in France can be addressed to the CNIL.
In addition, the French Data Protection Act provides two types of class actions:
The leading cases in the prior 12 months are the following:
Law enforcement’s access to data has been deeply modified and reinforced by Law No 2016-731 dated 3 June 2016 for “strengthening the fight against organised crime, terrorism and their financing, and improving the efficiency and safeguards of criminal proceedings”.
Under some circumstances, the Investigating Judge may prescribe the interception, recording and transcription of correspondence sent by electronic communications (articles 100 to 100-8 of the French Criminal Procedure Code). Such actions are subject to several procedural safeguards and limitations, particularly in terms of duration.
Moreover, the Investigating Judge (or police officers authorised by the Liberties and Detention Judge) can – remotely and without informing the data subject – access correspondence stored via electronic communications accessible by means of a computer identifier, for a maximum period of one month, renewed under strict conditions (Articles 706-95-1 to 706-95-3 of the French Criminal Procedure Code).
The Investigating Judge may also perform investigations at the premises concerned, to make any useful findings or conduct searches (articles 92 to 99-5 of the French Criminal Procedure Code). All objects, documents or computer data placed under judicial control shall be immediately inventoried and placed under seal.
Law No 2015-912 of 24 July 2015 on intelligence and its implementing Decree No 2016-67 of 29 January 2016 on intelligence gathering techniques define the legal framework that authorises the intelligence services to use information access techniques, particularly means of access to connection data and computer data capture, while guaranteeing the right to privacy.
For certain crimes and offences, the Investigating Judge may authorise the interception, recording and transcription of electronic correspondence. By way of exception and for the most serious crimes, it may also authorise, with or without the individual’s consent, access to information, remote access to correspondence stored via electronic communications services, the seizure of a device, copying of the device, or access to data stored on another computer device and previously available on the initially seized device. Procedural safeguards regulating such seizure are provided by articles 76 and 97 of the French Criminal Procedure Code.
An organisation must not transfer personal data to a foreign government if such transfer is not compliant with French and European data protection laws. Indeed, the Cloud Act explicitly provides that the service provider from whom the data is requested always has the possibility to object on the grounds that the request would lead to an infringement of the legislation of a foreign country and expose them to sanctions (conflict of laws situation). France does not participate in a Cloud Act agreement with the USA yet. The Mutual Legal Assistance Treaty agreement between France and the United States is not efficient, and discussions have been entered into between the USA and the EU to sign an EU-wide Mutual Legal Assistance Treaty. The provider may refuse to disclose the requested data on the basis of the common law principles of comity – ie, on the basis of the principle of international comity recognised by the US courts according to which, for the application of US law, the important interests of other countries must be taken into account and, where appropriate, US legislation must not be applied or applied in a nuanced manner.
Several laws in France are likely to hinder requisitions by the US authorities on data stored in Europe, particularly Articles 44 et seq of the GDPR (which lay down the conditions under which personal data may be transferred to third countries or international organisations), the French blocking statute (Law No 68-678 of 26 July 1968) and French Law No 2018-670 of 30 July 2018 on the protection of business secrecy.
“La Quadrature du net”, the French Data Network and “La Fédération des fournisseurs d'accès à internet associatifs” have challenged the constitutionality of certain provisions contained in Law No 2015-912 of 24 July 2015 on intelligence, before the French Constitutional Council.
The French Constitutional Council abrogated two provisions relating to the following:
Pursuant to Articles 44 to 50 of the GDPR, transfers of personal data outside the European Union are not authorised unless the third country recipient of the personal data ensures an adequate level of protection, or appropriate guarantees are applied.
The European Commission has so far recognised Andorra, Argentina, Canada (commercial organisations), the Faroe Islands, Guernsey, Israel, the Isle of man, Japan, Jersey, New Zealand, Switzerland, Uruguay and the United States of America (limited to the Privacy Shield framework) as providing adequate protection. Adequacy talks are currently ongoing with South Korea.
Where a third state is not recognised as offering an adequate level of protection, appropriate safeguards must be deployed, such as the execution of Standard Contractual Clauses, Binding Corporate Rules, etc.
In the absence of adequacy decisions or appropriate safeguards, a transfer of personal data can rely on some exceptions (article 49 of the GDPR), such as the explicit consent of the data subject, the performance of a contract between the data subject and the controller or the implementation of pre-contractual measures taken at the data subjects' request, etc.
Privacy Shield is a self-certification mechanism for companies established in the United States, which has been recognised by the European Commission as providing an adequate level of protection for personal data transferred to companies established in the United States. Before transferring personal data to a US company declaring being Privacy Shield certification, it is necessary to check if the US company certification is still active (certifications must be renewed regularly) and covers the processing in question.
Standard Contractual Clauses (SCC) are model contracts for the transfer of personal data adopted by the European Commission. There are two types of SCC: those governing transfers between two data controllers, and those governing transfers between a data controller and a data processor. In the Schrems II case, Maximilien Schrems challenged the sufficiently of the SCC to protect personal data transfers to the United States. According to the opinion of the Advocate General of the Court of Justice of the European Union presented on 19 December 2019, the standard contractual clauses used for data transfers between EU countries and third countries are "valid". It is now necessary to await the position of the European Court of Justice on this subject.
Binding Corporate Rules (BCR) constitute intra-group data protection agreements for the transfer of personal data. They must be evaluated and validated against the EDPB's standards. BCR must be legally binding and implemented by all relevant entities of the group of companies, must expressly confer rights on data subjects regarding the processing of their personal data, and must meet the requirements set out in the GDPR.
No government notifications or approvals are required to transfer data internationally.
The French Data Protection Act does not include provisions requiring personal data to be localised in France. GDPR rules apply in France for transfers of personal data processing, and no stricter rules are imposed by the French Data Protection Act.
French law provides that national treasures cannot be exported; public archives are considered national treasures. Hosting data outside France is considered exporting. The following Articles apply to exporting:
No software codes, algorithms or similar technical details are required to be communicated to the government.
Limitations to the collection and transfer of personal data in connection with foreign government data requests may be limited in France through Articles 44 et seq of the GDPR pertaining to transfers of personal data outside the European Union, the French blocking statute (Law No 68-678 of 26 July 1968) and French Law No 2018-670 of 30 July 2018 on the protection of business secrecy.
The "French blocking statute" is Law No 68-678 of 26 July 1968 relating to the communication of documents and information of an economic, commercial, industrial, financial or technical nature to foreign individuals or legal entities.
The key measures of this law are as follows:
Big Data Analytics
See 2.2 Sectoral and Special Issues: Cookies and beacons for the regulation of analytics cookies and online trackers.
A profiling processing operation is based on the establishment of an individualised profile, relating to a particular person, in order to evaluate certain personal aspects of that person, with a view to making a judgement or drawing conclusions about him or her.
A fully automated decision is a decision taken with respect to an individual, through algorithms applied to his or her personal data, without any human being intervening in the process.
In practice, the two notions are closely linked: profiling a person frequently leads to a decision being made about them, and many fully automated decisions are made on the basis of profiling.
Article 22 of the GDPR regulates automated decision processing, whether it is based on profiling or not. Pursuant to this article, individuals have the right not to be subject to a decision based exclusively on automated processing and producing legal effects concerning them or significantly affecting them in a similar way. However, in certain specific cases, a person may be the subject of a fully automated decision, even if it has a legal effect or a significant impact on him or her. These exceptions concern (i) the decisions based on the explicit consent of the data subjects; (ii) the decisions necessary for the conclusion or performance of a contract; and (iii) the decisions framed by specific legal provisions.
Article 22 of the GDPR provides a framework for the use of personal data necessary for the operation of algorithms, as mentioned above.
Article 47 of the French Data Protection Act provides an additional exception for individual administrative decisions, provided that the processing does not involve sensitive data. Such decisions include an explicit mention to explain that the individual decision has been taken on the basis of an algorithmic processing, and the data controller must be able to explain in detail and in an intelligible form to the data subject how the processing operation has been carried out.
On 19 February 2019, the European commission published a white paper on artificial intelligence, which draws the guidelines of the EU strategy on artificial intelligence and robotics, from technological, ethical, legal and socio-economic standpoints. The European Commission calls for a co-ordinated plan on artificial intelligence "made in Europe".
Internet of Things (IoT)
The CNIL has issued several practical guides on IoT and on securing connected devices, and in particular on connected TVs, toys, cars and kitchen robots, to raise awareness and to invite manufacturers to implement data protection by design and by default measures. In this context, the CNIL makes the following recommendations in particular:
See above sections on automated decision-making and artificial intelligence.
In addition, in 2017 the CNIL issued guidelines for connected vehicles for responsible data usage, which analyse three processing scenarios under which data collected in the vehicle:
The CNIL promotes the first scenario, which involves processing data locally in the vehicle, without transmission to service providers, as it provides for the best data protection guarantees.
The CNIL also states that the licence plate number and the vehicle serial number are personal data, as is any data relating indirectly to a data subject (eg, data relating to the journeys made, the condition of the vehicle's parts, the dates of technical inspections, the number of kilometres driven, driving style, etc).
The EDPB also issued guidelines 1/2020 on Connected Vehicles, which are open to public consultation until 20 March 2020.
Although the pack today applies to connected vehicles, two of its obligations have a considerable impact on the development of autonomous vehicles: the implementation of data protection by design and by default principles, as well as security by default underlined by the GDPR.
Regarding "facial recognition", the CNIL considers that a balanced overview is necessary in order to avoid any confusion and any blanket conclusion on this technology as it can host possible uses, from unlocking smartphones to opening bank accounts and recognising a person being sought by police in a crowd, and raises different issues.
Biometric processing, including facial recognition, has been identified by the CNIL as a key challenge under high scrutiny and is regulated by several legal frameworks:
The CNIL issued a position paper on the use of facial recognition techniques, which supported a risk-based approach to determine which risks are not acceptable in a democratic society and which ones can be assumed with appropriate safeguards. It also requires that the proportionality of the means deployed and the special protection afforded to children are both guaranteed. When the legal basis of the processing is the consent of the data subjects, the CNIL specifies that the provision of an alternative means that does not involve biometric devices, without additional cost or particular constraints for the data subjects, is essential in order to guarantee freedom of consent.
The CNIL also specifies that the data controller must regularly ensure that the automatic erasure of biometric templates is effective.
Geolocation is only considered by the CNIL for specific applications – eg, geolocation of employees' vehicles or geolocation to control employees' working time. The CNIL excludes the use of geolocation of employees' vehicles for monitoring compliance with speed limits, permanent control of employees, the collection of location data outside working hours, including to fight against theft or check compliance with the conditions of use of the vehicle, or to calculate the working time of employees when another device already exists. In addition, geolocation processing is considered by the CNIL as an intrusive means and is likely to require conducting a DPIA.
There is no privacy-specific law provision in France dedicated to drones, but the data protection and security by default principles apply. French Law No 2016-1428 dated 24 October 2016 only contains general provisions pertaining to security and training requirements regarding the use of certain drones.
Many collective and sectoral initiatives are initiated by industry stakeholders to mitigate risks relating to the use of disruptive technologies and to create insights for reasonable uses. The CNIL and industry professionals are elaborating sectoral codes of conduct to introduce best standards and harmonisation to specific and commonly used processing operations within an industry.
See 2.5 Enforcement and Litigation.
Several privacy-related issues occur in the context of due diligence processes in corporate transactions. Such implications are stressed by the recent statement of the EDPB adopted on 19 February 2020 regarding the privacy implications of mergers. Corporate transactions entail more and more privacy issues, such as the combination of personal data, the disclosure of personal data in the transaction process and securing the transaction process. For instance, the setting up of a data room as part of the due diligence process must comply with the main principles of the GDPR (eg, security measures to be implemented for access to the data room, prohibition of shared accounts, minimisation of communicated personal data, etc).
The Network and Information System Security Directive (NIS) of 6 July 2016, transposed into French law by Law No 2018-133 of 26 February 2018 laying down various provisions adapting national law with the European Union law in the field of security and its Decree No 2018-384 of 23 May 2018, on the security of the networks and information systems of essential service operators and digital service providers, imposes obligations on digital service providers and operators of essential services. More details are provided in the France Law and Practice, and France Trends and Developments chapters of the Cybersecurity guide.
More details are provided in the France Law and Practice, and France Trends and Developments chapters of the Cybersecurity guide.
17, avenue Matignon
75378 Paris cedex 08
+33 1 5367 4747
+33 1 5367 4748www.hoganlovells.com