UK General Data Protection Regulation and Data Protection Act 2018
The UK’s departure from the European Union took effect on 31 December 2020; from that date the EU General Data Protection Regulation EU2016/679 (EU GDPR) no longer applies directly in the UK. Instead, the EU GDPR has been incorporated into UK law (the UK GDPR) by virtue of Section 3 of the European Union (Withdrawal) Act 2018, the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2019 (as amended by the Data Protection, Privacy and Electronic Communications (Amendments etc) (EU Exit) Regulations 2020) (together, DPPEC Regulations) and the Data Protection Act 2018 (DPA 18). For practical purposes, the UK’s data protection regime is almost identical to that which applied prior to Brexit, although that position is likely to change over time.
The UK GDPR and the DPA 2018 govern the processing of personal data – ie, data that relates to an identified or identifiable individual. It has extra-territorial effect so that organisations that offer goods and services to, or monitor the behaviour of, individuals in the UK will be subject to its provisions, even where the collection and processing of personal data takes place outside the UK. Furthermore, the UK GDPR applies to processing that takes place in the context of the activities of a UK establishment (ie, an entity or branch located within the UK).
Generally, the UK GDPR applies to automated processing, but it also governs non-automated processing of personal data that forms part of a “filing system”. A filing system includes manual files that are sufficiently structured such that information about an individual is readily accessible.
“Personal data” is defined broadly to include identifiers such as name, identification number, location data, online identifiers and factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of individuals. Accordingly, an organisation will hold personal data where an individual’s identity is not known but the individual can be differentiated from other individuals – for example, by reference to a cookie identifier. The UK GDPR does not apply to data that has been irreversibly anonymised, but pseudonymised data remains within its scope.
The UK GDPR regulates controllers and processors. “Controllers” are organisations that determine the means and purposes of the processing of personal data; “processors” are organisations that process personal data on behalf of a controller. Processors have no real discretion as to how or why data is processed (although a controller’s instructions may permit limited discretion as to how those instructions are carried out by the processor).
The DPA 2018 supplements the UK GDPR by:
Following Brexit, the DPA 2018 must be read in light of the DPPEC Regulations.
ePrivacy Directive/Privacy and Electronic Communications (EC Directive) Regulations 2003
The ePrivacy Directive is currently under review by the EU, with negotiations between the Council of the European Union, the European Parliament and the European Union on a new ePrivacy Regulation currently underway. As of the date of publication, changes to a number of areas have been agreed but a number of significant areas of disagreement remain. Once finalised, the UK will need to decide whether to adopt similar legislation.
Investigatory Powers Act 2016 and Investigatory Powers (Interception by Businesses etc for Monitoring and Record-keeping Purposes) Regulations 2018
The provisions of the ePrivacy Directive that address confidentiality of communications now fall under the Investigatory Powers Act 2016 (IPA). The Investigatory Powers (Interception by Businesses etc. for Monitoring and Record-keeping Purposes) Regulations 2018 (IPIBR) contain exceptions to the general prohibition on the interception of communications that allow businesses to monitor communications lawfully, subject to satisfying strict requirements (see 2.4 Workplace Privacy).
Overview of Enforcement Environment
The UK’s data protection laws are enforced by the Information Commissioner's Office (ICO) (see 1.2 Regulators), which has a range of investigative, corrective and advisory powers. Long advocating a proportionate approach to enforcement, the ICO is an active regulator that is willing to utilise its investigative and corrective powers as required. In 2022, the largest single fine it imposed was for GBP7,552,800 (GBP500,000 in 2021), the total amount of fines imposed under the GDPR was GBP13.2 million (GBP535,000 in 2021), and the total amount imposed under PECR was GBP3.22 million (GBP3.27 million in 2021). The ICO has recently completed an extended, industry-wide investigation into data brokers (specifically the provision of offline marketing services by credit reference agencies operating in the UK), and has resumed its investigation into the adtech ecosystem. In September 2022, the ICO announced its provisional intent to impose a potential fine of GBP27 million on TikTok Inc and TikTok Information Technologies UK Limited.
The DPA 2018 also creates criminal offences for the unlawful obtaining of personal data (which includes procuring the disclosure of personal data without the controller’s consent), re-identifying information that has been de-identified without the controller’s consent, and altering (eg, blocking, erasing, destroying or concealing) personal data to prevent disclosure to an individual – for example, in response to an access request.
The ICO is the UK’s independent data protection and information rights regulator, responsible for regulatory oversight of, amongst others, the UK GDPR, PECR, the Network and Information Systems Regulations 2018 (NIS Regulations), Environmental Information Regulations and freedom of information matters.
Under the UK GDPR, the ICO has the following broad categories of powers:
Under PECR, the ICO may impose fines of up to GBP500,000 per breach and has a range of enforcement powers, including the investigation of complaints, requiring information to be provided, a limited form of audit power, requiring steps to be taken or not taken, and requiring that personal data processing ceases.
The extraterritorial scope of the UK GDPR means that the ICO has power to take action against controllers and processors based outside the UK. To date, this power has been used sparingly, when the ICO issued an enforcement notice against Canadian company Aggregate IQ (in the context of the ICO’s Cambridge Analytica investigation), requiring it to delete the personal data of UK citizens, and Clearview AI (GBP7.5 million).
In November 2022, the UK Information Commissioner set out his strategic approach to regulatory action which includes publishing all reprimands issued by the ICO on the ICO’s website, unless there is a good reason not to do so (eg, matters of national security or that it is likely to jeopardise any ongoing investigation).
ICO Investigations and Enforcement
The ICO has power to initiate investigations and act on complaints utilising the following tools.
The ICO seeks to take a proportionate approach to enforcement, prioritising cases in which there is a significant risk of harm to individuals. The process by which the ICO will assess whether and (if so) when to issue a notice and exercise its enforcement powers is set out in the ICO’s Regulatory Action Policy (currently under review), and its Statutory Guidance on Regulatory Action (recently the subject of a public consultation, which is now closed). There is a right of appeal against an enforcement notice to the First Tier Tribunal (Information Rights) within 28 calendar days.
Brexit and EU
On 31 January 2020, the UK left the EU and entered a transition period, which ended on 31 December 2020. On 28 June 2021, the European Commission published two adequacy decisions in respect of the UK. One for transfers under the EU GDPR and the other for transfers under the Law Enforcement Directive. These decisions contain the Commission’s assessment of the UK’s laws for protecting personal data, as well as the legislation designating the UK as adequate. The EU GDPR adequacy decision confirms that the UK provides adequate protection for personal data transferred from the EU to the UK under the EU GDPR.
The adequacy decisions are expected to last until 27 June 2025, provided always that the Commission can withdraw its adequacy decisions before this date, if it determines that the UK ceases to provide an adequate level of protection for personal data. In 2024, the Commission will decide whether to extend the UK adequacy decisions for a further period of up to a maximum of four years. In the absence of any extension or earlier withdrawal, the adequacy decisions will expire on 27 June 2025.
Following Brexit, UK organisations must consider whether any of their data processing activities continue to be governed by the EU GDPR (eg, where the UK entity offers goods or services to an individual in the EU, or monitors their behaviour per Article 3(2)). In those circumstances, the UK entity may need to appoint a representative under Article 27 of the UK GDPR. EU organisations will need to undertake a similar analysis to assess whether they are subject to the UK GDPR.
The ePrivacy Regulation is set to replace the ePrivacy Directive in the EU once a final text is agreed. As a regulation, once implemented, it will apply directly in all EU member states, replacing any domestic implementation of the ePrivacy Directive. As the UK is no longer an EU member state, the ePrivacy Regulation will not automatically apply, but the UK may choose to implement similar provisions.
Data Protection NGOs
There are several specialist privacy and data protection NGOs in the UK, with the most well-known being:
These organisations lobby, participate in government consultations and sometimes engage in litigation.
Industry Self-Regulatory Organisations
Where organisations engage in advertising in the UK, they should be aware of industry self-regulatory organisations the Advertising Standards Authority (ASA) and Committee of Advertising Practice (CAP), together with the Internet Advertising Bureau UK (IAB UK).
Following Brexit, the UK’s data protection regime is essentially the same as that of the EU – namely, an omnibus data protection framework. The UK’s implementation of the ePrivacy Directive, PECR, is aligned with equivalent laws across EU member states, with permitted discrepancies. The enforcement of both legal frameworks in the UK is generally proportionate. The ICO is an active regulator.
See 1.4 Multilateral and Subnational Issues and the discussion of Schrems II under 1.8 Significant Pending Changes,Hot Topics and Issues and 4.2 Mechanisms or Derogations that Apply to International Data Transfers.
As discussed in 1.4 Multilateral and Subnational Issues, the European Commission published two adequacy decisions in respect of the UK, in relation to transfers under the EU GDPR and the Law Enforcement Directive. The EU GDPR adequacy decision confirms that the UK provides adequate protection for personal data transferred from the EU to the UK under the EU GDPR.
In the wake of the Schrems II judgment and the invalidation of the Privacy Shield, UK organisations must undertake a Schrems II transfer risk assessment for transfers of personal data that rely on appropriate safeguards under Article 46 (ie, Standard Contractual Clauses (SCCs) and binding corporate rules). These transfer risk assessments require UK organisations to assess whether there is an adequate level of protection provided to personal data in the country of destination jurisdiction. Where this is not the case, UK organisations must consider whether additional safeguards can be implemented in connection with the transfer that would ensure an adequate level of protection. This may include legal safeguards (eg, additional contractual obligations), technical safeguards (eg, encryption of the data in transit/pseudonymisation) and organisational safeguards (eg, a procedure for handling and challenging government authorities’ requests for access to or disclosure of personal data). If no safeguards can be implemented, the transfer should not take place.
New Standard Contractual Clauses
On 4 June 2021, the European Commission adopted new SCCs for cross-border transfers to non-adequate third countries The new SCCs took effect on 27 June 2021. There was an 18-month grace period for organisations to transition any arrangements based on the old SCCs to the new SCCs, which ended on 27 December 2022. Therefore, as of 28 December 2022, the old SCCs can no longer be relied upon as a transfer mechanism.
Following Brexit, the new SCCs do not apply to transfers of personal data from organisations which are subject to UK GDPR and cannot be relied upon to facilitate transfers of personal data by organisations subject to the UK GDPR. On 2 February 2022, the UK Secretary of State published the international data transfer agreement (IDTA) and the international data transfer addendum to the new SCCs (Addendum), which came into force on 21 March 2022. The IDTA and Addendum replace the old SCCs and take into account the requirements of the Schrems II decision. There is a grace period for organisations to transition any arrangements based on the old SCCs, to the IDTA or Addendum, which will last until 21 March 2024. Therefore, until this date, the old SCCs can still be relied upon as a transfer mechanism for existing arrangements entered into before 21 September 2022.
In February 2021, the Council of the European Union agreed its text for the ePrivacy Regulation, paving the way for trilogue negotiations to begin. These negotiations are currently taking place between representatives of the three bodies involved in the EU legislative process, the Commission, the Parliament and the Council of Ministers. The ePrivacy Regulation currently includes a two-year transitionary period. The ePrivacy Regulation will not automatically apply in the UK, but the UK government may choose to implement similar provisions.
In response to the COVID-19 pandemic, the ICO published guidance reminding organisations that while data protection laws continue to apply, these should not stop organisations from using personal data proportionately. The ICO encouraged organisations to focus on the key principles of necessity, minimisation, transparency, fairness and security, and to ensure that individuals could continue to exercise their rights.
Data Protection Principles
Overarching principles are set out under Article 5 of the UK GDPR as follows.
Under Article 5(2), the principle of accountability requires controllers to implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing complies with the UK GDPR, and to keep those measures under review, updating them as required. Accountability also requires transparency with respect to the individuals whose data is processed. Increasingly, when investigating a complaint, the ICO reviews how controllers have complied with the accountability principle.
The UK GDPR permits a risk-based approach to compliance. Accordingly, controllers have a degree of flexibility and the measures implemented will differ between organisations, based on risk. Generally, the greater the risk presented by a processing activity, and the more intrusive it is to data subjects, the more the controller will need to do to implement robust security and accountability measures. Examples of accountability measures include internal policies and procedures, training, transparency measures, risk assessments, monitoring of internal compliance, ensuring senior leadership and oversight, and strong internal enforcement in response to complaints.
To ensure data processing is lawful, the controller must satisfy one of the following legal bases (provided by Article 6 of the UK GDPR).
Special Category Data
To process special category data and data relating to criminal offences, controllers must not only ensure a lawful basis for processing, but also establish that they can rely on an exemption to the UK GDPR’s general prohibition on the processing of such data under Article 9 (see 2.2 Sectoral and Special Issues).
Data Protection Officer
Article 37 of the UK GDPR requires controllers and processors to appoint a Data Protection Officer (DPO) if they are a public authority or where their core activities:
Organisations may appoint a DPO voluntarily, in which case they will be held to the same standards with respect to their DPO as organisations that are obliged to appoint one.
The role of a DPO is to monitor data protection compliance, inform and advise the organisation on data protection (including in relation to Data Protection Impact Assessments (DPIAs), as discussed below) and to co-operate with the supervisory authority. For example, the DPO will ensure that appropriate training is provided to employees, and act as a contact point for regulators and for data subjects.
Data Protection Impact Assessments
Where data processing activities are likely to result in a high risk to the rights and freedoms of individuals, having regard to the nature, scope, context and purposes of the processing, controllers must undertake a DPIA under Article 35 of the UK GDPR. A DPIA is typically required where processing involves the use of new technologies, and must be undertaken where the processing involves:
A DPIA is a key accountability measure for organisations, intended to assist in identifying and mitigating risk. What constitutes “high risk” is not specified in the UK GDPR, but the ICO has published guidance on the matter. The ICO considers that a DPIA should always be undertaken when profiling is carried out on a large scale, where data is combined, compared or matched from multiple sources, or where profiling, automated decision-making or special category data is used to help make decisions concerning access to a service, opportunity or benefit.
Where a DPIA does not demonstrate that the risks of processing may be adequately mitigated, the controller must consult the ICO under the prior consultation procedure before commencing the processing.
Data Processing Agreements
When a controller engages a processor to perform processing activities on its behalf, it must enter into a data processing agreement that complies with Article 28 of the UK GDPR. Obligations imposed on the processor under any such agreement must, in turn, be flowed down to any subsequent processor in the chain – ie, a “sub-processor”. While not required by the UK GDPR, such contracts generally include an apportionment of liability for data breaches and other violations of data protection laws between the parties.
Article 32 of the UK GDPR requires controllers and processors to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk of their processing. To determine which measures are appropriate, organisations should take into account the state of the art, the costs of implementation and the nature, scope, context and purposes of the processing, as well as the likelihood and severity of the risk for individuals’ rights and freedoms.
The UK GDPR suggests a range of measures that may be appropriate. These include pseudonymisation and encryption, measures that ensure the ongoing confidentiality, integrity, availability and resilience of processing systems, measures that ensure the availability of and access to personal data in the event of a physical or technical incident, and a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring security.
Notably, security measures form an important part of the privacy “by design and by default” approach to data protection that organisations are required to adopt under Article 25 of the UK GDPR, in addition to the measures and obligations set out above.
A personal data breach under the UK GDPR is a breach of security leading to the accidental or unlawful destruction, loss, alteration or, unauthorised disclosure of, or access to, personal data. Article 33 requires a controller to report a personal data breach to the ICO without undue delay and within 72 hours of becoming aware of the breach. Notification is not required if the breach is unlikely to result in a risk to the rights and freedoms of natural persons. Processors must notify relevant controllers of any breach, without undue delay, but not the ICO.
Under Article 34, if the breach is likely to result in a high risk of adversely affecting the rights and freedoms of the affected data subjects, the controller must notify those data subjects without undue delay.
Special Category Data
The UK GDPR prohibits the processing of “special category” data (ie, personal data concerning racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data for the purpose of uniquely identifying a natural person, health data and data relating to a natural person's sex life or sexual orientation), unless one of a limited number of exceptions applies. Personal data relating to criminal convictions and offences is not “special category” data, but forms a separate category of personal data. This data is also regarded as sensitive, and may only be processed for limited purposes. Financial data is not a sensitive data category.
Special category personal data may be processed where the individual has provided explicit consent or where the processing fulfils one of the following criteria.
Certain additional grounds for processing special category data are provided in Parts 1, 2 and 3 of Schedule 1 to the DPA 2018, which replicate and extend grounds previously included in the DPA 1998. Many of these grounds now require the controller to implement an Appropriate Policy Document to document how the processing complies with the data protection principles.
Criminal Convictions Data
The processing of criminal convictions data may be carried out under the control of an official authority or where authorised by a UK law providing appropriate safeguards for the rights and freedoms of data subjects. The DPA 2018 effectively provides that such data may be processed on the same basis as special category personal data, as described above.
The processing of children’s data represents an area of regulatory focus in the UK, particularly in the provision of online services. On 2 September 2021, the ICO’s Age Appropriate Design Code (AAD Code) came into force following a 12-month implementation period. The AAD Code consists of 15 design principles to ensure that the interests of children are central to the design and development of online services that they are likely to access. Requirements include the need for settings to be “high privacy” by default, that only the minimum amount of data is collected, that geolocation services default to “off”, and that “nudge techniques” are not utilised to weaken privacy safeguards (eg, by encouraging children to provide unnecessary data).
The ICO will enforce compliance in accordance with its Regulatory Action Policy, requiring organisations to demonstrate compliance. While the failure to comply with the AAD Code is not a breach of the UK GDPR per se, non-compliance with the AAD Code makes it difficult for organisations to demonstrate compliance with the UK GDPR and/or PECR. As such, they may be subject to sanction, including warnings, reprimands, orders to cease processing data and fines.
Data Subject Rights
Chapter III of the UK GDPR provides data subjects with certain rights with respect to their data, as summarised below. The DPA 2018 contains a significant number of exemptions that determine whether the data subject rights apply in individual cases.
Data subjects have a right under Article 22 not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning them or similarly significantly affects them.
Controllers are required to facilitate and comply with requests from data subjects to exercise these rights and must take the action requested without undue delay, in any event, within one month of receipt of the request. This deadline may be extended by a further two months where necessary.
PECR and Marketing Communications
Electronic marketing communications are governed by PECR in the UK (see 1.1 Laws). PECR requires prior consent to be obtained before any unsolicited electronic communications (ie, emails, texts and social media) are sent to individuals in their personal capacity. An exemption, known as “soft opt-in”, is available where the consumer provides their contact details during a sale (or negotiations for a sale) during which they have an opportunity to opt-out of marketing communications. Organisations may then deliver unsolicited marketing for products or services that are similar to those purchased by the consumer, provided there is an opt-out opportunity in each subsequent marketing communication.
Business recipients must be permitted to opt-out of further communications, but prior consent to electronic communications is not required.
Telephone and Fax
PECR also governs the use of marketing conducted via telephone and fax. Organisations must not make live marketing calls to numbers registered with the Telephone Preference Service (TPS) or Corporate TPS, unless the person has specifically consented to receive their calls.
Marketing activities will also fall within the scope of the GDPR where they involve the processing of personal data, even where the recipient is contacted via their business email address. Accordingly, the requirements of both regimes need to be considered.
PECR and Cookies
All monitoring in the workplace must be for a lawful purpose and must be fair and proportionate. Employers must balance their objectives against employees’ right to privacy. While employers have a legitimate interest in securing their systems, this must be balanced against the intrusiveness of any workplace monitoring. Only in limited circumstances can organisations undertake covert monitoring (eg, where there is reasonable suspicion of criminal behaviour).
Monitoring must be conducted in a transparent manner and the nature of the monitoring should be within the reasonable expectation of employees, who must be given notice. The UK GDPR’s other processing principles (eg, purpose limitation and data minimisation) should also be complied with.
Workplace monitoring of electronic communications is regulated by the IPA and the IPIBR. Under these regulations, interception of communications during transmission by employers will be permitted only where the monitoring relates to business activities and the employer has made all reasonable efforts to inform both parties to the communication that interception may take place. Monitoring should only be conducted on communications systems provided by the employer, although personal communications sent using business systems may sometimes be intercepted.
Whistle-blower hotlines (whether telephone or web-based) typically involve the processing of personal data, and must comply with data protection requirements. Relevant personal data is likely to concern the whistle-blower, the subject of the report, the incident and details of any follow up. The principles described in 2.1 Omnibus Laws and General Requirements must be met. Additional complexity will arise if the hotline is part of a global reporting tool, and steps will need to be taken to determine whether personal data needs to be transferred abroad (eg, to the USA).
The European Parliament and Council of the European Union adopted the Whistleblower Directive 2019/1937. The Directive introduces a common EU regime for the protection of persons who report breaches of EU law. It sets out, among other things, procedures for reporting channels, follow up of reports of breaches, prohibition of penalisation and provisions in relation to confidentiality. Member states were required to implement the Directive into their respective national laws by December 2021. However, as of February 2023, only seven member states have passed laws implementing the Directive’s requirements. Following Brexit, the UK is under no legal obligation to implement the Whistleblower Directive and, as of February 2023, it has not chosen to do so and is not expected to do so in the future. The UK was one of the countries which the European Commission already deemed to grant whistle-blowers comprehensive protection (under the Public Interest Disclosure Act 1998 as incorporated into the Employment Rights Act 1996). While much of the content of the Whistleblower Directive is already contained in UK law, the UK whistle-blower regime does not cover everything contained in the Whistleblower Directive.
Under the UK GDPR, the ICO has extensive enforcement powers, as described in 1.3 Administration and Enforcement Process. The sanctions imposed on controllers and processors that infringe the UK GDPR will depend on the severity of the breach, considering the nature, gravity and duration of the infringement, as well as whether the infringement was intentional, the degree of responsibility of the organisation for the incident, previous infringements and mitigating steps (see 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation).
Implementation of the GDPR has triggered an increase in class actions, which typically (but not exclusively) operate on an opt-in basis. The UK GDPR also explicitly provides that compensation should be available to any data subject who suffers damage (eg, where a data subject has lost money), even where the damage is not material (eg, where a data subject has suffered distress). A recent decision has clarified that compensation is not available for mere “loss of control” of personal data in the absence of material damage.
In addition, the ICO may conduct investigations and is currently investigating the adtech sector generally, signalling that enforcement will follow.
The IPA regulates the targeted and bulk interception and acquisition of electronic communications by law enforcement agencies (such as police forces) in the UK. The IPA prohibits such activities unless they are carried out with “lawful authority”.
Part 2 of the IPA allows a law enforcement agency to obtain a warrant relating to the targeted interception of communications (ie, to a particular person or organisation, or a group of persons who share a common purpose or carry out an activity together).
Part 3 of the IPA allows designated senior officers of certain public authorities (including police forces) to authorise officers to obtain data from any person that relates to a telecommunications system or is derived from a telecommunications system, where it is necessary to obtain that data for the purpose of a specific operation or investigation.
Warrants issued under Part 2 of the IPA are issued by the Secretary of State and must be approved by a Judicial Commissioner appointed under the IPA. The Investigatory Powers Commissioner is responsible for oversight of actions by public authorities carried out under the IPA.
Section 2 of the IPA imposes a general duty in relation to privacy upon public authorities (including police forces) exercising powers authorised by the IPA. In particular, public authorities must have regard to whether what is sought to be achieved by a warrant or authorisation issued under Part 2 or 3 of the IPA could reasonably be achieved by other less intrusive means.
In addition to the IPA, Part 3 of the DPA 2018 imposes a number of obligations on law enforcement agencies in relation to their processing of personal data. For example, law enforcement agencies must comply with six data protection principles (such as that processing of personal data for law enforcement purposes must be lawful and fair), data subjects may exercise rights (such as the right of access) against law enforcement agencies that process personal data, and law enforcement agencies are subject to a number of accountability obligations (such as an obligation to maintain records of personal data processing activities).
The IPA also applies to the obtaining of information by government agencies for intelligence, anti-terrorism or other national security purposes. In addition to obtaining warrants under Part 2 of the IPA or authorisation under Part 3, government agencies may also obtain data in the following ways.
Any decision to issue a retention notice under Part 4, or a warrant under Part 6 or 7, must be approved by a Judicial Commissioner.
Part 4 of the DPA 2018 imposes obligations on intelligence services in relation to their processing of personal data, although those obligations are more limited than those imposed on law enforcement agencies.
There is no specific legal basis in UK law that permits UK organisations to collect and transfer personal data in connection with a foreign government access request. In some instances, UK-based organisations may have a legitimate interest under Article 6 of the GDPR (to process personal data) and Article 49 of the GDPR (to transfer personal data). Organisations should consider any such collection and transfer of personal data carefully, as in many instances an individual’s rights and freedoms are likely to override any legitimate interest the organisation has in complying with the foreign government access request.
The UK and USA entered into a data sharing agreement on 7 October 2019 in connection with the US CLOUD Act 2018, for the purpose of facilitating cross-border data sharing by law enforcement agencies to counter serious crime.
UK intelligence services agencies have extensive powers under the IPA to obtain data (including personal data) for a range of purposes. Debate is ongoing about the extent to which those powers are necessary and proportionate, and whether they go beyond what is necessary and proportionate. This will be a key consideration for the European Council in determining whether to approve the European Commission’s draft adequacy decision for the UK. Even if an adequacy decision is given, privacy activists have signalled that they will likely challenge it, arguing that UK intelligence services have data acquisition and retention powers that go beyond what is necessary in a democratic society.
The UK GDPR prohibits the transfer of personal data from the UK to jurisdictions or international organisations unless the provisions of the UK GDPR are complied with. These are discussed in 4.2 Mechanisms or Derogations that Apply to International Data Transfers.
Personal data may lawfully be transferred outside the UK in the following circumstances.
Following the Schrems II case, the Privacy Shield is no longer recognised as a valid data transfer mechanism.
There are no general requirements to obtain government approval for international data transfers, except in the limited scenarios where organisations rely on the compelling legitimate interests derogation.
There are no data localisation requirements under UK law. There are restrictions on how personal data may be transferred internationally (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers).
There are no general requirements under UK law for software code, algorithms or similar technical detail to be shared with the government.
An organisation subject to the UK GDPR seeking to transfer personal data abroad in connection with foreign government data requests, foreign litigation proceedings or an internal investigation will need to meet the requirements of the UK GDPR. There must be an appropriate legal basis for the data processing itself (ie, collating the relevant personal data for the purpose of the request) and an appropriate legal basis for any transfer of the data abroad. Such requests must be examined carefully, on a case-by-case basis. The nature of the data and the proposed purpose of the processing will be relevant, as will the manner in which the data is to be disclosed and used abroad.
The UK does not have any “blocking” statutes in the context of data protection. Specifically, the UK has removed Article 48 of the GDPR from the UK GDPR. Previously, this provision operated as a blocking provision where a court or tribunal in a third country required an EU controller or processor to disclose personal data.
Emerging and developing technologies all raise similar compliance issues, particularly concerning transparency, proportionality and explainability. The UK GDPR and DPA 2018 do not explicitly reference new technologies, except in relation to the obligations to carry out DPIAs (see 2.1 Omnibus Laws and General Requirements).
The ICO has issued guidance on AI and machine learning, entitled Explaining decisions made with AI (developed in collaboration with the Alan Turing Institute). This is relevant also to autonomous decision-making and linked issues. It emphasises that providing explanations regarding AI-assisted decisions is one way to demonstrate that the individual has been treated fairly and in a transparent manner. This can be challenging as AI and machine learning systems are often designed to solve problems and spot patterns beyond the capability of humans. The manner in which they achieve this may not be fully understandable to those deploying the relevant technologies, let alone explainable to those whose personal data is utilised. Information provided to individuals must not be overly technical so as to enable individuals to understand the nature of a sophisticated system’s processing activity.
The topic of online harms is currently under government review and a White Paper for consultation was issued in 2020, in which the government proposes specific regulation of a number of areas.
The topics of biometrics including facial recognition are not directly addressed in UK law, but see 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation for action taken by the ICO and the courts in respect of biometric data.
UK organisations are starting to develop ethical approaches to the use of personal data. These are usually intended to guide employees, but they also build trust with customers. For example, IBM published an ethical framework for big data analytics dealing with how data will be collected and used, and Vodafone published a set of privacy commitments focusing on transparency and minimising the risks associated with using people’s personal data. These organisational data processing frameworks typically align with the key data protection principles and requirements outlined in the UK GDPR, often with a focus on fairness and transparency.
In 2022, the ICO imposed fines on the following companies:
In 2022, the ICO imposed fines totalling GBP13.2 million (GBP535,000 in 2021). In September 2022, the ICO also announced its provisional intent to impose a fine of GBP27 million on TikTok Inc and TikTok Information Technologies UK Limited.
Voice Recognition Technology
In May 2019, the ICO issued an enforcement notice against Her Majesty’s Revenue and Customs (HMRC) for processing biometric data without a valid legal basis under the GDPR. HMRC introduced a voice verification system, and collected the biometric data of 7 million data subjects, but failed to satisfy a legal basis for the processing. The consent obtained by HMRC failed to meet the GDPR standard for valid consent. The ICO did not issue a monetary penalty, but required HMRC to delete all biometric data held under the Voice ID system for which it did not have explicit consent.
Automated Facial Recognition Technology
A civil liberties campaigner, Mr Bridges, brought judicial review proceedings after South Wales Police (SWP) launched a project involving the use of automated facial recognition (AFR) technology. SWP deployed AFR technology in certain public locations where crime was considered likely to occur, matching captured images with “watchlists” of wanted persons in police databases using biometric data analysis. Mr Bridges challenged the use of AFR technology as being unlawfully intrusive, including under Article 8 of the European Convention on Human Rights (right to respect for private and family life). On appeal, the Court of Appeal held that the use of AFR technology was unlawful and violated human rights.
On May 2022, the ICO imposed a fine of GBP7.5 million on Clearview AI Inc. In addition, the ICO issued a notice to stop further processing of the personal data of people in the UK and to delete it following alleged serious breaches of the UK’s data protection laws. The fine relates to a number of issues, including Clearview AI Inc’s use of biometrics for facial recognition.
In June 2021, the ICO published a formal Information Commissioner’s Opinion addressing privacy concerns on the use of live facial recognition technology in public places. The Opinion was informed in part by six ICO investigations into the use, testing or planned deployment of live facial recognition systems.
Class Actions and Collective Redress
There have been a number of class actions alleging privacy infringements, in part due to the GDPR providing that compensation should be available to any data subject who suffers damage, even if not material damage. Furthermore, the GDPR permits representative bodies to pursue actions on behalf of data subjects. Class actions have been brought following cybersecurity breaches, with British Airways, Marriott and easyJet all facing claims. In July 2021, British Airways settled a class action brought by thousands of customers impacted by the cybersecurity incident.
WM Morrisons Supermarkets plc v Various Claimants is a class action arising out of the deliberate and unlawful publication of employee payroll data by a disgruntled employee. Affected employees commenced litigation against Morrisons for breaches of the DPA 1998 (in force at the time of the breach), misuse of private information and breach of confidence. The claimants also argued that Morrisons was vicariously liable for the actions of its disgruntled employee. Morrisons was found not to be vicariously liable, but the Supreme Court left open the possibility of employee class actions following data breaches in future.
In November 2021, the UK Supreme Court issued its judgement in Lloyd v Google. The Supreme Court was asked to consider whether an opt-out style of representative action is permitted (it being accepted that all members of the class suffered the same loss), and whether loss of control over personal data can be compensated, in the absence of any other material harm. The Supreme Court ruled in favour of Google, finding that the representative claim against Google should not be allowed to proceed. Moving forward, the decision means that actions for collective redress in relation to violations of data protection law in the UK likely will need to be based on the affirmative opt-in of the represented claimants, and each claimant will need to demonstrate the material damage that they suffered.
Privacy issues often arise in the context of a corporate transaction, notably where the target organisation is a data-heavy business. Due diligence represents a key activity in most corporate transactions, enabling the buyer to assess the value of the target business, having regard to compliance gaps and risks associated with the target company. From a privacy perspective, the buyer will typically use the due diligence phase to examine compliance with applicable data protection laws, usually focusing on the target’s approach to employee data, consumer data, and carefully examining the development and deployment of key data processing technologies.
Most due diligence processes will involve the seller sharing personal data with the buyer, and often with the buyer’s professional advisers and a virtual data room provider. This creates certain risks, including the risk of personal data being inadvertently disclosed to, or accessed by, unauthorised third parties. Sellers and buyers typically take steps to mitigate the risks associated with conducting due diligence, including the following.
There are no non-privacy/data protection-specific laws that mandate disclosure of an organisation’s cybersecurity risk profile or experience.
An emerging trend that is gathering pace is the potential for regulatory overlap between privacy and competition law. The proposed EU Digital Markets Act, the Digital Services Act, the Data Act and the e-Privacy Regulation raise questions as to how the need to regulate important aspects of the digital ecosystem will sit alongside or overlap with the core data protection framework of the UK GDPR. In addition, there is growing convergence between competition regulators, consumer rights regulators and data protection regulators, each with a different focus, seeking to regulate an increasingly similar set of issues.
The UK is attempting to address these issues and, following Brexit, may now have the freedom to create solutions that better meet the UK’s needs. For example, now the UK is no longer a member of the EU, it is under no legal obligation to implement EU laws, including the Digital Markets Act, the Digital Services Act, the Data Act or the e-Privacy Regulation. As of February 2023, the UK has not expressed any indication that it intends implement these laws. It remains to be seen whether the UK adopts similar laws of its own. If it does, the UK has the freedom to develop its own laws that meet the UK’s needs. While these EU laws do not directly apply to UK companies, they may nevertheless impact UK companies that operate in the EU or have establishments in the EU.
For completeness, operators of essential services (OES) such as utilities, and relevant digital service providers (RDSP) (such as providers of digital marketplaces, online search engines and cloud services) are subject to the EU Network and Information Systems Directive, implemented in the UK by the NIS Regulations. The NIS Regulations seek to ensure common levels of security to safeguard critical infrastructure. The ICO is the competent authority for RDSPs in the UK.
The UK NIS regime also includes an implementing act for digital service providers (known as the DSP Regulation), and specifies security requirements and incident reporting thresholds for certain organisations. While the UK GDPR concerns personal data, the NIS Regulations concern the security of network and information systems. That said, there is a significant crossover between the UK GDPR and the NIS Regulations, due in particular to the UK GDPR’s security requirements. In this respect, the application of the NIS Regulations is broader as it applies to digital data, and not just personal data.
Under the NIS Regulations, RDSPs are required to notify the ICO of any incident having an actual adverse effect on the security of network and information systems. Similar to the UK GDPR, this notification must be made without undue delay and no later than 72 hours of becoming aware of it. The NIS Regulations specify the information that must be included.
The ICO’s enforcement powers in respect of the NIS Regulations include issuing enforcement notices, exercising powers of inspection and imposing monetary penalties of up to GBP17 million in the most serious cases.
30 St Mary Axe
+44 0 20 7220 5700
+44 0 20 7220 5772info@HuntonAK.com www.HuntonAK.com
UK Data Protection After Brexit: Challenges and Opportunities
Data protection regulation is exploding around the globe. Driven by the necessity of processing personal data, rapid developments in technology and evolving consumer expectations, countries are increasingly turning to omnibus privacy frameworks, frequently modelled on the EU’s General Data Protection Regulation (GDPR). Within this rapidly evolving regulatory environment, the UK is perhaps uniquely placed. It has a long history of regulation in this area, dating back to the 1960s and 1970s, ratifying the Convention for the Protection of Individuals with regard to automatic processing of personal data (Treaty 108), and then passing its first comprehensive data protection law, the Data Protection Act in 1984. While the UK’s post-Brexit data protection framework is the UK GDPR, now that the UK is no longer a part of the European Union, there is an opportunity for the UK to mastermind its own data protection future. What does this mean for data protection regulation in the UK going forward, and what should companies expect?
UK’s post-Brexit data protection framework
In June 2021, the European Commission adopted a decision recognising the UK’s data protection regime to be adequate. Notably, the European Commission’s decision mandates a review after four years, with adequacy lapsing unless reaffirmed within the four-year timeframe. Any significant changes to the UK’s data protection regime, particularly those that weaken the rights of data subjects, will be considered by the European Commission in reviewing the decision and may result in the decision not being reaffirmed. The UK government has expressed a desire to reform the UK’s post-Brexit data protection regime; to that end, it carried out a consultation on potential reforms to the UK’s regime and published the draft Data Protection and Digital Information Bill. In the coming months and years, the UK will need to balance this potential liberalisation of its data protection regime against the possibility that significant changes may result in the non-renewal of the UK’s adequacy decision and limitations on the free flow of personal data from the EU to the UK.
One of the most discussed post-Brexit data protection topics is data transfers. In the post-Schrems II world, the GDPR data transfer framework continues to challenge businesses. The Information Commissioner’s Office (ICO) has published guidelines on how organisations should assess the risks associated with international data transfers flowing from the Schrems II decision. The ICO also has published an International Data Transfer agreement and an Addendum to the EU standard contractual clauses (SCCs) that must now be used for any new transfers of personal data outside the UK. The International Data Transfer Agreement is the UK’s answer to the EU SCCs and constitutes a form of appropriate safeguards under the UK GDPR. The Addendum to the EU SCCs may be appended to the EU SCCs by organisations subject to the UK GDPR, and doing so will allow those organisations to rely on the EU SCCs in relation to data transfers under the UK GDPR.
Those operating in the UK need to navigate several issues:
The extent to which the ICO will seek to bring enforcement actions in relation to international data transfers from the UK remains to be seen, but this has not historically been an area of focus of the ICO.
Global privacy framework
The concept of exporting applicable rights and obligations together with data is not unique to the EU; it is also a feature of the Asia Pacific Economic Cooperation (APEC) Privacy Framework’s Cross Border Privacy Rules (CBPR). The approach creates challenges for data importers who must ensure that they can honour these requirements across their platforms, even where the requirements differ from their own domestic data protection laws. This is one of the factors driving interest in the creation of a global data protection standard. The GDPR is becoming widely adopted around the globe, or at least is forming the basis of emerging data protection laws in many countries.
But the GDPR is far from the only approach. Treaty 108 has recently been updated, and the APEC Privacy Framework (based on the OECD Guidelines) continues to expand. Unlike the EU approach, the APEC Privacy Framework does not require members to have the same laws, but requires a consistent approach, based on nine privacy principles. APEC’s Cross-Border Privacy Rules (or CBPRs) enable data transfers between signatories to the APEC Privacy Framework. As the limits of the EU’s adequacy approach emerge, notably the challenge of ensuring “essential equivalence”, the search for creative solutions to enable global data flows while ensuring strong data protection will continue. The UK is well-placed to build bridges with other privacy regimes, and may be incentivised to do so given that data has become an inextricable part of global trade negotiations. In August 2021 the UK government published its mission statement entitled “International data transfers: building trust, delivering growth and firing up innovation”, which underlines its intention to ensure the free flow of data between the UK and key global economies, by recognising those countries as adequate or adopting alternative data transfer mechanisms that will facilitate data transfers to those countries. Priority countries identified by the UK for the adoption of adequacy decisions include the US, Australia, Brazil, India and Singapore, although as February 2023 limited progress has been made and only South Korea has been finalised.
One consequence of the Schrems II decision, and the more conservative approach that it signals to EU data exports, has been increased discussion of data localisation and data sovereignty. Anecdotally, vendors are ramping up their EU data processing capacity, creating EU clouds and EU service capability so that EU clients’ personal data does not need to be transferred outside the region. There has been open discussion at a political level of the merits of the EU pursuing a data localisation strategy. While localisation and increased data protectionism are not new and invariably result in discussions about big data analytics, increasingly these issues are on the table in trade negotiations. It is notable that the EU-UK Trade and Cooperation Agreement does not contain a localisation provision. As the UK explores trade opportunities outside the EU, it is hoped that it will seek to maintain, rather than constrain, the free-flow of personal data across borders.
An emerging trend that is gathering pace is the potential for regulatory overlap with data protection principles. The proposed EU Digital Markets Act, the Digital Services Act and the ePrivacy Regulation raise questions as to how the need to regulate important aspects of the digital ecosystem will sit alongside or overlap with the core data protection framework of the GDPR. In addition, there is growing convergence between competition regulators, consumer rights regulators and data protection regulators, each with a different focus, seeking to regulate an increasingly similar set of issues. The UK is attempting to address these issues and, outside of the EU, may now have the freedom to create solutions that better meet the UK’s needs.
There has been a more limited use of the ICO’s fining powers since the significant fines it issued in 2020 against British Airways (GBP20 million), Marriott (GBP18.4 million) and Ticketmaster (GBP1.25 million). In May 2022, the ICO imposed on Clearview AI Inc a fine of GBP7.5 million in relation to its collection of a database of more than 10 billion facial images, which it used for carrying out biometric identification of individuals. The ICO took formal enforcement action on approximately 60 occasions in 2022, but there were no other significant fines and the majority of fines imposed relate to electronic marketing practices. The reason for the ICO’s more limited enforcement action in 2022 compared to previous years is unclear, but it is possible that the reduction in the British Airways and Marriott fines in 2021 has caused the ICO to adopt a more conservative stance in issuing significant monetary penalties, and that the ICO has adopted a more pragmatic approach to enforcement during the COVID-19 pandemic. The ICO has, however, indicated that going forward it intends to more actively use its enforcement powers, and that it is expected to be nimbler now that it is no longer required to consult with EU authorities before taking action.
Private rights of action
Recently, there have been important developments in relation to private rights of action for data protection infringement. Historically, under the old Data Protection Act 1998 (DPA 1998), recovery of damages for breach of data protection was rare. Typically, the loss that arises from data protection infringement is distress rather than financial loss, and it was generally thought that the DPA 1998 required the establishment of financial loss before non-financial loss could be considered by the courts. That position changed in Vidal-Hall v Google Inc (2015) EWCA Civ 311 when the Court of Appeal, noting that the purpose of data protection legislation is not to protect economic rights, determined that financial loss is not a prerequisite to damages in data protection claims. Now, under the UK GDPR, it is clear that compensation can be awarded for both material and non-material damage (such as mental distress).
In 2021, the Supreme Court in Lloyd v Google LLC (2021) UKSC 50 considered the question of whether compensation for damages under UK data protection law can be awarded for mere “loss of control” of personal data in the absence of any material damage (ie, financial loss or distress). The Supreme Court ultimately decided that compensation is recoverable only when violation of the law results in tangible and material damage to impacted individuals. The decision will provide reassurance that technical breaches of data protection law that do not give rise to real harm to individuals will not be sufficient grounds for a compensatory award. In 2022, there were a number of decisions that affirmed the standard in Lloyd and furthermore indicated that, as a general matter, the threshold for proving actionable distress in the context of a data protection claim is high.
Usually, claims for breach of the DPA 2018 are made in conjunction with other claims, such as for breach of privacy, misuse of private information, breach of confidence, defamation and breach of employment rights. Damages awards for breach of the DPA 1998 and the DPA 2018 have varied. The majority to date have been for modest sums, but that position may well evolve in future.
Growth in UK class action litigation
Of interest to potential defendants in the UK is the emergence of class action litigation involving data protection claims. In the UK, these actions generally take one of two forms:
Group litigation orders proceed on an opt-in basis (unlike US class actions, which are opt-out). There has been an increase in class action claims under the GDPR, with most of the larger cybersecurity breaches spawning such claims, and specialist class action law firms ready to facilitate them. Accordingly, British Airways, Marriott and Easyjet are all facing claims, among others. The Supreme Court in Lloyd v Google (2019) ECWA Civ 1599 severely curtailed the ability of claimants to bring representative actions, finding that a representative action can proceed only when each individual represented in the claim has suffered the same loss, and that Lloyd was unable to demonstrate that each representative claimant had, in fact, suffered the same loss. The decision sets a high bar for collective redress under UK data protection law. Going forward, actions for collective redress generally will need to proceed on an opt-in basis, and each claimant will need to demonstrate the material damage that they have suffered. Representative action claims generally will be appropriate only in circumstances where there is a very high degree of conformity in the damages suffered by each of the represented claimants, such that there is no need for each claimant to individually demonstrate the damage they have suffered. Further developments in this area are expected in the coming years.
The ICO’s role post Brexit
Against the backdrop of Brexit, it is reasonable to consider what the future role of the ICO should be. Following the UK’s departure from the EU, the ICO has no formal role within the European Data Protection Board (EDPB), and is no longer part of the EU GDPR’s consistency mechanism or one-stop-shop. Once the largest data protection regulator in the EU (with a headcount of 822 permanent staff in March 2021), the ICO provided significant support to the work of the EDPB, and was known for its proportionate and pragmatic approach to issues. Now, the ICO must find its feet outside of the EU. Indeed, this task began some time ago, marked by publication of the ICO’s International Strategy in 2017. The ICO is influential within the international community of data protection regulators, currently chairing the Global Privacy Assembly and the International Conference of Information Commissioners. Following Brexit, the ICO is in an important position to bridge the gap between the APEC nations and the GDPR nations, and ideally placed to be the voice of reason challenging some of the more conservative views on regulatory enforcement. There is growing interest in the role of smart regulation in data protection, and in regulatory approaches that do not merely wield a stick, but incentivise responsible data usage to create efficiencies and value, while respecting individual rights. The ICO should be a leading voice in this debate, which is of increasing importance in the context of converging technologies and ever smarter data processing. The debate is also of practical importance to the UK as it navigates new trade opportunities following Brexit.
Just as the UK must seize the opportunity to forge a relevant role on the international stage following Brexit, the ICO must also create a meaningful international role as a data protection regulator with regulatory roots in the EU GDPR. There is a true opportunity for the UK to contribute fresh thinking and creativity to advance (if not solve) some of the more difficult data protection problems, including cross border data transfers and smart regulation. Organisations with interests in the UK should monitor these developments closely.
30 St Mary Axe
+44 (0) 20 7220 5700
+44 (0) 20 7220 5772info@HuntonAK.com www.HuntonAK.com