Data Protection & Privacy 2024 Comparisons

Last Updated March 12, 2024

Contributed By AWO

Law and Practice

Authors



AWO provides a unique offering in the rapidly developing area of AI, technology, data protection and privacy. By bringing together renowned sector experts and award-winning solicitors, it has grown to be a standout firm in this sector. The team uses its expertise to help a diverse range of clients navigate these complex areas of law in a rights-respecting, socially responsible and commercial way. AWO has created an innovative platform to provide legal services, consultancy and public affairs advice. AWO’s unique model is sought after by clients who wish to understand and shape some of the most significant issues arising from the emergence of new technologies and the mass collection of individuals’ personal data. As a result, AWO is now recognised as a leading firm in this changing and complex space. AWO is rarely absent from the front lines of analysis, commentary and strategic action in both the UK and the EU.

Data Protection Laws in the UK

The UK does not have a codified constitution. The Human Rights Act 1998 incorporates the European Convention on Human Rights. The European Court of Human Rights has acknowledged that the protection of personal data is of fundamental importance to a person’s enjoyment of his or her right to respect for private and family life, home and correspondence, as guaranteed by Article 8 of the Convention.

The UK’s data protection framework further intends to “contribute to the protection of individuals’ fundamental rights and freedoms”. Data is protected as a fundamental human right in the UK. Consequently, the rights provided cannot be abridged by, for example, a contract.

Data protection has been termed “the law of everything”. When coupled with the need to ensure “effective and complete” data protection for individuals, this requires considering data protection and privacy issues holistically.

The main laws governing data protection in the UK are the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act (DPA) 2018. Together, they regulate the processing of personal data. The regime applies to “personal data” rather than individuals. “Personal data” is defined broadly, as any information relating to an identifiable person. This is wider than the concept of “personally identifiable information” commonly used in the USA. It covers pseudonymous identifiers, which means that an organisation will process personal data for the UK GDPR to apply where an individual’s named identity is not known but the individual can be differentiated from other individuals – for example, by reference to a cookie identifier.

Both controllers (entities determining the “why” and “how” of data processing) and processors (entities processing data on the controller’s behalf, with limited discretion) are covered by the legislation.

The UK GDPR has extraterritorial application, encompassing controllers that are (i) operating in the UK or (ii) offering goods and services to, or monitoring the behaviour of, individuals in the UK, irrespective of whether the collection and processing of personal data takes place outside the UK.

The DPA 2018 supplements the UK GDPR by, for example, providing additional conditions for the processing of special categories of personal data (see 2.2 Sectoral and Special Issues). The DPA 2018 also provides the framework for the powers of the Information Commissioner’s Office (ICO) and individual rights to complain and seek redress in courts. The DPA 2018 also contains provisions on the processing of personal data by law enforcement and intelligence services.

The current UK government has stated its commitment to reform the UK GDPR and DPA 2018 through primary legislation (see 1.8 Significant Pending Changes, Hot Topics and Issues).

The Privacy and Electronic Communications Regulations (PECR) 2003 (PECR) address the operation of digital technology, such as cookies and other tracking technologies, and the processing of traffic and location data. PECR is therefore pertinent to data protection considerations, albeit the rules are subject to review in Europe given their age. It is unknown whether the UK will decide to adopt a similar approach to updating PECR.

The Investigatory Powers Act (IPA) 2016 provides for general privacy protections related to the unlawful interception of communications, access to data and systems. It also provides UK law enforcement, intelligence and other agencies with a legal framework to exercise surveillance powers, including intercepting communications, accessing data and computer systems, and providing for a data retention regime.

Privacy Law in the UK

While the focus of this guide is on data protection, individuals can – through civil legal action – assert their privacy rights through various grounds of claim, including defamation, harassment, breach of confidence, and the tort of misuse of private information.

Laws Regulating Artificial Intelligence (AI)

The technology underpinning AI involves the processing of personal data, making data protection law the most relevant regulation concerning AI, and the closest thing the UK has to “horizontal” AI regulation. The Equality Act (EA) 2010 addresses discrimination, relevant to bias in AI. Sector-specific laws, such as financial regulations, also constrain AI use.

Enforcement and Penalty Environment

Data protection laws are enforced by the ICO, the UK’s data protection authority.

Depending on the provisions breached, the ICO can issue fines up to GBP8.7 million or 2% of the total worldwide annual turnover, in case of undertakings, whichever is higher; or up to GBP17.5 million or 4% of the total worldwide annual turnover, in case of undertakings, whichever is higher.

Depending on the contravention, an individual affected may seek compensation and/or a compliance order through the civil courts in addition to his or her right to complain to the ICO.

An independent commissioner oversees the use by public authorities of the powers in the IPA 2016.

ICO

The ICO is the UK’s independent authority for information. It enforces regulations such as the UK GDPR, the DPA 2018, PECR, the Freedom of Information Act 2000, the Network and Information Systems Regulations 2018 and the IPA 2016.

The ICO has statutory duties, including monitoring and enforcing data protection laws, and promoting good practice, such as creating sector-specific codes of conduct such as the Age-Appropriate Design Code (AADC) (see 2.2 Sectoral and Special Issues).

The ICO maintains a data controller register. Organisations processing personal data must pay a data protection fee to the ICO unless exempt (public bodies and not-for-profit organisations).

The ICO’s powers are set out in 1.3 Administration and Enforcement Process. In sum, it conducts audits and investigations, with the authority to access premises, data and systems. It issues notices, including fines, to ensure compliance.

The UK GDPR’s extraterritorial scope empowers the ICO to act against foreign-based data controllers and processors.

Equality and Human Rights Commission (EHRC)

The EHRC enforces the EA 2010, with powers including investigations, binding agreements, action plans for breaches, and intervening in judicial reviews. It has a stated aim to examine the “impact of digital services and AI on equality and human rights”.

Sector-Specific Regulators With Relevance to AI

AI-related sector-specific laws are enforced by regulators such as the Financial Conduct Authority (FCA) for finance and the Competition and Markets Authority (CMA) for consumer law, each with distinct remits, constitutions and powers.

The ICO may investigate alleged contraventions of the UK GDPR or DPA 2018 in response to a complaint from an individual, or on its own initiative. An individual may request that the ICO exercise its enforcement functions under PECR, but such a request is not necessary for the ICO to act. The regulator has extensive freedom in how it uses its regulatory powers and resources, and does not respond to or resolve every complaint it receives.

The enforcement process of the ICO is regulated by Part 6 of the DPA 2018, which provides the ICO with the power to issue different kinds of notice including:

  • Information notices, which require organisations to disclose information needed to enable the exercise of the ICO’s functions.
  • Assessment notices, which require organisations to allow the ICO to assess their compliance with data protection law.
  • Enforcement notices, which require organisations to act (or abstain from acting) in a certain way – this may include a ban on the processing of personal data.
  • Penalty notices, which require organisations to pay a certain fine to the ICO.

Organisations may appeal to the Information Tribunal. If the Tribunal decides that the ICO’s decision was wrong as a matter of law, or that it exercised its discretion wrongly, it can issue a substitute decision notice. This decision notice has the same legal status as the notice issued by the ICO. The Tribunal can only consider questions relevant to the law, not any wider dispute that may arise from the request. The Tribunal has two tiers: First-tier and Upper.

Brexit

Post-Brexit, UK organisations processing EU residents’ data must comply with the EU GDPR, appoint an EU representative and meet all EU GDPR requirements. Similar considerations apply when processing data from residents of countries with strict data protection laws, such as Switzerland.

International Data Transfers

The main mechanisms set out the EU GDPR (eg, adequacy decisions or EU Standard Contractual Clauses) are maintained in the UK and/or complemented by national decisions and templates (eg, US-UK Data Bridge, International Data Transfer Agreement template, UK Addendum to the Standard Contractual Clauses). See more under 4. International Considerations.

The UK has also received two adequacy decisions from the European Commission which recognise that the UK ensures an adequate level of protection for personal data transferred to it under both the EU GDPR and the Law Enforcement Directive.

E-Privacy

PECR implements the EU e-Privacy directive. It covers topics such as cookies (see 2.2 Sectoral and Special Issues) and electronic marketing. The draft EU regulation that would repeal the directive, first proposed in 2017, has been in the negotiation stage at EU level since May 2021. Once adopted by the EU, the UK may decide to implement similar provisions, although it is not bound to as a non-EU member state.

Regional and Global Data Protection Initiatives

Global companies, operating both in the UK and internationally, navigate the challenges of complying with multiple concurrent data protection laws.

Beyond the adoption of laws akin to the EU GDPR in several countries, there are ongoing initiatives such as the modernisation of Council of Europe Convention 108 (CoE 108+), signed but not ratified by the UK, and the APEC Global Cross-Border Privacy Rules Forum. These initiatives aim to offer frameworks for global compliance, yet their impact remains limited compared to the EU’s GDPR.

Non-governmental Sector

Given the breadth of issues that data protection engages, several NGOs operate in this space. These include unions such as UNISON, bodies such as Worker Info Exchange and the Trades Union Congress (TUC), through to institutes such as the Ada Lovelace Institute, the Alan Turing Institute and Connected by Data. Civil society organisations are active in this area, such as:

  • Big Brother Watch, who have brought claims relating to Facial Recognition Technology and government monitoring of social media;
  • Open Rights Group, who have brought claims about the DPA 2018; and
  • Global Witness, who have filed complaints about algorithmic discrimination. 

European organisations are also increasingly active in this area, such as the pan-European consumer rights collective, BEUC.

Industry Self-Regulatory Organisations

Within the advertising industry, the following organisations are relevant:

  • the Committee of Advertising Practice;
  • the Advertising Standards Authority; and
  • the Interactive Advertising Bureau.

UK Data Protection System

The current data protection framework in the UK has largely remained the same after Brexit, although key changes are being discussed by the government (see 1.7 Key Developments and 1.8 Significant Pending Changes, Hot Topics and Issues).

Enforcement

Enforcement of data protection requirements by the ICO (see 1.2 Regulators, 1.3 Administration and Enforcement Process and 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation) is relatively low compared with EU regulators. For instance, during 2021-2022, the ICO issued no enforcement notices or criminal prosecutions and only issued four GDPR fines. There has been an increase in enforcement action since the beginning of 2022; however, these mostly relate to breaches of PECR rules, rather than UK GDPR infringements. The ICO has also pursued “reprimands” for UK GDPR breaches, rather than enforcement notices.

Data Protection and Digital Information Bill

In March 2023, the Data Protection and Digital Information Bill was introduced in Parliament (see 1.8 Significant Pending Changes, Hot Topics and Issues). The stated aim is to facilitate business and increase flexibility in the use of personal data. The Bill has not become an Act at the time of writing.

UK Extension to the EU-US Data Privacy Framework

In October 2023, the UK added the “Data Bridge” to the EU-US Data Privacy Framework, an opt-in certification for US organisations, including enforceable data protection requirements. Once certified, US organisations on the Data Privacy Framework List can receive UK personal data through the Data Bridge. The framework is overseen by the US Federal Trade Commission and the US Department of Transportation, and administered by the US Department of Commerce.

Online Safety Act

The Online Safety Act (OSA) 2023, akin to the EU Digital Safety Act, has been in effect since October 2023. It applies to various services, including user-to-user and search services. The OSA 2023 primarily addresses platforms’ responsibilities concerning illegal content and introduces duties related to lawful but potentially harmful content. It has extraterritorial reach, like the UK GDPR, with stringent provisions, especially regarding child sexual abuse content, allowing the regulator (OFCOM) to mandate scanning of private communications. OFCOM is tasked with creating enforcement Codes for the OSA 2023.

AI

While the EU focuses on the adoption of the AI Act, the UK has continued working on its national AI strategy focusing on investment, AI governance and cross-sectoral impact. In March 2023, the government published an AI White Paper (see 5.6 Digital Technology Regulation/Convergence of Privacy, Competition and Consumer Protection Laws (Including AI)), and in November 2023, it organised the AI Safety Summit attended by over 25 countries’ representatives. It also launched the “AI Safety Institute” to test the safety of emerging types of AI and engage in international collaboration with leading AI companies. In February 2024, the government (i) published its consultation response to the AI White Paper, which confirms that there will be no AI-specific legislation in the UK, and (ii) as part of its principles-based approach, published a guide to AI assurance techniques.

Digital Markets, Competition and Consumers Bill

See 5.6 Digital Technology Regulation/Convergence of Privacy, Competition and Consumer Protection Laws (Including AI).

Data Protection and Digital Information Bill

If the current version of the Bill is enacted, this will mean several changes to the data protection framework and a lowering of requirements for processing of personal data. Changes include:

  • a narrower definition of personal data, taking some pseudonymised datasets outside the scope of the UK GDPR’s protection;
  • a new legal basis for processing, “recognised legitimate interests”, which will provide predetermined gateways for certain types of processing. The list of such “recognised” bases can be augmented and added to by secondary legislation;
  • significant relaxations of the principle of purpose limitation in processing personal data;
  • a new test for the exercise of data subject rights which will make it easier for data controllers to delay and refuse to act on them;
  • the Senior Responsible Individual will replace the role of the Data Protection Officer and be compulsory only for public bodies or if the processing entails high risks for the data subjects;
  • Registers of Processing Activities will only be required for high-risk processing;
  • an increased use of “data bridges” to facilitate international transfers (see 1.4 Multilateral and Subnational Issues and 4. International Considerations). See also 5.6 Digital Technology Regulation/Convergence of Privacy, Competition and Consumer Protection Laws (Including AI); and
  • a power for the welfare department to order the scanning of the bank accounts of all recipients of government support for welfare fraud and error.

The UK’s data protection regime governs the relationship between those who control data by determining the means and purposes of processing (“data controllers”) and those processing data on their behalf, known as “data processors”) and those who are subject to it (“data subjects”), the identifiable living individuals to whom personal data relates.

Data Protection Principles

The UK GDPR sets out six general principles that must be complied with to secure the complete and effective protection of data subjects. The principles are interlinked and permeate all other provisions in the legislation.

Lawfulness, fairness and transparency means that controllers must be open with data subjects about how they are processing personal data and ensure that this is being done in compliance with data protection laws and in a way that is fair to data subjects.

Purpose limitation requires controllers to use data for “specified, explicit and legitimate purposes” and not to use data for further, incompatible purposes. In practice, it requires foreseeability for data subjects about how their personal data will be used as it places limitations on how a controller can use personal data. Cases are currently before the European courts which are testing the parameters of this principle, including when a “purpose” is specific enough.

Data minimisation is related to purpose limitation in that it mandates a controller to limit the collection of personal data to what is relevant and necessary to achieve a specified purpose.

Accuracy means that personal data must be kept accurate and up to date, and storage limitation stipulates that personal data must not be processed for longer than necessary to achieve the controller’s specified purposes.

The security principle requires that personal data be processed securely by means of appropriate technical and organisational measures.

These principles are underpinned by the “accountability” principle, which requires the data controller to document and demonstrate compliance with the principles. 

Legal Bases

A data controller must identify a legal basis for each purpose of processing it intends to carry out (see, for instance, 6 below). The legal bases act as a “gateway” to processing, legitimising the use of data. Without an appropriate legal basis for processing, the processing will be invalid.

The UK GDPR provides six exhaustive legal bases:

1. Consent: Under the UK GDPR, consent must be a “freely given, specific, informed and unambiguous” agreement for processing personal data. It demands a proactive, affirmative act and cannot be implied. Consent is also made conditional, including the requirement to be able to withdraw consent as easily as it was provided. Further conditions include prohibitions on bundling consent with other terms and making access to a service dependent on consent. These conditions and thresholds may render consent unsuitable in contexts with an imbalance between the data controller and the subject, such as employment.

2. Contract: The controller needs to consider whether processing personal data for a specific purpose is “necessary for the performance of a contract”. This does not mean strictly necessary, but the processing should be more than merely useful. This basis will be most appropriate for close business relationships, such as between a lawyer and a client.

3. Complying with a legal obligation: Where a controller needs to process personal data to comply with a common law or statutory obligation.

4. Protecting vital interests: A very narrow basis, only covering interests that are essential for someone’s “vital interests” such as life.

5. Public interest: Applies only to public sector functions, for example, government and law enforcement, allowing data to be processed to protect the public interest.

6. Legitimate interests: Unlike the other bases, legitimate interests provides a basis where the controller and the data subject do not have a direct relationship. It is often therefore relied on by organisations such as data brokers. Owing to this expansive application, the basis is not absolute. Rather, the controller needs to consider (i) whether the processing of personal data is necessary for its legitimate interests (ie, more than merely useful, and less intrusive means must be considered) and (ii) it must balance its own interests against those of data subjects in advance of processing any data and not proceed with processing if its interests are outweighed. A controller should carry out a Legitimate Interests Assessment (LIA) to document how this balance has been met (see 2.1 Omnibus Laws and General Requirements).

Record of Processing Activities (ROPAs)

For accountability, data controllers must keep a Record of Processing Activities (ROPA). It should include the controller’s identity, contact details, DPO info, categories of data, processing purposes, international data transfers and retention periods. Data processors also must keep a more concise ROPA.

Security of Processing

The UK GDPR has a risk-based approach which requires data controllers to identify the most appropriate technical and organisational measures to ensure the security of the personal data they process, in line with broader accountability requirements.

Data Breach Notification and Communication

Any personal data breach must be reported to the ICO within 72 hours unless it is unlikely to result in a risk to the rights and freedoms of individuals. If the breach is likely to result in a high risk to data subjects, data controllers will also need to inform the affected individuals too. Organisations should maintain a register of all breaches for accountability.

Data Processing Agreement

When a data controller hires a processor, they need to put a contract in place which meets the requirements of Article 28(3) of the UK GDPR. Processors cannot delegate without the controller’s approval, and they are liable for (sub)processor compliance.

Data Protection Officer (DPO)

Organisations must appoint a DPO when their core activities involve either:

  • the regular and systematic monitoring of individuals on a large scale; or
  • the processing on a large scale of special categories of data or data relating to criminal convictions and offences.

Either an employee or an external party can be appointed. In both cases, organisations must ensure that the DPO has the necessary knowledge, resources, independence, and access to information and decision-making processes to be able to carry out his or her tasks effectively. Such tasks include:

  • providing information and advice on data protection requirements;
  • monitoring compliance with data protection laws and related corporate policies;
  • participating in DPIA processes (see “Data Protection Impact Assessment (DPIA)” below); and
  • acting as a point of contact and co-operating with the ICO and any other supervisory authorities.

Whenever an organisation appoints a DPO, it should notify the ICO and make its contact details public.

Privacy by Design and by Default

Data protection by design and by default means organisations must “bake” data protection into all their processing activities from the design stage right through the life-cycle.

Data Protection Impact Assessment (DPIA)

A DPIA identifies and mitigates risks to individuals’ rights and freedoms in data processing. It is mandatory for high-risk processing, including automated decision-making, large-scale special category data processing and systematic monitoring of public spaces. The ICO provides a list of operations requiring a DPIA. The DPIA must detail processing specifics, assess necessity and risks, outline mitigations and be regularly reviewed for ongoing compliance.

Legitimate Interests Assessment (LIA)

The ICO considers an LIA to be “best practice” when a data controller relies on legitimate interests.

An LIA is a three-step test where a controller:

1. evaluates the legitimate interest pursued: this may be the exercise of a fundamental right, the pursuit of a broader/public interest, or any other interests that an organisation may legitimately pursue;

2. assesses the necessity of the processing for achieving such legitimate interest: this requires, in substance, the controller to consider whether less intrusive means are viable to achieve the same interest (including without processing personal data); and

3. performs a balancing test, to assess whether the envisaged processing is proportionate to the interference with the interests or fundamental rights and freedoms of the data subject.

Data Subjects’ Rights

The UK GDPR provides data subjects with various rights over their data including rights:

  • to receive transparent information about data processing;
  • to access personal data;
  • to correct and erase personal data;
  • to restrict or object to the controller’s processing;
  • to port data from one controller to another; and
  • not to be subject to a decision based solely on automated processing.

These rights are complex and subject to extensive case law and regulatory action. Some pertinent examples include:

  • The rights to transparent information and to access data are foundational because if data subjects cannot find out how their data is being processed, they cannot exercise any of their rights. As a result, regulators often act in relation to compliance with transparency requirements and often mandate controllers to reform their privacy notices. For example, the Irish Data Protection Commission has acted against Meta (Facebook) for its privacy notices.
  • The right to access starts with data subjects obtaining confirmation from a data controller as to whether their personal data is being processed and to obtain access to the data being processed. In addition, data subjects can ask for details of the processing, including information about recipients of data, the purposes for processing and the logic and consequences of processing in the case of automated decision-making and profiling.
  • The parameters of the right to object – and its effectiveness – are before the High Court in Tanya O’Carroll v Meta, a case due to be heard in 2025. The case relates to an objection to Facebook’s use of data, including its profiling, for direct marketing purposes. If successful, the case could have consequences for the entire online advertising industry, as each platform would be required to provide users with the chance to opt out from their data being used for targeting purposes. 
  • The right to erasure (often known as the “right to be forgotten”) is often used by individuals to have their personal data erased, including erasure of inaccurate or out-of-date internet search results about them. The High Court considered this right in NT1 and NT2 v Google. Those cases confirmed the fact-specific nature of the right.

Special Category Data

The UK GDPR treats processing of certain data as requiring enhanced protection:

  • data revealing racial or ethnic origin;
  • data revealing political opinions;
  • data revealing religious or philosophical beliefs;
  • data revealing trade union membership;
  • genetic data;
  • biometric data (where used for identification purposes);
  • data concerning health; and
  • data concerning a person’s sex life or sexual orientation.

Such “special category” data can only be processed by fulfilling one of the specific conditions under Article 9 of the UK GDPR. Certain conditions, such as processing for employment and social security, health or social care or public health, contain additional rules in Part 1 of Schedule 1 of the DPA 2018. For processing for substantial public interest, one of the 23 conditions under Part 2 of Schedule 1 of the DPA 2018 must be met.

Children’s Data

Recital 38 of the UK GDPR provides that children “merit specific protection” regarding their personal data.

The AADC aids the interpretation of the UK GDPR. If children are likely to access an online service, compliance with the AADC is mandatory. The AADC includes 15 standards for protecting children’s data rights online, such as defaulting to high privacy settings, minimising data collection, turning geolocation settings off by default, conducting a DPIA, and avoiding nudge techniques.

Offering information society services directly to a child will only be lawful if the child is at least 13 years old. Otherwise, there must be parental consent. Companies must take reasonable measures to verify consent. 

Criminal Conviction Data

Article 10 of the UK GDPR provides that criminal conviction data can only be processed “under the control of official authority” or if a condition for processing in Schedule 1 of the DPA 2018 is applicable.

Employee Data

There are no specific regulations concerning employment. Employers are likely data controllers, so must identify a lawful basis for processing employees’ personal data under Article 6 of the UK GDPR. If processing special category data, the employer must find an exemption under Article 9(2) of the UK GDPR and consider the conditions in Parts 1 and 2 of Schedule 1 of the DPA 2018.

The ICO has released guidance for processing employment data (see 2.4 Workplace Privacy). These include “Employment practices and data protection: monitoring workers” and “Employment practices and data protection: information about workers’ health”. The ICO currently has two draft guidances out for public consultation: a draft guidance on keeping employment records and a draft guidance on recruitment and selection. Unions have filed submissions to the ICO explaining their own interpretation, how it differs from the ICO’s and steps that employers should undertake to protect employees.

Financial Data

Financial data is not considered a special category of data under the UK GDPR. UK legislation regarding financial services requires companies to have robust data security practices, retain data for a mandated period and maintain confidentiality regarding financial services and transactions. 

Location Data

Location data, likely considered personal data under the UK GDPR, is subject to strict PECR rules. Public communications providers and value-added service providers can only process anonymous location data with user consent, which should be freely given, specific, informed, and withdrawable at any time, as specified in PECR regulation 14.

Cookies and Other Tracking Technologies

Cookies and other tracking technologies have come under increasing scrutiny by the ICO, with the ICO issuing warnings for websites to come into compliance (see 2.3 Online Marketing for legal regulations). The ICO has previously issued clear guidance that organisations must make it as easy for users to “Reject All” advertising cookies as it is to “Accept All”.

Behavioural or Targeted Advertising

Behavioural and targeted advertising technology has faced legal claims and regulatory enforcement. For instance, Meta’s approach to behavioural advertising has been subject to multiple rounds of enforcement by European regulators and the CJEU. In the UK, Meta faces a claim for its approach to individual choices around behavioural advertising (O’Carroll v Meta). Other industries using behavioural advertising, such as the remote gambling industry, are subject to claims and regulatory action.

“Real-time bidding” (RTB) technology which facilitates targeted advertising is also subject to regulatory action in the UK and Europe, with regulators finding that such techniques may not comply with the UK/EU GDPR. The Belgian Data Protection Authority’s decision on 2 February 2022 on the AdTech industry’s “Transparency and Consent Framework” (TCF) was made to “restore order to the online advertising industry”. If upheld on appeal, the decision will affect the entire AdTech ecosystem. A judgment of the European Court of Justice on 7 March 2024 has found that IAB Europe, as management body for the TCF, is a joint “data controller” under the GDPR for the TCF. The domestic courts in Belgium can now proceed to rule on the matter with the benefit of these important insights.

In June 2019, the ICO issued a report highlighting concerns regarding how RTB and AdTech comply with the GDPR.

The European Commission has published studies looking at alternatives to RTB, as the future of the industry is uncertain in face of the regulatory action.

Online Safety Act

See 1.7 Key Developments.

B2C

For business to consumer (B2C) direct marketing, PECR bars unsolicited electronic communications without prior consent unless the consumer provided contact details during a purchase, the marketing is for a similar product or service, and there is an opt-out option. Data subjects can object to direct marketing at any time under Article 21.2 of the UK GDPR.

B2B

For business to business (B2B) direct marketing, the requirements are less stringent and such marketing can be done on an “opt out” basis. The rules require that the sender identify itself and provide contact details. Sole traders and some partnerships have the same rights as consumers in B2C.

Data subjects have the right to object at any time (Article 21.2 UK GDPR).

Cookies

The use of cookies and other tracking technologies such as pixels is regulated by PECR. When cookies process personal data, the UK GDPR also applies.

To use cookies, organisations must inform users, explain their purpose and obtain GDPR-defined consent, unless the cookies are used for communication transmission (providing a website) or “strictly necessary” services.

The “strictly necessary” requirement has been narrowly defined as “essential”. The ICO states that “cookies that are helpful or convenient but not essential, or that are only essential for your own purposes, will still require consent”. This approach is consistent with CJEU case law.

The ICO has exclusive jurisdiction over PECR compliance, and individuals cannot bring private actions to force compliance with PECR (contrasted with compliance orders available under the DPA 2018). Individuals can, however, bring civil actions for damages caused by a breach of PECR. 

AdTech

AdTech encompasses online advertising technology involving data processing, cookies and similar tools, making the UK GDPR and PECR applicable. AdTech has faced scrutiny for UK GDPR compliance issues, including challenges in identifying legal bases for processing and ensuring security. The mechanisms the industry has deployed to comply with the data protection regime are also reported to breach the UK GDPR (see 2.2 Sectoral and Special Issues).

Workplace Monitoring

As most forms of employee monitoring involve the processing of personal data, such activities must comply with data protection requirements, as well as other relevant laws, notably the EA 2010.

Requirements

Before monitoring, employers should explore less intrusive methods. Technologies such as keystroke monitoring may be excessive. Employers must establish a lawful basis for processing data, especially special category data, with high-risk processes requiring DPIAs. Clear information must be provided to employees before deployment, with covert surveillance generally prohibited unless suspicions of criminal activity exist.

Article 22 of the UK GDPR contains a right for individuals not to be subject to solely automated decisions, including profiling, which significantly impact them. The consequence of the right is unclear but is often interpreted as requiring a “human in the loop”. This is crucial in employment as e-recruitment and robo-firing practices increase.

An employer must comply with PECR regulation 6 if it accesses, or attempts to access, the information stored on an employee’s computer.

UK Labour Organisations

The TUC asserts that while some monitoring is normal, legal limits exist for intrusive monitoring. With the rise of cheap monitoring technology, the TUC calls for unions to have a legal right to consult on electronic monitoring. They also demand an Employment Practices Code update. The UK GDPR recommends consultation with data subjects or their reps during DPIA, enhancing effectiveness. However, the controller’s discretion weakens this right.

Whistleblowing

Whistleblowing typically involves the processing of personal data. In the UK, it must comply with both data protection requirements and the Public Interest Disclosure Act 1998. In March 2023, the government launched a review of the whistleblowing framework.

Private Litigation

Individuals are entitled to pursue a court case for non-monetary relief (in the form of “compliance orders” in Section 167 DPA 2018) and monetary relief for material and non-material (including distress) damages (Article 82 UK GDPR and Section 168 DPA 2018).

The UK and European courts determine damages for UK/EU GDPR breaches on a case-by-case basis. Compensation is not automatic for a GDPR infringement; some damage is required. Low-level distress cases have resulted in low-value damages of GBP250. Most low-level GDPR damages claims therefore go to the county court.

Claims involving psychiatric or psychological injury have received substantial awards but are often combined with other grounds of claim, such as for breach of confidence and misuse of private information, resulting in “global” awards.

Both the contravention and its role in causing any alleged loss or damage must be proved on the balance of probabilities – ie, the court must be satisfied that it is more likely than not to have occurred.

For class actions, see 5.3 Significant Privacy and Data Protection Regulatory Enforcement or Litigation.

ICO Enforcement

In 2022/2023, the ICO carried out 71 audits. These were mostly addressed to the health (22), criminal justice (12) and private sectors (11), resulting in 1,500 recommendations. The main issues addressed by such recommendations included ROPAs, DPIAs, transparency measures (such as information notices) and privacy management. The focus areas for 2023/24 auditing include the use of AI in recruitment, financial services, and data sharing in child protection.

In 2023, the ICO imposed a total of 18 fines, most of which related to enforcement of PECR rules, specifically on unsolicited calls and messages. In the same year, the ICO also issued 37 reprimands and 18 enforcement notices.

The ICO is not empowered to award compensation to individuals.

Part 2 of the IPA 2016 allows law enforcement agencies to issue warrants for intercepting communications, including international mutual assistance agreements. These warrants, issued by the Secretary of State and approved by a Judicial Commissioner (known as the “Double Lock process”), must be necessary, proportionate and for a legitimate purpose.

Under Part 3 of the IPA 2016, law enforcement agencies can also access data directly from a person that relates to a telecommunications system or from a telecommunications system.

Under Part 5 of the IPA 2016, a law enforcement chief can issue a warrant for accessing data by interfering with equipment (ie, hacking). The Double Lock process also applies.

Part 3 of the DPA 2018 regulates personal data processing under these powers.

Beyond the IPA 2016, various statutes provide search warrant provisions allowing law enforcement to compel data production or access, for example, the Police and Criminal Evidence Act 1984.

The IPA 2016 grants intelligence services powers for data access, including targeted equipment interference (Part 5) and bulk interception, acquisition of communications data, and equipment interference (Part 6). The Double Lock process applies.

The Investigatory Powers Commissioner oversees these powers, and the Investigatory Powers Tribunal handles complaints related to IPA 2016 powers.

Under UK law, there is no specific provision allowing UK entities to gather and transmit personal data related to foreign government access requests. However, in certain cases, UK-based entities might justify this under Articles 6 and 49 of the UK GDPR concerning legitimate interests. Nonetheless, they must weigh individual rights against any such interests.

In addition to possessing a legal basis for the transfer, UK entities will need to comply with Chapter V of the UK GDPR (see 4.2 Mechanisms or Derogations That Apply to International Data Transfers).

In 2021, the European Court of Human Rights found the UK government’s data access regime violated Article 8 of the Convention. The IPA 2016 addressed these concerns, providing more oversight for law enforcement and social services. Debates continue whether these changes are sufficient.

Concerns have arisen about “digital strip searches”, where police access the full content of a phone in criminal investigations, with a 2020 ICO report citing legal clarity issues.

The UK government’s 2023 proposed amendments to the IPA 2016, including expanded internet records access, have also faced criticism from civil society organisations.

Restricted Transfers

Any transfer of personal data outside the UK national boundaries – that is, to a foreign country or to an international organisation – is considered a restricted transfer, subject to specific rules under the UK GDPR and DPA 2018.

Equivalent Level of Data Protection

In general, an international data transfer can only be made if the recipient country or international organisation ensures an equivalent level of protection to that of the UK, including data subjects’ rights and legal remedies (see 4.2 Mechanisms or Derogations That Apply to International Data Transfers).

International Data Transfers From the UK

Countries/regions with adequate data protection levels include the EU, Andorra, Argentina, Faroe Islands, Gibraltar, Guernsey, Isle of Man, Israel, Jersey, New Zealand, Republic of Korea, Switzerland and Uruguay. This means data transfers can occur.

Partial adequacy decisions exist for Canada, Japan and the USA. Without adequacy, a Transfer Risk Assessment is needed, along with identifying and implementing appropriate safeguards listed in Article 46 of the UK GDPR (eg, UK Binding Corporate Rules, International Data Transfer Agreement, International Data Transfer Addendum). Exceptions may apply under Article 49 of the UK GDPR.

International Data Transfers to the UK

EU to UK international data transfers are permitted under the 2021 European Commission’s adequacy decisions until mid-2025, extendable for up to four more years. Countries with strict data protection laws, like the EU GDPR, have rules for data transfers to the UK or elsewhere. The suitable transfer mechanism depends on each case.

Bespoke Contract for a Specific International Transfer

International data transfers may also take place, with the authorisation of the ICO, when the data exporter and the data importer draft a bespoke data transfer contract specific to an international data transfer they wish to carry out (Article 46.3 UK GDPR).

“Compelling Legitimate Interests” Derogation

The ICO needs to be notified when a one-off international data transfer is carried out for a compelling legitimate interest of the data exporter (Article 49.1 UK GDPR).

There are no data localisation requirements under UK law. Rules on restricted transfers apply (see 4.1 Restrictions on International Data Issues, 4.2 Mechanisms or Derogations That Apply to International Data Transfers, 4.3 Government Notifications and Approvals).

UK law does not require that software codes, algorithms, encryption algorithms or keys, or similar technical details be shared with the government.

Most evidence will contain some personal data. Balancing the need for evidence and the protection of personal data must be ensured when responding to requests for personal data disclosure (paragraph 5(3)(c) of Schedule 2 DPA 2018).

UK Internal Investigations

Organisations can share personal data to prevent or detect a crime. Requests often come from the Police or other crime prevention entities such as the Department for Work and Pensions Benefit Fraud Section. In such cases, organisations may not have to inform individuals or provide access to shared data if it could compromise an ongoing investigation.

Foreign Government Data Requests or Foreign Litigation Proceedings

Balancing the need for evidence and personal data protection is crucial when receiving requests for disclosing data from abroad in (prospective) legal proceedings. Parties should redact irrelevant personal data, but overzealous redaction might challenge disclosure adequacy and increase litigation costs. Excessive data disclosure may lead to complaints by data subjects to the disclosing organisation, the ICO, or compensation claims in national courts for distress caused by the disclosure.

See also 3.3 Invoking Foreign Government Obligations.

There are no blocking statutes in the UK with a direct effect on data protection rules.

Many new and emerging technologies are dealt under laws, regulations or guidance issued by the UK government and various regulatory authorities.

Big Data Analytics

There are no specific laws relating to big data analytics in the UK. In its 2014 report on big data, the ICO recommended companies using big data analytics to use a combination of tools such as privacy impact assessments, DPIAs, transparency and data minimisation.

Automated Decision-Making

Under Article 22 of the UK GDPR, a data subject cannot be subjected to solely automated decision-making including profiling if it results in a legal or similarly significant effect on the data subjects. Children are provided additional protection to protect their personal data from being used for marketing or creating online profiles. 

AI

See 1.7 Key Developments.

Internet of Things (IoT)

The UK government has passed the Product Security and Telecommunications Infrastructure (PSTI) Act 2022, and the PSTI (Security Requirements for Relevant Connectable Products) Regulations 2023 are due to come into force in April 2024. The legislation and regulation have introduced strict compliance measures for businesses dealing in IoT products. Some provisions include building cybersecurity measures into products and shifting the burden of security from consumers to businesses, no default passwords and reporting requirements for security issues. The legislation requires clear transparency and information to be provided to users at the time of buying IoT products.

Autonomous Vehicles

In November 2023, the UK government announced a draft law to regulate autonomous vehicles. The Automated Vehicles Bill is expected to be a technologically neutral legislation which will include provisions such as a threshold to define “autonomous vehicles”, robust safety requirements, liability and rules around marketing.

Biometric Data and Facial Recognition

Under the UK GDPR, biometric data is special category data when used for identification. In August 2023, the ICO issued draft guidance on biometric data, defining it and systems like fingerprinting or facial recognition. The guidance mandates DPIAs for high-risk biometric data processing, stating that using recognition systems always falls into this category. It emphasises explicit consent for processing biometric data, with exceptions for prevention, detection of unlawful acts, and research where consent is impractical. Accuracy, discrimination and security are highlighted as primary risks.

The UK does not currently have a specific law regarding facial recognition. Existing laws have led to cases and regulatory enforcement for such technology. In 2022, the ICO imposed a fine of GBP7.5 million on Clearview AI for serious breach of data protection laws including the use of biometric data for facial recognition. In the First-tier Tribunal decision of 17 October 2023, Clearview AI won its appeal against the fine imposed on jurisdictional grounds. The ICO has appealed the decision.

In 2020, the Court of Appeal held the use of Automated Facial Recognition (AFR) by the Police to be unlawful. The ICO has also engaged with private AFR providers, such as Facewatch, to ask those companies to bring their technology into compliance with the UK GDPR.

Geolocation Data

If geolocation data and tracking falls under the definition of location data as defined under PECR, then providers are required to comply with the provisions of obtaining consent and providing users with adequate information about processing.

Location data about individuals usually constitutes personal data. The AADC sets down further standards for geolocation tracking of children’s data. Under the AADC, companies are required to switch off geolocation tracking by default unless they provide a clear reason, children should be provided obvious signs that tracking is on, and the option that displays children’s geolocation to other users must be set back to default “off” after every session.

Drones

The use of drones in the UK is regulated by the Civil Aviation Authority (CAA). The CAA provides points to consider when the use of drones may involve processing personal data (photos/videos). Drone users and owners are also required to comply with the UK GDPR and the DPA 2018 where applicable.

Dark Patterns

In 2023, the ICO and CMA released a joint position paper on Dark Patterns which may result in enforcement actions by the ICO and CMA. The position paper discussed specific harmful practices such as “nudge and sludge” techniques, “confirm-shaming”, “biased framing”, “bundled consent” and “default settings”.

Disinformation, Deepfakes or Other Online Harms

See 1.7 Key Developments and 2.2 Sectoral and Special Issues.

Many UK organisations establish protocols for fair data practices and assess risks of disruptive technologies; for example, IBM and Aimia have ethical frameworks.

The Market Research Society’s Fair Data Accreditation Board certifies UK GDPR-aligned data handling. The UK government, in its 2021 National AI Strategy, commits to regulating AI in line with FAIR data principles (ie, Findable, Accessible, Interoperable and Reusable.

Enforcement by the ICO

The ICO’s record of enforcement across 2023 relates mostly to PECR enforcement rather than UK GDPR breaches. The ICO has focused on “reprimands” instead of enforcement action.

The data protection enforcement actions relating to the ICO of note include:

  • The ICO fined Clearview AI GBP7.5 million for unlawfully scraping personal data to use in its image recognition database. Clearview appealed this decision to the First-tier Tribunal on jurisdictional grounds. On 17 October 2023, the First-tier Tribunal held that UK data protection law could not have extraterritorial effect on the facts of the case and found for Clearview. The ICO has appealed the decision to the Upper Tribunal.
  • On 23 February 2023, the First-tier Tribunal handed down a decision partly allowing an appeal by credit reference agency Experian against an ICO GDPR enforcement notice issued in October 2020 in relation to data processing operations of Experian’s marketing business. The decision has wider consequences for controllers’ reliance on “legitimate interests” for processing for direct marketing and the provision of information about data processing. The ICO has appealed this decision to the Upper Tribunal.
  • On 4 April 2023, the ICO announced a fine against TikTok of GBP12.7 million for breaching the UK GDPR, in particular for failing to protect children’s privacy. The ICO had previously issued a “notice of intent” to fine TikTok GBP27 million. Despite the reduction, the fine is the third highest issued by the ICO. TikTok has appealed the fine.

Class Actions in Data Protection

In Lloyd v Google, the Supreme Court held that one opt-out representative procedure could not proceed. Lloyd precluded these kinds of representative actions under the DPA 1998 because there was no automatic right to compensation for the fact of a contravention of the DPA 1998 or loss of control of data. Instead, an assessment of the damage (most likely, distress) suffered by the affected individual was required. As such, there was no “class” for the purpose of representative actions. As a result, several proposed “representative” actions for data protection breaches were discontinued or otherwise disposed of, such as SMO v TikTok, filed by the former Children’s Commissioner for England on behalf of children in the UK.

In Prismall v Google UK, the UK High Court struck out an opt-out class action for data misuse. That claim is distinguished from Lloyd, as the claim relates to the tort of misuse of private information, rather than contraventions of the data protection regime. The claimant in Prismall is seeking to appeal the decision.

The position is different in the EU and in flux, due to the Representative Actions Directive. That Directive allows for representative actions for data protection claims.

Data protection due diligence is crucial. Companies must assess compliance with data protection laws, scrutinise data security practices and understand data processing activities. Key focuses for the acquiring company include the target’s compliance with laws and internal policies, past security breaches, data flows, system architecture, internal and external policies, and agreements with third parties.

While there is no specific disclosure guidance in the UK, listed companies are obliged by law to report on principal risks, which should by rights include cybersecurity considerations.

The SEC’s new cybersecurity disclosure requirements for US companies are also relevant for UK companies which are likely to face increased scrutiny if they have US companies are customers. 

The UK government has recently published a draft Cyber Governance Code of Practice which, inter alia, proposes establishing formal reporting, so this is an evolving area.

See 1.7 Key Developments.

Digital Markets, Competition and Consumers Bill

In April 2023, the UK government introduced a new Digital Markets, Competition and Consumers Bill. The Bill aims to significantly increase the CMA enforcement powers of UK consumer protection laws, increase the investigative and enforcement powers of the CMA to capture extraterritorial antitrust conduct by companies, and establish an ex-ante regulatory framework by enforcing conduct requirements based on objectives of fair trading and targeted pro-competitive interventions for digital markets.

AI

In March 2023, the UK Government released an AI White Paper which focused on ushering a “pro-innovation” and “context-specific” AI regulatory regime. The White Paper proposes regulating AI through existing regulators such as the ICO, the FCA and the CMA, rather than creating legislation for AI regulation as in the EU. The government response to the consultation was released in February 2024.

UK Data Protection and Information Bill

The UK Data Protection and Information Bill will, if implemented, update provisions of the UK GDPR, the DPA 2018 and PECR to provide an updated privacy framework for companies that process the personal data of UK residents (see 1.8 Significant Pending Changes, Hot Topics and Issues). Companies processing data in Europe or related to Europeans should consider the EU GDPR.

Looking forward, UK companies will need to monitor the forthcoming UK data reforms and ensure they are adhering to existing laws and guidelines – including those that apply to Al. Companies engaged in international transfers of data will have to keep abreast of international developments relating to data transfers and ensure that such transfers are being conducted lawfully. Companies should ensure that they have robust cybersecurity practices in place. Companies involved in online advertising should pay particular attention to their use of cookies, and to UK and EU developments regarding behavioural and targeted advertising.

While class actions in the UK for data protection look increasingly difficult, controllers and data subjects should also keep an eye on a suite of cases going through the courts. These include claims for monetary as well as non-monetary relief. For instance, O’Carroll v Meta will be significant for online advertising. The CJEU will also continue to consider the “law of everything” as part of its case docket, which will undoubtedly affect the UK approach to the GDPR. 

AWO

2 John Street
WC1N 2ES
UK

+44 (0)20 8080 3008

press@awo.agency www.awo.agency
Author Business Card

Law and Practice in UK

Authors



AWO provides a unique offering in the rapidly developing area of AI, technology, data protection and privacy. By bringing together renowned sector experts and award-winning solicitors, it has grown to be a standout firm in this sector. The team uses its expertise to help a diverse range of clients navigate these complex areas of law in a rights-respecting, socially responsible and commercial way. AWO has created an innovative platform to provide legal services, consultancy and public affairs advice. AWO’s unique model is sought after by clients who wish to understand and shape some of the most significant issues arising from the emergence of new technologies and the mass collection of individuals’ personal data. As a result, AWO is now recognised as a leading firm in this changing and complex space. AWO is rarely absent from the front lines of analysis, commentary and strategic action in both the UK and the EU.