TMT 2021

Last Updated February 19, 2021

UK

Law and Practice

Author



Deloitte Legal combines market-leading lawyers, consultants and technology experts to provide clients with new solutions to legal problems. Whether that be legal advice delivered in a more effective way, assistance in harnessing the considerable benefits created by advancements in legal technology, or advice on how to create a best-in-class in-house legal function, the firm has the breadth and depth of expertise to advise on the challenges that its clients face.

There is increasing regulation of cloud services through a wide variety of legislative provisions. These do not specifically relate to cloud services but have a considerable impact on such services. Key legislation is set out below.

UK GDPR

The Data Protection, Privacy and Electronic Communications (Amendments, etc) (EU Exit) Regulations 2019 (SI 2019/419) (the “DP Brexit Regulations”) introduced a new UK GDPR, with the GDPR now known as the EU GDPR in the UK. Schedule 1 of the DP Brexit Regulations amends the retained EU law version of the GDPR. Schedule 2 amends the Data Protection Act 2018 (“DPA 2018”), including replacing the definition of the "GDPR" in the DPA 2018 with a definition of the "UK GDPR".

In practice, the UK GDPR also includes the provisions of the applied GDPR, unless the context requires otherwise.

DPA 2018

The DPA 2018 remains in place, effectively subordinate to the UK GDPR. Organisations will need to bear in mind that there are two legal texts to consider, where relevant: the UK GDPR and the DPA 2018.

EU GDPR

From the end of the UK-EU transition period, the retained EU law version of the GDPR applies in the UK, along with the DPA 2018.

As the EU GDPR will continue to have extra-territorial effect, the EU GDPR may continue to apply to UK controllers or processors who have an establishment in the EU, or who offer goods or services to data subjects in the EU, or who monitor their behaviour as far as their behaviour takes place within the EU.

Legislation Applicable to Cloud Computing in Certain Industries

The Network and Information Systems (NIS) Regulations 2018

These apply to two groups of organisations:

  • "operators of essential services" (energy, transport, health, water and digital infrastructure); and
  • relevant "digital service providers", which
    1. provide online search engines, online marketplaces and/or cloud computing services,
    2. have their head office in the UK,
    3. have more than 50 staff, and
    4. have a turnover of more than EUR10 million.

Financial services

The following legislation only applies to cloud computing providers in the financial services sector.

Sarbanes-Oxley Act 2002 (SOX)

SOX is a US federal law which aims to increase transparency within the investment industry. However, its focus is on reforming internal control processes and the manner in which these are audited.

Markets in Financial Instruments Directive (MiFID II)

The MiFID II Directive (2014/65/EU) (2014/65/EU) and the Markets in Financial Instruments Regulation (Regulation 600/2014) (MiFIR) (retained in UK law by way of statutory instrument as UK MiFIR), collectively referred to as “MiFID II”, imposes a wide range of conduct of business and organisational requirements on regulated firms. These include obligations relating to outsourcing, confidential information, record-keeping and business continuity.

Financial Conduct Authority (FCA) Regulations:

  • firms subject to regulation by the FCA also have to comply with FCA security requirements;
  • under the regulatory regime applying to financial institutions, firms have a responsibility to assess the risks of data loss and take reasonable steps to minimise the risks of this loss occurring;
  • the FCA has issued guidance to clarify the requirements on firms when outsourcing to the cloud and other third-party IT services;
  • the European Banking Authority (EBA) has also published guidance for financial services organisations which covers outsourcing to the cloud – see EBA revised guidelines on outsourcing arrangements (25 February 2019);
  • within the insurance sector the Solvency II Directive (2009/138/EC) (“Solvency II” or “Solvency 2”) contains similar requirements.

Key Issues for Cloud Computing

The current key issues to consider when processing personal data within the cloud are as follows.

US cloud providers

On 16 July 2020, the ECJ ruled that the EU-US Privacy Shield personal data transfer mechanism for transfers from the EEA to the US was invalid. Transfers of personal data under the Privacy Shield are now unlawful, and organisations should seek alternative data transfer mechanisms such as the EU standard contractual clauses (SCCs) when working with US cloud providers. 

Adequacy

The UK left the EU without an Adequacy Decision in place at the end of the UK-EU transition period on 31 December 2020. A temporary "bridging mechanism" is in place for personal data transfers from the EU to the UK, during which time the UK will not be treated as a third country for the purposes of personal data transfers from the EU for four months from 1 January 2021, extended by two months unless one party objects, or, if earlier, until there is an adequacy finding for the UK. Recent reports are that the EU Commission has made a draft adequacy decision in relation to the UK.

At the end of the temporary bridging mechanism, if an adequacy decision is not in place, the UK's status as a "third country" will have important consequences for incoming data flows from the EU. Appropriate safeguards (see Article 46 of the EU GDPR) will need to be in place in relation to EU/UK data transfers, such as SCCs or binding corporate rules (BCRs).

Required contractual terms

Large cloud providers often do not allow for much flexibility with their standard data processing terms. Although any such terms should contain the minimum protections required to cover any legal data protection requirements, these are often not adequate in relation to the sensitivity of the data. However, if pushed, some movement can be expected on legal terms and operational measures – for example, greater clarity on service locations, data transfers and subcontractors (ie, the opportunity to object). 

Risk and Liability

For public blockchains such as that on which bitcoin operates, the technology tends to be free to use, based on open source and reliant on the community of users to provide computing power by setting up their own nodes. As such, they are usually provided on an “as is” basis and the developer offers very little in the way of representations, warranties or other commitments.

The actual rules that govern the operation of public blockchains can be quite piecemeal and located in, for example, chat forums, operating documents and other agreements.

Other key risks of using a public blockchain include security and confidentiality – its operation cannot be stopped or otherwise controlled by any one user in the event of a breach.

By using a private blockchain a lot of these issues can be avoided as the allocation and determination of risk and liability is more straightforward. The governance model can be pre-defined and agreements between the developer and participant will allocate the roles, risks and liabilities between the parties.

Intellectual Property (IP)

IP rights issues arising in the context of blockchain are fairly analogous to those arising in traditional software development agreements and relate to the ownership and licensing of IP in:

  • the back-end blockchain software;
  • the user-facing app; and
  • the content or data that sits on the blockchain itself.

The ownership and licensing of IP in these areas will depend on, among other factors, how “off-the-shelf” the development is, how much customisation and/or configuration is done, and the intended use of the blockchain.

Due diligence should be conducted around any use or incorporation of open-source software or other third-party software to avoid unintended infringement claims or incorporation of “copyleft” software into a private blockchain.

Copyleft refers to a variety of open source software which has a viral effect so that any modifications to it, or works derived from it, that are distributed to third parties, must be licensed on the same terms. This includes the obligation to make the source code (including any proprietary aspects of the modifications/derivative works) available to the public. Note that under the GNU AGPL licence, distribution of a modified/derivative work over a network (ie, cloud/software as a service) triggers the copyleft term.

The right of copyright holders to communicate work to the public should be carefully considered when working with blockchain, as the ease of use of blockchain as a file management and file transfer/sharing mechanism can make it a potential conduit for the facilitation of copyright infringement.

Data Privacy

Some of the most fundamental (and some of the most valued) features of blockchain are certainly prima facie at odds with some of the principles underpinning UK data protection law.

Key principles such as the data subject's right to erasure and right to rectification are complicated by the fact that data held using blockchain technology is generally immutable. Every node on the network has a copy of the ledger and therefore any data held on it, and there is no centralised “controller” of the data.

However, this is not to say that the two are completely incompatible. Irreversibly encrypting data can have the same or a similar effect to deleting it and using supplementary corrective statements can rectify any inaccuracies.

By anonymising personal data wherever possible or, even better, avoiding storing personal data on the blockchain in the first place, the risks are significantly reduced, as truly anonymised data does not fall within the remit of data protection law.

When using a private permissioned blockchain, participants must obtain authorisation before they can access or view the data stored on that blockchain. This means that there is a greater level of oversight and control of where any personal data is going.

Service Levels

As mentioned above, it is very unlikely that concrete service level commitments will be provided in relation to public blockchains as these tend to be provided on an “as is” basis.

The likelihood of vendors setting up private or bespoke blockchains committing to performance metrics is, as with other software vendors, dependent on the demands of the users or participants and the operating model of the blockchain.

In a more off-the-shelf solution that is offered on a “one to many” basis, vendors are less likely to offer concrete service levels. However, if a customer is relying on the availability of the blockchain for a crucial part of its business, this approach is unlikely to be acceptable.

Jurisdictional Issues

As the nodes on a blockchain, particularly a permissionless and public blockchain, can theoretically be located anywhere in the world, transactions completed via blockchain frequently cross jurisdictional boundaries. Servers are also decentralised, so when something goes wrong it can be difficult to pinpoint the exact location of the breach or failure and therefore what laws should apply.

Because of the pseudonymous and decentralised nature of blockchains, often involving participants located across multiple jurisdictions, the choice of governing law and forum should be considered very carefully. Unless these issues are agreed in advance, complicated jurisdictional and conflict of laws procedures could mean disputes go unsolved or end up in unwelcome or unfavourable forums for resolution.

Big data, machine learning and artificial intelligence (AI) projects involve working with extremely large data sets and novel technologies; as such, it is important to manage these projects carefully to ensure compliance with legal obligations through the entire project lifecycle. Below, we have addressed some the key legal challenges and potential solutions.

Ethics

Companies performing sophisticated data analytics or using machine learning/AI on personal data must comply with data protection legislation and the Equalities Act 2010. Any use of AI must comply with the data protection principle of fairness, but recent reports publicised by the UK’s Centre for Data Ethics and Innovation (CDEI) highlight that eliminating bias and ensuring fairness in algorithms is extremely difficult. As a result of algorithmic bias, the GDPR and UK GDPR (as applicable) provides a right of review to data subjects who are unhappy with a decision made solely by an algorithm (automated decision-making). Furthermore, companies should be aware that using algorithmic decision-making which disproportionately impacts individuals with a protected characteristic may be a breach of the Equalities Act 2010.

Companies should ideally take steps to mitigate algorithmic bias in the design-and-build phase of a project. This may include conducting a data validation exercise to ensure data which has been captured is not redundant, incomplete or inconsistent or carrying out an ethics assessment where boundaries for acceptable and unacceptable use cases for an AI are established. Furthermore, companies should also try to ensure that datasets used for training an AI are diverse.

Data Protection

Generally, larger datasets are at higher risk of experiencing a data breach, the costs of which can be substantial. Where data processing is likely to result in a high risk to individuals, a company is required to carry out a Data Protection Impact Assessment (DPIA) under the GDPR and UK GDPR (as applicable). The ICO considers high-risk processing activities to be those which involve large scale collection of "special categories of data" or those which use systematic and extensive profiling or automated decision-making to make significant decisions about people. Where AI or big data is being used for profiling purposes, this is likely to be considered a high-risk processing activity, necessitating a DPIA.

The GDPR and UK GDPR (as applicable) requires companies to use technical and organisational measures to safeguard data subjects’ rights and implement the data protection principles enshrined in the GDPR and UK GDPR (as applicable). This is known as "data protection by design". Failure to incorporate data protection by design could render the company vulnerable to a data breach and may be considered an aggravating factor should any litigation arise out of data breach.

It is important to consider data subjects’ rights from a project’s outset and ensure that projects are designed with data subjects’ privacy at the forefront. Companies can mitigate the risk by using encryption techniques, anonymisation (combined with a risk assessment of the risk of re-identification), ensuring that information security systems are up to date and adding access controls to the database.

Intellectual Property

It is natural that those investing in the development of an AI, a machine-learning technique, or the databases with which these products are trained, would seek to protect their product. Large, high-quality datasets can be costly to generate and bring considerable value to companies. Where there has been a substantial investment in obtaining, verifying or presenting the contents of a database, the database might be protected by a Database Right under the Copyright and Rights in Databases Regulations 1997. It is important to remember that the Database Right will protect the collection of data (ie, the database as a whole), and not its constituent elements and that the first owner of any Database Right is the maker of the database and not the commissioner. Database Rights would protect the owners from extraction or re-utilisation.

Protecting the intellectual property in an algorithm raises different challenges. Algorithms or abstract computer programs are not patentable in and of themselves. At present, the UK Patent Offices will not recognise AI itself as an inventor (though this may change in the future). However, systems that use AI or machine-learning systems which solve a "real world" problem can be patented, provided that a human being (rather than an AI) invented the idea.

The Internet of Things (IoT) covers a plethora of everyday internet-connected objects now commonplace in everyday life. The term includes anything from home appliances (lights, fridges, televisions) to wearables (watches, shoes) and to industrial connected machinery. With so many devices now in use, concerns around security, monitoring and data protection are under increasing consideration by government and regulatory bodies, though specific legislation remains at an early stage. Notably, in January 2020, the UK government asserted its intention to introduce specific legislation in respect of the IoT for consumer products.

Cybersecurity

Cybersecurity incidents involving connected devices are increasingly common. Many devices still have basic security weaknesses, such as default passwords and inadequate policies on identifying vulnerabilities partly due to their limited capabilities beyond connectivity. Cybersecurity incidents also present an increasing threat to industrial and governmental systems, meaning heightened security measures are likely to be required to protect potentially vital infrastructure.

To date, the approach of consumer device legislation and regulation has been largely a voluntary, industry-led approach, following the introduction of the Code of Practice for consumer IoT security in October 2018. This formed part of the government’s Secure by Design agenda, aiming for more comprehensive cybersecurity standards to be built into IoT consumer devices as a baseline. Measures include requiring non-default passwords, software updates and secure storage of credentials. Additionally, in February 2019, the European Telecommunications Standards Institute (ETSI) launched the first global industry standard on internet-connected consumer devices with largely corresponding principles to the Code of Practice.

However, continued concerns around weak implementation of standards mean the government is now looking to progress towards specific legislation for IoT cybersecurity. In May 2019, a consultation was launched to develop a statutory baseline for consumer IoT, with the top three proposals dealing with IoT password security, vulnerability disclosure policy and the duration of security update provision. A call for views on the proposal completed in September 2020 and draft legislation is expected to be drawn up in the near future.

Notably, the intention is for obligations to apply to both UK producers and distributors of IoT devices made outside the UK. The proposal also includes a designated regulator and enforcement powers including product suspension or recall, and fines of up to 4% of annual worldwide turnover. The implementation period may also be fairly short, with an estimate of nine months from royal assent to achieve full compliance.

Data Protection

IoT devices increasingly involve significant processing of personal data, as many are designed specifically with interconnectivity and data sharing in mind. This allows the manufacturer to accumulate data on its user base. Devices can capture and record a range of data points such as location, environment, health, habits or preferences, which allows a detailed profile of personal identifiers to be recorded, processed, transferred and stored. This means such devices risk conflicting with core principles of data privacy if such issues are not addressed at each level of the supply chain in an IoT project. The greater the level of personal data collection and transfer, the greater the scale of potential harm should the device be compromised.

The current Code of Practice requires manufacturers, service providers and application developers of IoT devices to comply with data protection legislation, including the GDPR and UK GDPR (as applicable) and the Data Protection Act 2018. Any IoT project which involves the processing of personal data must consider the processing obligations under such legislation; for example, whether the device offers clear information and the ability for the user to explicitly consent and withdraw consent with regard to any personal data the device collects. In some cases, obtaining informed consent may be difficult, as the nature of data collection from devices may be less obvious to users. 

Connected devices which collect children’s personal data, or particularly sensitive data such as that relating to health, will also require more stringent protection. Moreover, any personal data which is not necessary for the specific services the IoT device is intended to provide should not be collected or stored, nor should personal data be held for longer than is necessary.

Other Issues

IoT stakeholders should also be aware of rules and regulations governing data and communications more widely. Communications, be they person-to-person, machine-to-machine or otherwise, may be caught by telecoms and surveillance legislation (for example, the Investigatory Powers Act 2016 includes signals “between a person and a thing or between things”).

IoT infrastructure also tends to be complex, with the potential for many components in the supply chain. This means allocation of risk and liability must be carefully considered on a contractual level to ensure that liability in the event of failure at any level of the chain is properly understood.

Organisations entering into IT service agreements with local organisations will likely be restricted to contracting on one of a handful of standard framework agreements. While the exact content of the agreement will depend on the nature of services being procured, these framework agreements all have very similar base positions in relation to key provisions such as liability, indemnity and termination. This means that the main challenges faced by service providers are common across the range of IT service frameworks.

When considering these main issues, it is worth remembering that local legal frameworks are almost exclusively non-negotiable. This is also true of the majority of call-off agreements sitting under the legal frameworks, where only limited changes may be made, if any. It is only at an order level where supplier organisations may be able to include certain provisions or look to descope risks. As these processes are often competitive, service providers will be disinclined to raise too many issues with key terms. This, however, means that the challenges and risks resulting from the agreements must be accepted, insured against or backed off in some other way.

These main challenges and areas of risk in respect of contracting with local organisations are as follows.

Uncapped Liability

Where most service providers would usually negotiate out or restrict areas of uncapped liability, the standard IT service framework contains a number of areas where the service provider’s liability is uncapped. While some of these are standard (for example, intellectual property indemnities), some are not – such as uncapped liability in respect of data protection (see 'Data Protection' below) or, in some cases, a lack of any limit on the supplier’s total liability.

Service providers will also need to bear in mind that the extent of the exposure under liability provisions will often be greater under these framework agreements. Indemnities will often cover all claims, rather than only third-party claims, and standard carve-outs are not always included.

Data Protection

As a matter of policy, service providers are often restricted from transferring local organisation’s personal data outside the EEA and not able to use sub-processors without consent. These prohibitions may mean additional cost implications in relation to data storage or sub-processor selection.

Until recently, data protection liability (including for fines) was uncapped as standard. However, the Digital Outcomes and Specialists 5 has introduced a Data Protection Liability Cap of GBP10million, which may indicate that future local frameworks may similarly cap these losses. In relation to IT services, the risk implications of uncapped data protection liability vary greatly, depending on whether, and the nature of, data being shared. In some cases, it may also be able to mitigate this practically with dummy data. 

Pricing Considerations

Framework and associated call-off agreements often impose a large number of additional obligations on organisations that would not be present in a standard business-to-business agreement (or at least not to the same extent). These should be factored into pricing, and include the following.

  • An obligation to provide assistance on a number of matters for no additional charge, including in relation to data protection, data restoration and migration to replacement suppliers. It is often difficult for organisations to understand the true scope and cost implications of these obligations when tendering (and fixing prices) at the outset.
  • Related to the above, rights are granted (and obligations are owed) to the Crown as well as the relevant local organisation signatory. As such, any licences or rights granted or indemnities given have broader beneficiaries and the supplier organisations will need to ensure it is able to obtain such rights and back off such obligations.
  • There are significant reporting, performance management and transparency obligations.
  • Finally, there are obligations to comply with a variety of policies, especially in relation to IT services.

Confidentiality

Both the local organisation and the Crown may need to disclose information under the Freedom of Information Act (FoIA) or make public information about the relevant services procured. Service providers need to consider this when disclosing information and may need to identity certain information sets as commercially sensitive so as to be subject to an exemption under the FoIA.

Termination Rights

The local organisation will have very favourable termination for convenience rights in addition to a significant range of termination rights. It is often the case that service providers have very limited termination rights – in some call-off agreements they are not even entitled to terminate for breach or able to recover committed costs if the local organisation terminates; depending on the nature of the IT services provided, this could present a material financial risk.

The UK data protection regime is set out in the UK GDPR and the Data Protection Act 2018 and is regulated in the UK by the Information Commissioner’s Office (ICO). The UK GDPR sets out the following seven key principles that are fundamental to businesses and organisations collecting, storing, sharing, deleting or otherwise using (defined as “processing”) personal data:

  • lawfulness, fairness and transparency – ensuring that you have a valid legal reason for collecting and using personal data and using personal data in a fair, open and honest way;
  • purpose limitation – ensuring you are clear about the reasons why you are processing personal data;
  • data minimisation – ensuring that the personal data you process is relevant and limited to what is strictly necessary for the original purpose you collected it for;
  • accuracy – ensuring any personal data you hold is not incorrect or misleading;
  • storage limitation – only retaining personal data for a long as is necessary for the original purpose you collected it for;
  • integrity and confidentiality – ensuring sufficient security measures are implemented to protect the personal data you hold;
  • accountability – having effective measures in place and maintaining legally required documentation to demonstrate compliance with the above principles.

Complying with the above underlying key principles will assist businesses and organisations when complying with their other obligations under the UK GDPR.

Distinction between Companies/Individuals

The UK data protection regime distinguishes between data about individuals and data about companies. The UK GDPR makes clear that information about a legal entity as opposed to information about an individual will not be considered personal data and will therefore fall outside of the scope of the UK GDPR. In fact, the definition of “personal data” in the UK GDPR makes it clear that the particular information must relate to an identifiable "natural person".

However, information relating to individuals within companies – such as employees and directors and individuals who are sole traders or in a legal partnership – will constitute personal data and will fall within the scope of the UK GDPR. This is because data protection legislation distinguishes between data that relates to the individual, even if in a business context, and data that is solely related to the legal entity. Examples of data relating to individuals within a company include an individual’s name and professional email address, as they clearly relate to a particular individual rather than just the company itself.

General Processing of Data

The UK data protection regime only governs the processing of personal data as opposed to non-personal data. Where information does not contain any personal data or it has been effectively anonymised, it will not fall within the scope of the UK GDPR. The ICO encourages organisations to anonymise data where possible but emphasises the high threshold of data being truly anonymised. For a dataset to be effectively anonymised in accordance with the UK GDPR, any information which relates to an identifiable individual must be completely removed so that individual can never be re-identified from that dataset. However, the act of anonymising personal data itself still involves the processing of personal data so will be subject to the UK GDPR.

Processing of Personal Data

The term “processing” under the UK GDPR is very broad and captures most actions such as collecting, using, documenting, storing, disclosing and deleting personal data. The UK data protection regime splits the key data protection responsibilities and obligations into two main roles known as “controller” and “processor”, depending on the organisation’s role in relation to the personal data being processed. A controller is an organisation or individual that determines how and why the personal data is collected and used and is subject to more onerous responsibilities and obligations. In some scenarios, two controllers may be considered “joint controllers”. A processor is a different organisation or individual which processes personal data on behalf of a controller for their benefit. Processors will also be subject to more limited responsibilities and obligations under the UK GDPR. The UK GDPR necessitates a written contract between a controller and a processor with specific data protection terms.

Employee-Monitoring Activities

The massive use of digital devices has improved labour productivity, reformed many business models and driven innovation. However, the use of technology in the workplace also brings a complex dichotomy: the interest of a company in monitoring its employees’ activity versus the employees’ right to privacy.

Employee-monitoring activities include not only CCTV cameras, but also any continuous supervision of employees’ use of applications and systems, IT equipment, devices, or networks owned or managed by their organisations. On some occasions, this may also include accessing information relating to the employee contained in both internal and external sources including email, or public profiles on social media.

It should be borne in mind that, in order to carry out these activities, a number of fundamental data protection requirements must be considered.

Transparency Principle

First, transparency towards individuals is a key data protection principle, so it is essential that organisations inform their employees that they will be monitored. Employers must comply with the duty of transparency set out in the GDPR and UK GDPR (as applicable). They must provide clear and detailed information on the purposes for which monitoring will be carried out, and how employees may prevent their personal data being collected and used by monitoring tools. Employers may put in place different internal policies that set out their expectations of employees in relation to their use of corporate devices and networks.

Lawful Basis

It is also relevant to determine and inform individuals of the legal basis for processing on which the entities will rely to carry out monitoring activities. Most European data protection authorities, including the European Data Protection Board (EDPB), have determined that obtaining consent from employees may not be the most suitable legal ground to rely on due to the imbalance of power that may exist between employers and employees. Therefore, employers may need to rely on an alternative legal basis for processing, such as legitimate interest or compliance with legal obligations.

Where employers rely on legitimate interest, a legitimate interest assessment is required to confirm that this is the applicable and most appropriate legal basis and that objective may not be achieved by other less intrusive means, and that this legitimate interest is not overridden by the rights and freedoms of individuals.

Data Protection Impact Assessments

Another relevant consideration is whether monitoring and access to employee information is essential in order to carry out the work activity and, if so, what types of data the employer accesses when performing monitoring activities. In this respect, the EDPB stated that employers should conduct a proportionality test prior to launching any monitoring system in order to assess what personal data they actually need and whether individuals’ data protection rights individuals are guaranteed.

In order to assess whether proportionality has been taken into account, the data protection authorities advise employers to conduct a DPIA which may help establish any potential risks from a privacy perspective and whether any additional steps may need to be implemented in order to mitigate any severe impact on employees’ privacy.

Remote Work

Organisations should also carefully consider whether and how to monitor employees working from home, which has become particularly relevant due to the COVID-19 outbreak. Employers should prevent any unlawful monitoring by implementing alternative forms of accessing and using work equipment for private purposes. They should consider the 2017 European Court of Human Rights ruling in Bărbulescu v Romania, which states that employers may only be able to perform online employee monitoring under certain circumstances, such as having informed the employee in advance about the monitoring activity and guaranteeing that no special category data may be accessed unless necessary.

Takeaways

It is fundamental that, prior to starting any monitoring activity, organisations ensure that their purposes and objectives are properly defined and communicated so that all employees have a clear expectation of such activity based on their relationship with the organisation as employer, and that privacy is considered where implementing any tool used to monitor. Furthermore, in line with the GDPR and UK GDPR (as applicable), data controllers should ensure that they can guarantee protection of employees’ data protection rights and that these will not be overlooked.

The European Regulatory Framework has been developed around the regulation of electronic communications networks (ECNs) and services (ECSs) – ie, the infrastructure and services associated with the conveyance of messages (between people or machines) – with technology-neutrality as one of its underlying principles and general authorisations being the default position when providing ECSs rather than a licensing regime.

Radio Spectrum

As a scarce resource, even before departing from the EU, the UK has always been able to determine the licensing regime in relation to the use of its electro-magnetic spectrum. It is set out in the Wireless Telegraphy Act 2006 that the use of any radio transmitting device is required to be either licensed or specifically exempted from licensing.

For mobile telephones, the licence of a mobile network operator (MNO) covers the use of transmitters and repeaters under the MNO’s control and user devices are covered under a general exemption. Cellular repeaters, boosters and enhancers are not, as a general principle, exempted devices.

RFID tags fall within a general exemption under the Wireless Telegraphy (Radio Frequency Identification Equipment) (Exemption) (Amendment) Regulations 2007. This permits the use of RFID tags from a use of spectrum perspective. Ofcom has clarified that these tags are not regulated (except for the personal data processed by it under data protection legislation).

The European Electronic Communications Code does not regulate e-commerce, information society services and the exercising of editorial control over online content or broadcasts.

European Electronic Communications Code (EECC)

On 21 December 2020 – and regardless of Brexit – the European Electronic Communications Code (EECC) was implemented in the UK. DCMS formed the view that many provisions reflect best practice; therefore it was transposed through amendments to the Communications Act 2003 and the Wireless Telegraphy Act 2006. It is worth noting that regardless of the implementation date, Ofcom issued a statement last year that communications providers will have at least 12 months to implement any changes that are required under the EECC so that resources can be otherwise allocated to respond to the crisis.

This is the third revision by the EU of telecommunications regulation this century and is the first time that over-the-top (OTT) services such as WhatsApp are regulated as a sub-group of ECSs together with ECNs, although there is an exception where OTT services are purely ancillary to a non-communications service. OTT providers are now classified as number-based interpersonal communications services or number-independent communications services.

Once implemented by way of amendments to the EECC’s General Conditions of Entitlement, OTT providers like Skype and WhatsApp will have to comply with some of these conditions, including provisions that will need to be set out in their contracts with consumers, together with equivalence of access for disabled users and appropriate security measures.

Separately, providers of ECSs will also face prohibitions on the locking of devices.

VOIP

The regulation of VOIP (voice-over internet protocol) is a notable departure from Ofcom’s technology-neutral approach to regulation, with specific policy statements issued and provision in regulation due to a concern that consumers would not be aware of the limitations of a VOIP system – depending on the type of VOIP system being provided.

Before offering VOIP services in the UK, companies should consider whether the service is a publicly available telecoms service (PATS). PATS are defined as a service:

  • available to the public;
  • for making and receiving national and international calls; and
  • accessing emergency services, through a national or international phone number on a national or international numbering plans.

The regulation of VOIP should be monitored this year as Ofcom may decide to pick this regulation up in the interpersonal communication service amends to the General Conditions of Entitlement.

The Audiovisual Media Services Directive (AVMS Directive) was amended by the Broadcasting (Amendment) (EU Exit) Regulations 2019. Audio-visual service regulation has typically drawn a distinction between non-linear and linear services devoid of a platform-neutral approach. This may be further compounded with the Online Harms regulatory approach likely to become law in the UK in 2021.

For the broadcasting of audio-visual services, a licence from Ofcom is needed based on the service being provided:

  • a DTPS licence is a broadcast licence for a service providing television programmes – generally, a DTPS consists of “normal” television channels (consisting of moving pictures), including their interactive enhancements;
  • a DTAS licence is for a service which usually consists of self-standing text or data services, including teletext services and EPGs;
  • a TLCS licence is for a television service made available using either satellite, a radio multiplex, or an electronic communications network (such as cable);
  • an RTSL-E licence allows the broadcasting of television programmes for a particular event – a Broadcasting Act licence and a Wireless Telegraphy Act licence are both required in order to broadcast a restricted television service.

All licensees must also comply with Ofcom’s Broadcasting Code, advertising standards and any content requirements.

Ofcom does not regulate internet-only radio stations and there is a wide variety of other licences from temporary to longer term that must be applied for in the event radio spectrum will be used.

VSPs

Video-sharing platforms (VSPs) are a type of online video service. They allow users to upload and share videos with other people and engage with a wide range of content and social features. From 1 November 2020, VSPs were classified as an online video service and are required to have appropriate measures in place to protect children from potentially harmful content and all users from criminal content, and incitement to hatred and violence. VSPs will also need to meet advertising standards. Ofcom has indicated that the government intends VSP regulation to be in place until the new Online Harms regime comes into force. Only VSPs that are UK-based will be within Ofcom’s scope of regulation, as services will usually be regulated by the EU member state in which they are based.

While Ofcom has stated that it will not be responsible for regulating services such as YouTube and Facebook, we can expect some VSP services to fall within our jurisdiction, including Twitch and Vimeo.

Ofcom’s role is to make sure regulated VSPs are taking the appropriate steps to protect users from harmful content, such as incitement to hatred and violence and determining that the measures VSPs adopt to protect users are appropriate and proportionate. Formal guidance is expected to be published later this year. If VSPs break the rules, Ofcom can enforce a financial penalty of up to 5% of their qualifying revenue or GBP250,000 (whichever is greater).

The government’s broader Online Harms legislation is expected to apply to a much wider range of online services, including services which are not based in the UK. It is expected that Ofcom will be the regulator for this broader Online Harms role.

Encryption is a process through which plain text information is scrambled into an unreadable format known as "ciphertext". Encrypted information is only accessible to parties who are able to decrypt the data, typically through the use of an encryption key. Encryption allows senders and recipients of data to keep information private, validate authenticity and track digital signatures to verify the sender as the source of information.

Encryption, therefore, forms a key part of an information security environment, and the secure transmission and storage of information online.

Requirements Governing Use of Encryption

The UK has not enacted legislation which explicitly requires the use of encryption, but encryption is commonly required:

  • to comply with general security obligations under data privacy laws;
  • to meet information security certifications, such as ISO 27001;
  • to comply with industry standards, such as those relating to payment card information encryption.

These requirements can sometimes overlap or apply indirectly. For example, a contract counterparty may require their supplier to maintain an information security certification, comply with data privacy laws, and/or adopt specific security standards – which will often include encryption.

Strict legal requirement 1 – data privacy laws

UK data privacy laws require organisations to (i) take appropriate steps to protect the personal information they hold and (ii) adopt a "data protection by design" and default approach to compliance. Read together, this means information security should not come as an afterthought, and that organisations must take adequate steps to protect personal information they hold. This will commonly warrant the use of encryption, when feasible. Appropriate steps in this context requires a reflection on the risks associated with the storage and transmission of information, and whether it would be appropriate to encrypt personal information considering the risks to individuals to whom the data relates, the state of the art, and the costs of implementation.

In that context, businesses can be held liable for failing to encrypt personal information if it would have been sensible to do so. This can result in large GDPR or UK GDPR fines, reputational damage and class action claims for compensation. The importance of encryption in this context is becoming increasingly apparent in recent decisions by the UK data protection authority, the ICO. For example in their decisions against British Airways, the Marriot Hotel Group and DSG Retail, each organisation was held liable for failing to secure their systems, including when encryption would have been necessary to protect payment card information, passport numbers and point of sale systems, respectively.

Strict legal requirement 2 – investigatory requests

Use of encryption in the UK is also governed by the Regulation of Investigatory Powers Act 2000. Under part III of the Act, UK authorities can compel communications service providers to provide encryption keys or assist with decryption of encrypted communications data, including telephone calls and emails (Section 56 (3) RIPA 3). However, UK authorities cannot compel market participants based overseas to provide encryption keys.

Encryption Exemptions

Whilst the UK has not enacted any statutory exemptions to encourage the use of encryption, organisations who use encryption will find it easier to comply with data breach notification and international data transfers considerations under data privacy laws, as detailed below.

Data breach notification – organisations must notify data protection authorities of personal data breaches unless the breach is unlikely to result in a risk to the rights and freedoms of individuals. If a high risk is identified then individuals must also be notified. Use of encryption is a key advantage in these circumstances, as if the information is encrypted and therefore unreadable then the risk to individuals is much lower, and therefore – depending on the facts – a notification requirement may not be triggered. Risks of reputational damage, regulatory scrutiny, and likelihood of consumer class actions are therefore reduced significantly.

International transfers – UK privacy laws contain prohibitions on transferring personal data to jurisdictions which do not guarantee data protection rights and remedies equivalent to UK standards. This commonly arises in markets with extensive surveillance laws, and it is becoming clear from emerging guidance and case law that additional supplementary measures will be required to make lawful transfers to such markets. The range of available supplementary measures in these circumstances is limited, and recent guidance states that encryption (and the withholding of encryption keys) will be one of the few technical measures which organisations can take to effectively mitigate surveillance risks.

In short, encryption is a helpful tool to achieve information security compliance but cannot be used to exempt an organisation from complying with statutory obligations.

COVID-19 has had a potentially adverse effect on parties’ compliance with contractual obligations in the TMT sector. It has often raised the possibility of breach of material provisions in a supply contract. Whilst there have been no TMT-specific government actions, the more general government actions described do impact the TMT sector.

Guidance on Responsible Contractual Behaviour

The UK government’s main response has been to produce, support and promote the Cabinet Office guidance: “Guidance on responsible contractual behaviour in the performance and enforcement of contracts impacted by the COVID-19 emergency”.

The guidance was published in May 2020 and updated in June 2020. Being guidance, it does not have the force of law and this is a major weakness. It asks that parties act reasonably and proportionately in a spirit of co-operation with the aim of achieving practical, just and equitable contractual outcomes, and having regard to the other party, the available financial resources, the protection of public health and the national interest. It also emphasises that responsible and fair behaviour should apply in relation to the conduct of contractual disputes. A non-exhaustive list of behaviours covers payments, damages claims, the exercise of remedies, the provision of information and the initiating and conduct of disputes, as well as the enforcement of judgement.

The problem with the guidance, because it has no authority of law, is that there is no legal sanction on a party that does not follow the guidance. The UK government may have hoped that the courts would intervene to apply the guidance, but the courts in England have always tried to avoid "reinterpreting" a contract entered into freely by commercial entities. If the contract leads to an "unfair" outcome, the view of the courts is that it is not the courts’ job to intervene to rebalance the outcome. Whilst there have been no decided cases to date, the industry view is that the guidance will remain as guidance and is likely to have very limited effect on actual contractual behaviours.

Corporate Insolvency and Governance Act 2020

One concrete piece of law has been brought into effect is the UK government’s decision to accelerate an overhaul of the UK's insolvency laws. The Corporate Insolvency and Governance Act 2020 was brought to the statute books earlier than expected in July 2020. The government’s hope was that the Act would provide more tools to rescue struggling companies as a going concern and help more businesses weather the COVID-19 crisis.

The Act brought into place a number of permanent measures and temporary measures. Whilst under consideration already, these changes were brought into effect quickly to address the COVID-19 situation. The permanent measures can operate together to provide a breathing space for struggling companies – for instance, following a sudden deterioration in cashflow:

  • a new moratorium will prevent creditors from applying for administration orders or petitioning for the winding up of the company while the company seeks a rescue or restructure;
  • companies will be able to enter into a restructuring plan that will bind creditors and allow for the restructuring of debt while injecting fresh rescue finance.

A COVID-19-specific amendment also applies in relation to the (permanent) moratorium described above. In summary it gives effect to a prohibition on termination (ipso facto) clauses that engage when a company enters an insolvency procedure, a moratorium or begins a restructuring plan.

In addition, a company’s supplies will be protected during an insolvency process or restructuring. The Act prevents a wide class of essential suppliers from stopping their supply while a company is going through a rescue process. With exceptions for smaller suppliers, the Act includes safeguards to ensure that continued supplies are paid for, and suppliers can be relieved of the requirement to supply if it causes hardship to their business.

Additional Temporary Measures

The additional temporary measures are: (i) temporary suspension of the wrongful trading provisions; and (ii) temporary suspension of winding-up petitions.

Initially, the period of coverage for the temporary measures operated until 30 June 2020, but this period has been extended to 30 March 2021.

Whilst these have been the main developments relating to COVID-19, the government has intervened in many adjacent areas. Examples that may be relevant to TMT include: (i) the UK government's financial assistance programme, and (ii) the government-funded furlough scheme, providing for a contribution towards employee costs in businesses that are impacted by government-ordered lockdowns.

Deloitte Legal

2 New Street Square
London
EC4A 3BZ
UK

+44 20 7303 0435

emmawright@deloitte.co.uk www2.deloitte.com/uk/en/services/legal
Author Business Card

Trends and Developments


Author



Deloitte Legal combines market-leading lawyers, consultants and technology experts to provide clients with new solutions to legal problems. Whether that be legal advice delivered in a more effective way, assistance in harnessing the considerable benefits created by advancements in legal technology, or advice on how to create a best-in-class in-house legal function, the firm has the breadth and depth of expertise to advise on the challenges that its clients face.

Predicting accurate trends year-on-year in the telecoms, media and technology (TMT) sector can be a fairly difficult task, as although this is the very sector that has often wowed us with new developments, equally it is a sector often associated with empty hype. We are also now 12 months into the COVID-19 pandemic – millions of people around the world have become dependent on the internet to carry out many of their everyday activities: working, schooling, socialising and managing household affairs. Never before has our telecoms infrastructure seemed as important as those of gas, electricity, or other essential supplies; when we were told to remain in isolation from one another, the internet still allowed us to work, educate and remain as close as possible to our families. So, what impact will 2020 have on 2021?

Will 2021 Be the Year for 5G?

When the iPhone arrived with near field communication technology from 2014 onwards, the sector watched to see if this provided the necessary boost to the mobile payments market. It certainly heralded the advent of Apple Pay but, arguably, it was not until shops were required to reduce the cash being handled during the pandemic that people have been truly driven to move away from cash. 

Last year's new iPhone model finally came equipped with 5G technology, while 2020 was also the year when we had an insatiable appetite for data – and not in the offices where it has been typically been consumed. Yet it was also the year of 5G conspiracy theories, where claims were made that 5G caused COVID-19 and masts were vandalised. The question is: will 2021 be the year that 5G becomes ubiquitous?

Under the new European Electronic Communications Code (EECC), National Regulatory Authorities, such as Ofcom, must promote greater connectivity along with more competition. The UK is in the process of implementing the EECC, regardless of Brexit, and Ofcom is responsible for encouraging telecoms infrastructure market participants to enter into co-ownership or co-financing arrangements with smaller market entrants when developing 5G and fibre networks, offering reduced regulatory burdens for market participants with significant market power, provided that they enter into such agreements on fair terms.

It is hoped that allowing new partnerships or sharing arrangements like this may avoid the typical bottlenecks and write the necessary business cases that have thus far prevented the roll-out of high-speed networks throughout the country. It is also hoped that the EECCs will reduce reliance on established participants’ systems and encourage competition. However, there is some concern that these steps will compound some of the market dynamics that already exist.

Over the past 12 months, "over-the-top" (OTT) services have been recognised as being equivalent to more traditional telecom services and regulated as such. This will no doubt increase the costs of compliance for these messaging apps and change the nature of their relationship with their users. The security of these messaging platforms, together with the encryption applied, will no doubt face increased scrutiny.

To accompany this change in the telecoms legislation in December 2020, in 2021 we are likely to see a continued focus on e-privacy and cookie compliance. We can also expect developments at the EU legislative level (we are long overdue a harmonised e-privacy regulation), and across more consumer-facing industries, driven by regulatory investigations into the online advertising industry.

E-privacy rules govern the information you need to provide to individuals and the consents you need to collect prior to accessing and reading a user’s device. They apply unless your access is strictly necessary for the services requested by the user. In the UK, these rules are governed by the Privacy and Electronic Communications Regulations, which are derived from broader European legislation which has been implemented differently across Europe. This fragmentation has led to the need for the harmonised EU regulation which was initially scheduled for May 2018.

Organisations are also likely to see an increasing number of complaints relating to the setting of cookies and the tracking of online activity, as consumer awareness grows and the market moves towards a more unilateral adoption of the IAB’s Transparency and Consent Framework V2. The comprehensive framework is both raising the expectations of consumers in UK and European markets and also driving the implementation of much more complex cookie consent management mechanisms on websites. These changes are leading to increased disruption to the average online experience of a user, and so to increased consumer frustration and complaints. We can expect these trends of increased awareness and activism to continue, and for companies to respond to e-privacy concerns by increasing diligence to avoid being caught in regulatory investigations, such as those taking place in the data-brokering and online advertising sectors.  

The Digital Services Act (DSA) in its current form also contains a requirement that online platforms will need to identify all advertising, together with who is behind the advertising, why that advertising is targeting certain users and a requirement to report on the content of adverts, their target audience and the number of recipients that were reached.

All this is before we mention the proposed Digital Markets Act (DMA), aimed at solving the structural problems and competition concerns with large platforms the DSA designed to protect consumers from false or misleading content, and the UK's Online Safety Bill, which looks to protect the public from "online harms".

How Will Geopolitics Impact?

The sector has faced a real lack of regulatory certainty in recent times, due to growing protectionist trade positions, which is unprecedented. As governments realise that their economies are inextricably dependent on tech – and each other – they are using a wide range of tools to allow them to "dig in" and protect national security when, in fact, much of the approach is actually driven by digital sovereignty. The distinction between the two can be fine, with the former stemming from a genuine concern to protect the security and defence of a nation, and the latter emerging from the desire to control a nation’s data – not necessarily motivated by security and defence. 

Legislation proposed for the UK telecoms sector under the Telecoms Security Bill places far-reaching requirements on the telecoms supply chain and the security obligations that apply to the telecom operators, with significant fines being proposed for non-compliance. It will be interesting to see if other countries adopt similar stances. The past year has proved just how reliant countries are on key technologies and data, meaning that a lessening of the focus by governments seems unlikely. Therefore, 5G, AI and machine learning, advance surveillance technologies and quantum technologies are all likely to face tightened export controls. 

Online advertising and data capture have been the go-to commercial models for platforms and search engines. The approach that the EU is adopting in the DSA has already caused some concern on the other side of the Atlantic – whether this will have a deeper impact on EU-US relations remains to be seen. Some of the EU’s proposed regulation materially change the operating environment in Europe for many of the large US tech companies – and that is before anyone considers the applicable tax regimes.

In the broader TMT space, a period of more predictable and foreseeable regulation in the markets where companies operate would be welcomed by the TMT sector.

Tech for Good?

The phrase "tech for good" has a myriad of meanings. However, it means without doubt that tech should be designed with accessibility in mind, to allow everyone to use it. This applies equally to audio enhancements for the hard-of-hearing through to facial-recognition software working equally well for men and women, regardless of the colour of their skin or ethnicity. The issue that the TMT sector is facing is that well-intentioned emerging technologies carry unknown unintended consequences. Furthermore, the TMT sector must grapple with various (sometimes conflicting) interpretations of "good technology".

The recent about-turn on facial recognition (FR) is a good example of this. FR has become an increasingly pervasive technology, partly due to the advent of mobile phones using this capability to maintain security. However, the wider roll-out by law enforcement authorities with seemingly little scrutiny of the applicable legislative frameworks has led to some significant suppliers of FR to stop supplying governments in the past year. In the UK and abroad, there have been calls from parliamentarian and civil society groups for a clearer regulatory framework.

Conclusion

The TMT sector generally delivered to its user base in 2020 and enabled many activities that would have otherwise been impossible during national lockdowns. Even so, at least in relation to telecom companies, the cost of continuous upgrades to their networks and the necessary efforts to stop "churn" (ie, customer migration) weighs heavy on their valuations. This is unfortunate and is unlikely to change while the weight of securing national secrets falls mainly to them.

It is entirely plausible that our appetite for content may wane, post-lockdown, meaning that content providers will sharpen their focus on the flagship events. Conversely, will the provisions of the DSA and Online Harms legislation mean that the requirements for posting user-generated content (UGC), or the liability that flows mean that there will be less UGC and an increase in controlled content? In either eventuality, it seems the TMT sector has never been more at the mercy of the whims of broader national and geopolitical priorities.

Deloitte Legal

2 New Street Square
London
EC4A 3BZ
UK

+44 20 7303 0435

+44 20 7583 1198

emmawright@deloitte.co.uk www2.deloitte.com/uk/en/services/legal
Author Business Card

Law and Practice

Author



Deloitte Legal combines market-leading lawyers, consultants and technology experts to provide clients with new solutions to legal problems. Whether that be legal advice delivered in a more effective way, assistance in harnessing the considerable benefits created by advancements in legal technology, or advice on how to create a best-in-class in-house legal function, the firm has the breadth and depth of expertise to advise on the challenges that its clients face.

Trends and Development

Author



Deloitte Legal combines market-leading lawyers, consultants and technology experts to provide clients with new solutions to legal problems. Whether that be legal advice delivered in a more effective way, assistance in harnessing the considerable benefits created by advancements in legal technology, or advice on how to create a best-in-class in-house legal function, the firm has the breadth and depth of expertise to advise on the challenges that its clients face.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.