TMT 2019 Comparisons

Last Updated February 22, 2019

Contributed By Ashurst

Law and Practice

Authors



Ashurst digital economy group is a global team that operates from hubs in the UK, Australia and Hong Kong. With four partners and 15 qualified lawyers, the London group’s practice areas are digital economy strategic commercial transactions and regulation, data protection and IP, M&A, joint ventures and financings in the sector. The team of specialists understand the key constituencies of the digital economy that are transforming business across all sectors – digital infrastructure, cloud computing, new technologies and data protection. Ashurst advises across all industry sectors, with a key focus currently on TMT, FinTech, InfraTech and PropTech. Its clients include operators and vendors, corporates, banks and funds, as well as regulators and government institutions. Several members of the core team have worked in the industry for over 25 years, including a number who have worked inhouse at operators and banks, at the regulator and in tech start-ups.

As there isn’t a set of consolidated cloud computing law and regulation which limits the use of cloud-based services in the UK, providers (cloud providers) and users (cloud users) will need to be aware of, and comply with, a patchwork of law and regulation.

Regulatory requirements arising from the use of cloud-based services can vary, with some industry sectors more heavily regulated than others. An example of a sector where use of the cloud is more heavily regulated is the financial services sector.

Outsourcings in financial services are regulated. Accordingly, where a cloud-based service used by a bank or other regulated financial institution amounts to a material outsourcing (broadly, an outsourcing of critical or important operational functions), additional regulatory requirements will apply.

These requirements include complying with obligations involving due diligence, supervision, record retention, risk management, audit and reporting on the cloud user. These requirements are set out in the Financial Conduct Authority’s (FCA) Handbook and the Prudential Regulation Authority’s Handbook, and guidelines are available in the form of FCA guidelines and European Banking Authority recommendations (which are due to be updated in Q1 2019).

Where a cloud-based service involves the processing of personal data (including its storage or transmission), the UK data protection regime will apply.

In this context, and where personal data is managed through the cloud service, the cloud user will be the Controller and typically the cloud provider will be the cloud user’s Processor of that data. It follows that the usual Controller and Processor duties under the GDPR (see ‘The UK Data Protection Regime’ in 6.1 Core Rules Regarding Data Protection) will apply to the cloud user and the cloud provider respectively.

The nature of many cloud services, particularly those which are provided on a SaaS model (see ‘Licensing Model’ in 5.1 Specific Features) or involve multi-tenant data centres and/or infrastructure, create singular challenges for cloud users in complying with a number of their GDPR requirements, in particular those listed below.

  • Audits: many cloud providers refuse to grant direct audit/inspection rights to their customers, given customer audits jeopardise security and risk potential operational disruption. This can create tension with cloud users, who are legally required to impose a contractual obligation on the cloud provider to allow for, and contribute to, the cloud user’s data audit and inspections. A common compromise is to interpret this obligation broadly, such that the cloud user relies on being given access to the cloud provider’s own independent audit reports, rather than a right to conduct physical site inspections.
  • Subprocessing: the cloud user must contractually restrict the cloud provider from appointing a subprocessor (ie, a subcontractor who processes the cloud user’s personal data) without the cloud user’s written authorisation. While the GDPR enables general authorisation to be granted upfront, the cloud user must ensure that the cloud provider is required to inform the cloud user of any proposed new or replacement subprocessor, and the cloud user is granted an opportunity to object. This creates a particular challenge for cloud providers whose typical models are predicated on having unfettered flexibility to adapt their subcontractor pools in order to enable optimal operational efficiency. Most cloud providers approach this obligation by publishing their subcontractor list on their website, and stating that published updates to the list from time to time will be deemed notice to the cloud user. Some cloud providers also state the cloud user can terminate the cloud service if it objects to a subcontractor change (although the cloud user is typically not entitled to a refund of prepaid fees).
  • Data sovereignty: for example, US law enforcement agencies can require US-based cloud providers to disclose customer data, even where that data is stored by the cloud provider outside the USA (under US Clarifying Lawful Overseas Use of Data Act (Cloud Act)). Where that data comprises a cloud user’s personal data, the cloud provider may be caught between meeting its disclosure obligation to the enforcement agencies and its cloud user’s obligations under the GDPR to restrict/safeguard transfers of personal data outside of the European Economic Area (EEA). US-based cloud providers’ terms typically serve to protect their position under the Cloud Act, which may leave cloud users with residual risk of technical non-compliance with GDPR.

Cybersecurity

A UK cloud provider who offers cloud computing services in the UK is a digital service provider under the NIS Regulations and subject to its requirements (unless it is a micro or small business). For further details on the NIS Regulations, see 8 Scope of Telecommunications Regime. A cloud computing service for these purposes is any service which enables access to a scalable and elastic pool of shareable physical or virtual computing resources. It covers SaaS, IaaS and PaaS products, as well as e-mail and online storage services if they are scalable.

Businesses seeking to benefit from the efficiencies of multiple access and transaction settlement capabilities offered by blockchain technology will generally seek to operate a private blockchain due to the limitations and risks associated with public blockchains.

However, there are a number of legal challenges to launching or using a private blockchain in the UK, including, in particular, risk and liability, intellectual property, data privacy, service levels and jurisdictional issues.

It is essential for the participants in a private blockchain to establish a comprehensive and cohesive contractual framework that clearly allocates risk and liability between the parties under each arrangement, in particular:

  • the relationship between participants: the participants will be determined through the governance framework established for the blockchain (and will generally include the node operators) – the relationship between them could take the form of a joint venture, a partnership or a simple contractual relationship between the participants;
  • the arrangements for the technical operation of the blockchain: these will take the form of network/platform agreements, technology licensing or service agreements;
  • the use of the platform by end users: this could take the form of normal contractual platform terms and/or through the use of smart contracts.

All agreements or terms should be aligned so as to avoid any potential ambiguity or conflict. Given the issues over liability, any business adopting blockchain technologies should establish strong governance, risk management strategies and frameworks of control.

IP may be created in a number of elements making up a blockchain-based application, including the source and object code, the application programming interface as well as the graphic user interface and the database itself. Where the blockchain technology and/or its application is proprietary, consideration can be given to obtaining a patent. While there is a high bar for patentability, many businesses have obtained (or are seeking) patent protection for their implementation of the technology.

Participants and users must therefore ensure that they have appropriate contractual provisions in their agreements to secure ownership and/or rights of use in the blockchain technology and constituent elements of the relevant application. While open source is commonly used in the development of a blockchain application (in which case users must ensure they monitor and comply with the terms of use), participants will likely require tailoring to their specific business needs. The challenge in this scenario is that participants are likely to want IP ownership wherever possible while vendors catering for blockchain applications are increasingly looking to retain it and license its use in order to maximise the return on their investment. In some specialist sectors (such as the financial services sector) businesses are collaborating on developments and sharing IP rights.

In many cases, a generic blockchain will be used by participants to register many different kinds of documents and transactions, involving both non-personal data and personal data. Where personal data is involved, the application of data protection regimes raises issues, in particular those listed below.

  • Data protection jurisdiction: the data protection law applicable to a transaction may not correspond to the contractual law and may need to be established on a transaction-by-transaction basis.
  • Data Controller v Data Processor: the UK data protection regime imposes different obligations on Controllers and Processors (see section 6 Key Data Protection Principles, below). Identifying the roles of each participant involved in the blockchain network is challenging, and in many cases there are likely to be joint Controllers and parties who are both Controllers and Processors.
  • Security: there are risks that the data might be accessed inappropriately, and data breaches are required to be reported.
  • Rights to correction and erasure: reconciling the immutability of blockchain records with data protection legislation which grants a right to data subjects to correction and erasure of their personal data is challenging. However, it is possible to roll back transactions in a private blockchain and consideration may need to be given to enable this in order to ensure compliance.

Careful consideration should be given to privacy governance arrangements (including roles of the participants, off-chain storage of data, controlling permissible access), and to setting these out in the agreements between the parties.

For further details on the UK data protection regime, see section 6 Key Data Protection Principles, below.

Ensuring sufficient levels of service in relation to blockchain technology, any other relevant technology being used for its implementation and the network as a whole is essential for the successful commercialisation of a blockchain-based business. The normal contractual framework in terms of service levels in relation to the licensing and/or provision of technology-related services applies equally to blockchain. The blockchain network terms of use should address the service levels guaranteed for the users, the allocation of risk, liability relating to the failure to meet those terms and the consequences of any such failure. In addition, the allocation of responsibility among the blockchain network participants for monitoring compliance and managing the relevant service providers should be agreed in advance.

Choice of law and dispute resolution should be considered at the outset given the cross-border nature of blockchain. The parties should include an express governing law clause in their arrangements. However, applicable local requirements in other jurisdictions may nevertheless apply dependent on, for example, the location of the participants and end users, or where the activities and assets of the business are located. In the UK, consumer and data protection law would apply, for example. As a result, following the most restrictive local rules may be the most cautious approach. A more practical approach may be employing geo-blocking or similar technological measures. Ultimately, a risk-based approach will need to be taken and all agreements and terms should be clear as to what laws apply.

The issues around big data, AI and machine learning are challenging, and any project will involve an inherent level of legal risk.

See 3.3 Artificial Intelligence, below.

Artificial intelligence (AI) systems are (in the most basic terms) designed to ingest substantial amounts of data (input data) through the operation of software (incorporating complex algorithms/code) and subsequently make informed decisions in respect of output data (AI-generated work), without the need for human input. In machine learning it may be impossible to understand how the AI arrived at a particular decision.

Causation, Liability and Insurance

Attribution for liability is traditionally driven by causation (ie, once the cause has been determined, blame can be assigned) whether in contract or tort (and up to a point in consumer protection law). This will be clear in relation to machine decisions traceable back to defective programming or incorrect operation, but becomes challenging given the 'black box' nature of AI, meaning causation is often inexplicable or can’t be traced back to human error.

No legislation has been introduced in the UK to clarify the AI liability regime, other than in relation to fully autonomous driving (under the Automated and Electric Vehicles Act 2018) where it has been legislated from the perspective of gaps in currently available insurance. In the absence of any agreement between the parties, the courts will need to determine fault among product manufacturers/sellers, AI designers/suppliers and AI purchasers/users. The best approach, therefore, is to consider contractual warranties, indemnities and limitations for each of these organisations.

The application of the Consumer Protection Act 1987 to AI-embedded devices will also raise issues. The European Commission (EC) is currently reviewing the Product Liability Directive (which is the basis of the current UK legislation) in order to reflect the developments of digital technologies, including AI. The uncertainty over Brexit, however, means that it is not clear whether any such changes would be brought into force in the UK.

In the light of the potential uncertainties, in addition to contractual remedies, companies should consider how to demonstrate their AI's decision-making process. This includes consideration of how to document and prove that a function was performed, or a decision was made, as a result of reasonable programming that at the time met current industry standards or best practices. Insurance coverage should also be sought where available.

Intellectual Property

It is well established that algorithms and software may, in principle, be protected under the law of copyright and as confidential information. Patenting of software-related inventions is more difficult in Europe than the USA, due to the European exclusion from patentability of computer programs 'as such', but it is nevertheless possible to obtain patent protection for inventions embodied in software. It is recommended that ownership and right-to-use issues are, where possible, clarified by appropriate contractual provisions, rather than relying on statutory principles which can be difficult to apply in practice.

There are particular issues around the subsistence and ownership of IP in works generated by AI. So far as copyright is concerned, it is unclear how the concept of originality applies to such works, particularly under European law, with its insistence that copyright works be 'the author’s own intellectual creation', which suggests that a human author is required. Determining the ownership of copyright in AI-generated works is also potentially problematic, with a few jurisdictions, such as the UK, making express provision for the ownership of 'computer-generated' works, while in others the absence of a human author may make it difficult (if not impossible) to identify the owner. Again, express contractual provisions dealing with such situations are highly recommended.

Privacy

In the context of use of an AI system, large amounts of personal data may be collected (eg, to profile users and produce predictions on their behaviours). This was given due consideration as part of the development of the UK data protection regime (for further details see section 6 Key Data Protection Principles, below). As such, a business must ensure it has an appropriate data governance framework in place to maintain its compliance with the regime in the design and implementation of its AI system, including appropriate fair processing notices to data subjects to comply with transparency requirements.

Bias and Non-discrimination

The tension between the advantages of AI systems and risks they may present for fundamental rights are prevalent in the fields of non-discrimination.

Non-discrimination is a fundamental right and protects individuals from discrimination on such grounds as sex, race, colour, language and political opinion and representative input data. It has been seen that such data sets can (often inadvertently) contain bias or are skewed in a particular direction which will discriminate against individuals. The lack of diversity and inclusion in such AI systems is therefore a key concern. Businesses must ensure that input data is accurate and representative, and that they have the appropriate mechanisms in place to recognise and address bias within their AI system. They should also consider putting in place diverse AI technical teams (as recommended by the European Union’s Ethics Guidelines, discussed below).

Ethics Guidelines

The European Union (EU) (specifically the high-level expert group on AI) has releasedEthics Guidelines in relation to Trustworthy AI, covering three main components which should be met throughout the system's entire life cycle, namely that the AI should be lawful, ethical and robust. These non-binding guidelines set out a human-centric approach to the design and implementation of Trustworthy AI.

The guidance covers four ethical principles being (1) respect for human autonomy, (2) prevention of harm, (3) fairness and (4) explicability, and seven key requirements to realise Trustworthy AI, being (1) human agency and oversight, (2) technical robustness and safety, (3) privacy and data governance, (4) transparency, (5) diversity, non-discrimination and fairness, (6) environmental and societal well-being and (7) accountability.

From a practical perspective, the guidelines state that an organisation should, during its development, deployment and use of AI systems, put in place the non-exhaustive Trustworthy AI assessment list and adapt it to the specific use case in which the system is being applied; noting that a revised assessment list may be released in 2020 following further feedback on the guidelines. An organisation should continuously identify and implement the seven requirements, evaluate solutions and improve outcomes throughout the AI system's life cycle and involve stakeholders in this process. From a reputational perspective, organisations may want to consider signing up to the guidelines to promote customer confidence in their AI systems.

The Internet of Things (IoT) is essentially a network of devices (such as home appliances, electricity meters, cars) that are connected and sense, actuate, interact and exchange data. While there is currently no specific IoT regulation which governs the use or provision of IoT devices and services in a consolidated manner, the sponsors of any IoT project will need to consider the application of the regulatory regimes which apply to its key elements namely devices, connectivity networks and data.

Standards

The IoT is in a state of development and one of the key obstacles to the growth of the ecosystem (eg, the creation of smart cities and smart homes) is the lack of common technical standards for interoperability of the devices and the connectivity networks, as well as in relation to data management and security. The EC is seeking to address this in its Rolling Plan For ICT Standardisation, although it is still a work in progress. Any standards developed in this context, if made compulsory, would need to be complied with. Post-Brexit, British Standards developed by the British Standards Institution (BSI) would apply (however, these are likely to reflect European Standards as the BSI has secured a continued role in the relevant European Standards organisations post-Brexit).

In addition, the EC is currently considering introducing an EU-wide voluntary security certification standard for IoT devices to address consumer trust and confidence issues. In the light of Brexit, it is not clear whether the UK will adopt an equivalent certification programme.

IoT Devices

Depending on the devices being connected, a number of obligations will apply in relation to their design, manufacture, import, distribution and sale on the UK market. For further details, see section 8 Scope of Telecommunications Regime, below.

Connectivity Networks

IoT devices are generally connected through wireless networks (although they can be connected by cable) and the operator of the network or provider of the connectivity service (such as mobile or low power/wide area (LPWA) Wi-Fi network services, for example), is subject to obligations under the Electronic Communications regulatory regime. For further details, see section 8 Scope of Telecommunications Regime, below.

Data Privacy and Cybersecurity

IoT devices generate and transmit big data processed in the cloud and this raises additional data privacy and security risks. To the extent that personal data is involved, the UK data protection regime will apply (see section 6 Key Data Protection Principles, below). However, there are additional requirements in relation to the security of data where the data is transmitted over connectivity networks. These are set out in the PECR (for further details on the PECR see 8. Scope of Telecommunications Regime, below).

In addition, the Department for Digital, Culture, Media & Sport has published a Code of Practice for Consumer IoT Security (October 2018). The code is non-binding but sets out guidelines considered good practice for IoT security. These include ensuring the secure storage of credential and sensitive data, prompt software updates and resilience of IoT devices to network and power outages. An IoT provider should consider implementing these.

Product Liability

As the ecosystem of interconnected devices grows, there is a recognition that the source of defects and faults will become difficult to identify and, as a result, to determine where liability lies. The usual rules of contract and negligence will apply alongside the Consumer Protection Act 1987 but their applications are not always clear. The EC is keen to see changes to the Product Liability Directive 1985 to reflect the development of digital technologies, including in IoT technology. The uncertainty over Brexit means that it is uncertain whether any such changes would be brought into force in the UK.

An organisation procuring an IT solution to assist in the development of their business can face numerous challenges. With the rise to prominence of digitalisation across all industries, there are a number of key developments in the landscape of technology procurement which have brought about change. In particular, the prominence of cloud computing, the need for speed to market, market fragmentation with the advent of tech vendor start-ups, the rate of technological change and the increasing complexity of IT ecosystems. The combination of these factors is creating a shift towards shorter procurements, procurement of software through 'XXX as a Service' solutions and service integration and management (SIAM) contracts.

Most challenges and risks can be mitigated by careful consideration of the organisation’s current and future business requirements during the scoping and negotiation phase.

In particular, the organisation should consider:

  • the purpose and nature of the software or solution being procured;
  • how it will/needs to integrate into and interface with the existing landscape;
  • how the software/solution will be used by the business;
  • whether the software/solution will be business critical;
  • who will use the software/solution, and where they will be located;
  • whether the software/solution will need to be accessed by other related companies or third parties; and
  • what the future business and technology requirements are likely to be.

The answers to the questions above will directly inform the best method of procurement.

Licensing Model

The traditional licensing model where an organisation pays a periodic fee under a licence agreement for software which it retains on premises is increasingly less common. Software suppliers are finding greater traction in the software as a service model (SaaS), whereby centrally hosted software (and possibly other IT infrastructure) is licensed on a subscription basis or as part of a service solution. The SaaS model does not necessarily require the actual use of the software by the customer since it can enter into a service agreement with the supplier which is fulfilled by the use of the software application. In a SaaS model the customer’s data is transferred to, and held by, the supplier, raising risks around data confidentiality, security and privacy for personal data, all of which need to be considered in the contractual terms.

Key Issues

While the focus of the procurement has changed, the key issues remain relatively constant. The agreement should contemplate and address the key issues listed below.

  • Licensing: the nature and scope of the licence will be key – fixed term or indefinite? Enterprise or a per-user charge? Separately licensed or integrated in an 'as a service' model? The agreement might be for a licence or a service, or a blend of the two.
  • Customisation, integration, development: how the software is to be adapted for the organisation’s infrastructure, the degree of customisation or development, the manner of that customisation and its time frame, and consequences for failure all need to be provided for.
  • Hosting: where will the application reside (cloud vs on premises)? What is the impact of the difference on the customer’s risk profile? Will personal data reside in the EEA? If cloud is featuring, is the customer able to segregate liability and other provisions specific to cloud from the wider relationship with the IT vendor?
  • Maintenance and support: what technical support will be provided by the supplier? Will there be a warranty period before maintenance charges kick in? Consider knowledge transfer, training, helpdesk, support response times (agreed SLAs), service availability uptime v interruptions, notice and planning for scheduled maintenance, and exit support in case of unforeseen termination. Again, can cloud terms be considered separately, or is the cloud element integral to the SLA?
  • Data protection regime: if personal data is to be provided to the supplier, then compliant provisions will need to be included in the agreement. Determine the data flows and the processing activities to ensure that terms are customised to the specific scenario. For further details see 'The UK Data Protection Regime' in section 6 Key Data Protection Principles, below.
  • IP Infringement, data liability and other liabilities: an uncapped infringement indemnity should be provided by the supplier in the event an IP infringement claim is brought by a third party. Data security, privacy and confidentiality liability should also be treated separately from standard positions; in particular, where there is a high degree of reliance on the supplier’s system or services for the conduct of the customer’s core business. Other liabilities will generally be capped at between 100 and 200% of annual fees. Consider whether this is an effective remedy, given the counterparty’s covenant strength – might the customer need other protections (eg, an escrow)?
  • Usage rights: what are the limitations on use (eg, number of users and sites where software can be accessed or installed)?
  • Term/termination: consider, in particular, whether specific termination rights might be needed in respect of any failed project for customisation or development. Consider what termination would actually mean for the customer, and what assistance it would need at the end of the term to continue its business effectively.
  • Fee structure: enterprise vs individual licences. How will the cost and any increases be calculated?

As businesses adapt to the digital world and become increasingly tech-centric, IT agreements become substantially more complex.

It is fundamental that the scope of the IT service agreement is clearly defined, the key issues are addressed and that the terms are well drafted.

See 5.1 Specific Features, above.

The UK data protection regime regulates the treatment of information which relates to an identified or identifiable natural person (personal data) by entities that determine the means and purposes of processing (controllers), and entities that carry out processing on behalf of a controller (processors). It does not apply to corporate data.

The purpose of this section is restricted to explaining the core principles underlying the UK data protection regime.  There are, however, other sectoral laws which also apply to the processing of personal data and data more generally, including in the telecoms industry and in relation to interception (for further details see section 8 Scope of Telecommunications Regime, below) as well as freedom of information requests.

The UK Data Protection Regime

In the UK, the data protection regime is primarily set out in the General Data Protection Regulation 2016/679 (EU) (GDPR) and the Data Protection Act 2018 (DPA 2018).

Application and Scope

Personal Data is any information which relates to an identified or identifiable natural person (a data subject). This is a broad definition but specific examples would include an individual’s name, identification number, location data, online identifiers or factors specific to their physical, physiological, genetic, mental, economic, cultural or social identity, and is assessed objectively.

Controllers and Processors: the regime imposes direct obligations on both Controllers and  Processors. Processors have direct liability under the DPA 2018 in relation to any processing in breach of obligations specifically imposed on Processors under the DPA 2018, or caused by processing that is outside, or contrary to lawful instructions of the Controller. Controllers/Processors will generally allocate liability as between them in a contract.

Processing: the regime governs processing of personal data by 'automated means' (wholly or partly) or non-automated processing, where the personal data forms (or is intended to form) part of a filing system. Processing is defined broadly, capturing most activities conducted on personal data, including collecting, recording, organising/structuring, storing, adapting, altering, retrieving, using, transmitting/sharing, combining, restricting, erasing or destructing the data.

No response provided.

In the UK the DPA 2018 and the GDPR regulate the use of personal data (as defined above), including employee personal data.

Non-personal data (ie, any data which does not fall within the definition of personal data under Article 4 of the GDPR) is regulated by the  European Parliament and Council’s Regulation (EU) 2018/1807 on a framework for the free flow of non-personal data in the European Union (which will have effect from May 2019) and subject to the type of data, other sectoral laws may also be applicable.

The regulatory framework of non-personal data is not the focus of this summary.

Processing of personal data must meet the fundamental data protection principles listed below.

  • Lawfulness: Controllers must have a lawful basis to process personal data. These are consent, performance of a contract with the data subject, compliance with a legal obligation, protection of vital interests, public interest or legitimate interest of the Controller or a third party.
  • Transparency and fairness: Controllers must clearly notify data subjects of how their personal data is processed. In addition, Controllers must consider what the data subject would reasonably expect in relation to the use of his or her data.
  • Purpose limitation: Controllers must only use personal data for those purpose(s) notified to the data subject at the time their data is collected, or (where certain requirements are met) for new purposes that are not incompatible with the original processing.
  • Data minimisation: Controllers must only collect/process personal data that is relevant and necessary for the purpose(s) for which it is processed.
  • Accuracy: personal data must be kept up to date and accurate. Controllers (and Processors where storing personal data on behalf of Controllers) must have appropriate technical mechanisms to enable inaccurate data to be amended.
  • Storage limitation: personal data must be kept only for the period necessary for the purpose for which it is processed.
  • Integrity and confidentiality: Controllers and Processors must implement appropriate technical and organisational measures to protect the integrity and security of personal data against unauthorised or unlawful processing and against accidental loss, destruction or damage.

Controllers must demonstrate compliance with the above principles. In practice this means understanding what personal data they process, and creating and following internal data governance processes and procedures to enable compliance. The GDPR sets out the specific obligations of a Controller or Processor (including in relation to mandatory breach reporting requirements, international transfer restrictions, requirement for technical and organisational measures to protect personal data, maintain registers of processing, implementing policies, etc), and provides a framework for enforcement and data subject rights.

International Data Transfers

Under the GDPR, transfers of personal data to countries outside the EEA are regulated and restricted in some cases. Transfer of Personal Data can be made to a recipient in a third country about which the Commission has made an 'adequacy decision' (ie, it has decided the country has an adequate level of data protection) or if the data exporter has implemented appropriate safeguards (commonly in the form of model clauses and/or binding corporate rules) or an exemption or derogation otherwise applies.

Post-Brexit

The UK government has confirmed that post-Brexit the DPA 2018 will remain in place, and the GDPR would be implemented into UK law. Nevertheless, issues will also need to be dealt with in relation to international data transfers, authorised representatives and EU cross-border infringement/enforcement action.

There is now a vast array of tools available to businesses to monitor the use of company IT resources by staff during work hours.

Legitimate reasons for deploying such tools are numerous, including promoting productivity, detecting data breaches and gauging staff well-being. However, the law regarding monitoring employee use of computer resources is complex, and businesses need to give careful consideration to a number of issues before proceeding to engage monitoring tools in their day-to-day operations.

The UK does not have dedicated laws in place concerning employer monitoring of employee activities and it is neither expressly permitted nor prohibited. Instead, the legal position derives from a number of different sources. This section focuses on the interrelated matters of the right to privacy under the European Convention on Human Rights (ECHR), and the restrictions imposed on monitoring by the UK data protection regime.

The ECHR establishes an individual right to respect for private and family life, home and correspondence. While this right is not absolute, it is a countervailing factor that needs to be balanced against the interests that an employer might be seeking to pursue by undertaking monitoring activities.

The implementation of monitoring practices requires the careful consideration of a number of factors, such as the intrusiveness of the activity on employees’ privacy and the extent to which employees have been informed about the nature and scope of the activity.

On the assumption that monitoring will almost certainly involve processing the personal data of employees, the UK data protection regime is relevant to monitoring activities because it imposes obligations on how and when such processing can occur (for further details on the DPA 2018, see 6 Key Data Protection Principles, above).

The fundamental principle that such processing can only be carried out where there is a lawful basis for it needs to be considered. In the context of monitoring activities, the principal legal bases on which an employer is likely to rely are that the processing is necessary for one of the following reasons:

  • for the purposes of the employer's legitimate interests; or
  • for compliance with a legal obligation to which the employer is subject.

The first reason permits processing so long as those legitimate interests are not overridden by the interests of the employees. The second reason would apply where it is necessary for an employer to monitor an employee’s e-mails in order to comply with its legal obligations; for example, for the purpose of determining whether there had been a breach of insider trading laws. However, it would be unusual if such a reason could be applied for systematic monitoring, as opposed to as a short-term measure in response to a particular issue.

In order to take account of the right to privacy and data protection issues referred to above, the following key matters should be considered by any business looking to introduce monitoring activities:

  • whether the employer has provided legitimate reasons to justify why the monitoring is necessary;
  • the extent of the monitoring, and the degree of intrusion into the employee’s private life that it will cause (eg, monitoring the content of communications in contrast to monitoring the flow of communications);
  • whether employees have been appropriately notified of the possibility of their use of computer resources being monitored (ie, through policies/procedures/privacy notices);
  • whether less intrusive means are available that would enable the employer to achieve its aims; and
  • whether 'special category data' may be processed as a result of the monitoring, as more prescriptive conditions apply to the processing of such data.

Thorough consideration of the above matters prior to introducing any monitoring practices, including the performance of a data protection impact assessment, should help to ensure that businesses are able to reap the benefits of monitoring tools without falling foul of the legal protections for individuals that are in place.

The UK telecoms rules were intentionally designed to be technology neutral, although certain aspects only apply specifically to wireless technologies. It is rather the nature of the activities undertaken in relation to the devices/equipment, networks and services which define what requirements need to be complied with.

Requirements Prior to Bringing a Product/Service to the Market

The manufacturers, distributors and sellers of equipment and devices (whether telecoms network equipment or consumer devices such as smart phones, RFID tags and digital assistants) are required to comply with a number of regulations; in particular, the Water and Electrical and Electronic Devices Regulations 2012 (WEEE) and the Radio Equipment Regulations 2017 (RER).

A business that places telecoms and IT electrical and electronic equipment (and certain other equipment) on the UK market for the first time by importing, manufacturing, rebranding or the selling to end users in the UK, will be subject to obligations (set out in the WEEE) in relation to the financing of the collection, recovery and recycling of waste. It will also be required to register with the Environment Agency directly or via a Producer Compliance Scheme.

In addition and specifically in relation to wireless equipment, while no direct regulatory approval is required, the RER imposes a number of requirements on manufacturers in relation to the design and manufacture of the equipment. Before bringing a product onto the market, a manufacturer must:

  • engage a Conformity Assessment Body to carry out an assessment on the radio equipment proposed to go to market – this body will issue an 'EU-type examination certificate' if the radio equipment is compliant;
  • ensure the equipment is designed to meet certain requirements in relation to harmful interference and electromagnetic compatibility, and have operated the equipment in at least one EU Member State without infringing the applicable spectrum requirements in that state;
  • draw up (and keep) specified technical documentation (including an EU declaration of conformity), label the equipment with a 'CE' certification and manufacturer details, as well as provide instructions for use and safety.

The RER also sets out separate obligations for authorised representatives of manufacturers, importers and distributors of the particular radio equipment. While there are clearly costs associated with compliance with RER, no regulatory fees apply.

Network and Service Provider Obligations

Whatever the type of technologies used, if an organisation is operating a communications connectivity network or providing certain services over a network, the Electronic Communications (ECS) regulatory regime will apply. This is principally enshrined in the UK in the Communications Act 2003 and the Wireless Telegraphy Act 2006 (the Acts), the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), the Security of Network & Information Systems Regulations 2018 (NIS Regulations) and interception laws (the Regulation of Investigatory Powers Act 2003 (RIPA) and the Investigatory Powers Act 2016 (IPA)).

Scope of Regulated Networks and Services

It will be important to first identify whether the activities of the organisation fall within the scope of the regime. Essentially, regardless of the specific technologies used to provide a network or service, it will be regulated under this regime if it is an Electronic Communications Network or an Electronic Communications Service.

Electronic Communications Network or ECN: in simple terms, this is a transmission system for conveying messages ('signals') of any kind. ECNs cover a wide range of networks using different technologies, including wireless networks (such as mobile or Wi-Fi networks) and cable networks (eg, IP broadband networks) whether they are operated for public or private use.

Electronic Communications Service or ECS: this is a service, the principal feature of which is the conveyance of messages by means of an ECN. The definition of an ECS excludes services which are content services and information society services (eg, YouTube, online shopping or information services). An ECS will include phone calls, VoIP services such as Skype and text but also messaging services such as instant messaging, WeChat services and Instagram.

No individual licences are required unless the technology operates via wireless. All operators and service providers are automatically authorised to provide ECNs and ECSs and there is no requirement to notify the regulator, Ofcom. While there are no upfront costs to launching an ECN and ECS, an annual administrative charge is payable if the provider’s relevant revenues meet a specified threshold (currently set at GBP5 million). If an operator requires the use of wireless frequencies then additional requirements in terms of licences, timing and fees may apply (under the Wireless Telegraphy Act 2006). However, use of frequencies to provide certain services such as Wi-Fi and RFID is licence-exempt.

Obligations on Providers

ECN and ECS providers are subject to a number of obligations under the regime. The principal obligations on providers are as listed below.

General Conditions of Entitlement: an ECN or ECS provider, even though not issued with a licence, must nevertheless comply with certain requirements specified by Ofcom. These are the 'General Conditions of Entitlement' (GCs). The GCs include a number of obligations, such as network functioning (including the use of standardised equipment and specifications), consumer protection provisions (including access to emergency services) as well as numbering and technical conditions. These will apply differently dependent on the type of network and/or end user. If the GCs are not complied with, Ofcom can prohibit the operator from continuing to provide services.

Privacy obligations: the ECS sits alongside the obligations imposed under the general UK data protection regime. These are specific rules which apply in relation to marketing calls, e-mails, texts, cookies and similar technologies, keeping communications services secure as well as customer privacy as regards traffic and location data, itemised billing, line identification and directory listings. In particular, under PECR appropriate technical and organisational measures must be taken to safeguard the security of the service and customers are required to be informed of any significant security risks. The EC is proposing to replace PECR but, in the light of Brexit, it is not certain whether the UK would implement or adopt any replacement.

Providers of essential services: obligations are imposed under the NIS Regulations on operators of essential services (including water, electricity, transport, health care and financial services) who rely on networks and IT systems to provide their services. They are required to comply with more stringent security measures given the nature of their services, as well as more stringent breach reporting requirements and there are substantial fines for breach of the regulations.

Interception: telecom operators are subject to specific obligations in relation to interception, primarily under the RIPA and the IPA. These include obligations in relation to compliance with authorised requests to intercept communications over their networks, as well as for the retention of, and access to, communications data. Note that following two cases referred to the Court of Justice of the European Union, RIPA and the IPA need to be amended to ensure compliance with EU law. The government is intending to issue a regulation to achieve this in April 2019.

The main requirements for providing TV and radio broadcasting (as well as 'TV-like' on-demand services) from the UK are set out in the Broadcasting Acts 1990 and 1996 and the Communications Act 2003. These services are principally regulated by Ofcom.

The type of regulation which applies to an audiovisual media service depends upon the type of service being provided. The BBC is principally regulated through the Royal Charter, under which it is constituted, and an Agreement with the Secretary of State for Digital, Culture, Media and Sport. These give Ofcom responsibility for regulating the content standards of the BBC's TV, radio and on demand services. Other terrestrial public service TV channels (including Channels 3, 4 and 5) are licensed by Ofcom.

There are also a variety of other television broadcasting activities licensed by Ofcom. These include the operation of a Digital Terrestrial Television Multiplex (Mux), Digital Terrestrial Television Programme Services (DTPS), Local Digital Terrestrial Programme Services (L-DTPS) and Television Licensable Content Services (TLCS) (ie, linear television channels provided via satellite, radio multiplexes and electronic communications networks, such as cable). A parallel licensing structure exists for radio. Applicants for Ofcom licences must be 'fit and proper' and must not fall within one of the excluded categories in Schedule 2 of the Broadcasting Act 1990. Ofcom licences bring with them obligations to comply with a range of rules, including rules in relation to editorial content.

The Audiovisual Media Services Directive 2010/13/EU (AVMS Directive) provides an EU-wide framework for the provision of TV broadcasting and on-demand services. In particular, it provides that, in order for an EU originated service to be authorised for reception throughout the EU, the service provider need only comply with the law and regulation of the country of origin. UK providers currently benefit from this country of origin rule.

Broadcasting Services

Mux licences and L-DTPS licences are granted under a process operated by Ofcom. DTPS and TLCS licences are generally available by application to Ofcom, though to qualify for a DTPS licence the applicant must have a carriage agreement in place with a Mux licensee. The time frame and cost for each type of licence varies. For example, Ofcom aims to issue TLCS and DTPS licences within 25 days of receipt of an application. The current application fee for both TLCS and DTPS licences is GBP2,500. The annual fee is revised annually, with the current minimum fee being GBP1,000 for the charging year.

On-demand and streaming services do not require broadcast licences; however, there is 'light touch' regulation of such services, where they qualify as On-Demand Programme Services (ODPS) (ie, where, among other things, they are 'TV-like' and there is editorial responsibility for their content). Where there is no editorial responsibility (eg, the platform consists entirely of non-curated, user-generated, video clips), then the service will be regulated under the general law (including regulation of advertising by the Advertising Standards Authority) and not Ofcom.

While there is no obligation to hold a licence, the provider of an ODPS is required to notify Ofcom before the service begins and also if the service closes or undergoes significant changes. The ODPS is subject to a more limited set of editorial rules than broadcast services and is, also, subject to co-regulation by the Advertising Standards Authority. Providers of ODPS are required to retain copies of programming and co-operate fully with Ofcom.

The EU’s Regulation on Cross-border Portability of Online Content Services in the Internal Market (2017/1128) (Portability Regulation), which has direct effect in the UK, allows a consumer who has paid for an ODPS service in the UK to access it when visiting another EU Member State (and vice versa).

Post-Brexit

The AVMS Directive and its country-of-origin principle will no longer apply to the UK if there is a no-deal Brexit. As a result, the UK government has advised service providers to assess whether their Ofcom licences would be acceptable in other EU Member States in which their content is made available and, also, whether services licensed in other EU Member States would still be lawfully available in the UK. Broadcasting services may be permissible to and from the 20 countries (including the UK) that have signed and ratified the European Convention on Transfrontier Television 1989, but that Convention is an imperfect solution. It is out of date and, for example, does not cover video-on-demand services.

In addition, following the UK's exit from the EU in a no-deal scenario, the Intellectual Property (Copyright and Related Rights) (Amendment) (EU Exit) Regulations 2018 provide that the Portability Regulation will not be preserved in UK law.

Encryption is the process of converting data into an unrecognisable form, thereby preventing unauthorised access. Data such as text, sounds or images is encoded so that it can only be accessed by those who have a key to unscramble it.

There are two commonly used methods of encryption; namely:

  • symmetric encryption: which uses the same key for encryption and decryption, meaning that securely transferring the key is imperative;
  • asymmetric encryption: which is more commonly known as End-to-End encryption and uses different keys. WhatsApp’s End-to-End encryption is designed to ensure that only the sender and person they are communicating with can read what is sent and nobody in between, including WhatsApp.

Legal Requirements for Use of Encryption

While there is no legal requirement within the UK to encrypt data, failure to do so may lead to breaches of data compliance and confidentiality. In particular, the UK data protection regime requires that appropriate technical and organisational measures are in place to protect data (see 6 Key Data Protection Principles, above). While encryption is specifically cited as an appropriate means to achieve this, it is not, however, a requirement. Therefore, companies need only consider any technological developments and costs involved when assessing whether to implement encryption. Companies should, in any event, consider encryption technology alongside other technical and organisational security measures to protect against specific risks related to managing data.

Is Encryption Appropriate?

A company’s individual circumstances will therefore be highly relevant in determining whether or not encryption is appropriate. This will be influenced by factors including the risks posed to individuals’ rights and freedoms if data is breached, the sort of processing that is undertaken and what technology is available to assist a company.

In particular, the GDPR’s citation in relation to encryption as an appropriate means to protect data suggests that encryption will be needed where the risks associated with protecting data are great enough. As an example, Heathrow Airport was last year fined by the Information Commissioner’s Office (ICO) for failing to secure 'sensitive personal data' after an employee lost a memory stick containing confidential information. The ICO’s head of enforcement highlighted encryption’s importance, stating: “If this data had been encrypted then the information would have stayed secure.”

Additionally, while there are best practice methodologies, including the International Standard 27001, there is no UK legislation setting out minimum or maximum standards for encryption, nor are there licensing requirements for providers of encryption products and services.

Companies should also be aware of any relevant sector-specific guidance which may require the use of encryption in particular circumstances. For instance, the FCA guidance on data security supports the ICO’s position that it is not appropriate for customer data to be taken offsite on laptops or other portable devices which are not encrypted. The FCA further expects data backup procedures to be regularly reviewed and threats considered from all angles.

Holding encrypted data does not exempt organisations from otherwise relevant laws. Under RIPA (see 8 Scope of Telecommunications Regime, above), certain law enforcement agencies can require those holding encrypted information to produce the data in an intelligible format or to provide the key.

Additionally, the IPA (see 8 Scope of Telecommunications Regime, above) allows the Secretary of State to impose a 'technical capability notice' on service providers where it is 'necessary' and 'proportionate', provided Judicial Commissioner authorisation is obtained. A notice could include an additional obligation to remove the encryption applied on communications.

Ashurst LLP

London Fruit & Wool Exchange
1 Duval Square
London E1 6PW
United Kingdom

+44 0 20 7638 1111

+44 0 20 7638 1112

Amanda.Hale@ashurst.com www.ashurst.com
Author Business Card

Law and Practice in UK

Authors



Ashurst digital economy group is a global team that operates from hubs in the UK, Australia and Hong Kong. With four partners and 15 qualified lawyers, the London group’s practice areas are digital economy strategic commercial transactions and regulation, data protection and IP, M&A, joint ventures and financings in the sector. The team of specialists understand the key constituencies of the digital economy that are transforming business across all sectors – digital infrastructure, cloud computing, new technologies and data protection. Ashurst advises across all industry sectors, with a key focus currently on TMT, FinTech, InfraTech and PropTech. Its clients include operators and vendors, corporates, banks and funds, as well as regulators and government institutions. Several members of the core team have worked in the industry for over 25 years, including a number who have worked inhouse at operators and banks, at the regulator and in tech start-ups.