Contributed By Ashurst
As there isn’t a set of consolidated cloud computing law and regulation which limits the use of cloud-based services in the UK, providers (cloud providers) and users (cloud users) will need to be aware of, and comply with, a patchwork of law and regulation.
Regulatory requirements arising from the use of cloud-based services can vary, with some industry sectors more heavily regulated than others. An example of a sector where use of the cloud is more heavily regulated is the financial services sector.
Outsourcings in financial services are regulated. Accordingly, where a cloud-based service used by a bank or other regulated financial institution amounts to a material outsourcing (broadly, an outsourcing of critical or important operational functions), additional regulatory requirements will apply.
These requirements include complying with obligations involving due diligence, supervision, record retention, risk management, audit and reporting on the cloud user. These requirements are set out in the Financial Conduct Authority’s (FCA) Handbook and the Prudential Regulation Authority’s Handbook, and guidelines are available in the form of FCA guidelines and European Banking Authority recommendations (which are due to be updated in Q1 2019).
Where a cloud-based service involves the processing of personal data (including its storage or transmission), the UK data protection regime will apply.
In this context, and where personal data is managed through the cloud service, the cloud user will be the Controller and typically the cloud provider will be the cloud user’s Processor of that data. It follows that the usual Controller and Processor duties under the GDPR (see ‘The UK Data Protection Regime’ in 6.1 Core Rules Regarding Data Protection) will apply to the cloud user and the cloud provider respectively.
The nature of many cloud services, particularly those which are provided on a SaaS model (see ‘Licensing Model’ in 5.1 Specific Features) or involve multi-tenant data centres and/or infrastructure, create singular challenges for cloud users in complying with a number of their GDPR requirements, in particular those listed below.
A UK cloud provider who offers cloud computing services in the UK is a digital service provider under the NIS Regulations and subject to its requirements (unless it is a micro or small business). For further details on the NIS Regulations, see 8 Scope of Telecommunications Regime. A cloud computing service for these purposes is any service which enables access to a scalable and elastic pool of shareable physical or virtual computing resources. It covers SaaS, IaaS and PaaS products, as well as e-mail and online storage services if they are scalable.
Businesses seeking to benefit from the efficiencies of multiple access and transaction settlement capabilities offered by blockchain technology will generally seek to operate a private blockchain due to the limitations and risks associated with public blockchains.
However, there are a number of legal challenges to launching or using a private blockchain in the UK, including, in particular, risk and liability, intellectual property, data privacy, service levels and jurisdictional issues.
It is essential for the participants in a private blockchain to establish a comprehensive and cohesive contractual framework that clearly allocates risk and liability between the parties under each arrangement, in particular:
All agreements or terms should be aligned so as to avoid any potential ambiguity or conflict. Given the issues over liability, any business adopting blockchain technologies should establish strong governance, risk management strategies and frameworks of control.
IP may be created in a number of elements making up a blockchain-based application, including the source and object code, the application programming interface as well as the graphic user interface and the database itself. Where the blockchain technology and/or its application is proprietary, consideration can be given to obtaining a patent. While there is a high bar for patentability, many businesses have obtained (or are seeking) patent protection for their implementation of the technology.
In many cases, a generic blockchain will be used by participants to register many different kinds of documents and transactions, involving both non-personal data and personal data. Where personal data is involved, the application of data protection regimes raises issues, in particular those listed below.
Careful consideration should be given to privacy governance arrangements (including roles of the participants, off-chain storage of data, controlling permissible access), and to setting these out in the agreements between the parties.
For further details on the UK data protection regime, see section 6 Key Data Protection Principles, below.
Choice of law and dispute resolution should be considered at the outset given the cross-border nature of blockchain. The parties should include an express governing law clause in their arrangements. However, applicable local requirements in other jurisdictions may nevertheless apply dependent on, for example, the location of the participants and end users, or where the activities and assets of the business are located. In the UK, consumer and data protection law would apply, for example. As a result, following the most restrictive local rules may be the most cautious approach. A more practical approach may be employing geo-blocking or similar technological measures. Ultimately, a risk-based approach will need to be taken and all agreements and terms should be clear as to what laws apply.
The issues around big data, AI and machine learning are challenging, and any project will involve an inherent level of legal risk.
See 3.3 Artificial Intelligence, below.
Artificial intelligence (AI) systems are (in the most basic terms) designed to ingest substantial amounts of data (input data) through the operation of software (incorporating complex algorithms/code) and subsequently make informed decisions in respect of output data (AI-generated work), without the need for human input. In machine learning it may be impossible to understand how the AI arrived at a particular decision.
Causation, Liability and Insurance
Attribution for liability is traditionally driven by causation (ie, once the cause has been determined, blame can be assigned) whether in contract or tort (and up to a point in consumer protection law). This will be clear in relation to machine decisions traceable back to defective programming or incorrect operation, but becomes challenging given the 'black box' nature of AI, meaning causation is often inexplicable or can’t be traced back to human error.
No legislation has been introduced in the UK to clarify the AI liability regime, other than in relation to fully autonomous driving (under the Automated and Electric Vehicles Act 2018) where it has been legislated from the perspective of gaps in currently available insurance. In the absence of any agreement between the parties, the courts will need to determine fault among product manufacturers/sellers, AI designers/suppliers and AI purchasers/users. The best approach, therefore, is to consider contractual warranties, indemnities and limitations for each of these organisations.
The application of the Consumer Protection Act 1987 to AI-embedded devices will also raise issues. The European Commission (EC) is currently reviewing the Product Liability Directive (which is the basis of the current UK legislation) in order to reflect the developments of digital technologies, including AI. The uncertainty over Brexit, however, means that it is not clear whether any such changes would be brought into force in the UK.
In the light of the potential uncertainties, in addition to contractual remedies, companies should consider how to demonstrate their AI's decision-making process. This includes consideration of how to document and prove that a function was performed, or a decision was made, as a result of reasonable programming that at the time met current industry standards or best practices. Insurance coverage should also be sought where available.
It is well established that algorithms and software may, in principle, be protected under the law of copyright and as confidential information. Patenting of software-related inventions is more difficult in Europe than the USA, due to the European exclusion from patentability of computer programs 'as such', but it is nevertheless possible to obtain patent protection for inventions embodied in software. It is recommended that ownership and right-to-use issues are, where possible, clarified by appropriate contractual provisions, rather than relying on statutory principles which can be difficult to apply in practice.
There are particular issues around the subsistence and ownership of IP in works generated by AI. So far as copyright is concerned, it is unclear how the concept of originality applies to such works, particularly under European law, with its insistence that copyright works be 'the author’s own intellectual creation', which suggests that a human author is required. Determining the ownership of copyright in AI-generated works is also potentially problematic, with a few jurisdictions, such as the UK, making express provision for the ownership of 'computer-generated' works, while in others the absence of a human author may make it difficult (if not impossible) to identify the owner. Again, express contractual provisions dealing with such situations are highly recommended.
In the context of use of an AI system, large amounts of personal data may be collected (eg, to profile users and produce predictions on their behaviours). This was given due consideration as part of the development of the UK data protection regime (for further details see section 6 Key Data Protection Principles, below). As such, a business must ensure it has an appropriate data governance framework in place to maintain its compliance with the regime in the design and implementation of its AI system, including appropriate fair processing notices to data subjects to comply with transparency requirements.
Bias and Non-discrimination
The tension between the advantages of AI systems and risks they may present for fundamental rights are prevalent in the fields of non-discrimination.
Non-discrimination is a fundamental right and protects individuals from discrimination on such grounds as sex, race, colour, language and political opinion and representative input data. It has been seen that such data sets can (often inadvertently) contain bias or are skewed in a particular direction which will discriminate against individuals. The lack of diversity and inclusion in such AI systems is therefore a key concern. Businesses must ensure that input data is accurate and representative, and that they have the appropriate mechanisms in place to recognise and address bias within their AI system. They should also consider putting in place diverse AI technical teams (as recommended by the European Union’s Ethics Guidelines, discussed below).
The European Union (EU) (specifically the high-level expert group on AI) has releasedEthics Guidelines in relation to Trustworthy AI, covering three main components which should be met throughout the system's entire life cycle, namely that the AI should be lawful, ethical and robust. These non-binding guidelines set out a human-centric approach to the design and implementation of Trustworthy AI.
The guidance covers four ethical principles being (1) respect for human autonomy, (2) prevention of harm, (3) fairness and (4) explicability, and seven key requirements to realise Trustworthy AI, being (1) human agency and oversight, (2) technical robustness and safety, (3) privacy and data governance, (4) transparency, (5) diversity, non-discrimination and fairness, (6) environmental and societal well-being and (7) accountability.
From a practical perspective, the guidelines state that an organisation should, during its development, deployment and use of AI systems, put in place the non-exhaustive Trustworthy AI assessment list and adapt it to the specific use case in which the system is being applied; noting that a revised assessment list may be released in 2020 following further feedback on the guidelines. An organisation should continuously identify and implement the seven requirements, evaluate solutions and improve outcomes throughout the AI system's life cycle and involve stakeholders in this process. From a reputational perspective, organisations may want to consider signing up to the guidelines to promote customer confidence in their AI systems.
The Internet of Things (IoT) is essentially a network of devices (such as home appliances, electricity meters, cars) that are connected and sense, actuate, interact and exchange data. While there is currently no specific IoT regulation which governs the use or provision of IoT devices and services in a consolidated manner, the sponsors of any IoT project will need to consider the application of the regulatory regimes which apply to its key elements namely devices, connectivity networks and data.
The IoT is in a state of development and one of the key obstacles to the growth of the ecosystem (eg, the creation of smart cities and smart homes) is the lack of common technical standards for interoperability of the devices and the connectivity networks, as well as in relation to data management and security. The EC is seeking to address this in its Rolling Plan For ICT Standardisation, although it is still a work in progress. Any standards developed in this context, if made compulsory, would need to be complied with. Post-Brexit, British Standards developed by the British Standards Institution (BSI) would apply (however, these are likely to reflect European Standards as the BSI has secured a continued role in the relevant European Standards organisations post-Brexit).
In addition, the EC is currently considering introducing an EU-wide voluntary security certification standard for IoT devices to address consumer trust and confidence issues. In the light of Brexit, it is not clear whether the UK will adopt an equivalent certification programme.
Depending on the devices being connected, a number of obligations will apply in relation to their design, manufacture, import, distribution and sale on the UK market. For further details, see section 8 Scope of Telecommunications Regime, below.
IoT devices are generally connected through wireless networks (although they can be connected by cable) and the operator of the network or provider of the connectivity service (such as mobile or low power/wide area (LPWA) Wi-Fi network services, for example), is subject to obligations under the Electronic Communications regulatory regime. For further details, see section 8 Scope of Telecommunications Regime, below.
Data Privacy and Cybersecurity
IoT devices generate and transmit big data processed in the cloud and this raises additional data privacy and security risks. To the extent that personal data is involved, the UK data protection regime will apply (see section 6 Key Data Protection Principles, below). However, there are additional requirements in relation to the security of data where the data is transmitted over connectivity networks. These are set out in the PECR (for further details on the PECR see 8. Scope of Telecommunications Regime, below).
In addition, the Department for Digital, Culture, Media & Sport has published a Code of Practice for Consumer IoT Security (October 2018). The code is non-binding but sets out guidelines considered good practice for IoT security. These include ensuring the secure storage of credential and sensitive data, prompt software updates and resilience of IoT devices to network and power outages. An IoT provider should consider implementing these.
As the ecosystem of interconnected devices grows, there is a recognition that the source of defects and faults will become difficult to identify and, as a result, to determine where liability lies. The usual rules of contract and negligence will apply alongside the Consumer Protection Act 1987 but their applications are not always clear. The EC is keen to see changes to the Product Liability Directive 1985 to reflect the development of digital technologies, including in IoT technology. The uncertainty over Brexit means that it is uncertain whether any such changes would be brought into force in the UK.
An organisation procuring an IT solution to assist in the development of their business can face numerous challenges. With the rise to prominence of digitalisation across all industries, there are a number of key developments in the landscape of technology procurement which have brought about change. In particular, the prominence of cloud computing, the need for speed to market, market fragmentation with the advent of tech vendor start-ups, the rate of technological change and the increasing complexity of IT ecosystems. The combination of these factors is creating a shift towards shorter procurements, procurement of software through 'XXX as a Service' solutions and service integration and management (SIAM) contracts.
Most challenges and risks can be mitigated by careful consideration of the organisation’s current and future business requirements during the scoping and negotiation phase.
In particular, the organisation should consider:
The answers to the questions above will directly inform the best method of procurement.
The traditional licensing model where an organisation pays a periodic fee under a licence agreement for software which it retains on premises is increasingly less common. Software suppliers are finding greater traction in the software as a service model (SaaS), whereby centrally hosted software (and possibly other IT infrastructure) is licensed on a subscription basis or as part of a service solution. The SaaS model does not necessarily require the actual use of the software by the customer since it can enter into a service agreement with the supplier which is fulfilled by the use of the software application. In a SaaS model the customer’s data is transferred to, and held by, the supplier, raising risks around data confidentiality, security and privacy for personal data, all of which need to be considered in the contractual terms.
While the focus of the procurement has changed, the key issues remain relatively constant. The agreement should contemplate and address the key issues listed below.
As businesses adapt to the digital world and become increasingly tech-centric, IT agreements become substantially more complex.
It is fundamental that the scope of the IT service agreement is clearly defined, the key issues are addressed and that the terms are well drafted.
See 5.1 Specific Features, above.
The UK data protection regime regulates the treatment of information which relates to an identified or identifiable natural person (personal data) by entities that determine the means and purposes of processing (controllers), and entities that carry out processing on behalf of a controller (processors). It does not apply to corporate data.
The purpose of this section is restricted to explaining the core principles underlying the UK data protection regime. There are, however, other sectoral laws which also apply to the processing of personal data and data more generally, including in the telecoms industry and in relation to interception (for further details see section 8 Scope of Telecommunications Regime, below) as well as freedom of information requests.
The UK Data Protection Regime
In the UK, the data protection regime is primarily set out in the General Data Protection Regulation 2016/679 (EU) (GDPR) and the Data Protection Act 2018 (DPA 2018).
Application and Scope
Personal Data is any information which relates to an identified or identifiable natural person (a data subject). This is a broad definition but specific examples would include an individual’s name, identification number, location data, online identifiers or factors specific to their physical, physiological, genetic, mental, economic, cultural or social identity, and is assessed objectively.
Controllers and Processors: the regime imposes direct obligations on both Controllers and Processors. Processors have direct liability under the DPA 2018 in relation to any processing in breach of obligations specifically imposed on Processors under the DPA 2018, or caused by processing that is outside, or contrary to lawful instructions of the Controller. Controllers/Processors will generally allocate liability as between them in a contract.
Processing: the regime governs processing of personal data by 'automated means' (wholly or partly) or non-automated processing, where the personal data forms (or is intended to form) part of a filing system. Processing is defined broadly, capturing most activities conducted on personal data, including collecting, recording, organising/structuring, storing, adapting, altering, retrieving, using, transmitting/sharing, combining, restricting, erasing or destructing the data.
No response provided.
In the UK the DPA 2018 and the GDPR regulate the use of personal data (as defined above), including employee personal data.
Non-personal data (ie, any data which does not fall within the definition of personal data under Article 4 of the GDPR) is regulated by the European Parliament and Council’s Regulation (EU) 2018/1807 on a framework for the free flow of non-personal data in the European Union (which will have effect from May 2019) and subject to the type of data, other sectoral laws may also be applicable.
The regulatory framework of non-personal data is not the focus of this summary.
Processing of personal data must meet the fundamental data protection principles listed below.
Controllers must demonstrate compliance with the above principles. In practice this means understanding what personal data they process, and creating and following internal data governance processes and procedures to enable compliance. The GDPR sets out the specific obligations of a Controller or Processor (including in relation to mandatory breach reporting requirements, international transfer restrictions, requirement for technical and organisational measures to protect personal data, maintain registers of processing, implementing policies, etc), and provides a framework for enforcement and data subject rights.
International Data Transfers
Under the GDPR, transfers of personal data to countries outside the EEA are regulated and restricted in some cases. Transfer of Personal Data can be made to a recipient in a third country about which the Commission has made an 'adequacy decision' (ie, it has decided the country has an adequate level of data protection) or if the data exporter has implemented appropriate safeguards (commonly in the form of model clauses and/or binding corporate rules) or an exemption or derogation otherwise applies.
The UK government has confirmed that post-Brexit the DPA 2018 will remain in place, and the GDPR would be implemented into UK law. Nevertheless, issues will also need to be dealt with in relation to international data transfers, authorised representatives and EU cross-border infringement/enforcement action.
There is now a vast array of tools available to businesses to monitor the use of company IT resources by staff during work hours.
Legitimate reasons for deploying such tools are numerous, including promoting productivity, detecting data breaches and gauging staff well-being. However, the law regarding monitoring employee use of computer resources is complex, and businesses need to give careful consideration to a number of issues before proceeding to engage monitoring tools in their day-to-day operations.
The UK does not have dedicated laws in place concerning employer monitoring of employee activities and it is neither expressly permitted nor prohibited. Instead, the legal position derives from a number of different sources. This section focuses on the interrelated matters of the right to privacy under the European Convention on Human Rights (ECHR), and the restrictions imposed on monitoring by the UK data protection regime.
The ECHR establishes an individual right to respect for private and family life, home and correspondence. While this right is not absolute, it is a countervailing factor that needs to be balanced against the interests that an employer might be seeking to pursue by undertaking monitoring activities.
The implementation of monitoring practices requires the careful consideration of a number of factors, such as the intrusiveness of the activity on employees’ privacy and the extent to which employees have been informed about the nature and scope of the activity.
On the assumption that monitoring will almost certainly involve processing the personal data of employees, the UK data protection regime is relevant to monitoring activities because it imposes obligations on how and when such processing can occur (for further details on the DPA 2018, see 6 Key Data Protection Principles, above).
The fundamental principle that such processing can only be carried out where there is a lawful basis for it needs to be considered. In the context of monitoring activities, the principal legal bases on which an employer is likely to rely are that the processing is necessary for one of the following reasons:
The first reason permits processing so long as those legitimate interests are not overridden by the interests of the employees. The second reason would apply where it is necessary for an employer to monitor an employee’s e-mails in order to comply with its legal obligations; for example, for the purpose of determining whether there had been a breach of insider trading laws. However, it would be unusual if such a reason could be applied for systematic monitoring, as opposed to as a short-term measure in response to a particular issue.
In order to take account of the right to privacy and data protection issues referred to above, the following key matters should be considered by any business looking to introduce monitoring activities:
Thorough consideration of the above matters prior to introducing any monitoring practices, including the performance of a data protection impact assessment, should help to ensure that businesses are able to reap the benefits of monitoring tools without falling foul of the legal protections for individuals that are in place.
The UK telecoms rules were intentionally designed to be technology neutral, although certain aspects only apply specifically to wireless technologies. It is rather the nature of the activities undertaken in relation to the devices/equipment, networks and services which define what requirements need to be complied with.
Requirements Prior to Bringing a Product/Service to the Market
The manufacturers, distributors and sellers of equipment and devices (whether telecoms network equipment or consumer devices such as smart phones, RFID tags and digital assistants) are required to comply with a number of regulations; in particular, the Water and Electrical and Electronic Devices Regulations 2012 (WEEE) and the Radio Equipment Regulations 2017 (RER).
A business that places telecoms and IT electrical and electronic equipment (and certain other equipment) on the UK market for the first time by importing, manufacturing, rebranding or the selling to end users in the UK, will be subject to obligations (set out in the WEEE) in relation to the financing of the collection, recovery and recycling of waste. It will also be required to register with the Environment Agency directly or via a Producer Compliance Scheme.
In addition and specifically in relation to wireless equipment, while no direct regulatory approval is required, the RER imposes a number of requirements on manufacturers in relation to the design and manufacture of the equipment. Before bringing a product onto the market, a manufacturer must:
The RER also sets out separate obligations for authorised representatives of manufacturers, importers and distributors of the particular radio equipment. While there are clearly costs associated with compliance with RER, no regulatory fees apply.
Network and Service Provider Obligations
Whatever the type of technologies used, if an organisation is operating a communications connectivity network or providing certain services over a network, the Electronic Communications (ECS) regulatory regime will apply. This is principally enshrined in the UK in the Communications Act 2003 and the Wireless Telegraphy Act 2006 (the Acts), the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR), the Security of Network & Information Systems Regulations 2018 (NIS Regulations) and interception laws (the Regulation of Investigatory Powers Act 2003 (RIPA) and the Investigatory Powers Act 2016 (IPA)).
Scope of Regulated Networks and Services
It will be important to first identify whether the activities of the organisation fall within the scope of the regime. Essentially, regardless of the specific technologies used to provide a network or service, it will be regulated under this regime if it is an Electronic Communications Network or an Electronic Communications Service.
Electronic Communications Network or ECN: in simple terms, this is a transmission system for conveying messages ('signals') of any kind. ECNs cover a wide range of networks using different technologies, including wireless networks (such as mobile or Wi-Fi networks) and cable networks (eg, IP broadband networks) whether they are operated for public or private use.
Electronic Communications Service or ECS: this is a service, the principal feature of which is the conveyance of messages by means of an ECN. The definition of an ECS excludes services which are content services and information society services (eg, YouTube, online shopping or information services). An ECS will include phone calls, VoIP services such as Skype and text but also messaging services such as instant messaging, WeChat services and Instagram.
No individual licences are required unless the technology operates via wireless. All operators and service providers are automatically authorised to provide ECNs and ECSs and there is no requirement to notify the regulator, Ofcom. While there are no upfront costs to launching an ECN and ECS, an annual administrative charge is payable if the provider’s relevant revenues meet a specified threshold (currently set at GBP5 million). If an operator requires the use of wireless frequencies then additional requirements in terms of licences, timing and fees may apply (under the Wireless Telegraphy Act 2006). However, use of frequencies to provide certain services such as Wi-Fi and RFID is licence-exempt.
Obligations on Providers
ECN and ECS providers are subject to a number of obligations under the regime. The principal obligations on providers are as listed below.
General Conditions of Entitlement: an ECN or ECS provider, even though not issued with a licence, must nevertheless comply with certain requirements specified by Ofcom. These are the 'General Conditions of Entitlement' (GCs). The GCs include a number of obligations, such as network functioning (including the use of standardised equipment and specifications), consumer protection provisions (including access to emergency services) as well as numbering and technical conditions. These will apply differently dependent on the type of network and/or end user. If the GCs are not complied with, Ofcom can prohibit the operator from continuing to provide services.
Privacy obligations: the ECS sits alongside the obligations imposed under the general UK data protection regime. These are specific rules which apply in relation to marketing calls, e-mails, texts, cookies and similar technologies, keeping communications services secure as well as customer privacy as regards traffic and location data, itemised billing, line identification and directory listings. In particular, under PECR appropriate technical and organisational measures must be taken to safeguard the security of the service and customers are required to be informed of any significant security risks. The EC is proposing to replace PECR but, in the light of Brexit, it is not certain whether the UK would implement or adopt any replacement.
Providers of essential services: obligations are imposed under the NIS Regulations on operators of essential services (including water, electricity, transport, health care and financial services) who rely on networks and IT systems to provide their services. They are required to comply with more stringent security measures given the nature of their services, as well as more stringent breach reporting requirements and there are substantial fines for breach of the regulations.
Interception: telecom operators are subject to specific obligations in relation to interception, primarily under the RIPA and the IPA. These include obligations in relation to compliance with authorised requests to intercept communications over their networks, as well as for the retention of, and access to, communications data. Note that following two cases referred to the Court of Justice of the European Union, RIPA and the IPA need to be amended to ensure compliance with EU law. The government is intending to issue a regulation to achieve this in April 2019.
The main requirements for providing TV and radio broadcasting (as well as 'TV-like' on-demand services) from the UK are set out in the Broadcasting Acts 1990 and 1996 and the Communications Act 2003. These services are principally regulated by Ofcom.
The type of regulation which applies to an audiovisual media service depends upon the type of service being provided. The BBC is principally regulated through the Royal Charter, under which it is constituted, and an Agreement with the Secretary of State for Digital, Culture, Media and Sport. These give Ofcom responsibility for regulating the content standards of the BBC's TV, radio and on demand services. Other terrestrial public service TV channels (including Channels 3, 4 and 5) are licensed by Ofcom.
There are also a variety of other television broadcasting activities licensed by Ofcom. These include the operation of a Digital Terrestrial Television Multiplex (Mux), Digital Terrestrial Television Programme Services (DTPS), Local Digital Terrestrial Programme Services (L-DTPS) and Television Licensable Content Services (TLCS) (ie, linear television channels provided via satellite, radio multiplexes and electronic communications networks, such as cable). A parallel licensing structure exists for radio. Applicants for Ofcom licences must be 'fit and proper' and must not fall within one of the excluded categories in Schedule 2 of the Broadcasting Act 1990. Ofcom licences bring with them obligations to comply with a range of rules, including rules in relation to editorial content.
The Audiovisual Media Services Directive 2010/13/EU (AVMS Directive) provides an EU-wide framework for the provision of TV broadcasting and on-demand services. In particular, it provides that, in order for an EU originated service to be authorised for reception throughout the EU, the service provider need only comply with the law and regulation of the country of origin. UK providers currently benefit from this country of origin rule.
Mux licences and L-DTPS licences are granted under a process operated by Ofcom. DTPS and TLCS licences are generally available by application to Ofcom, though to qualify for a DTPS licence the applicant must have a carriage agreement in place with a Mux licensee. The time frame and cost for each type of licence varies. For example, Ofcom aims to issue TLCS and DTPS licences within 25 days of receipt of an application. The current application fee for both TLCS and DTPS licences is GBP2,500. The annual fee is revised annually, with the current minimum fee being GBP1,000 for the charging year.
On-demand and streaming services do not require broadcast licences; however, there is 'light touch' regulation of such services, where they qualify as On-Demand Programme Services (ODPS) (ie, where, among other things, they are 'TV-like' and there is editorial responsibility for their content). Where there is no editorial responsibility (eg, the platform consists entirely of non-curated, user-generated, video clips), then the service will be regulated under the general law (including regulation of advertising by the Advertising Standards Authority) and not Ofcom.
While there is no obligation to hold a licence, the provider of an ODPS is required to notify Ofcom before the service begins and also if the service closes or undergoes significant changes. The ODPS is subject to a more limited set of editorial rules than broadcast services and is, also, subject to co-regulation by the Advertising Standards Authority. Providers of ODPS are required to retain copies of programming and co-operate fully with Ofcom.
The EU’s Regulation on Cross-border Portability of Online Content Services in the Internal Market (2017/1128) (Portability Regulation), which has direct effect in the UK, allows a consumer who has paid for an ODPS service in the UK to access it when visiting another EU Member State (and vice versa).
The AVMS Directive and its country-of-origin principle will no longer apply to the UK if there is a no-deal Brexit. As a result, the UK government has advised service providers to assess whether their Ofcom licences would be acceptable in other EU Member States in which their content is made available and, also, whether services licensed in other EU Member States would still be lawfully available in the UK. Broadcasting services may be permissible to and from the 20 countries (including the UK) that have signed and ratified the European Convention on Transfrontier Television 1989, but that Convention is an imperfect solution. It is out of date and, for example, does not cover video-on-demand services.
In addition, following the UK's exit from the EU in a no-deal scenario, the Intellectual Property (Copyright and Related Rights) (Amendment) (EU Exit) Regulations 2018 provide that the Portability Regulation will not be preserved in UK law.
Encryption is the process of converting data into an unrecognisable form, thereby preventing unauthorised access. Data such as text, sounds or images is encoded so that it can only be accessed by those who have a key to unscramble it.
There are two commonly used methods of encryption; namely:
Legal Requirements for Use of Encryption
While there is no legal requirement within the UK to encrypt data, failure to do so may lead to breaches of data compliance and confidentiality. In particular, the UK data protection regime requires that appropriate technical and organisational measures are in place to protect data (see 6 Key Data Protection Principles, above). While encryption is specifically cited as an appropriate means to achieve this, it is not, however, a requirement. Therefore, companies need only consider any technological developments and costs involved when assessing whether to implement encryption. Companies should, in any event, consider encryption technology alongside other technical and organisational security measures to protect against specific risks related to managing data.
Is Encryption Appropriate?
A company’s individual circumstances will therefore be highly relevant in determining whether or not encryption is appropriate. This will be influenced by factors including the risks posed to individuals’ rights and freedoms if data is breached, the sort of processing that is undertaken and what technology is available to assist a company.
In particular, the GDPR’s citation in relation to encryption as an appropriate means to protect data suggests that encryption will be needed where the risks associated with protecting data are great enough. As an example, Heathrow Airport was last year fined by the Information Commissioner’s Office (ICO) for failing to secure 'sensitive personal data' after an employee lost a memory stick containing confidential information. The ICO’s head of enforcement highlighted encryption’s importance, stating: “If this data had been encrypted then the information would have stayed secure.”
Additionally, while there are best practice methodologies, including the International Standard 27001, there is no UK legislation setting out minimum or maximum standards for encryption, nor are there licensing requirements for providers of encryption products and services.
Companies should also be aware of any relevant sector-specific guidance which may require the use of encryption in particular circumstances. For instance, the FCA guidance on data security supports the ICO’s position that it is not appropriate for customer data to be taken offsite on laptops or other portable devices which are not encrypted. The FCA further expects data backup procedures to be regularly reviewed and threats considered from all angles.
Holding encrypted data does not exempt organisations from otherwise relevant laws. Under RIPA (see 8 Scope of Telecommunications Regime, above), certain law enforcement agencies can require those holding encrypted information to produce the data in an intelligible format or to provide the key.
Additionally, the IPA (see 8 Scope of Telecommunications Regime, above) allows the Secretary of State to impose a 'technical capability notice' on service providers where it is 'necessary' and 'proportionate', provided Judicial Commissioner authorisation is obtained. A notice could include an additional obligation to remove the encryption applied on communications.