Over the past decade, many businesses have done away with traditional forms of software, hardware and infrastructure arrangements in favour of flexible and dynamic 'as a service' offerings.
Generally speaking, contracts in Australia are governed by the common law; however, there are a number of statutory laws, regulations and standards that (directly or indirectly) regulate the substance, validity or execution of contracts. These include:
There are also a number of industry-specific laws and standards in Australia that are particularly relevant to IT services agreements and cloud computing agreements. For example, banks and similar financial institutions, insurance companies and superannuation funds (ie, pension funds) will need to comply with certain mandatory regulatory standards that have implications for their IT and cloud services arrangements. These include:
Mandatory Laws or Specific Legal Exclusions
Unfair contract terms
Under the unfair contracts regime in the ACL, a term of a consumer contract or small business contract will be void if the term is unfair and the term is found in a standard form contract. This regime potentially applies to many cloud computing services. For instance, an end user licence agreement (EULA) fits all of the criteria relevant for the unfair contracts regime to apply, namely:
In a contract for cloud computing services, a term is unfair if it would cause a significant imbalance in the parties’ rights and obligations arising under the contract; it is not reasonably necessary to protect the legitimate interests of the party advantaged by the term; and it would cause detriment (whether financial or otherwise) if relied upon. Some specific examples of clauses that have been flagged by the Australian Competition and Consumer Commission (ACCC) as potentially being unfair include unilateral termination rights, unreasonable limitations of liability, wide indemnities and automatic renewal clauses.
The ACCC or an affected third party may commence proceedings and obtain a declaration that a term is unfair. If a term is declared unfair, it is void for contracts with all businesses or consumers which have suffered loss as a result of the unfair term. Each aggrieved party need not challenge the validity of that term. This can be a significant risk for businesses that have thousands of customers with the same standard form contract.
Non-excludable Consumer Guarantees
The ACL sets out some non-excludable consumer guarantees that apply to the supply of goods and services in Australia. Any attempt to contract out of these consumer guarantees is invalid (and can leave a party exposed to a claim of misleading and deceptive conduct), although a supplier can limit their liability for a breach of these consumer guarantees.
Under this regime, a person is a consumer when they buy any type of goods or services costing up to AUD40,000 or goods or services costing more than AUD40,000 which are normally used for personal, domestic and household use. This means that smaller value business contracts can be subject to this regime. Relevantly for the supply of cloud or other IT services, consumer guarantees include a guarantee that services will be provided with due care and skill, the services are reasonably fit for the purpose specified by the consumer and the services will be supplied within a reasonable time.
Processing Personal Information
In Australia, organisations that enter into a contract that involves the collection or disclosure of personal information should have regard to the Australian Privacy Principles (APPs), which form part of the Privacy Act 1988 (Cth) (Privacy Act). The Privacy Act and related regulations apply to Australian government agencies and to organisations that are registered in or conduct business in Australia with an annual revenue of more than AUD3 million, or that hold health information, trade in personal information or conduct a credit reporting business.
In contrast to the data protection laws of the EU, the Privacy Act does not distinguish between 'data controllers' and organisations that merely act as 'data processors' on behalf of other organisations. This difference can be challenging for European organisations entering into IT services agreements with Australian businesses. In Australia, all entities that meet one of the criteria set out above must comply with the full range of obligations under the Privacy Act, as set out in the data protection section of this guide.
Any organisation using an overseas cloud storage provider should consider whether the provision of data to that provider constitutes a cross-border disclosure under APP8, which would oblige the organisation to take reasonable steps to ensure that the cloud storage provider complies with the APPs, or whether it is simply merely a 'use' for which such steps are not required. This will depend on the nature of the particular arrangement with a cloud storage provider, including the extent to which the storage provider has any control over the data.
Notably, the 'use' exception has only been articulated in guidance published by the OAIC (rather than being expressly articulated in the statute). Given this, and because it can only be relied on in very limited circumstances, cloud provider and IT services agreements involving the offshore disclosure of personal information, often include, at a minimum, an obligation on the provider to comply with the APPs, protect the information and notify the organisation in the event of a data breach.
Securing Personal Information
The OAIC has published a Guide to Securing Personal Information to assist organisations to comply with APP 11, which requires that organisations take reasonable steps to protect personal information from loss, misuse and unauthorised access and disclosure (among other things). The guide suggests that organisations that enter into cloud computing arrangements should ensure that their contracts include security controls, reporting obligations, and measures to ensure that eligible and suspected data breaches are assessed and notified in accordance with the mandatory data breaches notification regime established under Part IIIC of the Privacy Act (known as the NDB scheme).
Where personal information has been disclosed to a cloud provider and that cloud provider suffers a data breach, both the discloser and the cloud provider will be required to assess the breach and (if it is an 'eligible' data breach), make the relevant notifications. That said, compliance by one entity will be taken to constitute compliance by both entities.
At its most basic, a blockchain can be described as a decentralised or distributed database, which provides a shared and verified record of transactions or data. There are many different kinds of 'blockchain' project, each with a unique set of legal issues. One fundamental legal consideration is whether a blockchain network is open/'public', or closed/private. A public blockchain is more likely to raise issues of confidentiality and privacy, as all data and transactions on the chain are viewable. If a blockchain requires confidentiality between network members or parties, then a private permissioned chain may be more appropriate, to ensure data is only viewable to a subset of people. Use of public blockchains also raises challenging issues about liability, as it can be difficult (or impossible) to identify any party responsible for the network and its operation.
It is also important to consider that blockchains are often transnational, and enable data to be held in multiple jurisdictions simultaneously. As such, legal jurisdiction for certain actions, or digital assets, can be challenging to identify. This is particularly the case in public chains, which often express themselves to be 'jurisdictionless'. The boundaries of the application of Australian law to blockchain activities therefore need to be considered on a project-by-project basis.
Recently, regulators in Australia have been closely considering the manner in which existing frameworks for banking, commodities, securities and consumer protection regulate the use of cryptocurrencies and crypto assets, which are often utilised in blockchain implementations or are native to blockchain networks. For example, recent amendments to the Anti-Money Laundering and Counter-Terrorism Financing Act 2006 mean that digital currency exchanges (ie, exchanges that convert cryptocurrency into fiat or vice-versa) are subject to obligations such as enrolling with AUSTRAC and conducting customer due diligence.
The corporate regulator, the Australian Securities and Investment Commission (ASIC), has been monitoring this space closely. To date, it has focused its attention on consumer protection and ensuring that any individuals or organisations using blockchain technology (and associated smart contracts or cryptocurrencies and crypto assets) are not misleading consumers. ASIC has also made public pronouncements in this space to provide guidance to the market. See for example, INFO 219 (in relation to start-ups that are considering operating market infrastructure, or providing financial or consumer credit services, using distributed ledger technology) and INFO 225 (guidance in relation to initial coin offerings and cryptocurrency).
Financial services regulation
It is critical for any project which involves the trading or management of crypto assets to consider the potential application of existing financial services regulation (as set out in Chapter 7 of the Corporation Act 2001 (Cth), and other relevant legislation such as the Payment Systems (Regulation) Act 1998 (Cth)). While ASIC has issued guidance that it does not consider the cryptocurrency bitcoin to be a financial product or service, many other projects may potentially fall within the existing regulatory framework.
The Australian Taxation Office (ATO) has also been monitoring the use of cryptocurrencies. In 2017, amendments to the GST Act were made to treat cryptocurrency in a manner similar to money, rather than goods. Recent ATO statements suggest that it may take a more interventionist approach in the near future when cryptocurrencies are used to evade tax.
Blockchain technology is also increasingly being combined with 'smart contracts', to create automatically executing computer code that is hosted on blockchain networks, with the data from the execution of the smart contracts stored on the blockchain. Smart contracts offer the potential to automatically execute aspects of contractual relationships and obligations. This raises the question of how they connect with 'legal' contracts. On their own, smart contracts have a number of limitations. For example, smart contracts do not work well for non-operational clauses (eg, a clause providing that a party must act in good faith). The interaction between smart contract code and overarching legal agreements, or the extent to which the code embodies an agreement, will be ongoing questions to consider in the implementation of this technology.
The delegation of decision-making processes to computers raises the question of liability for harm or damage that results from those decisions. Depending on how the AI is being used, liability may arise under contract, in tort or under statute (including under privacy, competition, consumer or anti-discrimination laws). Where the harm has not been caused by any identifiable human error, it is unclear where liability may fall.
Some commentators have suggested that more stringent liability should apply to the human operator of the AI technology, either through an extension of the doctrine of vicarious liability or statutory intervention, and it is anticipated that this will be the approach taken by regulators in the near term. However, as AI technology further develops to involve complex machine learning algorithms that can make near-autonomous decisions, it may raise questions as to whether the doctrines of 'legal personhood' and liability should be extended to the technology itself.
The collection, aggregation and transformation of raw data into derived or value-added data sets also raises questions of ownership under intellectual property law. In certain cases, a data aggregator may be entitled to protections under Australian intellectual property law. It has been recognised in Australia that where data has been compiled into a database it may be protected by copyright. The owner of the database must be able to demonstrate 'originality' of the database for copyright to attach, which depends on the labour and expense of collecting, verifying, recording and assembling the data. Since copyright attaches to the database rather than the underlying data, it will be the creator of the database that is the owner – which may not necessarily be the party that contributed the data.
In Europe, a database may also be protected by 'database rights'. The database right can apply to a database even if it not 'original', as the purpose of the right is to protect 'investment' in verifying, presenting or obtaining a database. Currently, no such intellectual property right is available in Australia.
In some cases, courts and regulators (primarily the ACCC) have intervened to require companies to license their intellectual property rights to data sets, in circumstances where access to data is essential for competitors to compete in the market.
Discussions about AI in the intellectual property context tend to focus on patent protection, given that patents are intended to protect new and disruptive inventions like AI technologies. However, copyright can also protect software-based AI technologies, as copyright can subsist in computer source code as a 'literary work' under Australian law. However, the question of whether copyright can protect AI-created works is a more vexed question. Under the Copyright Act 1968 (Cth), the concept of authorship is one of human intellectual effort. This does not pose a problem when machines are used by humans as a tool; but when a machine makes independent decisions, issues of copyright become unclear. It may be that a court would consider works created by AI that do not involve sufficient human intervention to be free of copyright. Otherwise, if such works were to be protected by copyright law, it is not clear who the relevant author would be - the person who handled the machine, or the person who created the AI's algorithms, or even the machine itself. Given that New Zealand and the United Kingdom have expressly amended copyright law to deal with AI-created works, it may not be long before Australia enacts similar laws.
See 3.1 Big Data.
No response provided.
Choice of law and jurisdiction in respect of AI-related claims may also raise complex questions. Where an AI algorithm is developed and continually updated in one jurisdiction, operated from another and applied to end-users in a third, courts may be required to reconsider traditional applications of choice of law rules under Australian law.
In recent years, consumer expectations of transparency and control as to the use and sharing of their data have risen. Where big data involves personal information, the collection, use and disclosure of that data will be regulated by the Australian Privacy Principles (APPs). The Office of the Australian Information Commissioner (OAIC) has published guidelines to assist organisations that handle large aggregated datasets to make decisions about the level of de-identification necessary for big data activities to fall outside the scope of the Privacy Act. At a bare minimum, for information to be considered appropriately de-identified, there must be no reasonable likelihood of re-identification. This usually requires both the removal of direct personal identifiers and any other information that may allow an individual to be identified; and implementation of safeguards in the data access environment to appropriately manage the risk of re-identification.
Where the APPs do apply, there are many legal considerations for organisations handling big data.
Complying with these requirements and mitigating the relevant risks can involve the adoption of any number of measures, including developing internal data protection policies and an overarching data governance strategy, undertaking privacy impact assessments at the outset of big data projects, encryption (including anonymisation of data) as a means of de-identification and managing the risk of reidentification, and contracting with service providers on terms which address privacy, security and regulatory compliance obligations.
Although big data itself is not specifically regulated in Australia, it engages both privacy and intellectual property law (see below), as well as confidentiality and other issues associated with the proprietary ownership of data.
Artificial intelligence (AI) is another tech phenomenon that is growing rapidly in Australia. AI is the simulation or undertaking of 'human' processes by machines. Machine learning is one application of AI, whereby machines use data to train themselves to complete a task with little human intervention.
Australian law is slowly evolving to try to keep pace with the rapid development in big data, AI and machine learning technologies. The challenge for lawyers is to sensibly navigate projects, use cases and contracts in this space where the rules of play are still being defined.
Where personal information will be collected, used, disclosed or otherwise handled in the course of developing an AI algorithm or using an AI system, the relevant developers, providers and trainers will need to comply with the Privacy Act when handling that information.For example, if the data used to develop the AI system includes personal information, consent of the relevant individuals may be required in order to use their data for that purpose. In addition, certain applications of AI, such as facial recognition technology, may also be subject to State-based surveillance laws. Unlike under the GDPR, there are currently no provisions under Australian law dealing directly with automated decision-making, although this will be an area to watch in the coming years as the technology becomes more prevalent.
For projects involving big data, AI or machine learning, it is crucial to establish a robust data governance framework from the outset. This may involve forming an ethics committee to monitor uses of data, developing a code of conduct and a set of ethical use principles for the use of the technology, and outlining an escalation process for decisions relating to data collection, use and disclosure.
For AI projects, in many cases it may be appropriate for decisions which are not clear-cut or have ethical implications to be escalated to a human for determination, with the outcome of such decisions to be fed back into the technology to further advance the sophistication of automated decision-making.
In the case of AI and big data, the technology and its use should be regularly monitored, especially in the early stages, to ensure compliance with the organisation’s agreed code of conduct and ethical use principles.
Further, these documents and policies should be 'living' documents, regularly considered and audited afresh to enable data and AI governance principles and processes to evolve as the technology, its use and related outcomes are better understood.
See 3.2 Machine Learning.
Connected devices, otherwise known as the Internet of things (IoT), can span a wide range of potential use cases and sectors. However, what all IoT devices have in common is that they involve a physical device, or a network of physical devices (including vehicles, home appliances and other items) which are embedded with software, sensors and Internet connectivity in order to enable these objects to collect and exchange data. With the growth of connected devices, and projects which integrate, implement or use such devices, there are various areas of Australian law which might impact such projects.
Underlying Communications Infrastructure and Data Assets
Projects involving IoT devices or services tend to have many stakeholders, often including third parties responsible for administering the infrastructure on which IoT projects rely (eg, network communication providers, data sources and data storage infrastructure). Any project must consider the underlying arrangements to provide network connectivity services and ensure access to sufficient and accurate data. The availability and reliability of such services and data will be a critical component in any IoT-enabled service.
In particular, broadband infrastructure and radio communications spectrum availability are likely to become bigger issues for IoT projects in Australia as the volume of IoT systems increases. The current plan to roll out 5G across Australia in 2019 and 2020 will reduce pressure on current broadband infrastructure, but projects with users located in non-metropolitan areas should consider the implications for their IoT project in less connected areas.
Development, Security and 'Privacy by Design'
At the outset of any IoT project, it is important to clearly identify any relevant intellectual property rights, whether in underlying software, designs in a specially designed IoT device or any database created.
Data protection, management of personal information in accordance with the Privacy Act 1988 (Cth) (if the IoT service is used in Australia) and security will also be critical factors to consider. From individual software API access and security to cross-channel communication, any potential encryption or management of access to encryption keys and data collection and notification practices are all critical to the success of an IoT project. Security and privacy expectations and requirements should be clearly factored into the design of an IoT-enabled device and service, as well as the project framework.
Implementation and Operation
The co-ordination, implementation and operation of any IoT device will inevitably raise its own legal considerations.
Managing a complex supply chain
A project to deliver an IoT-enabled service or device will usually involve multiple service providers. This means consideration will need to be given to supply chain management and ensuring appropriate risk allocation between suppliers contractually. One of the most difficult issues that a project may face is likely to be identification of legal liability for the failure of an IoT device comprising of third-party supplied devices, third-party software and third-party data sources, potentially including licensed public sector data.
In relation to consumer-facing products, consideration may need to be given to Australia's patchwork regulation of surveillance law, with State-based regimes (that are similar but not consistent) governing the use of devices for video and audio recording as well as GPS-tracking and collection of location data.
Privacy and data
IoT devices are often designed to automatically collect data about their users, which will usually raise issues of the application of the Privacy Act 1988 (Cth), whether in relation to personal information and/or sensitive information like health information (with health record information to be considered state by state) and/or biometric data (which may include behavioural data).
Given the dependence of IoT devices on data, clearly mapping and establishing rights to use the data collected (whether for the specific product, or a larger commercial purpose) will be critical, whether from stakeholders or end consumers. This is relevant even where the data is not personal information, given the growing recognition on the value of data generally and its full value being dependent on quality and accuracy. For more detail on privacy and data protection generally, see the data protection section of this guide.
Consumer law issues
Finally, where an IoT project is consumer-facing, it will also need to have regard to Australian consumer law, which regulates the sale of all consumer goods. Product liability to users of IoT should be assessed, and consumer guarantees and representations monitored.
The key focus of data regulation in Australia is the protection of personal information (being information about individuals, rather than companies or other non-human entities).
'Personal information' under the Privacy Act 1988 (Cth) (Privacy Act) means information or an opinion about an identified individual, or an individual who is reasonably identifiable, whether or not the information or opinion is true and whether or not the information or opinion is recorded in a material form.
Examples include a person’s name, address, telephone number and date of birth or more complex information like a resume or personnel file. Generally, anonymised data is not regulated as personal information, unless the relevant individuals can be re-identified by reference to other information.
Australian privacy principles
The Australian Privacy Principles (APPs) set out the minimum standard for the handling of personal information in Australia. The APPs are binding on 'APP entities', being federal government agencies and private entities with an annual turnover of more than AUD3 million, or that provide health services to individuals and hold health information, trade in personal information or conduct a credit reporting business.
The APPs govern the collection, use, disclosure and secure handling of personal information; stipulate how entities must deal with access and correction requests; and contain specific rules regarding certain matters, such as the disclosure of personal information overseas, the use of personal information for direct marketing and the use of government identifiers (eg, driving licence numbers).
A breach of any one of the 13 APPs is an 'interference with the privacy of an individual', which may attract the exercise of the regulatory powers under the Privacy Act, as outlined below.
Part IIIA of the Privacy Act also regulates the handling of personal information in the course of consumer credit reporting.
Notifiable data breaches scheme
The Notifiable Data Breaches Scheme (NDB Scheme) requires APP entities to notify the Australian privacy regulator (the OAIC) and affected individuals, where any personal information which the APP entity holds is subject to an 'eligible data breach'.
An 'eligible data breach' occurs where:
The relevant regulatory authority is the Office of the Australian Information Commissioner (OAIC). The OAIC is responsible for the investigation of privacy-related complaints and has a variety of powers, and can conduct investigations on its own initiative or following a complaint. As for available remedies, it can make determinations in response to individual complaints, accept enforceable undertakings from entities found to be in breach or pursue pecuniary penalties of up to AUD420,000 for individuals and AUD2.1 million for corporations for serious or repeated interferences with privacy.
Other Commonwealth laws relevant to the handling of personal information include the:
Each Australian State and Territory also administers privacy legislation, which generally applies to State government agencies.
Over the past 12 months, the Australian data protection landscape has witnessed the following key policy developments.
See 6.1 Core Rules Regarding Data Protection.
See 6.1 Core Rules Regarding Data Protection.
See 6.1 Core Rules Regarding Data Protection.
In Australia, workplace surveillance is governed by State and Territory-based laws that regulate the use of surveillance devices in each jurisdiction. Only New South Wales (NSW) and the Australian Capital Territory (ACT) regulate computer surveillance in an employment context, while other jurisdictions have device-specific legislation that applies more broadly. Queensland does not have any dedicated surveillance legislation.
The Workplace Surveillance Act 2005 (NSW) (NSWAct) and the Workplace Privacy Act 2011 (ACT) (ACT Act) (together, the WS Acts) apply to both overt and covert computer surveillance (surveillance of the input, output or other use of a computer by an employee), as well as camera surveillance and tracking surveillance (eg, GPS devices on vehicles) carried out in any place where an employee (including employees, independent contractors, outworkers, persons doing a work experience placement and volunteers) is working.
As a general rule, employers must provide their employees with 14 days’ written notice before any type of overt surveillance is undertaken. The notice must outline the type of surveillance to be carried out, how and when it will be conducted and whether it will be continuous or intermittent and for a specified period or ongoing. New employees must be notified before they start work.
Additionally, surveillance of e-mails, computers or Internet usage may only be carried out in accordance with a stated employer policy on computer surveillance. Employees must be notified in advance of that policy in such a way that it is reasonable to assume they have understood it.
Covert surveillance without an employee’s knowledge is only permitted under the WS Acts where the employer suspects the employee is engaged in unlawful activity and has obtained approval from a magistrate (who must be satisfied that there is reasonable suspicion of wrongdoing).
Blocking Internet Access and E-mails
It is also an offence under the WS Acts to block access to an Internet site or prevent the delivery of an e-mail sent by or to an employee, unless the employer is acting in accordance with its stated policy on e-mail and Internet use notified to the employee in advance. Employees must also be notified as soon as practicable after an e-mail is blocked, other than where the e-mail or attachment is spam, may damage the computer or network or would be reasonably considered harassing or offensive. Company policy cannot prevent the delivery of an e-mail or access to a website merely because it relates to industrial matters.
Other Jurisdictions and Related Laws
General surveillance laws in other jurisdictions govern the recording of individuals through optical or listening devices but do not deal specifically with computer surveillance (and only the Victorian Surveillance Devices Act 1998 specifically refers to listening devices installed in the workplace). All of these laws require that express or implied consent be obtained from individuals (including employees) prior to surveillance being undertaken.
Various Commonwealth, State and Territory laws also prohibit unauthorised access to computer systems or data on computer systems protected by password or other forms of access control without the consent of the system controller. Employers should therefore exercise particular caution in respect of app, browser or cookie data analytics technologies which might facilitate unauthorised access to employee-owned devices, such as a personal smartphone or a home computer used for work purposes.
Otherwise, in the absence of specific workplace and/or computer surveillance provisions, employers should also consider the application of other relevant laws, including the Privacy Act 1988 (Privacy Act). Notably, employers will be exempt from the rules for handling personal information under the Privacy Act to the extent it is dealing with any 'employee records' that are directly related to a past or former employment relationship. Whether personal information related to employees has been collected via computer software, e-mail or other electronic means, is a question of fact determined on a case-by-case basis.
Carriers and Carriage Service Providers
The Telecommunications Act 1997 (Cth) (Telco Act) delineates obligations between two types of service providers:
Examples include Internet service providers (ISPs), VoIP service providers and providers who resell services on a carrier’s network.
Under the Telco Act, carriers must hold a licence issued by the Australian Communications and Media Authority (the ACMA). Carriers are subject to both a one-off application fee (currently AUD2,122) and an annual fee calculated by the ACMA, based on a carrier’s eligible revenue. CSPs are not subject to a licensing regime.
Legislative Requirements and Powers
Both carriers and CSPs are required to comply with numerous legislative obligations, which govern the way they operate their business and provide services to the public.
Carriers are required to act in accordance with good engineering practice and recognised industry standards when inspecting the land and installing or maintaining a facility. Subject to narrow statutory exceptions, carriers must protect the confidentiality of information relating to contents of communications, carriage services and the affairs of other people. They must do their best to prevent telecommunications networks and facilities being used to commit offences, and give the authorities such help as is reasonably necessary for the purposes of enforcing the criminal law, protecting the public revenue and safeguarding national security. The Telco Act also imposes a range of obligations on carriers in relation to access to its telecommunications infrastructure – including telecommunications transmission towers and underground facilities.
Carriers and CSPs
In most cases, carriers will also be regarded as CSPs and will therefore have additional obligations. Where a CSP supplies a standard telephone service (eg, a fixed line or mobile telecommunications service), the Telco Act requires that they comply with the Standard Service Provider Rules In particular, CSPs must provide operator services and directory assisted services to end-users of standard telephone services. Itemised billing must be provided to customers, and priority assistance must be given to people with life-threatening medical conditions.
Under the Telecommunications (Consumer Protection and Service Standards) Act 1999,CSPs which provide standard telephone services (and in certain circumstances, even CSPs that do not) must also provide access to an emergency call service.
Additionally, the Telecommunications Consumer Protections Code 2015 regulates how CSPs can advertise and handle customer complaints. For example, CSPs are required to provide appropriate detail in advertising, to ensure the principal message and main terms are captured in the body of the advertisement.
Carriers and CSPs must also comply with the Telecommunications (Interception and Access) Act 1979 (TIA Act), which imposes data retention obligations. In particular, carriers and CSPs are required to keep certain information and documents relating to any communication carried by means of the telecommunications service for no less than two years. The types of information to be kept is information which relates to relevant subscribers/accounts/services, source and destination of communications, and date and time information. The TIA Act also imposes various reporting obligations on carriers and CSPs, including to create an interception capability plan to be provided to the Communications Access Coordinator.
The Telco Act provides for pecuniary penalties to be paid for contraventions of civil penalty provisions, including for breaches of the TIA Act. Where a carrier has failed to comply with licence conditions, they may be subject to other enforcement mechanisms such as written directions, formal warnings, enforceable undertakings or infringement notices.
In December 2018, the Telecommunications and Other Legislation Amendment (Assistance and Access) Act 2018 was passed into law, amending the Telco Act. This legislation obliges 'designated communications providers' (which may include carriers and CSPs) to provide assistance to law enforcement and security agencies with a warrant for information. For more detail, see the encryption requirements section of this guide.
The Radiocommunications Act 1992 (Cth) regulates the way in which spectrum is divided and licensed to entities, including carriers, Internet providers and government entities. Under the Radiocommunications Act, there are three primary, and distinct, types of licences: spectrum licences, apparatus licences and class licences.
Spectrum licences permit a person to operate devices within a specific geographic area and frequency band. Once spectrum bands have been designated by the Minister for Communications, licences can be issued by the ACMA, typically by way of auction. Once issued, they have a maximum term of 15 years, and may only be assigned to another entity with the ACMA’s approval. There is a statutory assumption that spectrum licences will be returned to the market, and not automatically renewed, at the end of the relevant licence term.
Apparatus licences permit a person to operate certain categories of devices (eg, receivers, base stations, remote control stations) at a particular location. Unlike spectrum licences, there is a general presumption that an apparatus licence will be renewed on provision of a renewal application and payment of licence fees. They have a maximum term of five years.
Class licences set out the common technical and operational parameters that persons and devices must comply with, and are granted in respect of:
Spectrum and apparatus licences are granted to persons on application, and are subject to the payment of fees and compliance with licence conditions. Conversely, class licences are not issued to persons and do not require an application or payment of fees.
In 2015 the government announced it would implement the recommendations of a Spectrum Review, including replacement of the current legislative arrangements for radio communications with new legislation that removes prescriptive processes and streamlines licensing for a simpler and more flexible framework. An exposure draft of the new legislation has been released and submissions received. The government has not yet acted on this matter.
Audiovisual services have different regulatory treatment dependent upon their delivery platform, which is not appropriate in an age of converged media and changed consumption habits. The situation has been compounded by successive government inaction for many years, with only piecemeal regulatory change. This is despite a plethora of independent reports, including from the expert regulator, the ACMA, the Australian Law Reform Commission and the Convergence Review, all recommending that broken concepts be remade to reflect market realities.
As matters currently stand, three Commonwealth Acts (the Radiocommunications Act 1992, the Broadcasting Services Act 1992 (BSA) and the Telco Act provide the framework for analysis of particular audiovisual services. The Telco Act includes mechanisms for regulating communications content services providers; however, it is primarily focused on carriers and CSPs as set out in 8 Scope of Telecommunications Regime, above.
Traditional audiovisual broadcasting services using radiofrequency spectrum (currently known as the broadcasting services bands) are required to comply with the BSA – in particular, content regulation requirements – including Australian content obligations.
There has been an increase in the popularity of online (over the top) content providers, which give consumers access to content via Internet streaming, such as Netflix, Amazon Prime, Kayo, Stan and Freeview (Streaming Services). Streaming Services are excluded from the definition of broadcasting services under the BSA, giving rise to inconsistency in, for example, content and classification regulation across various platforms.
Through piecemeal amendments to the BSA – in particular, the introduction of Schedule 5 (Regulation of prohibited online content hosted outside Australia), Schedule 7 (Regulation of illegal and offensive content provided online and via convergent devices) and Schedule 8 (Regulation of gambling promotional content provided on an online content service) – a partial legislative framework has been established for certain online content, together with a complaints-based mechanism for assessing online content. However, the concepts used are frequently inconsistent and there is no overall coherent regulatory structure.
Regulatory Framework for Television Services
The BSA requires that commercial and community television services using the broadcasting services bands must hold licences under the BSA. Fees for such licences are determined on a case-by-case basis, and the amount payable depends on spectrum location, geographic location, amount of spectrum occupied and the coverage area of the licence.
Subscription television services are also licensed under the BSA on the basis of one licence per stream of programming.
Commercial television and subscription television broadcasters are also subject to further regulatory requirements; namely, the anti-siphoning list (which protects certain sports from being behind a pay television wall); the Commercial Television Code of Practice (a self-regulatory code adopted by free-to-air (FTA) broadcasters, and registered by the ACMA); the Subscription Television Broadcasting Code of Practice (a self-regulatory code, registered by the ACMA); and minimum Australian content requirements which apply to FTA and subscription television services.
Requirements for Streaming Services
Streaming Services are not required to apply for licences or pay any fees to offer services to customers in Australia. Nor are they subject to the further regulatory requirements referred to above.
Outside the complaint procedures in Schedule 5 (Online services) of the BSA, which solely regulates the conduct of Internet Service Providers in relation to content hosted outside of Australia, Streaming Services are only regulated in Australia if the service is a 'content service' (ie, a service delivered by means of carriage service or that can be accessed by a carriage service which, for completeness, does not include Freeview as this is a retransmitted broadcasting service which is carved out from the definition of content service). To fall within the scope of the regulation, the content provided (or a link to the content provided) must also be hosted in Australia or, if a live stream, must be provided from Australia. For example, if a Streaming Service is provided from servers in the USA, then there is not an Australian connection. If there is storage on servers in Australia, including on caching servers, then there is likely an Australian connection. Note that any storage of content on a highly transitory basis is probably not considered to be storage in Australia.
Streaming Services and online platforms will, on the basis that they have an Australian connection, be subject to the take-down regime under Schedule 7 of the BSA. This regime requires “prohibited” or “potential prohibited” content (determined by reference to separate laws and guidelines) to be removed if it is subject to a takedown notice. Content is prohibited content where it has been classified as:
Where Streaming Services charge a fee for their offerings and offer MA15+ content, they must have in place a restricted access system. The Streaming Service must, therefore, display warnings as to the nature of the MA15+ content and require an individual to apply for access to the content by making a declaration that they are over 15 years of age.
Additionally, the Internet Industry Code of Practice requires that, where stored content has not been classified by the ACB and the Streaming Service reasonably considers the content to be substantially likely to be “prohibited” or “potential prohibited” content, the Streaming Service must ensure that the content has been assessed by a 'Trained Content Assessor'. Situations where this could arise include where user-generated content is hosted by a Streaming Service or if content from overseas will be stored and offered.
The Broadcasting Services (Online Content Service Provider Rules) 2018 prohibit gambling advertising during live sport that is streamed online between 5.00 am and 8.30 pm. At all other times, gambling advertising is also restricted. The new rules were developed under Schedule 8 of the BSA (regulation of online gambling promotions).
Additional Considerations – Copyright protection
In the USA, there are safe-harbour provisions which protect Internet content hosts and content service providers from liability where they are in breach of copyright. However, Streaming Service providers in Australia should be aware that the Copyright Act 1968 (Copyright Act) does not provide such protection in Australia, meaning that Streaming Services which host user material are at risk of breaching a third party’s copyright due to its actions or inactions.
Potential Legislative Changes
In 2012, the Australian Law Reform Commission recommended that a new Classification of Media Content Act be enacted, which would be platform-neutral and incorporate all classification obligations applying to all media content, including online and mobile content subject to the regulatory regime under Schedules 5 and 7 of the BSA and broadcast and subscription audiovisual content regulated under the BSA. The government did not act on those recommendations.
Also in 2012, the final report of the Convergence Review was published which recommended coherent platform and technology-neutral regulation in three areas: media ownership and control, media standards across all platforms and the production and distribution of Australian content. The recommendations of the Convergence Review have not been acted upon.
Consistent with a piecemeal approach, the Department of Communications recently announced that it had approved, after a two-year trial, a tool developed by Netflix that produces Australian classifications and consumer advice for films and television programmes available online via Netflix Australia. The tool combines Netflix’s viewer-recommendation technology with Australian classification standards, and is said to provide quicker access to classification for this particular Streaming Service.
The ACMA has historically been eager to ensure that it can adapt to any convergence in the area of media and communications. However, it is hamstrung by the BSA and the current regulatory structure. Given the current market and movement away from traditional media consumption, it is possible that Streaming Services might be become more heavily regulated; for example, by the imposition of Australian content requirements.
The BSA and its Schedules may also be the subject of further amendments following the 2018 Digital Platforms Inquiry and review of the Enhancing Online Safety Act 2015 (Cth), which will also take into account Schedules 5 and 7 of the BSA. There has been no indication to date as to whether any potential amendments to the BSA and its Schedules will directly affect Streaming Services. Despite this, the Digital Platforms Inquiry may lead to new regulatory frameworks and oversight for digital platforms (including Streaming Services), including in relation to the delivery of content in Australia (with the production and delivery of news and journalistic content being a particular area of scrutiny), advertising take-down standards and codes of practice.
See 9.1 Main Requirements.
Australia does not have a comprehensive legal regime relating to the use of encryption. For example, individuals have no general right to use encryption or encrypted services to protect their information, and there are no mandatory minimum standards regarding the strength of encryption that may be used. However, several Commonwealth laws do touch on, and in some cases undermine, the use of encryption in Australia:
The use of encryption does not directly exempt an organisation from any Australian laws, but may in effect exempt an organisation from being subject to the Privacy Act. Where an organisation handles encrypted communications which contain personal information, but the organisation does not have the private key to decrypt such information, an individual will not be 'reasonably identifiable' from the information and therefore it will not constitute 'personal information' for the purposes of the Privacy Act. In this way, encryption is a means of successfully de-identifying personal information. However, once decrypted, the Privacy Act would apply to such information.