Contributed By Fasken
There are no laws that specifically address the metaverse and associated services and products in Canada. As a result, legal considerations regarding the metaverse arise from the application of existing laws of general application to metaverse services and products, such as private sector privacy laws, consumer protection laws, competition laws, and copyright and other intellectual property laws.
The following is an illustrative list of such considerations.
Privacy and Cybersecurity
Canadian private sector privacy laws apply to metaverse products and services. It is possible to comprehensively log and record an individual’s activity in the metaverse to an even greater degree than in the physical world. Canada’s privacy laws remain consent-based, and potential reforms to federal law and reforms in the province of Quebec will increase the transparency and other requirements for that consent (though in the case of federal reform, new broader exceptions to consent may help the development of the metaverse) and empower regulators to impose significant penalties (this enhanced authority is already in place in Quebec).
In addition, the coming into force of the new Quebec law has introduced new requirements on individual profiling and the collection of personal information through technical means. Organisations operating in the metaverse will have to carefully design their processes for obtaining consent, and ensure their processing of personal information remains within the bounds of that consent and the other requirements of Canadian privacy laws.
Responsibility for Cyber-Attacks and Security Breaches
To the extent that organisations collect a greater volume of potentially more sensitive personal information in the metaverse, the risks of harm from cyber-attacks and security incidents will increase. The requirements to establish and maintain appropriate safeguards for personal information as required by Canadian privacy laws will be more stringent, and it will be even more critical for organisations to identify, mitigate and ameliorate incidents. It may also be difficult to determine where responsibilities lie in respect of breach notification to users, the Office of the Privacy Commissioner of Canada (OPC) or other applicable regulators as metaverse users move through different metaverse platforms.
Artificial Intelligence Regulations
Many human interactions within the metaverse will be enabled by artificial intelligence (AI). Seamless, AI-driven human/system interaction (particularly any AI interpreting or mimicking human behaviour) may fall within future AI legislation and regulations as well as automated decision-making provisions under privacy laws. These points are discussed in 4. Artificial Intelligence.
Consumer Protection and Civil Liability
Most Canadian provinces have prescriptive consumer protection laws that regulate marketing practices and consumer contracts. For example, organisations in the metaverse must comply with restrictive requirements on internet contracts, even though interactions in the metaverse may be more akin to in-person interactions. Further, the application of tort and other common and statutory law that applies to in-person public places has not yet been tested in the metaverse.
Commercial Contracting
Metaverse industry participants need to co-operate in order to create a seamless experience for consumers and users moving through different metaverse platforms. Common intellectual property-sharing agreements and confidentiality clauses will need to account for these novel circumstances, and metaverse businesses will need to consider how to apportion responsibility for privacy and security risks in contracts.
In Canada, digital services and digital markets are not subject to a specific regulatory regime as under the European Union’s Digital Services Act and Digital Markets Act. Instead, any legal considerations regarding digital markets arise from the application of general laws to digital services and products such as communications laws, consumer protection laws, competition laws and private sector privacy laws.
In addition, the Canadian Parliament amended the Broadcasting Act in 2023 to (among other things) enable Canada’s broadcasting and telecommunications regulator – the Canadian Radio-television and Telecommunications Commission (CRTC) – to implement a regulatory framework that would apply to digital streaming services that qualify as “online undertakings”. The CRTC has initiated several proceedings designed to implement its new mandate under the Broadcasting Act. These proceedings are expected to result in certain online undertakings being subject to obligations designed to support the creation, distribution, promotion and discoverability of Canadian and Indigenous audio and video content through monetary and non-monetary contributions and investments in Canada.
The amended Broadcasting Act provides the CRTC with the authority to impose penalties for violations of certain regulatory obligations, and grants the CRTC explicit information-gathering powers with respect to online undertakings.
In addition to the amended Broadcasting Act, the Canadian Parliament also enacted the Online News Act, which aims to require dominant digital platforms to compensate news businesses when their content is made available on their services. To achieve this goal, the Online News Act creates a bargaining framework to ensure that dominant digital platforms provide for fair compensation to news businesses, and encourages them to enter into commercial agreements with a range of news businesses in Canada.
Any business seeking to trade in digital services or markets in Canada should consult legal experts to determine whether their activity may be subject to existing generally applicable laws and regulations.
Laws and Regulations
Except in Quebec, there are no private sector laws of general application focused specifically on the provision of cloud services to the private sector in Canada. Quebec’s law regarding information technology (AELFIT) sets limits on the liability for cloud service providers and other such intermediaries (search engine providers, internet service providers) where the provider has limited control over user-generated content. The Quebec law on information technology also targets biometric information and biometric databases, and imposes disclosure requirements in connection with such activities. Other Canadian laws of general application and certain sector-specific regulations targeted at cloud services apply to the provision and use of such cloud services. Applicable laws include private sector privacy laws and regulations which impose industry-specific requirements, such as:
Regulated Industries
The Office of the Superintendent of Financial Institutions (OSFI) is the Canadian federal regulator that supervises and regulates federally regulated banks and insurers, trust and loan companies, and private pension plans subject to federal oversight. OSFI has issued Guideline B-10, Outsourcing of Business Activities, Functions and Processes, which specifies certain OSFI expectations for federally regulated financial institutions (FRFIs) that outsource one or more of their business activities to a service provider. This Guideline applies to all outsourcing arrangements, including cloud services, under which FRFIs are expected to:
The Guideline also contains a list of specific terms that OSFI expects an FRFI to address in a cloud service contract. While Guideline B-10 is directed at federal entities, it has also been voluntarily adopted by many provincially regulated entities in the financial sector.
In April 2023, OSFI released a final version of a revised Guideline B-10 (the “Revised Guideline”), which sets out enhanced expectations for FRFIs in managing an expanded scope of third-party risks, and places greater emphasis on governance and risk management plans, and on specific outcomes and principles. The Revised Guideline expands the application of Guideline B-10 to “third-party arrangements”, which include any business or strategic arrangement with external entities. Examples of arrangements that will be subject to the Revised Guideline are:
The Revised Guideline replaces the “materiality” threshold in the current guideline, and introduces a new “risk-based approach”, which requires a more comprehensive risk-management framework that accounts for the level of risk and the “criticality” associated with individual third-party arrangements. It also includes more specific requirements for FRFIs to develop cloud-specific requirements and consider cloud portability in their contracting arrangements. The Revised Guideline will come into effect on 1 May 2024, and OSFI indicates that this transition period is intended to provide FRFIs with sufficient time to self-assess and build third-party risk-management programmes that comply with the new requirements.
Under the Bank Act, the Trust and Loan Companies Act, the Insurance Companies Act and the Cooperative Credit Associations Act, certain records of federally regulated financial organisations carrying on business in Canada must be maintained in Canada. In addition, a FRFI is expected to ensure that OSFI can access, in Canada, any records necessary to enable OSFI to fulfil its mandate.
In addition to Guideline B-10, OSFI has also released an advisory on Technology and Cybersecurity Incident Reporting, setting out OSFI’s expectations in relation to the immediate and ongoing reporting of cybersecurity incidents, and which FRFIs should account for in their agreements with cloud providers. These expectations are in addition to the mandatory breach notification requirements under Canadian privacy laws.
In July 2022, OSFI released a final Guideline B-13, Technology and Cyber Risk Management, which is intended to serve as a complement to existing guidelines, including Guideline B-10. Guideline B-13 is expected to be read, and implemented, from a risk-based perspective to allow FRFIs to compete effectively and take full advantage of digital innovation, while maintaining sound technology risk management. Guideline B-13 provides FRFIs with technologically neutral guidance to produce key “outcomes” in three domains:
Guideline B-13 became effective on 1 January 2024.
Personal Information Processing in Canada – Overview
A comprehensive review of all privacy obligations in Canada is beyond the scope of this summary. However, generally the following applies:
In Canada, privacy and personal information are regulated by both federal and provincial legislation.
The Personal Information Protection and Electronic Documents Act (PIPEDA) is the federal privacy law for private sector organisations. The Office of the Privacy Commissioner of Canada (OPC) oversees compliance with PIPEDA. The OPC has issued a number of guidelines and case summaries that provide non-binding guidance on the OPC’s interpretation of PIPEDA’s obligations. As of January 2024, PIPEDA continues to be subject to an adequacy decision by the European Commission, to ensure that personal data is processed in accordance with the General Data Protection Regulation (GDPR) and can be transferred from the EU to Canada without requiring additional data protection safeguards, such as the use of standard contractual clauses, to the extent PIPEDA applies to such data.
PIPEDA applies in all provinces and territories in Canada to organisations engaged in commercial activities, except within a province or territory that has enacted substantially similar private sector legislation and that is subject to an exemption under PIPEDA (though PIPEDA continues to apply in those provinces in connection with federal “works, undertakings and businesses” such as airlines, banks and telecommunications companies, and in connection with the interprovincial or international processing of personal information). British Columbia, Alberta and Quebec have their own legislation that regulates the collection, use and disclosure of personal information by private sector organisations in those provinces. In addition, most provinces have provincial legislation that regulates the collection, use and disclosure of personal health information. There are also federal and provincial public sector privacy laws that apply to the public sector.
In September 2021, Quebec’s legislature passed comprehensive reforms to the province’s privacy law. Quebec’s revised law is the first Canadian private sector privacy law to specifically address de-identification and anonymisation, automated decision-making, technology-based profiling, and data portability. Importantly, it will introduce significant administrative penalties and fines, including:
The revised Quebec law also requires organisations to undertake privacy impact assessments prior to transferring personal information outside Quebec.
In June 2022, the Canadian government tabled Bill C-27, the Digital Charter Implementation Act, 2022 – legislation that would, among other things, enact the Consumer Privacy Protection Act (CPPA) to replace the privacy provisions of PIPEDA. As of the time of writing, Bill C-27 remains under consideration in the parliamentary process. The proposed CPPA retains the principles-based and consent-based approach of PIPEDA. Among other things, the CPPA would:
Bill C-27 would also create a new Personal Information and Data Protection Tribunal, which would consider decisions and recommendations of the Federal Privacy Commissioner.
The CPPA would allow the Federal Privacy Commissioner to recommend, and for the Personal Information and Data Protection Tribunal to impose, penalties up to the greater of either CAD10 million or 3% of an organisation’s annual global revenues. It would also provide for significantly expanded offences with fines up to the greater of either CAD25 million or 5% of annual global revenues, and for a private right of action to permit recourse to the courts in certain circumstances.
In response to stakeholder feedback, in October 2023 the federal government proposed amendments to the CPPA that would, among other things:
Every aspect of privacy legislation might have some impact on the provision or use of cloud services.
The Personal Information Protection and Electronic Documents Act (PIPEDA)
Under PIPEDA, personal information means information about an identifiable individual. PIPEDA provides that an organisation is responsible for personal information in its control, or in its possession or custody, including information that has been transferred to a third party for processing. An organisation that transfers personal information to a cloud service provider remains primarily responsible for that personal information, and will want to ensure that the cloud services contract contains appropriate provisions to address the organisation’s responsibilities in relation to the personal information transferred to and processed by the cloud service provider.
OPC guidance clarifies that an organisation must take all reasonable steps to protect personal information from unauthorised uses and disclosures while it is in the hands of the third-party processor, regardless of whether the information is being processed in Canada or a foreign country. An organisation must be satisfied that the third party has policies and processes in place, including training for its staff and effective security measures, to ensure that the information in its care is properly safeguarded at all times, as well as an audit right.
PIPEDA requires that personal information be protected by security safeguards appropriate to the sensitivity of the information. The security safeguards must protect personal information against:
The nature of the safeguards will vary depending on:
More sensitive information should be safeguarded by a higher level of protection, particularly where large volumes of information are involved. The methods of protection should include physical, organisational and technical measures.
PIPEDA case summaries provide non-binding guidance on the OPC’s interpretation of these obligations.
An organisation will want to address the detail of a service provider’s security safeguards in the cloud services contract. When it investigates security breaches, the OPC will closely examine the safeguards in place at the time of the breach, and the contractual requirements to implement and maintain such safeguards. The cloud provider’s obligations in the case of a breach of security safeguards should be included in the cloud services contract.
As noted previously, the Canadian government re-introduced proposed legislation to replace PIPEDA’s personal information provisions with a new law, the Consumer Privacy Protection Act (CPPA), and this legislative reform remains under consideration by Parliament.
Cross-Border Transfers
PIPEDA does not prohibit an organisation from transferring personal information to an organisation outside Canada for processing. However, the OPC expects that organisations must assess the risks to the integrity, security and confidentiality of personal information when it is transferred to third-party service providers operating outside Canada. The OPC expects that organisations will advise their customers that their personal information may be sent to another jurisdiction for processing and that, while the information is in another jurisdiction, it may be accessed by the courts, law enforcement and national security authorities of that jurisdiction.
Alberta’s private sector privacy law requires organisations that transfer personal information outside Canada to maintain policies on:
Quebec’s privacy law now requires organisations, prior to personal information being transferred out of Quebec, to conduct a privacy impact assessment and to determine whether the transferred information will receive protection in accordance with “generally accepted best practices respecting the protection of personal information” in the target jurisdiction. In September 2023, Quebec’s privacy regulator released a guide and template on conducting privacy impact assessments in compliance with the new law.
Artificial Intelligence and Data
AI systems often rely on large datasets. Organisations that develop AI systems using large datasets in Canada must balance the need to maximise the value of large datasets with the requirements of Canadian privacy laws. Holding large amounts of personal information can lead to issues surrounding consent, transparency, accountability, and the requirement to limit the collection of personal information to that needed for the purposes identified by the collecting organisation.
Additionally, holding large volumes of personal information requires organisations to implement more stringent safeguards in order for them to be considered appropriate under Canadian privacy laws. Holding greater amounts of personal information about a greater number of individuals also increases the risks of a class action in the event of a data breach, and the liability that would result from a breach.
To limit these risks, organisations may use anonymised and synthetic data. Anonymised and synthetic data, where there is no “serious possibility” that the information, alone or in combination with other information, can identify an individual (ie, be re-identified) are not personal information and thus not subject to existing Canadian privacy laws outside Quebec (see below). However, the potential for re-identification increases as datasets grow and other datasets become available for matching, and statistical and other methods that can re-identify data are becoming increasingly sophisticated.
Canada’s federal privacy law does not currently include a definition of de-identified or anonymised data, or define what it means to de-identify or anonymise data; though the proposed Consumer Privacy Protection Act being considered by Parliament would introduce such definitions. Revisions to Quebec’s privacy law have introduced anonymisation and de-identification as separate concepts. Under the new Quebec law, anonymised information is information that:
De-identification is a less stringent standard, and is information that no longer allows the direct identification of an individual. A business handling de-identified information will have a requirement to take “reasonable steps to avoid re-identification”.
Artificial Intelligence
As mentioned in 3.1 Highly Regulated Industries and Data Protection, in June 2022 the Canadian government tabled Bill C-27, the Digital Charter Implementation Act, 2022. This legislation would, among other things, enact the Artificial Intelligence and Data Act (AIDA) as well as comprehensive reforms to Canada’s federal private sector privacy law regime. Bill C-27 remains in the legislative process and under consideration by the House of Commons at this time (January 2024).
If passed, AIDA would regulate the design, development and use of AI systems in the private sector with a focus on mitigating the risks of harm and bias in the use of “high-impact” AI systems. AIDA would set out positive requirements for AI systems, as well as monetary penalties and new criminal offences for certain unlawful or fraudulent conduct in respect of AI systems. In November 2023, the minister responsible for AIDA presented a set of proposed amendments to the parliamentary committee studying the new law. Among other things, the proposed amendments would do the following.
Canadian privacy law requirements regarding consent, openness and transparency are areas of concern with regard to AI and machine learning (ML). Meaningful consent and transparency require that organisations identify the purposes of the collection and use of personal information. This is more difficult in the case of AI and ML, since those purposes can evolve over time as ML algorithms and models make discoveries and predictions based on data.
Since September 2023, Quebec’s privacy law requires that individuals be informed when their personal information is used to make a decision about them using automated processing, and that individuals must be offered the opportunity to make observations about the decision.
Organisations that use AI might also encounter issues with Canadian intellectual property laws. Canadian copyright law does not protect databases where the creation of a database or other compilation is not an exercise of “skill and judgement”, and it does not protect individual data elements removed from a database or other compilation (for example, where those data elements are mere facts, such as street addresses). In October 2023, the Canadian government launched a consultation related to authorship of AI-generated content and issues related to the use of copyright-protected content in the training of AI services and tools. The government characterised the purpose of the consultation as focusing on “the impacts of recent developments in AI on the creative industries and the economic impacts that these technologies have, or could have, on Canadians, and [looking] at whether change is required to further improve or reinforce copyright policy for a modern, evolving Canadian economy”. The consultation closed for public comment in January 2024.
Furthermore, an ML algorithm or model is not considered an “inventor” under the Patent Act or an “author” under the Copyright Act. For Canadian copyright law, the choice of the ML algorithm, training data, and the conduct of the training would have to be an exercise of skill and judgement for the ML model and its output to be potentially considered an original work eligible for copyright protection (subject to the output or model otherwise being the proper subject matter of copyright).
There are no laws that specifically address internet of things (IoT) services and devices in Canada. As a result, any legal considerations regarding the IoT arise from the application of general laws to IoT services and devices.
Canada’s private sector privacy laws apply to the use of IoT devices and services by individuals that allow organisations to collect personal information. In Alberta, British Columbia (BC) and Quebec, provincial private sector privacy laws will also apply to the use of IoT devices in the workplace, while the federal law (PIPEDA) will apply to federally regulated workplaces (ie, to federal works, undertakings and businesses) across Canada. Alberta, BC and federal privacy laws are notice-based in connection with employee personal information and the use of personal information to manage the employee-employer relationship, and employers must provide notice to employees of the use of IoT devices that collect employee personal information and of the subsequent use and disclosure of that information.
Even where employers give notice, however, the processing of personal information must also be for purposes a reasonable person would consider appropriate in the circumstances. Thus, the use of IoT devices to collect employee personal information for inappropriate purposes – for example, location tracking or video surveillance where less intrusive measures could be used – may still run afoul of Canadian laws even if employees were provided notice of the tracking.
Other than for limited exceptions, Canada’s private sector privacy laws are consent-based. From a privacy perspective, the IoT poses a challenge in obtaining meaningful consent as it allows passive information collection that may be less obvious to individuals and more difficult to explain. Transparency is particularly important if the IoT service provider is contemplating secondary uses of personal information (ie, uses in addition to providing the services) – for example, marketing or advertising. The ability of IoT devices to collect large amounts of data must be weighed against requirements to limit the collection of personal information.
The OPC has released guidance targeted towards manufacturers of IoT devices. In particular, the guidance recommends as a best practice that organisations perform a privacy impact assessment before releasing IoT products.
Since September 2023, Quebec law requires organisations that use technologies that can identify, locate or profile individuals to have this functionality off by default. IoT devices and services must comply with this requirement.
IoT devices and services are also seeing growing use in the healthcare sector. Most Canadian provinces have enacted health privacy legislation regulating the use of personal health information by healthcare providers. Depending on the province, health privacy legislation may apply to the healthcare provider, or to both the healthcare provider and its service provider.
Audio-Visual Services
All traditional audio-visual services (television, radio, cable, etc) operating in Canada are required to be licensed or exempt from licensing by the CRTC under the Broadcasting Act.
The CRTC issues licences for terms that are indefinite or fixed, and those licences are subject to conditions of service and certain orders and regulations that it deems appropriate for the implementation of Canada’s broadcasting policy. Licensees are generally subject to a variety of regulatory obligations, including requirements relating to the exhibition of Canadian content and to programme expenditures and/or contributions.
Television and radio stations that use radio spectrum are also required to obtain authorisation from the Department of Innovation, Science and Economic Development Canada (ISED) in accordance with the Radiocommunication Act. Applications to obtain a broadcasting licence must be filed with the CRTC, and the CRTC is required to hold a public hearing to consider the application. The process typically takes between eight and 18 months to conclude.
To be eligible to hold a broadcasting licence, a company must be owned and effectively controlled by Canadians. Broadcasting licensees will also generally be required to pay broadcasting fees under the Broadcasting Fees Regulations, which are expected to become effective on 1 April 2024. The broadcasting fees will be a licensee’s pro rata share of the annual cost of the CRTC’s operations.
In addition to licensing, the CRTC has the authority to exempt classes of traditional broadcasting undertakings from holding a licence, and it has exercised this authority in a number of circumstances, including with respect to small satellite-to-cable (discretionary) services and small cable distributors. Exemption orders issued by the CRTC contain terms and conditions that apply to an entire class of broadcasting undertakings, and do not require a company to pay any broadcasting fees or to obtain any further authorisations from the CRTC to operate in Canada.
Online Video-Sharing Platform Services
Entities that operate online video-sharing platforms (including those that offer user-generated content) and other online streaming services in Canada are permitted to do so in Canada under the amended Broadcasting Act, and may be subject to regulatory obligations imposed by the CRTC, including:
Certain online undertakings will also be required to pay broadcasting fees under the proposed Broadcasting Fees Regulations. There are, however, no Canadian ownership and control requirements applicable to online undertakings, as operators are not required to hold licences for their online undertakings.
In May 2023, the CRTC initiated three separate proceedings to begin implementing its new mandate under the amended Broadcasting Act. Two of those proceedings have concluded, and resulted in requirements for certain online streaming services to register with the CRTC and provide information about their activities in Canada, and to comply with modest conditions of service, which include a requirement to make content available in a way that is not tied to a specific mobile or internet service.
The third proceeding, which included a three-week hearing beginning in November 2023, considered the contributions online streaming services will need to make to support Canadian and Indigenous content. A decision in that proceeding is expected to be issued in spring 2024, and will be followed by subsequent proceedings considering whether to impose other regulatory obligations – financial and non-financial – for online undertakings.
Among other reforms, the amended Broadcasting Act also provides the CRTC with the authority to impose administrative monetary penalties (AMPs) on all types of broadcasting undertakings (both traditional and online) for violations of certain regulatory obligations imposed under the Act, and provides the CRTC with more explicit information-gathering powers.
Telecommunications
The Telecommunications Act regulates telecommunications common carriers and telecommunications service providers. It does not regulate technologies.
The Telecommunications Act defines a telecommunications common carrier as a person who owns or operates a “transmission facility” used by that person or another person to provide telecommunications services to the public for compensation. A transmission facility means “any wire, cable, radio, optical or other electromagnetic system, or any similar technical system for the transmission of intelligence between network termination points, but does not include an exempt transmission apparatus”.
A telecommunications service means a service provided by means of telecommunications facilities, which in turn is broadly defined to include any facility or thing that is used or capable of being used for telecommunications or for any operation directly connected with telecommunications, including a transmission facility. A telecommunications service provider (TSP) is defined as “a person who provides basic telecommunications services” and includes telecommunications service resellers.
Telecommunications regulation in Canada is therefore technology-agnostic, and there are no restrictions on the use of new technologies by carriers or service providers. Certain services are, however, subject to registration and other regulatory requirements. For example:
The CRTC does not charge for registering as a TSP, but it operates a contribution fund to which carriers and TSPs are required to contribute based on a percentage of their Canadian telecommunications revenues, once they are generating CAD10 million or more in revenues. Money from this fund is used to finance video relay services and the extension of broadband facilities to rural and remote parts of Canada.
VoIP service providers need to register with the CRTC as a carrier or reseller depending on whether they own transmission facilities. VoIP service providers also require a BITS licence, which entails an application to the CRTC. No fees are applicable for these registrations, approvals or licences other than contribution to the fund referenced above. VoIP service providers that provide access or egress to or from the public switched telephone network (PSTN), and that use North American Numbering Plan (NANP) telephone numbers to route calls, require CRTC approval of their 911 emergency calling services before providing service in Canada.
The provision of instant messaging is regulated if it involves the use of transmission facilities owned or leased by a carrier or TSP providing the messaging service. Registration as a reseller or non-dominant carrier will be required; as will a BITS licence. No fees are applicable other than contribution to the fund referenced above. The provision of an app over the public internet without transmission services is generally not regulated.
Radiocommunications
The Radiocommunication Act regulates spectrum, and the Minister of Innovation, Science and Industry is empowered to issue radio or spectrum licences, or to exempt frequencies from the requirement for a licence. The Minister oversees the department of Innovation, Science and Economic Development Canada (ISED), and is empowered to charge fees for radio or spectrum licences or to hold competitive bidding auctions.
Radio apparatus must meet ISED standards and certification requirements before they can be marketed, sold, offered for sale or imported into Canada. Certifications from specified countries can be used as the basis for obtaining Canadian certification, but certification from foreign regulators such as the FCC does not serve as authorisation to market, sell, offer for sale, or import radio apparatus in Canada.
As with cloud services, there are no private sector laws of general application focused primarily on the provision of IT services to the private sector in Canada, though other Canadian laws will apply to the provision of such services.
Applicable laws include:
All Canadian provinces and territories other than Quebec operate under a common law regime, and the law with respect to contracts will be broadly similar to those of other common law jurisdictions such as the United States and United Kingdom, subject to Canadian jurisprudence and provincial laws concerning contracts. Quebec is a civil law jurisdiction subject to the Civil Code of Quebec, including provisions of the code that address service contracts.
The issues discussed in 3. Cloud and Edge Computing regarding cloud services also apply to IT services.
Provincial Laws and Regulations Governing the Use of Electronic Signatures
The use of electronic signatures is governed by provincial statute and regulation, and the requirements and conditions for use vary heavily from province to province – as such, this summary is not intended to be comprehensive. As an example, Ontario’s Electronic Commerce Act (the “Ontario ECA”) is generally permissive of the use of electronic signatures, which it defines as “electronic information that a person creates or adopts in order to sign a document and that is in, attached to or associated with the document”.
The Ontario ECA (and other provincial statutes and regulations), however, forbid the use of electronic signatures for particular types of documents, such as wills, powers of attorney, or documents that constitute negotiable instruments (in the case of the Ontario ECA). The Ontario ECA also requires the electronic signature to meet any prescribed technology standards or requirements.
The Ontario ECA does not apply to the use of biometric information as an electronic signature or digital personal identifier, although they can be used if all parties to a transaction agree to their use.
Federal Laws and Regulations Related to Financial Institutions
Federal laws and regulations may also apply to financial institutions – or other entities involved in the provision of services involving the transfer of money – that impose “know your customer” requirements.
For example, the Proceeds of Crime (Money Laundering) and Terrorist Financing Act, and its underlying regulations, prescribe certain acceptable methods that a reporting entity (eg, a bank, trust and loan company, securities dealer, money service business, etc) can use to identify an individual, corporation or entity other than a corporation. These methods include:
Guidance issued by the Financial Transactions and Reporting Analysis Centre of Canada (FINTRAC), the financial intelligence unit of Canada’s federal Department of Finance, contemplates the ability of a reporting entity to use the government-issued photo identification method when a person is not physically present. This requires the reporting entity to have a process in place to authenticate the government-issued photo identification document. At this time, FINTRAC guidance provides few details with respect to the specifications or technical requirements for such a process, other than that the process must be able to determine that a government-issued photo identification document is authentic, valid and current. The guidance does, however, indicate that it is not enough to only view a person and their government-issued photo identification document through a video conference or another type of virtual application.
Bay Adelaide Centre
333 Bay Street, Suite 2400
PO Box 20
Toronto, ON
M5H 2T6
Canada
+1 416 366 8381
+1 416 364 7813
toronto@fasken.com www.fasken.com