Constitutional Rights
The United States Constitution does not explicitly include a right to privacy. The Bill of Rights does, however, protect certain aspects of privacy. For example, the First Amendment protects the privacy of beliefs, the Third Amendment protects the privacy of the home against the quartering of soldiers in private homes without the consent of the home’s owner, the Fourth Amendment prohibits unreasonable searches and seizures, and the Fifth Amendment creates rights relevant to both criminal and civil legal proceedings. Moreover, the Ninth Amendment provides that the enumeration of certain rights in the Bill of Rights cannot deny the existence of other rights. Some commentators interpret the Ninth Amendment as affirming the existence of rights outside those expressly protected by the Bill of Rights. Finally, certain decisions appear to indicate that the right to privacy, especially in marital relations, is part of the liberty interest of the 14th Amendment.
Sector-Specific Data Protection Legislation
There is currently no single, all-encompassing federal legislation covering privacy and the protection of personal information generally in the USA. Instead, legislation at the federal level primarily protects data in specific sectors, such as healthcare, education, communications and financial services or, in the case of online data collection, that of children. Examples of such laws include:
The sectoral approach adopted by US federal law to address privacy and data protection means that each state may enact its own laws governing privacy and data protection. As a result, privacy requirements differ from state to state, and cover different areas. Where a federal statute covers a specific topic, it may pre-empt a similar state law on the same topic.
The Federal Trade Commission
The Federal Trade Commission (FTC) is an independent US law enforcement agency charged with protecting consumers and enhancing competition across broad sectors of the economy. The FTC’s primary legal authority comes from Section 5 of the Federal Trade Commission Act, which prohibits “unfair or deceptive practices” in the marketplace. The FTC has taken the view that “unfair or deceptive practices” include, for example, a company’s failure to adhere to its own published privacy notice and the company’s failure to provide an adequate level of security for the personal information it holds, as well as the use of deceptive marketing practices. If a company violates an FTC order, the FTC can seek civil monetary penalties for the violations. The FTC can also seek civil monetary penalties for violations of certain privacy statutes and rules. This broad authority allows the FTC to address a wide array of practices affecting consumers, including those that emerge with the development of new technologies and business models.
The FTC
The FTC, an independent US federal law enforcement agency charged with protecting consumers, has become the primary privacy and security enforcement agency in the USA. The FTC’s primary legal authority comes from Section 5 of the Federal Trade Commission Act, under which the FTC’s jurisdiction is limited to challenging privacy violations by organisations whose information practices are considered “deceptive” or “unfair”. For example, when a company claims that it will safeguard the personal information of its customers, the FTC may begin an enforcement action to make sure that companies live up to these promises. On this basis, the FTC has brought legal actions against organisations that have violated consumers’ privacy rights, or misled them by failing to maintain security for sensitive consumer information, or caused substantial consumer injury.
In addition to its authority to take action against deceptive or unfair trade practices, the FTC has the authority to enforce several sector-specific laws, which include the CAN-SPAM Act, COPPA, the FCRA and the TCFAPA, among others. Since the FTC’s enforcement actions nearly always result in settlement agreements with companies, the contents of those agreements are used by companies looking for guidance in developing privacy practices.
The FTC may start an investigation on its own based on publicly available information, at the request of another agency, or based on complaints from consumers or competitors.
Other Agencies
Other agencies at the federal and state levels, as well as state consumer protection regulators (usually the state Attorneys General), may also exercise regulatory authority in relation to privacy. At the federal level, examples include the Office of the Comptroller of the Currency, the Department of Health and Human Services, the Federal Communications Commission, the Securities and Exchange Commission, the Consumer Financial Protection Bureau and the Department of Commerce.
State Attorneys General
State Attorneys General have the power to bring enforcement actions based on unfair or deceptive trade practices. The source of these powers are typically state laws prohibiting “unfair or deceptive acts and practices” and authorising the state Attorney General to initiate enforcement actions.
Recent privacy events have seen increased co-operation and co-ordination in enforcement among state Attorneys General, whereby multiple states will jointly pursue actions against companies that experience data breaches or other privacy allegations. Co-ordinated actions among state Attorneys General often exact greater penalties from companies than would typically be obtained by a single enforcement authority. In recent years, Attorneys General in states such as California, Connecticut and Maryland have formally created units charged with the oversight of privacy, and the State of New York has created a unit to oversee the internet and technology.
California Privacy Protection Agency
In November 2020, voters in the State of California approved Proposition 24, also known as the California Privacy Rights Act of 2020 (CPRA). The CPRA added new privacy protections to the existing California Consumer Privacy Act of 2018 (CCPA) and created a new agency, the California Privacy Protection Agency (CPPA), to implement and enforce the CCPA and the CPRA. The CCPA may bring enforcement actions related to the CCPA or CPRA before an administrative law judge. The California Attorney General retains civil enforcement authority over the CCPA and the CPRA.
Adjudication
The FTC determines in an adjudicative proceeding whether a practice violates the law. As mentioned previously, pursuant to Section 5(b) of the FTC Act, the FTC may challenge “unfair or deceptive” acts or practices. When the FTC has “reason to believe” that a violation of the law has occurred, the FTC may issue a complaint setting forth its charges. If the respondent elects to settle the charges, it may sign a consent agreement (without admitting liability), consent to entry of a final order, and waive all right to judicial review. If the FTC accepts the proposed consent agreement, it places the order on the record for thirty days of public comment (or for such other period as the FTC may specify) before determining whether to make the order final.
If instead the respondent elects to contest the charges, the complaint is adjudicated before an administrative law judge (ALJ) in a trial-type proceeding conducted under the FTC’s Rules of Practice. A “complaint counsel”, who is a staff member from the relevant bureau or a regional office, conducts the prosecution of the matter. Upon conclusion of the hearing, the ALJ issues an “initial decision” setting forth their findings of fact and conclusions of law, and recommending either the entry of an order to cease and desist, or the dismissal of the complaint. Either complaint counsel or respondent, or both, may appeal the initial decision.
Upon appeal of an initial decision, the FTC receives briefs, holds oral argument, and thereafter issues its own final decision and order. The FTC’s final decision is appealable by any respondent against which an order is issued. The respondent may file a petition for review with any US Court of Appeals within whose jurisdiction the respondent resides or carries on business or where the challenged practice was used. If the Court of Appeals affirms the FTC’s order, the Court enters its own order of enforcement. The party losing in the court of appeals may seek review by the Supreme Court.
Enforcement
An FTC order generally becomes final (ie, binding on the respondent) 60 days after it is served, unless the order is stayed by the FTC or by a reviewing court. Divestiture orders become final after all judicial review is complete (or if no review is sought, after the time for seeking review has expired). If a respondent violates a final order, it is liable for a civil penalty for each violation. The penalty is assessed by a federal district court in a suit brought to enforce the FTC’s order.
Where the FTC has determined in a litigated administrative adjudicatory proceeding that a practice is unfair or deceptive, and has issued a final cease and desist order, the FTC may obtain civil penalties from non-respondents who thereafter violate the standards articulated by the FTC. To accomplish this, the FTC must show that the violator had “actual knowledge that such act or practice is unfair or deceptive and is unlawful” under Section 5(a)(1) of the FTC Act. To prove “actual knowledge”, the FTC typically shows that it provided the violator with a copy of the FTC’s determination about the act or practice in question, or a “synopsis” of that determination.
APEC’s CBPR System
The USA participates in the Asia-Pacific Economic Cooperation’s (APEC) Cross-Border Privacy Rules (CBPR) system. At this stage, around twenty US companies are certified under the CBPR system and are therefore required to implement privacy policies and practices that are consistent with the CBPR programme requirements. The objective of the CBPR system is to bridge national privacy laws within APEC and reduce barriers to the flow of information. Certified businesses also demonstrate their commitment to consumer privacy through this system.
Global CBPR Forum
In April 2022, the US Department of Commerce Secretary Gina Raimondo announced a key development in international collaboration with the newly created Global Cross-Border Privacy Rules Forum (the “Global CBPR Forum”). Participant countries include Canada, Japan, the Republic of Korea, the Philippines, Singapore and Chinese Taipei. In August 2022, the Australian Government announced that Australia had joined the Global CBPR Forum.
According to the Global CBPR Declaration, the framework establishes a certification system to help companies in participating jurisdictions demonstrate compliance with internationally recognised privacy standards, with the aim of fostering interoperability and international data flows. The Global CBPR Forum will replace the existing APEC Cross-Border Privacy Rules (APEC CBPR) and Privacy Recognition for Processors (PRP) certification systems, enabling non-APEC countries to participate.
Transfers From the EEA: the Privacy Shield and SSCs
Data transfer from the European Economic Area (EEA) towards countries outside the EEA may only occur if they offer an “adequate” level of data protection, which generally means an equivalent level to the EU General Data Protection Regulation (GDPR). The European Commission has determined that several countries ensure an adequate level of protection due to their domestic law or the international commitments they have entered into. Pursuant to EU data protection law, the USA is not considered to offer an “adequate” level of data protection in relation to transfers of data. However, the EU Commission considered data transfers to US organisations that were certified under the EU-US Privacy Shield framework to be adequate. The EU Commission’s adequacy decision on the Privacy Shield framework was adopted on 12 July 2016, and the Privacy Shield framework became operational on 1 August 2016.
On 16 July 2020, the Court of Justice of the European Union (CJEU) issued a ruling invalidating the EU-US Privacy Shield framework and setting out stricter criteria for using other safeguards such as standard contractual clauses (SCCs) or binding corporate rules (BCR). In particular, the CJEU pointed to the far-reaching possibilities of surveillance that exists under the US national security laws. The CJEU identified Section 702 of the Foreign Intelligence Surveillance Act (FISA), Executive Order 12333 and Presidential Policy Directive 28 (PPD-28), which allow US intelligence agencies to collect data on foreign nationals, as inconsistent with rights guaranteed in the Charter of Fundamental Rights of the European Union.
On 4 June 2021, the European Commission issued an updated set of SCCs for data transfers from controllers or processors located in the EEA (or otherwise subject to the GDPR) to controllers or processors established outside the EEA (and not subject to the GDPR). Since then, there have been decisions, such as from the Austrian or French data protection authorities in relation to Google Analytics, which have invalidated certain data transfers from the EEA to the USA due to concerns surrounding the potential accessibility of the data by intelligence services.
After nearly two years of negotiations, the United States and the European Commission announced on 25 March 2022 the Trans-Atlantic Data Privacy Framework, designed to address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-US Privacy Shield framework. In particular, under the proposed framework, the United States made commitments to strengthen the privacy and civil liberties safeguards governing US signals intelligence activities, establish a new redress mechanism with independent and binding authority and enhance its existing rigorous and layered oversight of signals intelligence activities.
Following such announcement, President Biden signed on 7 October 2022 an Executive Order to implement the EU-US Data Privacy Framework. Among other things, the new framework will allow individuals in the EU to seek redress through an independent Data Protection Review Court made up of members outside the US government. That body “would have full authority to adjudicate claims and direct remedial measures as needed”. In addition, the Executive Order provides that US signals intelligence activities may only be conducted following a determination that they are “necessary to advance a validated intelligence priority”, and “only to the extent and in a manner that is proportionate to the validated intelligence priority for which they have been authorized”. The Executive Order also specifies certain “legitimate objectives” and “prohibited objectives” for which US signals intelligence activities may be carried out. The Executive Order also requires US intelligence agencies to review their policies and procedures to implement these new safeguards.
On 13 December 2022, the European Commission announced that it has launched the process for the adoption of an adequacy decision for the EU-US Data Privacy Framework. By way of reminder, an adequacy decision is one of the tools provided under the GDPR to transfer personal data from the EU to third countries, which, in the view of the European Commission, offer a comparable level of protection of personal data to that of the European Union.
As part of the adequacy process, the European Data Protection Board (EDPB) will review the draft adequacy decision, although such opinion is non-binding. Such review is expected to take at least six months. Once finalised, the European Commission will put the proposal before a committee of EU member state representatives, which will have the final say. In addition, the European Parliament has a right of scrutiny over adequacy decisions. As of the date of publication of this chapter (9 March 2023), the adequacy process was not yet completed.
In the meantime, it can be noted that the European Commission has already stated in relation to the Executive Order that “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used”.
There are a number of non-governmental organisations (NGOs) in the USA that are focused on privacy and data protection issues.
The USA and the EU have a fundamentally different approach to privacy law. Generally, the EU member states view privacy as a fundamental human right and freedom. In particular, Article 8 of the EU Charter of Fundamental Rights proclaims that “everyone has the right to the protection of personal data concerning him or her”, and also that “everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.” In addition, even before the GDPR was adopted, the European approach was to use a comprehensive or omnibus approach to data protection law, where an overarching law covers the collection of all information and data relating to all EU data subjects.
By contrast, the US Constitution contains no express right to privacy. Moreover, rather than create an overarching privacy regulation, the USA has enacted various privacy laws as and when a need for them arises, based on a sectoral approach. As discussed in 1.1 Laws, there are a number of laws covering specific sectors: for example, health information is regulated under HIPAA, financial information is regulated under the GLBA and FCRA, and marketing can be regulated under the TCPA or CAN-SPAM regulations.
Moreover, information relating to an individual is typically referred to as “personally identifiable information” (PII) or “personal information”, in contrast to the concept of “personal data” found in the European framework. Under US law, the scope of PII or personal information is not uniform as the information protected varies across legislation and state. In particular, certain types of data may be protected for a given purpose under a specific framework, but not for another. Personal data, in the context of GDPR, covers a much wider range of information than PII. In other words, all PII is personal data but not all personal data is PII.
Unsurprisingly, terminology and concepts introduced in the GDPR, such as “processor”, “controller”, “data subject” or “sensitive personal data”, are generally not applicable in the USA, among (many) other differences.
A number of key developments have taken place in the past 12 months affecting US businesses, as described throughout this chapter. See in particular 1.4 Multilateral and Subnational Issues in relation to developments on cross-border data transfers. Other significant developments include the following.
Brexit
Brexit has also impacted data transfers towards the USA, since, in addition to the GDPR, organisations now need to consider the UK General Data Protection Regulation (UK GDPR), as tailored by the UK Data Protection Act 2018. For instance, due to Brexit, the revised SCCs mentioned above published by the EU Commission do not directly apply in the UK, since the UK is no longer an EU member state.
On 2 February 2022, the Secretary of State laid before Parliament an international data transfer agreement (IDTA), an international data transfer addendum to the European Commission’s standard contractual clauses for international data transfers (Addendum) and a document setting out transitional provisions. The documents were issued under Section 119A of the Data Protection Act 2018 and, following Parliamentary approval, came into force on 21 March 2022.
The UK government has also announced that it is working in partnership with a number of priority countries to reach a UK adequacy arrangement in the future with such countries. The United States is one of the countries on the UK’s list of priority destinations for adequacy.
State Legislation
California Consumer Protection Act, as amended by the California Privacy Rights Act
At the state level, the California Consumer Protection Act of 2018 (CCPA) came into effect on 1 January 2020, introducing one of the most comprehensive privacy laws in the USA. The CCPA has established new rights for California residents, additional protections for children’s data and rules around the sale of personal information. The CCPA also included the right for California residents to opt out of the sale of their personal information and the right to non-discrimination in terms of price and services when a consumer exercises a privacy right under the CCPA. The Office of the Attorney General issued regulations in June 2020 clarifying how to implement the CCPA’s requirements. The regulations address topics such as how businesses should provide notice to individuals, verify the identity of the persons, and handle requests for the exercise of privacy rights (eg, right to know, right to access, right to delete, right to opt out, etc). The regulations went into effect on 14 August 2020. Additional amendments to the regulations went into effect on 15 March 2021.
In November 2020, voters in the State of California approved Proposition 24, also known as the California Privacy Rights Act (CPRA), which went into effect on 1 January 2023. The CPRA amends and expands the existing CCPA. In particular, the CPRA created the California Privacy Protection Agency (CPPA), which has the authority to bring an administrative enforcement action against businesses that violate the CCPA or the CPRA. The attorney general retains enforcement authority over the CCPA or the CPRA. Changes introduced in the CCPA, as amended by the CPRA include:
Virginia Consumer Data Protection Act
On 2 March 2021, the Virginia Consumer Data Protection Act (VCDPA) was signed into law and became effective on 1 January 2023. This made Virginia the second state to enact a consumer privacy and data security law, following in the footsteps of California. The VCDPA applies to businesses that conduct business in Virginia, or produce products or services that target Virginia residents, and that:
“Consumer” is defined as a natural person who is a resident of Virginia, acting only in an individual or household context. The definition explicitly excludes individuals acting in a commercial or employment context.
The VCDPA grants Virginia consumers the rights to access, correct, delete, know, and opt out of the sale and processing for targeted advertising purposes of their personal information, similar to the CCPA and CPRA. However, the VCDPA is not a replica of the CPRA; instead, it takes inspiration from the GDPR in a few key areas. For example, it requires covered organisations to perform Data Protection Assessments (not to be confused with Data Protection Addendums) which resemble the GDPR’s Data Protection Impact Assessments (DPIAs) and the VCDPA further adopts similar terminology to that used in the GDPR (ie, “controller” and “processor”). The Attorney General may initiate actions and fines of USD7,500 per violation of the VCDPA. There is no private right of action for consumers under the VCDPA.
Colorado Privacy Act
The Colorado Privacy Act (CoPA) was enacted on 8 July 2021 and is set to take effect on 1 July 2023. CoPA applies to legal entities that conduct business or produce commercial products or services that are intentionally targeted to Colorado residents and that either:
Similar to the VCDPA, CoPA’s definition of consumer does not include individuals acting in commercial or employment contexts. Instead, it is designed to protect the “consumer”, defined in CoPA as “an individual who is a Colorado resident acting only in an individual or household context; and does not include an individual acting in a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context”. Importantly, CoPA also uses similar terminology to the GDPR (ie, “personal data”, “controller” and “processor”).
Among other things, CoPA grants consumers the following:
Connecticut’s Data Privacy Act
On 10 May 2022, Gov. Ned Lamont signed Senate Bill 6, An Act Concerning Personal Data Privacy and Online Monitoring (CDPA) into law. Most provisions of the law will go into effect alongside the Colorado Privacy Act on 1 July 2023.
It applies to businesses that, during the preceding calendar year:
The CDPA includes many of the same rights, obligations and exceptions as the data privacy laws in California, Colorado, and Virginia. It draws heavily from CoPA’s law and the VCDPA, with many of the law’s provisions either mirroring or falling somewhere between the CoPA and the VCDPA, but contains a few notable distinctions that should be factored into an entity’s compliance efforts.
Utah’s Consumer Privacy Act
In March 2022, Governor Spencer Cox signed the Utah Consumer Privacy Act (UCPA) into law, which takes inspiration from the VCDPA, CoPA and CPRA.
The UCPA applies to both data controllers and processors that generate over USD25 million in annual revenue and either:
Contrary to the CPRA, VCDPA or CoPA, the UCPA does not grant individuals the right to opt out of profiling or the right to correct inaccuracies in their data.
Significant pending changes, hot topics and issues on the horizon over the next 12 months include the following.
Following the Schrems II decision, SCCs remain a valid EU-US data transfer mechanism but require companies to self-assess the recipient country’s level of protection (ie, by conducting a Transfer Impact Assessment) and adopt supplementary measures where the third country does not provide “a sufficient level” of safeguards in accordance with EU data protection law. Since President Biden’s Executive Order to implement the EU-US Data Privacy Framework (see 1.4 Multilateral and Subnational Issues for additional background information) has taken effect, companies currently conducting Transfer Impact Assessments should reference and discuss this Executive Order when analysing the impact of US surveillance laws on the proposed transfers. Once the EU-US Data Privacy Framework is available for companies to self-certify, the costs and uncertainty associated with conducting a Transfer Impact Assessment and supplementary measures is likely to be reduced. In the meantime, the European Commission has indicated that “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used”. In any case, the topic of international data transfers is expected to remain a hot topic for the foreseeable future.
Several state legislatures have enacted a privacy bill following California’s steps, namely Virginia, Colorado, Connecticut and Utah. As mentioned in 1.7 Key Developments, 2023 is the year when these new US state privacy laws enter into effect, and more states are expected to adopt their own privacy laws. In turn, the enactment of various state privacy laws is likely to increase pressure to enact a comprehensive US federal privacy law, as organisations grapple to comply with the requirements of the various state laws, each imposing slightly different requirements. Simply put, organisations have to comply with no less than five different state privacy laws at the moment, with more expected to be enacted in the near future. However, while there has been a push in 2022 for a federal privacy bill, the proposed text (the American Data Privacy and Protection Act) was ultimately not successful.
As mentioned in 1.1 Laws, there is currently no federal legislation protecting personal information generally across the country. Rather, there are many laws at the federal level protecting personal information in specific sectors, and, in addition, the various privacy laws enacted at state level must be taken into account.
The State of California has traditionally taken a leadership role in the USA in relation to cybersecurity and the protection of the personal information of California residents. For example, California was one of the first states in the nation to provide an express right of privacy in the California Constitution, giving each citizen an “inalienable right” to pursue and obtain “privacy”. California also was the first US state to enact, in 2002, a data breach notification law requiring organisations to notify all impacted individuals “in the most expedient time possible and without unreasonable delay, consistent with the legitimate needs of law enforcement”, whenever information relating to a California resident may have been compromised. The CCPA is the first state-level omnibus privacy law imposing broad obligations on businesses to provide state residents with transparency and control over their personal information. The CPRA, which entered into effect in January 2023, amends and further extends the CCPA’s requirements.
Territorial Scope
Organisations established in other jurisdictions may be subject to both federal and state privacy laws if they collect, store, transmit, process or share personal information of US residents.
Principles
The FTC has issued various guidance documents addressing principles such as transparency, lawfulness of processing, purpose limitation, data minimisation, proportionality, retention and recommending privacy-by-design practices. The FTC staff has also issued guidance on online behavioural advertising, emphasising core principles such as giving meaningful disclosure and choice to consumers, limiting data retention and obtaining consent where information is intended to be used in a manner that differs from the disclosures made when the data were collected.
Privacy Policy
Certain states have enacted laws requiring the publication of a privacy policy. The first state law in the nation to require commercial websites and online services to post a privacy policy, the California Online Privacy Protection Act (CalOPPA) went into effect in 2004. CalOPPA was later amended in 2013 to require certain disclosures regarding tracking of online visits.
CalOPPA applies to any person or company whose website or online service collects personal information from California consumers. It requires the website to feature a conspicuous privacy policy stating exactly what information is collected and with whom it is shared. Sectoral laws may impose certain requirements. For example, financial institutions covered by the Gramm-Leach-Bliley Act must tell their customers about their information-sharing practices and explain to customers their right to “opt out” if they do not wish their information shared with certain third parties.
Individual Rights
There is no general right of access, rectification, deletion, objection or restriction recognised across the country for all types of personal information. Instead, the existence of these rights depends on each specific statute (there is no common general approach across the country). For example, COPPA provides that parents have a right to review and delete the personal information relating to their children. Pursuant to HIPAA, individuals are entitled to request copies of medical information held by a health services provider. Pursuant to the FCRA, individuals may receive a copy of their credit report maintained by a reporting agency. In relation to state law, the CCPA grants California residents several rights in relation to personal information held by a business relating to that resident, such as the right of access, right of deletion, right to restrict processing, right to data portability, etc. The CPRA further extends the CCPA, recognising the right to correct inaccurate information.
Registration Requirements
Some states (such as California and Vermont) require data brokers to register with the state Attorney General. For example, California’s data broker law applies to “data brokers”, which are defined as businesses that knowingly collect and sell to third parties the personal information of consumers with whom the businesses do not have direct relationships. Data brokers must also pay an annual registration fee. Any data broker that fails to register may be subject to a civil penalty of USD100 for each day it remains unregistered, as well as other penalties, fees and costs.
Data Protection Officer
There are no specific requirements to appoint a formal privacy officer or data protection officer in the USA. However, certain regulated entities (eg, those covered by statutes such as HIPAA or the GLBA) are required to comply with certain privacy and security obligations. Some states may also require the formal appointment of an employee to maintain the organisation’s information security programme. New York’s SHIELD Act, which became effective on 21 March 2020, identifies the required components of a data security programme that, if implemented, are deemed to satisfy the reasonableness standard under New York law – the designation of “one or more employees to coordinate the security program” is expressly listed as a “reasonable administrative safeguard”. In any case, appointing a chief privacy officer and a chief information security officer is a best practice that is common among larger organisations and increasingly also among mid-sized ones.
International Transfers
The USA does not have restrictions on the transfer of personal information to other countries.
Data Security and Data Breaches
Certain federal and state laws impose obligations to ensure the security of personal information. The FTC has stated that a company’s security measures must be reasonable. In addition, some federal and state laws establish breach notification requirements. State statutes require the reporting of data breaches to a state agency or Attorney General under certain circumstances.
In light of the recent cyber-attacks, certain states have recently begun enacting laws providing a liability exemption for companies that adopt industry-recognised cybersecurity frameworks such as the National Institute of Standards and Technology’s (NIST) Cybersecurity Framework and the Center for Internet Security’s (CIS) Critical Security Controls. These laws are intended to provide incentives for companies to follow nationally recognised cybersecurity standards, by granting a “safe harbour” against certain state tort law claims in their states in the event of a data breach. Both Utah (March 2021) and Connecticut (July 2021) adopted such cybersecurity safe harbour statutes for businesses impacted by a data breach, following in the footsteps of Ohio, which enacted such legislation in 2018.
In the USA, certain statutes (such as the GLBA and the FCRA) impose additional requirements for sensitive information.
Financial Information
The GLBA regulates the collection, safekeeping and use of private financial information by financial institutions. For example, according to the GLBA’s Safeguards Rule, if an entity meets the definition of a financial institution, it must adopt measures to protect the customer data in its possession. Financial institutions are required to notify customers of their data practices and privacy policies, prevent the disclosure of personal information to third parties and establish appropriate safeguards to secure personal information.
Where the personal information of customers is impacted by a security breach, financial institutions must notify the relevant regulators and the customers involved. There are a number of regulators that can enforce consumer privacy rights under the GLBA including the Federal Reserve, the Federal Deposit Insurance Corporation, the Office of the Comptroller of the Currency, the Securities and Exchange Commission, the Consumer Financial Protection Bureau and the FTC (for non-bank financial institutions).
Health Information
For organisations operating in the healthcare industry, the Department of Health and Human Services (HHS) enforces compliance with HIPAA and HITECH. HIPAA applies to a range of organisations, such as those that administer health plans, healthcare clearing houses, healthcare providers, service providers that require access to personal health information (PHI) and providers of employee medical insurance. In order to safeguard electronically stored health information, HIPAA requires that organisations enter into business associate agreements with vendors who will require access to PHI. Such agreements restrict the vendors’ use and disclosure of the PHI except as set out in the agreement as well as ensuring that the confidentiality and integrity of data is ensured. HIPAA’s Breach Notification Rule requires any data breaches to be reported to the HHS and imposes civil and criminal penalties for organisations that fail to adequately protect PHI with appropriate information security standards. In addition, HIPAA’s Security Rule requires organisations to maintain appropriate administrative, physical and technical measures to protect the confidentiality, integrity and security of electronic PHI.
Communications Data
Communications data is governed by a number of federal laws such as:
Children’s and Students’ Information
Information relating to children is protected by the Children’s Online Privacy Protection Act (COPPA), which imposes requirements on operators of websites or online services directed to children under the age of 13, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under the age of 13. Among other requirements, operators of websites or online services must post a complete privacy policy online, notify parents directly about their information collection practices, and get verifiable parental consent before collecting personal information from their children or sharing it with others. The FTC is responsible for the enforcement of COPPA.
The Family Educational Rights and Privacy Act (FERPA) is a federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable programme of the US Department of Education. It gives parents or eligible students more control over their educational records, and it prohibits educational institutions from disclosing “personally identifiable information in education records” without the written consent of an eligible student, or, if the student is a minor, the student’s parents.
Video Viewing Information
The Video Privacy Protection Act (VPPA), passed by Congress in 1988, is intended to prevent a “video tape service provider” from “knowingly” disclosing an individual’s “personally identifiable information” (PII) to third parties where that individual “requested or obtained… video materials” such as “pre-recorded video cassette tapes or similar audio visual materials”. When passing the law, Congress had in mind rental video providers of visual materials such as VHS tapes. While the text of the VPPA may appear outdated today, the VPPA has been at the centre of a number of high-profile lawsuits in recent years since its broad language is used in relation to digital video materials, such as online video-streaming services. The VPPA creates a private right of action and allows a court to award statutory damages upwards of USD2,500 per violation.
Credit and Consumer Reports
Credit and consumer reports are governed by the FCRA, as amended by the Fair and Accurate Credit Transactions Act 2003, which promotes accuracy, fairness and privacy of the information contained in consumer credit reports and aims to protect consumers from identity theft. The law regulates the way credit reporting agencies can collect, access, use and share the data they collect in individuals’ consumer reports. For example, the FCRA grants consumers the right to request and access all the information a reporting agency has about such a consumer. Enforcement of the FCRA is shared between the FTC and federal banking regulators.
Online Behavioural Advertising
The FTC staff has issued guidance on online behavioural advertising, emphasising the following principles to protect consumer privacy interests.
However, the FTC has not indicated that opt-in consent for the use of non-sensitive information is necessary in behavioural advertising.
The CAN-SPAM Act, a law that sets the rules for commercial email, requires commercial messages to contain a method for recipients to opt out or unsubscribe from such communications without incurring any costs. Despite its name, the CAN-SPAM Act does not apply just to bulk email. It covers all commercial messages, which the law defines as “any electronic mail message the primary purpose of which is the commercial advertisement or promotion of a commercial product or service”, including email that promotes content on commercial websites. The law makes no exception for business-to-business email. That means all email – for example, a message to former customers announcing a new product line – must comply with the law. However, emails that are informational, transactional or relationship-oriented are exempt from CAN-SPAM.
There are federal and state laws that apply to telemarketing communications that vary in the restrictions imposed including restricted calling times, do-not-call registers, opt-out requests, mandatory disclosures and a prohibition on using auto-diallers or pre-recorded messages.
The FTC’s Telemarketing Sales Rule established a national do-not-call register that will prevent most unsolicited telemarketing calls although there are some exceptions. Political parties, charities, debt collectors, healthcare providers and organisations issuing informational (rather than sales) messages are still permitted to call telephone numbers that feature on the register. Where an individual has an established business relationship with an organisation, the register will not prevent unsolicited calls for a period of 18 months after the relationship has ended.
Under the TCPA, an individual’s express written consent must be obtained before certain marketing texts may be sent to a mobile phone, which includes messages sent using an auto-dialler. The TCPA and CAN-SPAM Act apply to both business-to-consumer and business-to-business electronic direct marketing. The FTC, FCC and the state Attorneys General are active enforcers in this area. California’s Shine the Light Act requires businesses that disclose personal information to third parties for the direct marketing purposes of those third parties to provide notice and access to certain information.
Federal Legislation
Broadly, in the USA, employee monitoring is legal and mostly unregulated. As an employer-provided computer system is property of the employer, they may listen to, watch and read employees’ workplace communication and, in some cases, personal messages. While the Fourth Amendment of the US Constitution protects the privacy rights of federal, state and local government employees, this protection does not extend to employees in the private sector.
Digital privacy is covered by the Electronic Communications Privacy Act (ECPA), which protects against the interception (in transit) of digital and electronic communications. It also includes the Stored Communications Act (SCA), which, as the name suggests, covers the disclosure of stored communications and records. The ECPA permits employers to monitor the verbal and written communications of their employees, provided there is a legitimate business reason for such monitoring or the employer has obtained the employee’s consent. The SCA has a broader exception, allowing employers to access stored information outside the ordinary course of business.
State Legislation
There are some state laws that regulate the monitoring of employee communications. In Connecticut, employees must receive written notice of the monitoring and the monitoring methods that will be used. In California, Florida, Louisiana and South Carolina, there is a state constitutional right to privacy, which makes employee monitoring difficult for employers. On a state level, only Connecticut and Delaware require that employers notify employees about monitoring of email or internet beforehand.
Video Surveillance
The National Labor Relations Board has stated that video surveillance introduced in the workplace is a condition of employment and, as such, should be agreed with trade unions and be subject to collective bargaining unless previously agreed. The National Labor Relations Board recommends that the roll out of any surveillance or monitoring programmes are always subjected to the scrutiny of trade unions. There are significant issues caused in monitoring employees in relation to trade union activities.
Whistle-Blowing
US employees are protected from retaliation from their employers if they make a protected disclosure under the Whistleblower Protection Act. For federal employees, disclosures are usually made to an Inspector General using a confidential hotline that permits confidential whistle-blowing disclosures. The Inspectors General may not disclose the identity of the disclosing employee unless it is unavoidable or it is mandated by a court order. For non-federal employers, it is recommended that hotlines allow for anonymous reporting. Further, the Sarbanes-Oxley Act 2002 introduced a requirement for publicly traded companies to implement a mechanism for employees to make anonymous reports of financial irregularities.
FTC Enforcement
The FTC is active in regulating data security and privacy issues. For example, on 1 February 2021, the FTC announced it had finalised a settlement with Zoom, over allegations that the company misled consumers about the level of security it provided in its software for online meetings. The FTC order also requires the company to implement a comprehensive security programme, review any software updates for security flaws prior to release and ensure the updates will not hamper third-party security features. The company must also obtain biennial assessments of its security programme by an independent third party, which the FTC has authority to approve, and notify the FTC if it experiences a data breach.
The possible enforcement penalties that are available to the FTC include injunctions and damages although the FTC places greater reliance on consent decrees, under which the organisation will be monitored by the FTC for further violations, which will incur financial penalties. On 14 December 2020, the FTC announced that it had issued orders to nine social media and video streaming companies, requesting information on how these companies collect, use and present personal information, their advertising and user engagement practices and how their practices affect children and teens.
To date, the largest fine imposed by the FTC for violating consumers’ privacy was the fine of USD5 billion against Facebook, on 24 July 2019 for violations of an earlier FTC order. It is one of the largest penalties ever assessed by the US government for any violation. As part of the new FTC order, the company is required to conduct a privacy review of every new or modified product, service or practice before it is implemented, and document its decisions about user privacy.
On 4 September 2019, the FTC imposed a total fine of USD170 million on YouTube to settle allegations by the FTC and the New York Attorney General that the video sharing service illegally collected personal information from children without their parents’ consent. To be more precise, the settlement included a fine of USD136 million to the FTC and USD34 million to New York for allegedly violating the Children’s Online Privacy Protection Act (COPPA) Rule. The USD136 million penalty is currently the largest amount imposed in a COPPA case.
Enforcement by Other Regulators
The FTC is not the only regulator actively enforcing privacy. On 6 August 2020, the Office of the Comptroller of the Currency imposed an USD80 million civil money penalty against Capital One, a large national bank. It was alleged that the bank failed to establish effective risk assessment processes before migrating information technology operations to a public cloud environment and that it failed to correct the issues that arose in a timely manner.
On 22 July 2020, the New York Department of Financial Services (NYDFS) announced that it had filed administrative charges against First American, an insurance company, pursuant to the NYDFS Cybersecurity Regulation, marking the agency’s first enforcement action since the rules went into effect in March 2017. The NYDFS alleges that the insurer failed to fix a vulnerability on its public-facing website, resulting in the exposure of millions of documents containing consumers’ sensitive personal information, including bank account numbers, mortgage and tax records, social security numbers, wire transaction receipts and drivers’ licence images.
On 24 November 2020, a multistate coalition of Attorneys General announced that Home Depot, a large home improvement retailer, agreed to pay USD17.5 million and implement a series of data security practices in response to a data breach the company experienced in 2014, with the USD17.5 million payment to be divided among the 46 participating states and the District of Colombia.
On 16 August 2021, the US Securities and Exchange Commission announced that Pearson plc, a London-based public company that provides educational publishing and other services to schools and universities, agreed to pay USD1 million to settle charges that it misled investors about a 2018 cyber-incident involving the theft of student data and administrator credentials.
According to the SEC, the company made a reference in its semi-annual report filed in July 2019 (Form 6-K) to a data privacy incident as a hypothetical risk, when the cyber-incident had in fact already occurred. The SEC further alleged that the company subsequently indicated in a media statement that the breach may have included dates of birth and email addresses when such records were in fact known to have been stolen, and that the company did not patch a critical vulnerability for six months after being notified of it. The SEC also held that the company’s disclosure controls and procedures were not designed to ensure that those responsible for making disclosure determinations were informed of certain information about the circumstances surrounding the incident.
This settlement highlights once again the importance of carefully assessing the materiality of a cyber-attack and the importance of providing adequate and accurate disclosures in company filings. In June 2021, the SEC held that another company, First American Financial Corporation, made inaccurate disclosures regarding a cybersecurity incident reflecting inadequate disclosure controls and procedures. These cases show the increased focus on cybersecurity issues and the importance of disclosure controls and procedures to timely escalate cyber-incidents and support any cybersecurity response plans.
Private Litigation
In addition to enforcement from regulatory entities, individuals may bring private rights of action and class actions for privacy and security violations that relate to credit reporting, marketing, electronic communications and call recording, under the respective legislation. Pursuant to the CCPA (California), individuals may bring a private right of action to claim statutory damages where their unencrypted personal information was not adequately protected by an organisation. It is possible that there will be increased class actions in this area.
Employees may also bring a private right of action under the common law, where previous cases have established a precedent regarding the invasion of their privacy by their employer’s workplace monitoring. Employees would need to demonstrate that there was an expectation of privacy in relation to the specific information that has been monitored by an employer.
The Fourth Amendment of the US Constitution protects the privacy of a person and possessions from unreasonable searches and seizures by federal or state law enforcement authorities. This right is triggered where an individual has a reasonable expectation of privacy.
The Fourth Amendment provides safeguards to individuals during searches and detentions, and prevents unlawfully seized items from being used as evidence in criminal cases. The degree of protection available in a particular case depends on the nature of the detention or arrest, the characteristics of the place searched, and the circumstances under which the search takes place.
The reasonableness standard generally requires a warrant supported by probable cause. The search and seizure must also be conducted reasonably. When law enforcement officers violate an individual’s constitutional rights under the Fourth Amendment, and a search or seizure is deemed unlawful, any evidence derived from that search or seizure will almost certainly be kept out of any criminal case against the person whose rights were violated.
The Foreign Intelligence Surveillance Act
The Foreign Intelligence Surveillance Act (FISA) permits the US government to access personal data for national security purposes. Pursuant to FISA, the government can obtain information, facilities or technical assistance from a broad range of entities. National Security Letters (NSLs) offer an additional investigative tool for limited types of entities. The Foreign Intelligence Surveillance Court (FISC), a federal court staffed by independent, life-tenured judges, approves and oversees FISA activities.
FISA was originally intended to govern surveillance activities targeting individuals inside the USA. In 2008, however, Section 702 was enacted to authorise the acquisition of foreign intelligence information about non-US persons located outside the USA (a non-US person is anyone who is not a US citizen or permanent US resident).
Section 702 operates differently to the “traditional” FISA provisions, which require the government to obtain orders on an individualised basis and demonstrate probable cause in each case. However, pursuant to Section 702, the government does not have to specify which non-US persons will be targeted.
Under Section 702, the Attorney General (AG) and Director of National Intelligence (DNI) submit written certifications to the FISC that jointly authorise surveillance activities for up to a year. A Section 702 certification must be based on specific criteria determined annually by the AG and DNI, pending review and approval by the FISC, and must be accompanied by (and the FISC must approve) targeting procedures defining how the government determines which specific persons’ communications may be acquired.
In practice, the government sends the providers “selectors” (such as telephone numbers or email addresses) that are associated with specific “targets” (an account identifier such as an email address or telephone number of an individual or legal entity). Thus, in the terminology of Section 702, people (eg, non-US persons reasonably believed to be located outside the USA) are “targeted”, whereas “selectors” (eg, email addresses or telephone numbers) are tasked. The targeting procedures approved by the FISC are binding on the government and must specify how a “selector” may be “tasked” to acquire the type of foreign intelligence specified in the certification.
Executive Order 12333
Originally issued in 1981, Executive Order 12333 on US Intelligence Activities (EO 12333) was enacted to, among other things, “enhance human and technical collection techniques [of the US government], especially those undertaken abroad, and the acquisition of significant foreign intelligence, as well as the detection and countering of international terrorist activities and espionage conducted by foreign powers.”
In broad terms, EO 12333 provides the foundational authority by which US intelligence agencies collect foreign “signals intelligence” information, being information collected from communications and other data passed or accessible by radio, wire and other electromagnetic means. Unlike FISA’s Section 702, EO 12333 does not authorise the US government to require any company or person to disclose data.
In a Press Release issued in 2013, the NSA has indicated that:
“Executive Order 12333 is the foundational authority by which NSA collects, retains, analyzes, and disseminates foreign signals intelligence information. The principal application of this authority is the collection of communications by foreign persons that occur wholly outside the United States. To the extent a person located outside the United States communicates with someone inside the United States or someone inside the United States communicates with a person located outside the United States those communications could also be collected. Collection pursuant to EO 12333 is conducted through various means around the globe, largely from outside the United States, which is not otherwise regulated by FISA. Intelligence activities conducted under this authority are carried out in accordance with minimization procedures established by the Secretary of Defense and approved by the Attorney General.”
The NSA further indicated that this process will often involve the collection of communications metadata:
“For instance, the collection of overseas communications metadata associated with telephone calls – such as the telephone numbers, and time and duration of calls – allows NSA to map communications between terrorists and their associates. This strategy helps ensure that NSA’s collection of communications content is more precisely focused on only those targets necessary to respond to identified foreign intelligence requirements.”
Similar to FISA’s Section 702, EO 12333 requires procedures to minimise how an agency collects, retains or disseminates US person information. These procedures must be approved by the Attorney General and can be found in documents such as United States Signals Intelligence Directive SP0018 (USSID 18).
Presidential Policy Directive 28
Presidential Policy Directive 28 (PPD-28), a presidential directive in effect since 2014, sets certain binding requirements for SIGINT (i.e., signals intelligence) activities. As a formal presidential directive, it has the force of law within the executive branch, and compliance is mandatory.
PPD-28 declares that “all persons should be treated with dignity and respect regardless of their nationality or wherever they might reside, and all persons have legitimate privacy interests in the handling of their personal information.” The order recognises that the same protections and safeguards applicable to Americans (ie, requiring that surveillance take place only for defined and legitimate purposes) also apply to citizens of foreign countries. In particular, PPD-28 delimits the use of SIGINT collected in bulk to detecting and countering six types of threat: (i) espionage and other threats from foreign powers; (ii) terrorism; (iii) threats from weapons of mass destruction; (iv) cybersecurity threats; (v) threats to US or allied forces; and (vi) transnational criminal threats, including illicit finance and sanctions evasion related to the other purposes named in this section.
PPD-28 further provides that “In no event may signals intelligence collected in bulk be used for the purpose of suppressing or burdening criticism or dissent; disadvantaging persons based on their ethnicity, race, gender, sexual orientation, or religion; affording a competitive advantage to U.S. companies and U.S. business sectors commercially; or achieving any purpose other than those identified in this section.” It also requires each intelligence agency to adopt new policies and procedures allowing the retention or dissemination of personal information, regardless of nationality, only if retention or dissemination of “comparable information concerning U.S. persons would be permitted”.
On 7 October 2022, President Biden signed an Executive Order to implement the EU-US Data Privacy Framework. Among other things, the new framework will allow individuals in the EU to seek redress through an independent Data Protection Review Court made up of members outside the US government. That body “would have full authority to adjudicate claims and direct remedial measures as needed”. In addition, the Executive Order provides that US signals intelligence activities may only be conducted following a determination that they are “necessary to advance a validated intelligence priority”, and “only to the extent and in a manner that is proportionate to the validated intelligence priority for which they have been authorized.” The Executive Order also specifies certain “legitimate objectives” and “prohibited objectives” for which US signals intelligence activities may be carried out. The Executive Order also requires US intelligence agencies to review their policies and procedures to implement these new safeguards.
In the meantime, the European Commission has already stated in relation to the Executive Order that “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used”.
The CLOUD Act
The US Clarifying Lawful Overseas Use of Data Act (CLOUD Act) was passed in 2018, mooting the then pending US Supreme Court case, United States v Microsoft (Ireland), in which Microsoft challenged a warrant from the US government requiring it to produce emails that were electronically stored on servers located in Ireland. The CLOUD Act amended an existing US law, the Stored Communications Act (SCA), to allow US law enforcement, through a warrant, subpoena or court order, to access communications data stored electronically outside the USA, as long as the information sought is relevant and material to an ongoing criminal investigation.
The CLOUD Act explicitly states that it applies to providers of an electronic communication service or remote computing service who hold or store data or other information “pertaining to a customer or subscriber”, “regardless of whether such communication, record, or other information is located within or outside of the United States.” Accordingly, even if data is stored outside the USA, the US government would still be able to seek access to such data located outside the US, as long as the service provider is subject to the jurisdiction of the USA. These powers apply to any provider of an electronic communication service or remote computing service who is subject to US jurisdiction.
CLOUD Act Agreements
In addition, the CLOUD Act also enables the entry into an executive agreement with foreign countries, whereby countries who enter into such agreements may request data directly from companies based in the other country. In this respect, the CLOUD Act supplements rather than eliminates mutual legal assistance treaties (MLAT), which remain another method by which evidence in criminal cases is made available to authorities from other countries.
The first known country to have entered into an executive agreement pursuant to the CLOUD Act is the UK: on 3 October 2019, the USA and the UK signed a bilateral agreement under the CLOUD Act, allowing law enforcement bodies from both countries direct access to electronic data stored by companies in the other country. The agreement allows law enforcement, when armed with appropriate court authorisation, to go directly to companies based in the other country to access electronic data, rather than having to go through the other country’s government (which can take years).
“Quashing” CLOUD Act Warrants
In addition, the CLOUD Act provides for a procedure for service providers to file a motion to “quash” (ie, annul) or modify a CLOUD Act warrant, in limited circumstances and subject to several conditions: the CLOUD Act provides that “a provider of electronic communication service to the public or remote computing service, including a foreign electronic communication service or remote computing service, that is being required to disclose” the contents of a communication
“may file a motion to modify or quash the legal process where the provider reasonably believes:
a. that the customer or subscriber is not a United States person and does not reside in the United States; and
b. that the required disclosure would create a material risk that the provider would violate the laws of a qualifying foreign government.”
According to the US Department of Justice:
“[a] request to issue a warrant must be submitted to an independent judge for approval. The judge cannot authorize the warrant unless he or she finds that the government has established by a sworn affidavit that ‘probable cause’ exists that a specific crime has occurred or is occurring and that the place to be searched, such as an email account, contains evidence of that specific crime. Further, the warrant must describe with particularity the data to be searched and seized; fishing expeditions to see if evidence exists are not permitted.”
The Safe Harbour arrangement for data transfers between the EU and the USA was introduced back in 2000. In the Schrems I case in 2015, the Court of Justice of the European Union (CJEU) invalidated the arrangement based on concerns around government access and inadequate protection of the personal data of EU citizens. Following that decision, the EU and the USA negotiated the Privacy Shield framework in 2016. The Privacy Shield framework was developed to provide a more robust interoperability mechanism to manage transfers of EU personal data between the USA and EU. The agreement was welcomed on both sides of the Atlantic, with then-vice president of the commission stating that “businesses, especially the smallest ones, have the legal certainty they need to develop their activities across the Atlantic”. Likewise, then US Federal Trade Commission (FTC) chairwoman stated that it was essential to ensure that “consumer privacy is protected on both sides of the Atlantic”.
However, as mentioned in 1.4 Multilateral and Subnational Issues and 1.7 Key Developments, there have been quite a few developments in the area of international data transfers. The invalidation by the CJEU on 16 July 2020 of the Privacy Shield framework has led to quite a few changes, including a new set of Standard Contractual Clauses issued by the European Commission and, more recently, the Executive Order to implement the EU-US Data Privacy Framework. As mentioned previously, the Executive Order is not specific to the EU-US Data Privacy Framework: the European Commission has stated that “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used”.
The fact that the topic of international data transfers between the EU and the USA is again being revisited shows the ongoing tension regarding US government access and the protection of personal information under US law, leaving any replacement transfer mechanisms equally vulnerable to a legal challenge before the CJEU.
Furthermore, when the US government issues a warrant to an organisation for access to personal data, a secrecy order would compel that organisation to keep the warrant confidential from the target. Some commentators have expressed concerns that such secrecy is becoming routine and creates difficulties for organisations that are unable to inform customers that their data has been accessed.
There are no restrictions on international data transfers of personal information under US law. However, data transfer restrictions introduced by other jurisdictions, such as those pursuant to EU law, restrict the transfer of personal data relating to EU residents into countries such as the USA that are not deemed to offer an “adequate” level of protection. In order to remedy this situation, companies have to commit to EU principles by entering into arrangements such as binding corporate rules and standard contractual clauses (SCCs) to facilitate the data transfer, and implement supplementary safeguards. On 12 November 2020, the European Commission released a draft set of new SCCs for personal data transferred from the EU to a third country. On 7 October 2022, President Biden signed an Executive Order to implement the EU-US Data Privacy Framework. The Executive Order is not limited to the EU-US Data Privacy Framework, however: as stated by the European Commission, “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used”.
As mentioned in 1.4 Multilateral and Subnational Issues, 1.7 Key Developments and 3.4 Key Privacy Issues, Conflicts and Public Debates, both the EU-US and the Swiss-US Privacy Shield frameworks are no longer a valid transfer mechanism for data transfers to the USA. However, the use of binding corporate rules and the EU Commission’s SCCs remain valid mechanisms, as long as supplemental safeguards are implemented to protect the data. On 7 October 2022, an Executive Order to implement the EU-US Data Privacy Framework was signed. The Executive Order is not limited to the EU-US Data Privacy Framework, however: as stated by the European Commission, “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used”.
The USA participates in the Asia-Pacific Economic Cooperation Cross-Border Privacy Rules system. The system allows US companies to demonstrate their compliance with internationally recognised data privacy protections and the framework has been recognised in a number of trade agreements between Canada, Mexico and the USA.
In April 2022, the US Department of Commerce Secretary Gina Raimondo announced a key development in international collaboration with the newly created Global Cross-Border Privacy Rules Forum (the “Global CBPR Forum”). Participant countries include Canada, Japan, the Republic of Korea, the Philippines, Singapore and Chinese Taipei. In August 2022, the Australian Government announced that Australia had joined the Global CBPR Forum.
According to the Global CBPR Declaration, the framework establishes a certification system to help companies in participating jurisdictions demonstrate compliance with internationally recognised privacy standards, with the aim of fostering interoperability and international data flows. The Global CBPR Forum will replace the existing APEC Cross-Border Privacy Rules (APEC CBPR) and Privacy Recognition for Processors (PRP) certification systems, enabling non-APEC countries to participate.
US law does not require any government notifications or approvals in order to transfer personal information internationally.
There are no data localisation requirements under US federal law. However, the transfer of sensitive personal information belonging to US citizens is an emerging issue in the USA. In 2020, the National Security and Personal Data Protection Act was put forward to the US Congress (although it did not pass). The Act sought to address the growing concerns of sensitive personal information being transferred, through social media platforms (ie, TikTok) or for data storage purposes, to countries where the information is accessible to intelligence services. In addition, the Department of Commerce announced, on 18 September 2020, prohibitions on transactions relating to mobile applications WeChat and TikTok to “safeguard the national security of the United States”, further alleging that these apps “collect vast swaths of data from users, including network activity, location data, and browsing and search histories”.
Certain public procurement contracts impose domestic data storage as a requirement. For example, in Google’s agreement with the City of Los Angeles, Section 1.7 provides that “Google agrees to store and process Customer’s email and Google Message Discovery (GMD) data only in the continental United States. As soon as it shall become commercially feasible, Google shall store and process all other Customer Data, from any other Google Apps applications, only in the continental United States.”
There is no law formally generally requiring software code, algorithms or similar technical detail to be shared with the US government. This does not mean, however, that organisations have never been requested to share such information, for example on the grounds of national security. For example, such requirements may exist in certain public procurement contracts.
A mutual legal assistance treaty (MLAT) is the most common method of foreign enforcement agencies requesting assistance with cross-border issues and access to information located within another jurisdiction. However, organisations are not compelled to comply with such requests. The MLAT simply provides a formal mechanism for processing information requests.
In 2018, the USA introduced the CLOUD Act as an alternative request mechanism to streamline requests and data exchange between jurisdictions. Under the CLOUD Act, service providers under US jurisdiction may be prevented from disclosing communications to foreign governments unless there is a CLOUD Act agreement in place. However, these executive agreements only lift the blocking statute (the Stored Communications Act) and permit companies to comply with foreign government requests; companies are not required to comply with such requests.
The Stored Communications Act (SCA) operates as a “blocking statute” as it prohibits service providers in the USA from disclosing communications to a foreign government (subject to limited exceptions that do not apply to foreign government requests) unless there is a CLOUD Act agreement in place. The SCA will apply where the information sought by the foreign government relates to the communications of one of its own nationals, even where it relates to the investigation of criminal behaviour. Furthermore, the SCA prevents disclosure of such data even where the foreign government is subject to an order under its own national laws to obtain the required information.
Artificial Intelligence
There are no specific laws in the USA regarding Artificial Intelligence (AI). In 2019, Executive Order No 13,859 was issued which acknowledged that the US government must facilitate AI research and development and introduced “The American AI Initiative” which will be guided by five core principles. Those principles are as follows:
Technical standards are to be developed by the National Institute of Standards and Technology (NIST) to support in the development of reliable AI systems.
On 26 January 2023, the US National Institute of Standards and Technology (NIST) released the Artificial Intelligence (AI) Risk Management Framework (AI Risk Management Framework 1.0), a voluntary guidance document for managing and mitigating the risks of designing, developing, deploying, and using AI products and services. NIST also released a companion playbook for navigating the framework, a roadmap for future work, and mapping of the framework to other standards and principles, both at home and abroad.
Connected TVs
California was the first state in the USA to regulate the collection and use of voice data through connected televisions (ie, smart TVs). Section 22948.20 of the Business & Professions Code provides that a “person or entity shall not provide the operation of a voice recognition feature within this state without prominently informing, during the initial setup or installation of a connected television, either the user or the person designated by the user to perform the initial setup or installation of the connected television.” In short, this section requires manufacturers to provide notice of voice-control features during the initial set-up of a connected television. Sections 22948.20 (b) and (c) also restrict the sale or use of voice data for advertising purposes.
Internet of Things (IoT)
California is also the first state in the nation to enact a cybersecurity law for connected devices, as Senate Bill 327 was signed, in October 2019, into law. This law, also known as the “Internet of Things (IoT) Law”, requires device manufacturers to consider and to implement security features for all functionality stages of connected devices. Notably, the IoT Law does not appear to be limited to consumer devices: any device that connects to the internet, regardless of the type of information processed, appears to be covered by this law. Even though California may have taken the first step to address the topic of device security, it seems that other states in the USA may not be far behind (eg, Oregon has already adopted a similar law).
The Oregon law specifies requirements for “reasonable security features” that are similar to those in the California law. Similar to California’s law, the law in Oregon also provides that a reasonable security feature may consist of “compliance with requirements of federal law or federal regulations that apply to security measures for connected devices”. Oregon’s law does, however, include some notable differences from California’s law. Under Oregon’s HB 2395, a “connected device” is restricted to a device that “is used primarily for personal, family or household purposes”, thereby excluding from its scope devices used or sold for business-to-business purposes. In addition, Oregon’s law applies to a narrower range of entities. In Oregon, a “manufacturer” is defined as “a person that makes a connected device and sells or offers to sell the connected device in this state”. In comparison, California’s law defines manufacturers to include any entity that “contracts with another person to manufacture” the connected device on the person’s behalf.
In December 2020, a federal law, the IoT Cybersecurity Improvement Act of 2020, was signed into law, requiring the NIST to develop and publish standards and guidelines on addressing issues related to the development, management, configuring, and patching of IoT devices for use by federal agencies. In November 2021, NIST released its IoT Device Cybersecurity Guidance for the Federal Government (SP 800-213).
Biometrics and Facial Recognition
In the US, there is no single federal law that regulates biometric data use and collection although there are state specific laws in place. The State of Illinois introduced the Biometric Information Privacy Act (BIPA) in 2008 which regulates how private entities can collect, use and share biometric information and biometric identifiers and imposes certain security requirements to protect this data. In particular, the Illinois Supreme Court held in Rosenbach v Six Flags Entertainment Corp (2019) that the BIPA does not require persons whose fingerprints or other biometric identifiers are stored without compliance with the law to prove anything more before being able to sue for the statutory damages prescribed by the statute.
The State of Texas introduced a statute similar to the BIPA in 2009, which prevents the collection of biometric identifiers for a commercial purpose unless prior consent has been obtained from the individual. Furthermore, the State of Washington introduced biometric data privacy provisions in 2017 that prevent the use of biometric data without providing prior notice to individuals, obtaining their consent and implementing a mechanism to prevent commercial use of the data. Only Illinois’ BIPA currently provides a private right of action.
More recently, three federal legislative proposals were introduced in 2020 regarding the use of biometric and facial recognition technology: the Ethical Use of Facial Recognition Act, the Facial Recognition and Biometric Technology Moratorium Act of 2020 and the National Biometric Information Privacy Act of 2020. This reflects the increased importance of this area. At the state level, for instance, on 6 January 2021, a bipartisan group of New York state lawmakers introduced Assembly Bill 27, known as the Biometric Privacy Act or BPA, the latest version of proposed privacy legislation that would allow consumers to sue companies for improperly using or retaining their biometric data.
Chatbots
On 1 July 2019, California’s Bolstering Online Transparency Act (BOT Act) came into effect as a reaction to growing concerns that, as technology improves, bots are getting increasingly better at influencing consumers and voters. The BOT Act defines a bot as an “automated online account where all or substantially all of the actions or posts of that account are not the result of a person.” The BOT Act makes it “unlawful for any person to use a bot to communicate or interact with another person in California online, with the intent to mislead the other person about its artificial identity… in order to incentivise a purchase or sale of goods or services in a commercial transaction or to influence a vote in an election.” There is no liability, however, if the person discloses its use of a bot in a manner that is “clear, conspicuous, and reasonably designed to inform persons with whom the bot communicates or interacts.”
While the BOT Act does not provide details as to the specific form of disclosure, the legislative history of the BOT Act points to the FTC’s “.com Disclosures: How to Make Effective Disclosures in Digital Advertising”. Under the FTC’s guidance, scrolling and pop-up disclosures are discouraged and affirmative statements should be provided (ie, a statement such as “I am a bot”, as opposed to “I am not a person”). The BOT Act only applies to bots that interact with California residents, but there is currently no indication that the law is limited to California businesses only.
Organisations in the USA have not yet established any protocols for digital governance or fair data practice review boards or committees to address the risks of emerging or disruptive digital technologies. However, organisations can establish such protocols or bodies on a voluntary basis. Microsoft has implemented a technology and corporate responsibility team to provide guidance on ethical business practices, privacy and cybersecurity and a separate internal board to navigate the issues raised by the use of AI. Other companies such as Walmart and AIG have implemented technology committees on their board that are responsible for monitoring technological trends and developments in cybersecurity in order to manage and oversee developing disruptive digital technologies.
Data breaches – in 2020, some of the world’s largest businesses experienced data breaches in connection with the spread of COVID-19 and remote work. The US government also suffered a major data breach. Unsurprisingly, some of these data breaches have resulted in class actions or shareholder derivative litigation. There have been also several settlements resolving data breach cases from previous years.
The TCPA – 2020 also brought significant litigation under the Telephone Consumer Protection Act, highlighting an important division among circuit courts of appeal.
The CCPA – see 1.8 Significant Pending Changes, Hot Topics and Issues.
The BIPA – 2020 was another year of active litigation under the Illinois Biometric Privacy Act (BIPA), which recognises a private right of action. COVID-19 also resulted in new types of BIPA litigation around health screening and remote work.
Other relevant cases relate to COPPA infringements and child privacy cases, among other areas.
The acquirer’s due diligence investigation should at least consider the following (this is not intended to be an exhaustive list).
If any pre-close sharing of data is taking place, then a data transfer agreement will need to be put in place. Typically, this will cover the acquiring company’s obligations around the handling of such data – eg, requiring the acquiring company to:
There are no express US laws that mandate disclosure of an organisation’s cybersecurity risk profile or experience. However, the Securities and Exchange Commission (SEC) has issued guidance stating that publicly traded companies should give consideration to cybersecurity risks and incidents when preparing for disclosure under its registration statements required under the Securities Act 1933 and the Securities Exchange Act. This also includes the periodic and current reports under the Exchange Act. The purpose of the statements is to disclose timely, comprehensive and accurate information regarding a company that investors would consider as part of an investment decision. On this basis, the SEC considers cybersecurity issues as relevant to the statement. Furthermore, the SEC considers that it may be necessary to disclose cybersecurity issues as a significant factor if resulting in a speculative or high-risk investment. Companies must evaluate their cybersecurity risks based on prior security incidents and the severity and frequency of those incidents and all other available information.
On 9 July 2021, President Joe Biden signed an executive order to promote competition in the American economy. The executive order announces a shift towards a greater scrutiny of mergers, “especially by dominant internet platforms, with particular attention to the acquisition of nascent competitors, serial mergers, the accumulation of data, competition by ‘free’ products, and the effect on user privacy”. It also encourages the US Federal Trade Commission to establish rules on surveillance, data accumulation and “barring unfair methods of competition on internet marketplaces”.
There are no other significant issues in US data protection practice not already addressed in this article.
2650 Birch Street
Suite 100
Palo Alto, CA 94306
USA
+1 650 313 2361
Paul.Lanois@fieldfisher.com www.fieldfisher.comIntroduction
Following a very busy 2022 in the data privacy space, 2023 is already gearing up to be yet another pivotal year for the sector. New or updated data privacy laws had, by January 2023, already entered into effect (the California Privacy Rights Act and the Virginia Consumer Data Protection Act). Additional data privacy laws and regulations will enter into effect throughout 2023 (the Colorado Privacy Act and the Connecticut Data Privacy Act in July 2023, and the Utah Consumer Privacy Act on 31 December 2023). Organisations, both large and small, will have to prepare for such new laws (along with their associated implementing regulations in certain cases) and increased regulatory enforcement.
This article introduces some of the key areas that are currently of note.
State Legislation
California Consumer Protection Act and California Privacy Rights Act
The California Consumer Protection Act of 2018 (CCPA) came into effect on 1 January 2020, introducing one of the most comprehensive privacy laws in the USA. The CCPA established new rights for California residents, additional protections for children’s data as well as rules surrounding the “sale” of personal information.
California law has a broad definition of “sale” that extends well beyond the ordinary meaning of the word. A sale is defined as the “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.” In other words, as long as a company derives any benefit (a “valuable consideration”) from the disclosure of individuals’ personal information, that is a “sale” under California law. Organisations are required to provide certain information in relation to such sale of personal information, and California residents have the right to opt out of the sale of their personal information. The CCPA also included the right to non-discrimination in terms of price and services when a consumer exercises a privacy right under the CCPA.
In November 2020, voters in the State of California approved Proposition 24, also known as the California Privacy Rights Act (CPRA), which went into effect on 1 January 2023. The CPRA amends and expands the existing CCPA. In particular, the CPRA created the California Privacy Protection Agency (CPPA), which has the authority to bring an administrative enforcement action against businesses that violate the CCPA, as amended by the CPRA. The Attorney General retains enforcement authority over the CCPA, as amended by the CPRA.
Changes introduced in the CPRA include the following:
In the first public CCPA enforcement action, the California Attorney General announced on 24 August 2022 a USD1.2 million settlement including injunctive relief terms against cosmetic giant Sephora, which stated that “both the trade of personal information for analytics and the trade of personal information for an advertising option constituted sales under the CCPA.” In particular, the Attorney General alleged that the company failed to disclose to consumers that it was selling their personal information. The Attorney General further alleged that the company failed to process user requests to opt out of sale via user-enabled global privacy controls in violation of the CCPA.
In addition to the payment of USD1.2 million in penalties, the company was required to clarify its online disclosures and privacy policy to indicate that it sells data, and provide mechanisms for consumers to opt out of the sale of their personal information, including via the Global Privacy Control. Finally, the company was required to conform its service provider agreements to the CCPA’s requirements and provide reports to the Attorney General relating to its compliance status.
Virginia Consumer Data Protection Act
On 2 March 2021, the Virginia Consumer Data Protection Act (VCDPA) was signed into law and became effective on 1 January 2023. This made Virginia the second state to enact a consumer privacy and data security law, following in the footsteps of California. The VCDPA applies to businesses that conduct business in Virginia, or produce products or services that target Virginia residents, and that:
“Consumer” is defined as a natural person who is a resident of Virginia, acting only in an individual or household context. The definition explicitly excludes individuals acting in a commercial or employment context.
The VCDPA grants Virginia consumers the rights to access, correct, delete, know, and opt out of the sale and processing for targeted advertising purposes of their personal information, similar to the CCPA and CPRA.
However, the VCDPA is not a replica of the CPRA; instead, it takes inspiration from the GDPR in a few key areas. For example, it requires covered organisations to perform data protection assessments (not to be confused with data protection addendums) which resemble the GDPR’s data protection impact assessments (DPIAs) and the VCDPA further adopts similar terminology to that used in the GDPR (ie, “controller” and “processor”). The Attorney General may initiate actions and fines of USD7,500 per violation of the VCDPA. There is no private right of action for consumers under the VCDPA.
Colorado Privacy Act
The Colorado Privacy Act (CoPA) was enacted on 8 July 2021 and is set to take effect on 1 July 2023. CoPA applies to legal entities that conduct business or produce commercial products or services that are intentionally targeted at Colorado residents and that either:
Similar to the VCDPA, CoPA’s definition of consumer does not include individuals acting in commercial or employment contexts. Instead, it is designed to protect the “consumer”, defined in CoPA as “an individual who is a Colorado resident acting only in an individual or household context; and does not include an individual acting in a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context.” Importantly, CoPA also uses similar terminology to the GDPR (ie, “personal data”, “controller” and “processor”).
Among other things, CoPA grants consumers the following:
Connecticut’s Data Privacy Act
On 10 May 2022, Governor Ned Lamont signed Senate Bill 6, An Act Concerning Personal Data Privacy and Online Monitoring (CDPA) into law. Most provisions of the law will go into effect alongside the Colorado Privacy Act on 1 July 2023.
It applies to businesses that, during the preceding calendar year:
The CDPA includes many of the same rights, obligations and exceptions as the data privacy laws in California, Colorado, and Virginia. It draws heavily from CoPA and the VCDPA, with many of the law’s provisions either mirroring or falling somewhere between CoPA and the VCDPA, but contains a few notable distinctions that should be factored into an entity’s compliance efforts.
Utah’s Consumer Privacy Act
In March 2022, Governor Spencer Cox signed the Utah Consumer Privacy Act (UCPA) into law, which takes inspiration from the VCDPA, CoPA and the CPRA.
The UCPA applies to both data controllers and processors that generate over USD25 million in annual revenue and either:
Contrary to the CPRA, VCDPA or CoPA, the UCPA does not grant individuals the right to opt out of profiling or the right to correct inaccuracies in their data.
The Push for a Federal Privacy Law - the American Data Privacy and Protection Act
After years of unsuccessful attempts to introduce data privacy legislation that would apply nationwide across the United States, H.R. 8152 – also known as the American Data Privacy and Protection Act (ADPPA) – was introduced in 2022. The ADPPA would have been the first comprehensive US federal privacy bill to regulate how organisations keep and use consumer data and would have included protections intended to address potentially discriminatory impacts of algorithms. For example, the ADPPA would have required companies to evaluate certain artificial intelligence tools and submit those evaluations to the Federal Trade Commission (FTC).
A federal privacy law would create a single set of requirements governing data privacy nationwide, as opposed to the patchwork of state privacy laws, such as those listed above. Small and medium-sized organisations in particular often struggle to keep up with the various applicable privacy state laws, which do not always contain the same requirements. However, while receiving bipartisan support, opponents to the ADPPA include California authorities – such as the California Privacy Protection Agency - who were concerned that the ADPPA would undermine California’s state privacy law by replacing it with weaker privacy protections.
On 28 February 2023, California Governor Gavin Newsom, Attorney General Rob Bonta, and the California Privacy Protection Agency sent a joint letter to Congress opposing pre-emption language in the ADPPA. In particular, they called on Congress to set the floor and not the ceiling in any federal privacy law, and to allow states to provide additional protections in response to changing technology and data privacy practices.
Although the ADPPA ultimately failed to pass during the 2022 legislative session, it evidences the push for a comprehensive data privacy law in the United States. In addition, at a time when artificial intelligence (AI) is pervading almost every domain, the ADPPA would have constituted a first stab at federal regulation of AI, with provisions targeting algorithmic accountability and bias as well as data privacy and security.
EU-US Data Privacy Framework
After nearly two years of negotiations, the United States and the European Commission announced on 25 March 2022 the Trans-Atlantic Data Privacy Framework, designed to address the concerns raised by the Court of Justice of the European Union when it struck down, in 2020, the Commission’s adequacy decision underlying the EU-US Privacy Shield framework. In particular, under the proposed framework, the United States made commitments to strengthen the privacy and civil liberties safeguards governing US signals intelligence activities, establish a new redress mechanism with independent and binding authority and enhance its existing rigorous and layered oversight of signals intelligence activities.
Following this announcement, President Biden signed on 7 October 2022 an Executive Order to implement the EU-US Data Privacy Framework. Among other things, the new framework will allow individuals in the EU to seek redress through an independent Data Protection Review Court made up of members outside the US government. That body “would have full authority to adjudicate claims and direct remedial measures as needed.”
In addition, the Executive Order provides that US signals intelligence activities may only be conducted following a determination that they are “necessary to advance a validated intelligence priority”, and “only to the extent and in a manner that is proportionate to the validated intelligence priority for which they have been authorized.” The Executive Order also specifies certain “legitimate objectives” and “prohibited objectives” for which US signals intelligence activities may be carried out. The Executive Order also requires US intelligence agencies to review their policies and procedures to implement these new safeguards.
On 13 December 2022, the European Commission announced that it has launched the process for the adoption of an adequacy decision for the EU-US Data Privacy Framework. As of the date of publication of this chapter (9 March 2023), the adequacy process has not been completed.
In the meantime, it can be noted that the European Commission has already stated in relation to the Executive Order that “all the safeguards that have been put in place by the US Government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the US under the GDPR, regardless of the transfer mechanisms used.”
Global Cross-Border Privacy Rules Forum
In April 2022, the US Department of Commerce Secretary Gina Raimondo announced a key development in international collaboration with the newly created Global Cross-Border Privacy Rules Forum (the “Global CBPR Forum”). Participant countries include Canada, Japan, the Republic of Korea, the Philippines, Singapore and Chinese Taipei. In August 2022, the Australian government announced that Australia had joined the Global CBPR Forum.
According to the Global CBPR Declaration, the framework establishes a certification system to help companies in participating jurisdictions demonstrate compliance with internationally recognised privacy standards, with the aim of fostering interoperability and international data flows. The Global CBPR Forum will replace the existing APEC Cross-Border Privacy Rules (APEC CBPR) and Privacy Recognition for Processors (PRP) certification systems, enabling non-APEC countries to participate.
Surge in Session Replay Lawsuits
In order to better understand what users are viewing, clicking on or hovering over, a growing number of website operators have turned to “session replay” technologies in order to understand the consumer’s interactions with the website at regular time intervals. This can be used for example to collect information on broken links as well as providing better error messages to support IT teams.
During the past couple of months, there has been a surge in consumer class action lawsuits alleging that certain businesses and their cloud service providers were violating state anti-wiretapping statutes (for example in California or Florida) and invading consumers’ privacy rights based on the websites’ (or applications’) use of session replay technologies without obtaining sufficient consent. Such state anti-wiretapping statutes require all parties to a communication to consent to the communication being recorded. On this basis, collecting proper consent for online tracking (including for the use of session replay technologies) will help reduce the risk of litigation for the use of such technologies.
2650 Birch Street
Suite 100
Palo Alto, CA 94306
USA
+1 650 313 2358
info@Fieldfisher.com www.fieldfisher.com