Constitutional Rights
The United States Constitution does not explicitly include a right to privacy. The Bill of Rights does, however, protect certain aspects of privacy. For example:
Moreover, the Ninth Amendment provides that the enumeration of certain rights in the Bill of Rights cannot deny the existence of other rights. Some commentators interpret the Ninth Amendment as affirming the existence of rights outside those expressly protected by the Bill of Rights. Finally, certain decisions appear to indicate that the right to privacy, especially in marital relations, is part of the liberty interest of the 14th Amendment.
Sector-Specific Data Protection Legislation
There is currently no single, all-encompassing federal legislation covering privacy and the protection of personal information generally in the USA. Instead, legislation at the federal level primarily protects data in specific sectors (such as healthcare, education, communications and financial services) or, in the case of online data collection, that of children. Examples of such laws include the following:
The sectoral approach adopted by US federal law to address privacy and data protection means that each state may enact its own laws governing privacy and data protection. As a result, privacy requirements differ from state to state, and cover different areas. Where a federal statute covers a specific topic, it may pre-empt a similar state law on the same topic.
The Federal Trade Commission
The Federal Trade Commission (FTC) is an independent US law enforcement agency charged with protecting consumers and enhancing competition across broad sectors of the economy. The FTC’s primary legal authority comes from Section 5 of the Federal Trade Commission Act, which prohibits “unfair or deceptive practices” in the marketplace. The FTC has taken the view that “unfair or deceptive practices” include, for example, a company’s failure to adhere to its own published privacy notice and the company’s failure to provide an adequate level of security for the personal information it holds, as well as the use of deceptive marketing practices. If a company violates an FTC order, the FTC can seek civil monetary penalties for the violations. The FTC can also seek civil monetary penalties for violations of certain privacy statutes and rules. This broad authority allows the FTC to address a wide array of practices affecting consumers, including those that emerge with the development of new technologies and business models.
The FTC
In addition to its authority to take action against deceptive or unfair trade practices (as described in 1.1 Laws), the FTC has the authority to enforce several sector-specific laws, which include the CAN-SPAM Act, COPPA, the FCRA and the TCFAPA, among others. Since the FTC’s enforcement actions nearly always result in settlement agreements with companies, the contents of those agreements are used by companies looking for guidance in developing privacy practices.
The FTC may start an investigation on its own based on publicly available information, at the request of another agency, or based on complaints from consumers or competitors.
Other Agencies
Other agencies at the federal and state levels, as well as state consumer protection regulators (usually the state Attorneys General), may also exercise regulatory authority in relation to privacy. At the federal level, examples include:
The State of California has created an agency with full administrative power dedicated to privacy and data protection, the California Privacy Protection Agency (CPPA), which is unprecedented in the United States – see below for more details.
State Attorneys General
State Attorneys General have the power to bring enforcement actions based on unfair or deceptive trade practices. The sources of these powers are typically state laws prohibiting “unfair or deceptive acts and practices” and authorising the state Attorney General to initiate enforcement actions.
Recent privacy events have seen increased co-operation and co-ordination in enforcement among state Attorneys General, whereby multiple states will jointly pursue actions against companies that experience data breaches or other privacy allegations. Co-ordinated actions among state Attorneys General often exact greater penalties from companies than would typically be obtained by a single enforcement authority.
California Privacy Protection Agency
In November 2020, voters in the State of California approved Proposition 24, also known as the California Privacy Rights Act of 2020 (CPRA). The CPRA added new privacy protections to the existing California Consumer Privacy Act of 2018 (CCPA) and created a new agency, the California Privacy Protection Agency (CPPA), to implement and enforce the CCPA (as amended by the CPRA). The CPPA may bring enforcement actions related to the CCPA. The California Attorney General retains civil enforcement authority over the CCPA.
In addition, the CPPA has been given the power to adopt regulations under the CCPA, including rules that further implement consumers’ rights and the responsibilities of businesses with the goal of strengthening consumer privacy. The CPPA has already adopted CCPA regulations, with further regulations on the way. The CPPA has so far announced two major settlements as a result of its enforcement actions:
Adjudication
The FTC determines in an adjudicative proceeding whether a practice violates the law. As mentioned previously, pursuant to Section 5(b) of the FTC Act, the FTC may challenge “unfair or deceptive” acts or practices. When the FTC has “reason to believe” that a violation of the law has occurred, the FTC may issue a complaint setting forth its charges. If the respondent elects to settle the charges, it may sign a consent agreement (without admitting liability), consent to entry of a final order, and waive all right to judicial review. If the FTC accepts the proposed consent agreement, it places the order on the record for 30 days of public comment (or for such other period as the FTC may specify) before determining whether to make the order final.
Enforcement
An FTC order generally becomes final (ie, binding on the respondent) 60 days after it is served, unless the order is stayed by the FTC or by a reviewing court. Divestiture orders become final after all judicial review is complete (or if no review is sought, after the time for seeking review has expired). If a respondent violates a final order, it is liable for a civil penalty for each violation. The penalty is assessed by a federal district court in a suit brought to enforce the FTC’s order.
Where the FTC has determined in a litigated administrative adjudicatory proceeding that a practice is unfair or deceptive, and has issued a final cease-and-desist order, the FTC may obtain civil penalties from non-respondents who thereafter violate the standards articulated by the FTC. To accomplish this, the FTC must show that the violator had “actual knowledge that such act or practice is unfair or deceptive and is unlawful” under Section 5(a)(1) of the FTC Act. To prove “actual knowledge”, the FTC typically shows that it provided the violator with a copy of the FTC’s determination about the act or practice in question, or a “synopsis” of that determination.
Global CBPR Forum
The USA participates in the Asia-Pacific Economic Cooperation’s (APEC) Cross-Border Privacy Rules (CBPR) system. In April 2022, the US Department of Commerce Secretary Gina Raimondo announced a key development in international collaboration with the newly created Global Cross-Border Privacy Rules Forum (the “Global CBPR Forum”). Participant countries include Canada, Japan, the Republic of Korea, the Philippines, Singapore and Chinese Taipei. In August 2022, the Australian government announced that Australia had joined the Global CBPR Forum. In July 2023, the Global CBPR Forum announced that it welcomed the United Kingdom (UK) as an Associate, further expanding participation in the Global CBPR Forum outside the Asia-Pacific region.
According to the Global CBPR Declaration, the framework establishes a certification system to help companies in participating jurisdictions demonstrate compliance with internationally recognised privacy standards, with the aim of fostering interoperability and international data flows. The Global CBPR Forum will replace the existing APEC Cross-Border Privacy Rules (APEC CBPR) and Privacy Recognition for Processors (PRP) certification systems, enabling non-APEC countries to participate.
Transfers From the EEA: the Privacy Shield and SSCs
Data transfer from the European Economic Area (EEA) towards countries outside the EEA may only occur if they offer an “adequate” level of data protection, which generally means a level equivalent to the EU General Data Protection Regulation (GDPR).
On 4 June 2021, the European Commission issued an updated set of SCCs for data transfers from controllers or processors located in the EEA (or otherwise subject to the GDPR) to controllers or processors established outside the EEA (and not subject to the GDPR). Since then, there have been decisions, such as from the Austrian or French data protection authorities in relation to Google Analytics, which have invalidated certain data transfers from the EEA to the USA due to concerns surrounding the potential accessibility of the data by intelligence services.
After nearly two years of negotiations, on 25 March 2022 the United States and the European Commission announced the Trans-Atlantic Data Privacy Framework, designed to address the concerns raised by the Court of Justice of the European Union when, in 2020, it struck down the Commission’s adequacy decision underlying the EU-US Privacy Shield framework. In particular, under the proposed framework, the United States made commitments to:
Following such announcement, on 7 October 2022 President Biden signed an Executive Order to implement the EU-US Data Privacy Framework. Among other things, the new framework will allow individuals in the EU to seek redress through an independent Data Protection Review Court made up of members outside the US government. That body “would have full authority to adjudicate claims and direct remedial measures as needed”. In addition, the Executive Order provides that US signals intelligence activities may only be conducted following a determination that they are “necessary to advance a validated intelligence priority”, and “only to the extent and in a manner that is proportionate to the validated intelligence priority for which they have been authorised”. The Executive Order also specifies certain “legitimate objectives” and “prohibited objectives” for which US signals intelligence activities may be carried out. The Executive Order also requires US intelligence agencies to review their policies and procedures to implement these new safeguards.
The adequacy decision on the EU-US Data Privacy Framework was adopted by the European Commission on 10 July 2023. This decision concludes that the United States ensures an adequate level of protection, comparable to that of the EU, for personal data transferred from the EU to US companies participating in the EU-US Data Privacy Framework.
A number of non-governmental organisations (NGOs) in the USA are focused on privacy and data protection issues, including the following.
The USA and the EU have a fundamentally different approach to privacy law. Generally, the EU member states view privacy as a fundamental human right and freedom. In particular, Article 8 of the EU Charter of Fundamental Rights proclaims that “everyone has the right to the protection of personal data concerning him or her”, and also that “everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified”. In addition, even before the GDPR was adopted, the European approach was to use a comprehensive or omnibus approach to data protection law, where an overarching law covers the collection of all information and data relating to all EU data subjects.
By contrast, the US Constitution contains no express right to privacy. Moreover, rather than create an overarching privacy regulation, the USA has enacted various privacy laws as and when a need for them arises, based on a sectoral approach. As discussed in 1.1 Laws, there are a number of laws covering specific sectors – for example:
Moreover, information relating to an individual is typically referred to as “personally identifiable information” (PII) or “personal information”, in contrast to the concept of “personal data” found in the European framework. Under US law, the scope of PII or personal information is not uniform as the information protected varies across legislation and state. In particular, certain types of data may be protected for a given purpose under a specific framework, but not for another. Personal data, in the context of the GDPR, covers a much wider range of information than PII. In other words, all PII is personal data but not all personal data is PII.
In the absence of a comprehensive federal legislation regulating data privacy, the United States has taken a patchwork approach, with each individual state being able to enact its own set of rules – and several states have already done so. Between 2020 and 2022, the states of California, Virginia, Colorado, Connecticut and Utah passed comprehensive state privacy laws. In 2023, an additional eight states enacted comprehensive state privacy laws (Iowa, Indiana, Tennessee, Montana, Texas, Florida, Delaware and Oregon) which are scheduled to take effect between July 2024 and January 2026. There is a real potential to see each state in the USA creating its own set of privacy rules, just as each US state has its own rules in relation to data breach notification – indeed, all 50 states in the USA have their own data breach reporting laws, and they all have different requirements for determining whether a breach has occurred and for the notices that are required.
A number of key developments have taken place in the past 12 months affecting US businesses, as described throughout this chapter. See in particular 1.4 Multilateral and Subnational Issues in relation to developments on cross-border data transfers. Other significant developments include the following.
UK Transfers
In September 2023, the UK government announced the launch of the UK-US data bridge, thereby recognising the USA as offering an adequate level of data protection where the transfer is to a US organisation that:
Supporting this decision, on September 18th the US Attorney General designated the UK as a “qualifying state” under Executive Order 14086. This will allow all UK individuals whose personal data has been transferred to the USA under any transfer mechanisms (ie, including those set out under UK GDPR Articles 46 and 49) access to the newly established redress mechanism if they believe that their personal data has been accessed unlawfully by US authorities for national security purposes.
State Legislation
The California Consumer Protection Act, as amended by the California Privacy Rights Act
In November 2020, voters in the State of California approved Proposition 24, also known as the California Privacy Rights Act (CPRA), which went into effect on 1 January 2023. The CPRA amends and expands the existing California Consumer Protection Act of 2018 (CCPA). In particular, the CPRA created the California Privacy Protection Agency (CPPA), which has the authority to bring an administrative enforcement action against businesses that violate the CCPA or the CPRA. The attorney general retains enforcement authority over the CCPA or the CPRA.
The Virginia Consumer Data Protection Act
On 2 March 2021, the Virginia Consumer Data Protection Act (VCDPA) was signed into law and became effective on 1 January 2023. This made Virginia the second state to enact a consumer privacy and data security law, following in the footsteps of California.
The VCDPA grants Virginia consumers the rights to access, correct, delete, know and opt out of the sale and processing for targeted advertising purposes of their personal information, similar to the CCPA and CPRA. However, the VCDPA is not a replica of the CPRA; instead, it takes inspiration from the GDPR in a few key areas. For example, it requires covered organisations to perform Data Protection Assessments (not to be confused with Data Protection Addendums) which resemble the GDPR’s Data Protection Impact Assessments (DPIAs), and the VCDPA further adopts similar terminology to that used in the GDPR (ie, “controller” and “processor”). The Attorney General may initiate actions and fines of USD7,500 per violation of the VCDPA. There is no private right of action for consumers under the VCDPA.
The Colorado Privacy Act (CoPA)
Similar to the VCDPA, CoPA’s definition of consumers does not include individuals acting in commercial or employment contexts. Instead, it is designed to protect the “consumer”, defined in CoPA as “an individual who is a Colorado resident acting only in an individual or household context; and does not include an individual acting in a commercial or employment context, as a job applicant, or as a beneficiary of someone acting in an employment context”. Importantly, CoPA also uses similar terminology to the GDPR (ie, “personal data”, “controller” and “processor”).
Among other things, CoPA grants consumers the following:
Connecticut’s Data Privacy Act
On 10 May 2022, Governor Ned Lamont signed Senate Bill 6, An Act Concerning Personal Data Privacy and Online Monitoring (CDPA) into law. The law became effective on 1 July 2023.
The CDPA includes many of the same rights, obligations and exceptions as the data privacy laws in California, Colorado and Virginia. It draws heavily from CoPA’s law and the VCDPA, with many of the law’s provisions either mirroring or falling somewhere between CoPA and the VCDPA, but contains a few notable distinctions that should be factored into an entity’s compliance efforts.
Utah’s Consumer Privacy Act
In March 2022, Governor Spencer Cox signed the Utah Consumer Privacy Act (UCPA) into law, which takes inspiration from the VCDPA, CoPA and CPRA. It entered into effect on 31 December 2023.
Contrary to the CPRA, VCDPA or CoPA, the UCPA does not grant individuals the right to opt out of profiling or the right to correct inaccuracies in their data.
US State Privacy laws entering into effect in 2024
On 1 July 2024, Florida’s Digital Bill of Rights, Oregon’s Consumer Privacy Act, and Texas’ Data Privacy and Security Act take effect. On 1 October 2024, Montana’s Consumer Data Privacy Act takes effect. While these laws share similarities with the other privacy state laws that have been introduced, they are not identical.
Here are some of the highlights of these new state laws.
Significant pending changes, hot topics and issues on the horizon over the next 12 months include the following.
Several states had their state privacy law enter into effect during 2023, namely California (ie, the CPRA, which updates the CCPA), Virginia, Colorado, Connecticut and Utah.
As mentioned in 1.7 Key Developments, 2024 is the year when additional state privacy laws enter into effect: on 1 July 2024, Florida’s Digital Bill of Rights, Oregon’s Consumer Privacy Act, and Texas’ Data Privacy and Security Act take effect; and on 1 October 2024, Montana’s Consumer Data Privacy Act takes effect.
Additional state privacy laws are scheduled to take effect in 2025 – for example:
The list will likely only get bigger as more states are also expected to adopt their own privacy laws. In turn, the enactment of various state privacy laws is likely to increase pressure to enact a comprehensive US federal privacy law, as organisations grapple to comply with the requirements of the various state laws, each imposing slightly different requirements.
As mentioned in 1.1 Laws, there is currently no federal legislation protecting personal information generally across the country. Rather, there are many laws at the federal level protecting personal information in specific sectors; and, in addition, the various privacy laws enacted at state level must be taken into account.
The State of California has traditionally taken a leadership role in the USA in relation to cybersecurity and the protection of the personal information of California residents. For example, California was one of the first states in the nation to provide an express right of privacy in the California Constitution, giving each citizen an “inalienable right” to pursue and obtain “privacy”. In 2002, California was also the first US state to enact a data breach notification law requiring organisations to notify all impacted individuals “in the most expedient time possible and without unreasonable delay, consistent with the legitimate needs of law enforcement”, whenever information relating to a California resident may have been compromised.
The CCPA was the first state-level omnibus privacy law imposing broad obligations on businesses to provide state residents with transparency and control over their personal information. The CPRA, which entered into effect in January 2023, amends and further extends the CCPA’s requirements. Since the CCPA was introduced, other states have introduced their own state privacy laws that take inspiration from the CCPA. While these laws share many similarities and provide for similar privacy rights (eg, the rights of access, correction, deletion and opting out of the sale of personal information, etc), they are not identical, and there are a number of significant differences that organisations should be mindful of. A summary of some of the key similarities and differences is listed in 1.7 Key Developments.
Territorial Scope
Organisations established in other jurisdictions may be subject to both federal and state privacy laws if they collect, store, transmit, process or share personal information of US residents.
Principles
The FTC has issued various guidance documents addressing principles such as:
The FTC staff has also issued guidance on online behavioural advertising, emphasising core principles such as:
Privacy Policy
Certain states have enacted laws requiring the publication of a privacy policy. The first state law in the nation to require commercial websites and online services to post a privacy policy – the California Online Privacy Protection Act (CalOPPA) – went into effect in 2004. CalOPPA was later amended in 2013 to require certain disclosures regarding tracking of online visits.
CalOPPA applies to any person or company whose website or online service collects personal information from California consumers. It requires the website to feature a conspicuous privacy policy stating exactly what information is collected and with whom it is shared. Sectoral laws may impose certain requirements. For example, financial institutions covered by the Gramm-Leach-Bliley Act must tell their customers about their information-sharing practices and explain to customers their right to “opt out” if they do not wish their information to be shared with certain third parties.
On 21 February 2024, the California Attorney General announced a settlement with DoorDash, resolving allegations that the company violated the CCPA and CalOPPA. According to the Attorney General, the company sold its California customers’ personal information between 2018 and 2020 without providing notice or an opportunity to opt out of that sale, in violation of both the CCPA and CalOPPA. The Attorney General also alleged that the company violated CalOPPA by failing to state in its posted privacy policy that it disclosed personally identifiable information.
Individual Rights
There is no general right of access, rectification, deletion, objection or restriction recognised across the country for all types of personal information. Instead, the existence of these rights depends on each specific statute (there is no common general approach across the country). For example, COPPA provides that parents have a right to review and delete the personal information relating to their children. Pursuant to HIPAA, individuals are entitled to request copies of medical information held by a health services provider. Pursuant to the FCRA, individuals may receive a copy of their credit report maintained by a reporting agency.
In relation to state law, the CCPA grants California residents several rights in relation to personal information held by a business relating to that resident, such as the right of access, right of deletion, right to restrict processing, right to data portability, etc. The CPRA further extends the CCPA, recognising the right to correct inaccurate information. Other states have since adopted comprehensive state privacy laws, as further explained in 1.7 Key Developments.
Registration Requirements
Some states (such as California and Vermont) require data brokers to register with the state Attorney General. For example, California’s data broker law applies to “data brokers”, which are defined as businesses that knowingly collect and sell to third parties the personal information of consumers with whom the businesses do not have direct relationships. Data brokers must also pay an annual registration fee. Any data broker that fails to register may be subject to a civil penalty of USD100 for each day it remains unregistered, as well as to other penalties, fees and costs.
On 10 October 2023, Senate Bill 362, often referred to as the Delete Act, was signed into law. It requires data brokers to now register with the California Privacy Protection Agency (which will enforce the law) and to disclose the types of personal information they collect. The CPPA would also create a free, simple way for Californians to direct all data brokers to delete their personal information free of charge. Data brokers that fail to adhere to the law would face civil penalties and administrative fines set by the CPPA.
Data Protection Officer
There are no specific requirements to appoint a formal privacy officer or data protection officer in the USA. However, certain regulated entities (eg, those covered by statutes such as HIPAA or the GLBA) are required to comply with certain privacy and security obligations. Some states may also require the formal appointment of an employee to maintain the organisation’s information security programme. In any case, appointing a chief privacy officer and a chief information security officer is a best practice that is common among larger organisations and increasingly among mid-sized ones.
International Transfers
The USA does not have restrictions on the transfer of personal information to other countries.
Data Security and Data Breaches
Certain federal and state laws impose obligations to ensure the security of personal information. The FTC has stated that a company’s security measures must be reasonable. In addition, some federal and state laws establish breach notification requirements. State statutes require the reporting of data breaches to a state agency or Attorney General under certain circumstances.
In the USA, certain statutes (such as the GLBA and the FCRA) impose additional requirements for sensitive information.
Financial Information
The GLBA regulates the collection, safekeeping and use of private financial information by financial institutions. For example, according to the GLBA’s Safeguards Rule, if an entity meets the definition of a financial institution, it must adopt measures to protect the customer data in its possession. Financial institutions are required to:
Health Information
For organisations operating in the healthcare industry, the Department of Health and Human Services (HHS) enforces compliance with HIPAA and HITECH. HIPAA requires that organisations enter into business associate agreements with vendors who will require access to personal health information (PHI). Such agreements restrict the vendors’ use and disclosure of the PHI except as set out in the agreement, as well as ensure the confidentiality and integrity of data. HIPAA’s Breach Notification Rule requires any data breaches to be reported to the HHS and imposes civil and criminal penalties for organisations that fail to adequately protect PHI with appropriate information security standards. In addition, HIPAA’s Security Rule requires organisations to maintain appropriate administrative, physical and technical measures to protect the confidentiality, integrity and security of electronic PHI.
Communications Data
Communications data is governed by a number of federal laws, such as:
Children’s and Students’ Information
Information relating to children is protected by the Children’s Online Privacy Protection Act (COPPA), which imposes requirements on operators of websites or online services directed to children under the age of 13, and on operators of other websites or online services that have actual knowledge that they are collecting personal information online from a child under the age of 13. Among other requirements, operators of websites or online services must:
On 20 December 2023, the Federal Trade Commission (FTC) published its Notice of Proposed Rulemaking to update COPPA. If ultimately approved, the new rules would require regulated companies that direct online services to children under the age of 13 (or that have actual knowledge that they are collecting personal information from a child under the age of 13) to implement significant changes to their business operations. Such proposed changes include:
The Family Educational Rights and Privacy Act (FERPA) is a federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable programme of the US Department of Education. It gives parents or eligible students more control over their educational records, and prohibits educational institutions from disclosing “personally identifiable information in education records” without the written consent of an eligible student, or, if the student is a minor, the student’s parents.
Video-Viewing Information
The Video Privacy Protection Act (VPPA), passed by Congress in 1988, is intended to prevent a “video tape service provider” from “knowingly” disclosing an individual’s “personally identifiable information” (PII) to third parties where that individual “requested or obtained… video materials” such as “pre-recorded video cassette tapes or similar audio-visual materials”. When passing the law, Congress had in mind rental video providers of visual materials such as VHS tapes. While the text of the VPPA may appear outdated today, the VPPA has been at the centre of a number of high-profile lawsuits in recent years, since its broad language is used in relation to digital video materials, such as online video-streaming services. The VPPA creates a private right of action and allows a court to award statutory damages upwards of USD2,500 per violation.
The VPPA has recently made a bit of a comeback as plaintiffs are now using it to challenge the use of pixel technology across a variety of websites that provide online video content. For example, a lawsuit alleging a violation of the VPPA has been allowed to proceed against a gaming and entertainment website which “hosts prerecorded video-streaming content”, although many such claims have also been dismissed where plaintiffs failed to adequately allege either a relationship with the business (such as registration or any subscription commitment) or access to restricted content.
Credit and Consumer Reports
Credit and consumer reports are governed by the FCRA, as amended by the Fair and Accurate Credit Transactions Act 2003, which promotes accuracy, fairness and privacy of the information contained in consumer credit reports and aims to protect consumers from identity theft. The law regulates the way credit-reporting agencies can collect, access, use and share the data they collect in individuals’ consumer reports. For example, the FCRA grants consumers the right to request and access all the information a reporting agency has about such a consumer. Enforcement of the FCRA is shared between the FTC and federal banking regulators.
Online Behavioural Advertising
The FTC staff has issued guidance on online behavioural advertising, emphasising the following principles to protect consumer privacy interests:
The CAN-SPAM Act, a law that sets out the rules for commercial email, requires commercial messages to contain a method for recipients to opt out or unsubscribe from such communications without incurring any costs. Despite its name, the CAN-SPAM Act does not apply just to bulk email. It covers all commercial messages, which the law defines as “any electronic mail message, the primary purpose of which is the commercial advertisement or promotion of a commercial product or service”, including email that promotes content on commercial websites. The law makes no exception for business-to-business email. That means that all email – for example, a message to former customers announcing a new product line – must comply with the law. However, emails that are informational, transactional or relationship-oriented are exempt from the CAN-SPAM Act.
There are federal and state laws that apply to telemarketing communications, which vary in the restrictions imposed, including:
Under the TCPA, an individual’s express written consent must be obtained before certain marketing texts may be sent to a mobile phone, which includes messages sent using an auto-dialler. The TCPA and CAN-SPAM Act apply to both business-to-consumer and business-to-business electronic direct marketing. The FTC, FCC and the state Attorneys General are active enforcers in this area. California’s Shine the Light Act requires businesses that disclose personal information to third parties for the direct marketing purposes of those third parties to provide notice and access to certain information.
Federal Legislation
Broadly, in the USA, employee monitoring is legal and mostly unregulated. As an employer-provided computer system is property of the employer, they may listen to, watch and read employees’ workplace communications and, in some cases, personal messages. While the Fourth Amendment of the US Constitution protects the privacy rights of federal, state and local government employees, this protection does not extend to employees in the private sector.
Digital privacy is covered by the Electronic Communications Privacy Act (ECPA), which protects against the interception (in transit) of digital and electronic communications. It also includes the Stored Communications Act (SCA), which, as the name suggests, covers the disclosure of stored communications and records. The ECPA permits employers to monitor the verbal and written communications of their employees, provided there is a legitimate business reason for such monitoring or the employer has obtained the employee’s consent. The SCA has a broader exception, allowing employers to access stored information outside the ordinary course of business.
State Legislation
There are some state laws that regulate the monitoring of employee communications. In Connecticut, employees must receive written notice of the monitoring and the monitoring methods that will be used. In California, Florida, Louisiana and South Carolina, there is a state constitutional right to privacy, which makes employee monitoring difficult for employers. On a state level, only Connecticut and Delaware require that employers notify employees about monitoring of email or internet beforehand.
In July 2023, California Attorney General Rob Bonta announced an investigative sweep, through inquiry letters sent to large California employers requesting information on the companies’ compliance with the CCPA with respect to the personal information of employees and job applicants.
Video Surveillance
The National Labor Relations Board has stated that video surveillance introduced in the workplace is a condition of employment and, as such, should be agreed with trade unions and be subject to collective bargaining unless previously agreed. The National Labor Relations Board recommends that the roll-out of any surveillance or monitoring programmes always be subjected to the scrutiny of trade unions. There are significant issues caused in monitoring employees in relation to trade union activities.
Whistle-Blowing
US employees are protected from retaliation from their employers if they make a protected disclosure under the Whistleblower Protection Act. For federal employees, disclosures are usually made to an Inspector General using a confidential hotline that permits confidential whistle-blowing disclosures. The Inspectors General may not disclose the identity of the disclosing employee unless it is unavoidable or is mandated by a court order. For non-federal employers, it is recommended that hotlines allow for anonymous reporting. Further, the Sarbanes-Oxley Act 2002 introduced a requirement for publicly traded companies to implement a mechanism for employees to make anonymous reports of financial irregularities.
FTC Enforcement
The FTC is active in regulating data security and privacy issues. The possible enforcement penalties that are available to the FTC include injunctions and damages, although the FTC places greater reliance on consent decrees, under which the organisation will be monitored by the FTC for further violations, which will incur financial penalties.
Over the course of 2023, the FTC announced numerous cases involving consumers’ sensitive health data, alleging violations of both Section 5 of the FTC Act and the FTC’s Health Breach Notification Rule. Children’s privacy was another key area of focus for the FTC in 2023, with the FTC finalising record-setting penalties against Epic Games, creator of the popular video game Fortnite. The company was required to pay USD275 million for violations of COPPA, following allegations that Epic had collected children’s personal information without parental consent and set voice and text chat features to “on” by default, as well as USD245 million over allegations that the company used dark patterns and other deceptive practices to trick players into making unwanted purchases. In 2023, the FTC also announced:
In addition to children protection, the FTC took action against home security camera company Ring and required the company to pay USD5.8 million for allegedly compromising its customers’ privacy by allowing any employee or contractor to access consumers’ private videos, and by failing to implement basic privacy and security protections, enabling hackers to take control of consumers’ accounts, cameras and videos.
Private Litigation
In addition to enforcement from regulatory entities, individuals may bring private rights of action and class actions for privacy and security violations that relate to credit reporting, marketing, electronic communications and call recording, under the respective legislation. Pursuant to the CCPA (California), individuals may bring a private right of action to claim statutory damages where their unencrypted personal information was not adequately protected by an organisation. It is possible that there will be increased class actions in this area.
Employees may also bring a private right of action under the common law, where previous cases have established a precedent regarding the invasion of their privacy by their employer’s workplace monitoring. Employees would need to demonstrate that there was an expectation of privacy in relation to the specific information that has been monitored by an employer.
Over the course of 2023, there was a wave of privacy class actions, alleging that the use of website session replay, chatbot, pixel and similar technologies constitute “wiretapping” and therefore violate state wiretap statutes such as the California Invasion of Privacy Act (CIPA). Wiretap statutes prohibit wiretapping, eavesdropping and non-consensual telephone call recordings. Violations of CIPA are particularly attractive for plaintiffs because a successful suit can result in a USD5,000 statutory penalty per violation. Companies should therefore update their privacy policies, terms of use and relevant disclosures to consumers, both on their websites and in their chatbot features, to ensure transparency, and should obtain proper consent prior to the use of chat transcripts and related content.
The Fourth Amendment of the US Constitution protects the privacy of a person and possessions from unreasonable searches and seizures by federal or state law enforcement authorities. This right is triggered where an individual has a reasonable expectation of privacy.
The Fourth Amendment provides safeguards to individuals during searches and detentions, and prevents unlawfully seized items from being used as evidence in criminal cases. The degree of protection available in a particular case depends on:
The reasonableness standard generally requires a warrant supported by probable cause. The search and seizure must also be conducted reasonably. When law enforcement officers violate an individual’s constitutional rights under the Fourth Amendment, and a search or seizure is deemed unlawful, any evidence derived from that search or seizure will almost certainly be kept out of any criminal case against the person whose rights were violated.
The Foreign Intelligence Surveillance Act
The Foreign Intelligence Surveillance Act (FISA) permits the US government to access personal data for national security purposes. Pursuant to FISA, the government can obtain information, facilities or technical assistance from a broad range of entities. National Security Letters (NSLs) offer an additional investigative tool for limited types of entities. The Foreign Intelligence Surveillance Court (FISC), a federal court staffed by independent, life-tenured judges, approves and oversees FISA activities.
Executive Order 12333
Originally issued in 1981, Executive Order 12333 on US Intelligence Activities (EO 12333) was enacted to, among other things, “enhance human and technical collection techniques [of the US government], especially those undertaken abroad, and the acquisition of significant foreign intelligence, as well as the detection and countering of international terrorist activities and espionage conducted by foreign powers”.
In broad terms, EO 12333 provides the foundational authority by which US intelligence agencies collect foreign “signals intelligence” information, being information collected from communications and other data passed or accessible by radio, wire and other electromagnetic means. Unlike FISA’s Section 702, EO 12333 does not authorise the US government to require any company or person to disclose data.
Similar to FISA’s Section 702, EO 12333 requires procedures to minimise how an agency collects, retains or disseminates US person information. These procedures must be approved by the Attorney General and can be found in documents such as United States Signals Intelligence Directive SP0018 (USSID 18).
Presidential Policy Directive 28
Presidential Policy Directive 28 (PPD-28), a Presidential Directive in effect since 2014, sets certain binding requirements for SIGINT (ie, signals intelligence) activities.
Executive Order 14086
On 7 October 2022, President Biden signed an Executive Order to implement the EU-US Data Privacy Framework. Among other things, the new framework will allow individuals in the EU to seek redress through an independent Data Protection Review Court made up of members outside the US government. That body “would have full authority to adjudicate claims and direct remedial measures as needed”. In addition, the Executive Order provides that US signals intelligence activities may only be conducted following a determination that they are “necessary to advance a validated intelligence priority”, and “only to the extent and in a manner that is proportionate to the validated intelligence priority for which they have been authorised”. The Executive Order also specifies certain “legitimate objectives” and “prohibited objectives” for which US signals intelligence activities may be carried out. The Executive Order also requires US intelligence agencies to review their policies and procedures to implement these new safeguards.
The CLOUD Act
The US Clarifying Lawful Overseas Use of Data Act (the “CLOUD Act”) was passed in 2018, mooting the then pending US Supreme Court case, United States v Microsoft (Ireland), in which Microsoft challenged a warrant from the US government requiring it to produce emails that were electronically stored on servers located in Ireland. The CLOUD Act amended an existing US law, the Stored Communications Act (SCA), to allow US law enforcement, through a warrant, subpoena or court order, to access communications data stored electronically outside the USA, as long as the information sought is relevant and material to an ongoing criminal investigation.
The CLOUD Act explicitly states that it applies to providers of an electronic communications service or remote computing service who hold or store data or other information “pertaining to a customer or subscriber”, “regardless of whether such communication, record or other information is located within or outside the United States”. Accordingly, even if data is stored outside the USA, the US government would still be able to seek access to such data located outside the USA, as long as the service provider is subject to the jurisdiction of the USA. These powers apply to any provider of an electronic communications service or remote computing service who is subject to US jurisdiction.
CLOUD Act Agreements
In addition, the CLOUD Act also enables entry into executive agreements with foreign countries, whereby countries who enter into such agreements may request data directly from companies based in the other country. In this respect, the CLOUD Act supplements rather than eliminates mutual legal assistance treaties (MLATs), which remain another method by which evidence in criminal cases is made available to authorities from other countries.
“Quashing” CLOUD Act Warrants
In addition, the CLOUD Act provides for a procedure for service providers to file a motion to “quash” (ie, annul) or modify a CLOUD Act warrant, in limited circumstances and subject to several conditions. The CLOUD Act provides that “a provider of electronic communication service to the public or remote computing service, including a foreign electronic communication service or remote computing service, that is being required to disclose” the contents of a communication, may file a motion to modify or quash the legal process where the provider reasonably believes:
According to the US Department of Justice:
“[a] request to issue a warrant must be submitted to an independent judge for approval. The judge cannot authorise the warrant unless he or she finds that the government has established by a sworn affidavit that ‘probable cause’ exists that a specific crime has occurred or is occurring and that the place to be searched, such as an email account, contains evidence of that specific crime. Further, the warrant must describe with particularity the data to be searched and seized; fishing expeditions to see if evidence exists are not permitted.”
As mentioned in 1.4 Multilateral and Subnational Issues and 1.7 Key Developments, there have been quite a few developments in the area of international data transfers. The invalidation by the CJEU on 16 July 2020 of the Privacy Shield framework has led to quite a few changes, including a new set of Standard Contractual Clauses issued by the European Commission and, more recently, the Executive Order to implement the EU-US Data Privacy Framework. As mentioned previously, the Executive Order is not specific to the EU-US Data Privacy Framework: the European Commission has stated that “all the safeguards that have been put in place by the US government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the USA under the GDPR, regardless of the transfer mechanisms used”.
There are no restrictions on international data transfers of personal information under US law. However, data transfer restrictions introduced by other jurisdictions, such as those pursuant to EU law, restrict the transfer of personal data relating to EU residents into countries such as the USA that are not deemed to offer an “adequate” level of protection. In order to remedy this situation, companies have to commit to EU principles by entering into arrangements such as binding corporate rules and standard contractual clauses (SCCs) to facilitate the data transfer, and must implement supplementary safeguards.
On 4 June 2021, the European Commission issued modernised standard contractual clauses under the GDPR for data transfers from controllers or processors in the EU/EEA (or otherwise subject to the GDPR) to controllers or processors established outside the EU/EEA (and not subject to the GDPR). These modernised SCCs replace the three sets of SCCs that were adopted under the previous Data Protection Directive 95/46.
On 7 October 2022, President Biden signed an Executive Order to implement the EU-US Data Privacy Framework. The Executive Order is not limited to the EU-US Data Privacy Framework, however: as stated by the European Commission, “all the safeguards that have been put in place by the US government in the area of national security (including the redress mechanism) will be available for all transfers to companies in the USA under the GDPR, regardless of the transfer mechanisms used”.
As mentioned in 1.4 Multilateral and Subnational Issues, 1.7 Key Developments and 3.4 Key Privacy Issues, Conflicts and Public Debates, the EU-US Data Privacy Framework is now in place.
See under Global CBPR Forum in 1.4 Multilateral and Subnational Issues.
US law does not require any government notifications or approvals in order to transfer personal information internationally.
There are no data localisation requirements under US federal law. However, the transfer of sensitive personal information belonging to US citizens is an emerging issue in the USA.
Certain public procurement contracts impose domestic data storage as a requirement.
There is no law formally requiring software code, algorithms or similar technical detail to be shared with the US government. This does not mean, however, that organisations have never been requested to share such information, for example on the grounds of national security. Such requirements may, for instance, exist in certain public procurement contracts.
A mutual legal assistance treaty (MLAT) is the most common method of foreign enforcement agencies requesting assistance with cross-border issues and access to information located within another jurisdiction. However, organisations are not compelled to comply with such requests. The MLAT simply provides a formal mechanism for processing information requests.
Under the CLOUD Act, service providers under US jurisdiction may be prevented from disclosing communications to foreign governments unless there is a CLOUD Act agreement in place. However, these executive agreements only lift the blocking statute (the Stored Communications Act) and permit companies to comply with foreign government requests; companies are not required to comply with such requests.
The Stored Communications Act (SCA) operates as a “blocking statute” as it prohibits service providers in the USA from disclosing communications to a foreign government (subject to limited exceptions that do not apply to foreign government requests) unless there is a CLOUD Act agreement in place. The SCA will apply where the information sought by the foreign government relates to the communications of one of its own nationals, even where it relates to the investigation of criminal behaviour. Furthermore, the SCA prevents disclosure of such data even where the foreign government is subject to an order under its own national laws to obtain the required information.
Artificial Intelligence
There are no specific federal laws in the USA regarding artificial intelligence (AI).
On 26 January 2023, the US National Institute of Standards and Technology (NIST) released the Artificial Intelligence (AI) Risk Management Framework (AI Risk Management Framework 1.0), a voluntary guidance document for managing and mitigating the risks of designing, developing, deploying and using AI products and services. NIST also released a companion playbook for navigating the framework, a roadmap for future work, and mapping of the framework for other standards and principles, both at home and abroad.
At the state level, a number of state legislatures have introduced laws to combat discriminatory AI practices. For example, Colorado requires insurers to disclose and conduct risk management of any use of algorithms and predictive modelling in order to better guarantee equitable insurance coverage. New York City passed a law in 2021 that restricts the use of automated decision systems in the screening of candidates by requiring employers to conduct bias audits, publish results and notify candidates of the use of such tools, subject to civil penalty. States such as California have also issued regulations in relation to automated decision-making practices.
Connected TVs
California was the first state in the USA to regulate the collection and use of voice data through connected televisions (ie, smart TVs). Section 22948.20 of the Business and Professions Code provides that a “person or entity shall not provide the operation of a voice recognition feature within this state without prominently informing, during the initial set-up or installation of a connected television, either the user or the person designated by the user to perform the initial set-up or installation of the connected television”. In short, this section requires manufacturers to provide notice of voice-control features during the initial set-up of a connected television. Sections 22948.20 (b) and (c) also restrict the sale or use of voice data for advertising purposes.
Internet of Things (IoT)
California is also the first state in the nation to enact a cybersecurity law for connected devices, as in October 2019 Senate Bill 327 was signed into law. This law, also known as the Internet of Things (IoT) Law, requires device manufacturers to consider and to implement security features for all functionality stages of connected devices. Notably, the IoT Law does not appear to be limited to consumer devices: any device that connects to the internet, regardless of the type of information processed, appears to be covered by this law. Other states such as Oregon have also adopted a similar law.
In December 2020, a federal law, the IoT Cybersecurity Improvement Act of 2020, was signed into law, requiring NIST to develop and publish standards and guidelines on addressing issues related to the development, management, configuring and patching of IoT devices for use by federal agencies.
Biometrics and Facial Recognition
In the USA, there is no single federal law that regulates biometric data use and collection, although there are state-specific laws in place. For example, the State of Illinois introduced the Biometric Information Privacy Act (BIPA) in 2008, which regulates how private entities can collect, use and share biometric information and biometric identifiers, and which imposes certain security requirements to protect this data. In particular, the Illinois Supreme Court held in Rosenbach v Six Flags Entertainment Corp (2019) that the BIPA does not require persons whose fingerprints or other biometric identifiers are stored without compliance with the law to prove anything more before being able to sue for the statutory damages prescribed by the statute.
Chatbots
On 1 July 2019, California’s Bolstering Online Transparency Act (the “BOT Act”) came into effect as a reaction to growing concerns that, as technology improves, bots are getting increasingly better at influencing consumers and voters. The BOT Act defines a bot as an “automated online account where all or substantially all of the actions or posts of that account are not the result of a person”. The BOT Act makes it “unlawful for any person to use a bot to communicate or interact with another person in California online, with the intent to mislead the other person about its artificial identity… in order to incentivise a purchase or sale of goods or services in a commercial transaction or to influence a vote in an election”. There is no liability, however, if the person discloses its use of a bot in a manner that is “clear, conspicuous and reasonably designed to inform persons with whom the bot communicates or interacts”.
The BOT Act only applies to bots that interact with California residents, but there is currently no indication that the law is limited to California businesses only.
Organisations in the USA have not yet established any protocols for digital governance or fair data practice review boards or committees to address the risks of emerging or disruptive digital technologies. However, organisations can establish such protocols or bodies on a voluntary basis.
See 1.8 Significant Pending Changes, Hot Topics and Issues.
The acquirer’s due diligence investigation should at least consider the following (note: this list is not intended to be exhaustive).
If any pre-closing sharing of data takes place, a data transfer agreement will need to be put in place. Typically, this will cover the acquiring company’s obligations around the handling of such data – eg, requiring the acquiring company to:
In July 2023, the Securities and Exchange Commission (SEC) adopted rules requiring registrants to disclose material cybersecurity incidents they experience, and to disclose on an annual basis material information regarding their cybersecurity risk management, strategy and governance. The SEC also requires foreign private issuers to make comparable disclosures.
On 9 July 2021, President Joe Biden signed an Executive Order to promote competition in the American economy. The executive order announces a shift towards a greater scrutiny of mergers, “especially by dominant internet platforms, with particular attention to the acquisition of nascent competitors, serial mergers, the accumulation of data, competition by ‘free’ products, and the effect on user privacy”. It also encourages the US Federal Trade Commission to establish rules on surveillance, data accumulation and “barring unfair methods of competition on internet marketplaces”.
There are no other significant issues in US data protection practice not already addressed in this article.
2650 Birch Street
Suite 100
Palo Alto, CA 94306
USA
+1 650 313 2361
Paul.Lanois@fieldfisher.com www.fieldfisher.comAt the risk of sounding like a broken record, 2023 was another busy year in the data privacy space, and this is likely to continue in 2024. New or updated data privacy laws entered into effect during 2023 (for California, Virginia, Colorado, Connecticut and Utah), with more privacy laws entering into effect during 2024. For example:
It is not just new state privacy laws making the headlines, but also a number of recent enforcement actions. This article introduces some key areas that are currently of note.
State Legislation
Previously covered topics included the changes introduced by the state privacy laws, which entered into effect in 2023 (ie, the California Consumer Protection Act as amended by the California Privacy Rights Act, the Virginia Consumer Data Protection Act, the Colorado Privacy Act, the Connecticut Data Privacy Act and the Utah Consumer Privacy Act), so these will not be revisited here.
In the absence of comprehensive federal privacy legislation, more and more state legislatures have considered introducing a state privacy law to regulate the handling of personal information relating to state residents, as well as providing state residents with privacy rights (such as the right of access, right of correction, right of deletion, right to opt out of targeted advertising or the sale of their personal information) that are gradually becoming standard rights. Here are some of the key highlights of the new privacy laws that will enter into effect in 2024.
Florida’s Digital Bill of Rights
This takes effect on 1 July 2024, with a limited jurisdictional scope: it primarily applies to businesses with an annual global revenue exceeding USD1 billion. In addition to providing for consumer privacy rights that have now become a standard in the recent US state privacy laws (eg, the right to confirm that their personal data is being processed, the right of access/obtaining a copy of their personal data, the right of correction, the right of deletion and the right to opt out of the sharing of their data for targeted advertising purposes), Florida’s Digital Bill of Rights grants consumers the right to opt out of the collection of their personal information through voice or facial recognition features.
In addition, if a controller engages in selling sensitive personal data, the controller must provide the following notice: “NOTICE: This website may sell your sensitive personal data.”
Similarly, if a controller engages in selling biometric personal data, the controller must provide the following notice: “NOTICE: This website may sell your biometric personal data.”
These notices must be posted in the same location and in the same manner as a covered company’s privacy notice. Finally, for providers of an online service, product, game or feature likely to be predominantly accessed by individuals under the age of 18 (“online platforms”), the Bill:
Oregon’s Consumer Privacy Act
This takes effect on 1 July 2024. It does not have a threshold based on an entity’s annual revenue. It applies to organisations that conduct business in Oregon or that provide products or services to Oregon residents, and that during a calendar year control or process the personal information of at least 100,000 Oregon residents or control or process the personal information of 25,000 Oregon residents and derive more than 25% of their gross revenue from selling personal information.
In addition to providing for consumer privacy rights that have now become a standard in the recent US state privacy laws (such as the right to confirm whether a controller is processing the consumer’s personal data, the right to access their personal data, the right to correct inaccuracies, the right to delete their personal data, the right to opt out of the processing of the personal data for purposes of targeted advertising, selling and profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer), Oregon’s Consumer Privacy Act requires a controller’s privacy notice to describe “all categories of third parties with which the controller shares at a level of detail that enables the consumer to understand what type of entity each third party is and, to the extent possible, how each third party may process personal data”.
Texas’ Data Privacy and Security Act
This takes effect on 1 July 2024. It does not have any threshold based on annual revenue thresholds or volume of data processed for determining applicability, contrary to most other US state privacy laws. However, Texas’ Data Privacy and Security Act provides an exemption for small businesses, as defined by the US Small Business Administration, unless they sell sensitive data, in which case they must obtain consumer consent in advance.
Texas’ Data Privacy and Security Act also provides for consumer privacy rights that have now become a standard in the recent US state privacy laws (such as the right to confirm whether a controller is processing their personal data and to access that data, the right to correct inaccuracies in their personal data, the right to request deletion of their data, the right to opt out of processing for the purposes of targeted advertising, sale of personal data or profiling, and the right to appeal a controller’s refusal to take action on a consumer request to exercise their rights). Like the laws in Colorado, Connecticut, California and Montana, Texas’ Data Privacy and Security Act will require covered businesses to recognise universal opt-out mechanisms for the sale of personal data and targeted advertising in 2025.
Montana’s Consumer Data Privacy Act
This takes effect on 1 October 2024. It is similar to Oregon’s Consumer Privacy Act as it also does not have a revenue threshold. It applies to businesses that conduct business in the state or that produce products or services targeted at state residents, and that control or process the personal data of at least 50,000 Montana residents or control or process the personal information of 25,000 Montana residents and derive more than 25% of their gross revenue from selling personal data.
Montana has one of the lowest thresholds since most other state privacy laws apply to a business that controls or processes the personal data of 100,000 state residents. Like the above state privacy laws, Montana’s Consumer Data Privacy Act provides Montana residents with:
Surge in Session Replay Lawsuits
In order to better understand what users are viewing, clicking on or hovering over, a growing number of website operators have turned to “session replay” technologies in order to understand the consumer’s interactions with the website at regular time intervals. This can be used, for example, to collect information on broken links as well as to provide better error messages to support IT teams.
Over the course of 2023, there was a wave of privacy class actions, alleging that the use of website session replay, chatbot, pixel and other similar technologies constitute “wiretapping” and therefore violate state wiretap statutes such as the California Invasion of Privacy Act (CIPA). Wiretap statutes prohibit wiretapping, eavesdropping and non-consensual telephone call recordings, and typically require all parties to a communication to consent to the communication being recorded. Violations of CIPA are particularly attractive for plaintiffs because a successful suit can result in a USD5,000 statutory penalty per violation.
The Video Privacy Protection Act (VPPA) has also made a bit of a comeback through such lawsuits as plaintiffs are using it to challenge the use of pixel technology across a variety of websites that provide online video content. For example, a lawsuit alleging a violation of the VPPA was allowed to proceed against a gaming and entertainment website which “hosts pre-recorded video streaming content”, although many such claims have also been dismissed where plaintiffs failed to adequately allege either a relationship with the business (such as registration or any subscription commitment) or access to restricted content.
On this basis, companies should update their privacy policies, terms of use, and relevant disclosures to consumers, both on their websites and in their chatbot features, to ensure transparency, and should obtain proper consent prior to the use of chat transcripts and related content.
Enforcement in Relation to the Sale-Of Date
In February 2024, the California Attorney General announced a settlement with DoorDash, whereby the company was required to pay a USD375,000 civil penalty to resolve allegations that the company violated the California Consumer Privacy Act (CCPA) and the California Online Privacy Protection Act (CalOPPA). The investigation by the California Department of Justice claimed that DoorDash sold its California customers’ personal information without providing notice or an opportunity to opt out of that sale, in violation of both the CCPA and CalOPPA. The sale allegedly occurred in connection with DoorDash’s participation in a marketing co-operative, where businesses contribute the personal information of their customers in exchange for the opportunity to advertise their products to each other’s customers.
In addition to the financial penalty, the company is required to comply with California requirements that apply to businesses that sell personal information, including reviewing contracts with marketing and analytics vendors as well as use of technology to evaluate whether it is selling or sharing consumer personal information. The DoorDash agreement is the second CCPA enforcement settlement so far – the previous being the settlement in August 2022 involving the cosmetics retailer Sephora, after allegations that the company did not tell consumers it was selling their personal information and did not process customer opt-out requests.
The Federal Trade Commission (FTC) has also taken enforcement actions against companies for providing data to third parties. For example, the FTC announced that it requires a UK-based software company, Avast, to pay USD16.5 million as part of a proposed consent agreement that would also prohibit the company from selling browsing data for advertising purposes. According to the FTC complaint, the company did not inform consumers that it collected and sold their browsing data. Moreover, the FTC alleges that the company failed to prohibit some of its data buyers from re-identifying users. The FTC also alleges that the company deceived users by claiming that the software would protect consumers’ privacy by blocking third-party tracking, but that it failed to adequately inform consumers that it would sell their detailed, re-identifiable browsing data.
In order to mitigate risks, companies should consider the following.
Children’s Protection
Children’s privacy was another key area of focus for the FTC in 2023, with the FTC finalising record-setting penalties against Epic Games, creator of the popular video game Fortnite. The company was required to pay USD275 million for violations of the Children’s Online Privacy Protection Act of 1998 (COPPA) (over allegations that Epic had collected children’s personal information without parental consent, and that it had set voice and text chat features to “on” by default), as well as USD245 million over allegations that the company used dark patterns and other deceptive practices to trick players into making unwanted purchases.
In 2023, the FTC also announced:
Moreover, in 2023 the FTC also announced further action against Meta, including a blanket prohibition against monetising data of children and teens under 18, on the basis that the company had allegedly misled parents about their ability to control with whom their children communicated through its Messenger Kids app, and had misrepresented the access it provided certain app developers to private user data.
Data Security
In 2023, the FTC finalised its order against education technology provider Chegg Inc for its data security practices, which the FTC found to be “careless” and “lax” and which exposed sensitive information about millions of Chegg’s customers and employees (including social security numbers, email addresses and passwords).
The FTC also finalised a separate order against online alcohol marketplace Drizly and its CEO over security failures by the company, which the FTC said led to a data breach exposing the personal information of about 2.5 million consumers. The FTC also took action against home security camera company Ring, and required the company to pay USD5.8 million for allegedly compromising its customers’ privacy by allowing any employee or contractor to access consumers’ private videos and by failing to implement basic privacy and security protections, enabling hackers to take control of consumers’ accounts, cameras and videos.
On 20 December 2023, the FTC published its Notice of Proposed Rulemaking to update COPPA, the latest step in a process that started in 2019 with an FTC Request for Comments regarding proposed updates to COPPA.
After consideration of more than 175,000 submissions in response to the Request for Comments, the FTC published several proposed and significant changes to COPPA that, if ultimately approved, will require regulated companies that direct online services to children under the age of 13 (or that have actual knowledge that they are collecting personal information from a child under the age of 13) to implement significant changes to their business operations. Such proposed changes include:
2650 Birch Street
Suite 100
Palo Alto, CA 94306
USA
+1 650 313 2361
Paul.Lanois@fieldfisher.com www.fieldfisher.com