Data privacy is regulated in the US by various legal authorities, including the US Constitution, federal and state statutes and regulations, and local law.
US Constitution
The First Amendment, in some circumstances, protects people’s right to speak or engage in other protected activities anonymously, and the Fourth Amendment requires law enforcement, when investigating a crime, to obtain a warrant, issued by a judge or magistrate based on a showing of “probable cause” that specifically identifies the places to be searched or the things to be seized.
Federal Statutes
Federal statutes regulate data privacy in certain sectors, and the Federal Trade Commission (FTC), which is the principal federal privacy regulator, also has authority to bring enforcement actions related to data privacy and security.
General consumer protection
The FTC uses Section 5 of the FTC Act, which prohibits unfair or deceptive acts or practices in or affecting commerce, to bring enforcement actions against companies with regard to their privacy and security practices.
Financial institutions and monetary affairs
The Fair Credit Reporting Act (FCRA) governs data used to evaluate consumers for extension of credit, employment, insurance and certain other matters.
The Gramm-Leach-Bliley Act (GLBA) and the Safeguards Rule govern protection of non-public consumer personal information and disclosures by certain financial institutions to third parties.
The Right to Financial Privacy Act imposes certain data privacy obligations on particular financial institutions.
Children’s privacy
The Children’s Online Privacy Protection Act (COPPA) regulates online collection, use and disclosure of personal information from children under the age of 13, and generally requires notice and verifiable parental consent before doing so.
Education privacy
The Family Educational Rights and Privacy Act governs access, use and disclosure of “education records” and students’ personally identifiable information.
Health information
The Health Insurance Portability and Accountability Act (HIPAA) regulates health information privacy and security, but applies only to certain “covered entities” and, in some cases, covered entities’ service providers, known as “business associates”.
The Confidentiality of Substance Use Disorder Patient Records rule regulates substance use disorder records generated by certain federally conducted or assisted programmes.
Communications and media
The Cable Act prohibits cable operators’ disclosure of personally identifiable information of subscribers to cable and other services, unless authorised by the Act or by specific court orders with notice to subscribers, including the opportunity for subscribers to contest certain orders.
The Video Privacy Protection Act (VPPA) prohibits providers of physical and digital audio-visual materials from disclosing information identifying video rentals or usage without specific advance subscriber consent, subject to limited exceptions.
The Electronic Communications Privacy Act (ECPA) includes the following:
The Telephone Consumer Protection Act protects consumers from unwanted “robocalls” and text messages made using auto-diallers and/or an artificial, synthetic or prerecorded voice.
The CAN-SPAM Act protects consumers from unwanted unsolicited commercial emails.
The Communications Act prohibits telephone companies from disclosing certain data about consumers’ telecommunications services, communications metadata and bills without customer consent, subject to limited exceptions.
Driver’s licence information
The Driver’s Privacy Protection Act prohibits states from selling information acquired from individuals while issuing driver’s licences and automobile registrations.
Information in computer systems
The Computer Fraud and Abuse Act prohibits unauthorised access, or exceeding authorised access, to certain computer systems.
Biometric data
The FTC can bring enforcement actions under the FTC Act against commercial entities regarding their handling of biometric data. The FTC defines biometric data to include facial, eye or fingerprint data that identifies individuals.
Government access to personal information
For information held by the federal government:
For law enforcement and intelligence community access to personal information:
State Statutes
In the absence of comprehensive federal privacy legislation, states began to enact their own omnibus consumer privacy statutes. States have also enacted laws that protect particular types of sensitive personal information (eg, biometric information) and information in certain industry sectors.
State General Privacy Laws
20 US states have enacted broad consumer privacy laws. These laws provide similar consumer rights and data controller and processor obligations.
California Consumer Privacy Act (CCPA)
The CCPA applies to for-profit entities doing business in California that determine the purposes and means of data processing and meet the jurisdictional threshold – ie, that:
These entities are “businesses”, similar to “controllers” under the GDPR. Businesses must do the following, among other things, subject to exceptions:
Under the CCPA, consumers (California residents) have rights to do the following:
The California Privacy Protection Agency (CPPA) and the California attorney general enforce the CCPA. The CPPA has authority to promulgate implementing regulations. The CCPA provides a limited private right of action for consumers in the event of a data breach caused by inadequate security safeguards.
In addition to the CCPA, the California Online Privacy Protection Act requires operators of websites and online services that collect personally identifiable information from California residents to post a privacy policy that contains certain information, and the California “Shine the Light” Law requires businesses that have disclosed certain personal information of California consumers to third parties for those third parties’ own direct marketing purposes to give consumers the right to receive information about those disclosures and those third parties.
The CCPA compared with other state privacy statutes
While the CCPA is like other state privacy statutes in many respects, there are some important differences, as follows.
The CCPA protects the personal information of employees and individuals in an employment and business context, whereas the other state privacy laws apply only to personal information in a personal or household context.
Except in the few cases noted below, other state laws do not use the amount of a company’s annual revenue as a jurisdictional threshold. Instead, with a couple of exceptions, they use the number of state residents whose personal data an entity collects or the revenue the entity derives from the sale of personal information.
While the CCPA uses the terms “business” and “service provider” (and “contractor”), the other state laws use the terms “controller” and “processor”, which are roughly equivalent to the same terms in the GDPR. All state privacy laws use the term “third party” to describe entities that are none of these, however.
The CCPA gives consumers the right to limit the processing of their sensitive personal information that is used to infer characteristics about them, while most other states require entities to obtain consent from consumers before processing their sensitive personal information. States generally define “sensitive” personal data to include information revealing:
Some states add additional categories to the foregoing, such as precise geolocation, philosophical beliefs, sex life, union membership and – in California and Colorado – neural data (among others).
Thus far, only the laws of California, Colorado, Florida and New Jersey authorise rulemaking to implement their privacy laws.
While most state laws expressly exempt entities covered by sector-specific privacy laws (eg, financial institutions regulated under the GLBA), the CCPA provides exemptions that exempt the information, rather than the entities, governed by sector-specific statutes.
The CCPA has a limited private right of action for certain data breaches, whereas none of the other state privacy laws has a private right of action.
These other state laws, with a few exceptions, provide the same rights to consumers and impose the same obligations on controllers, though each has its own unique provisions, as described below.
Colorado
The Colorado Privacy Act (CPA) uses as a jurisdictional threshold the amount of consumers’ personal data processed annually (at least 100,000 or more, or 25,000 if the controller derives revenue or consumers receive a discount, from the sale of personal data). Colorado controllers include non-profit entities. The CPA requires controllers to perform a data protection assessment for processing personal information that may cause a heightened risk of harm, such as targeted advertising. Colorado controllers must have an appeals process for consumers who object to how their requests are handled.
Connecticut
The Connecticut Data Privacy Act (CTDPA), as amended, has applicability thresholds like the CPA. The CTDPA does not apply to non-profits, however. The CTDPA prohibits targeted advertising to, and the sale of personal data of, consumers who the controller has actual knowledge of or wilfully disregards as being 18 years’ old without consent (parental consent is required for children under 13).
Virginia
The Virginia Consumer Data Protection Act (VCDPA) is like the CPA, but does not apply to non-profits. The VCDPA also gives controllers a right to cure non-compliance before enforcement.
Utah
The Utah Consumer Privacy Act (UCPA) applies to businesses that both:
Also, the UCPA:
Iowa
The Iowa Consumer Data Protection Act (IACDPA) defines a “sale” as an exchange of personal data with a third party for monetary consideration only, and, like the Utah law, it gives consumers the right to opt out of (not opt in to) sensitive data processing.
Nebraska
The Nebraska Data Privacy Act (NDPA) applies to entities that:
The NDPA does not apply to non-profits.
Delaware
The Delaware Personal Data Privacy Act (DPDPA) largely follows the Colorado model but applies to controllers that control or process the personal data of at least 35,000 Delaware consumers. The DPDPA applies to non-profits and has no HIPAA-entity level exemption.
New Hampshire
The New Hampshire Expectation of Privacy statute is similar to Delaware, except that it does not apply to non-profits.
New Jersey
The New Jersey Data Privacy Act (NJDPA) roughly follows the Colorado model, except that:
Oregon (effective 1 July 2024 generally, and 1 July 2025 for non-profits)
The Oregon Consumer Privacy Act (OCPA) follows the Colorado model, and, like the CPA, applies to non-profit entities, with some exceptions. The OCPA has no GLBA or HIPAA entity-level exemptions. The OCPA gives consumers the right to obtain the names of specific third parties to which the controller has disclosed the consumers’ personal information, but the controller can respond by disclosing the names of the specific third parties to which it has disclosed personal data generally. Oregon requires controllers to obtain consent (from parents, for children under 13) before selling or using the personal information of children under 16 for targeted advertising.
Texas
The Texas Data Privacy and Security Act (TDPSA), like the Nebraska law, has no per-consumer collection threshold and instead applies to entities that:
Controllers that sell sensitive personal data must prominently disclose the following: “NOTICE: We may sell your sensitive personal data.”
Similarly, controllers that sell biometric personal data must provide the following: “NOTICE: We may sell your biometric personal data.”
Montana
The Montana Consumer Data Privacy Act (MCDPA) prohibits the sale of personal data of consumers who the controller has actual knowledge of being under 16 and the use of their personal information for targeted advertising without consent (parental consent is required for consumers under 13 years of age).
Minnesota (effective 31 July 2025 generally, and 31 July 2029 for post-secondary institutions)
The Minnesota Consumer Data Privacy Act (MCDPA) is like the Colorado law, but gives broader rights to consumers who are subject to profiling by automated decision-making technology in certain circumstances. Such consumers have the right to:
The MCDPA, like the Oregon law, also gives consumers the right to a list of specific third parties to whom the controller has disclosed their personal information, or a list of all specific parties to which the controller has disclosed personal information generally.
Indiana (effective 1 January 2026)
The Indiana Consumer Data Protection Act (INCDPA) is similar to the Virginia law and defines a data sale as the exchange of personal data for monetary consideration only.
Kentucky (effective 1 January 2026)
The Kentucky Consumer Data Protection Act (KCDPA) also resembles the Virginia law. The KCDPA provides a right to cure, does not require controllers to recognise browser opt-out signals, and defines a “sale” as the exchange of personal data for monetary consideration only.
Tennessee (effective 1 July 2025)
The Tennessee Information Protection Act (TIPA) largely resembles the Virginia law by, among other things, adopting the narrow definition of a data sale as the exchange of data for “valuable monetary consideration”. The TIPA is unique in allowing controllers and processors to assert an affirmative defence to claims that their data practices are inadequate if they adopt and comply with a written privacy programme that “reasonably conforms” to the National Institute of Standards and Technology privacy framework or a similar framework.
Rhode Island (effective 1 January 2026)
The Rhode Island Data Transparency and Privacy Protection Act (RI-DTPPA) has the same general applicability threshold as Delaware (35,000 consumers).
Maryland (effective 1 October 2025)
The Maryland Online Data Privacy Act (MODPA) has the same low applicability thresholds as Delaware and similar consumer rights and controller obligations, but it also includes some unique provisions that make compliance more difficult – for example, as follows.
Florida
The Florida Digital Bill of Rights (FDBR) applies to only a few very large companies – ie, those that have USD1 billion or more in annual revenue and obtain at least 50% of their revenue from digital advertisement sales, operate an app store or other digital distribution platform with at least 250,000 applications, or operate a consumer smart speaker. Only a handful of companies meet these requirements.
The FDBR otherwise has controller obligations and consumer rights that resemble those of the Virginia law, with some exceptions, including the following.
Sector-Specific State Privacy Statutes: Health Data
The Washington My Health My Data Act
The Washington My Health My Data Act (MHMD) is not a generally applicable state privacy law but is broad enough to affect many companies that process data not typically regarded as health data.
Nevada
Nevada’s Consumer Health Data Law (NVCHDL) is like Washington’s MHMD, except that it has no private right of action.
Sector-Specific State Privacy Statutes: Biometric Data
Three US states (Illinois, Texas and Washington) have laws that govern the collection, use, disclosure and storage of biometric data. Such data typically includes retina or iris scans, fingerprints, voiceprints and scans of hand or face geometry. All of these statutes impose notice and consent obligations on covered entities, although specific requirements vary.
Illinois
The Biometric Information Privacy Act (BIPA) prohibits collection of biometric data without specific advance notice and express consent in writing. It prohibits the selling, leasing or trading of, or profiting from, biometric data under any circumstances, without any exception for consent. The BIPA also uniquely requires companies to provide a publicly available policy that includes a retention schedule and destruction guidelines for biometric data. The BIPA provides a private right of action allowing for the recovery of statutory and actual damages.
Texas
The Capture or Use of Biometric Identifier Act (CUBI) prohibits the capture of biometric data for a commercial purpose without advance notice and express consent. The CUBI prohibits the sale, lease or other disclosure of biometric data to third parties unless one of several very narrow exceptions applies.
Washington
Washington’s law prohibits the enrollment of biometric data for a commercial purpose – ie, for marketing products that are unrelated to the initial transaction in which the data was collected – without advance notice, consent or a mechanism that notifies consumers of the subsequent use of the biometric data for a commercial purpose.
Other Sector-Specific State Privacy Statutes
State privacy laws also cover additional issues, such as the following.
Wiretapping/electronic eavesdropping
All 50 states prohibit surreptitious interception of private electronic communications and monitoring or recording of private in-person and electronic communications without the consent of at least one of the participants to the communication, subject to exceptions. 12 states require consent of all participants.
Student privacy laws
Most states have enacted laws that limit how operators of websites, applications and online services that market and provide their products and services to K-12 schools and school districts collect, use and disclose the personal information of students.
Data breach notification and data security
All US states and most US territories have enacted data breach notification laws. These laws generally require entities that own, license or maintain personal information of state residents to notify individuals in the event of unauthorised access to acquisition of personal information about those individuals. Such laws typically apply to a core set of personal information, such as:
Some such laws also apply to certain types of medical and health insurance information, certain usernames and passwords, and biometric data. Most of these laws also require notification to the state attorney general or other state agency.
Many states also have enacted data security laws, which generally require entities to protect personal information from unauthorised access, acquisition or other misuse. Generally, these requirements are broadly stated and require entities to maintain “reasonable” security measures. Some of these laws also specifically require contractual obligations to impose reasonable security measures on any third parties to which an entity discloses personal information, and to securely delete personal information when no longer needed.
Several states (eg, Massachusetts, New York and Oregon) have more detailed requirements. Those state laws require various administrative, technical and physical safeguards for personal information, such as:
Health information confidentiality
States generally govern the confidentiality of health information through:
Some states (notably California) also prohibit certain healthcare providers from responding to in-state or out-of-state warrants for data on use of reproductive healthcare services.
Data brokers
Four states – California, Vermont, Texas and Oregon – require data brokers to register with state agencies. Definitions vary, but generally “data brokers” are businesses that collect and sell or license the personal data of individuals with whom the business does not have a direct relationship. In addition, California’s Delete Act directs the CPPA to develop a mechanism that enables consumers – with one request – to delete personal information held by all data brokers registered with the state.
Disposal of records containing personal information
Most states have enacted laws that require businesses to securely destroy or dispose of personal information that is no longer needed. Acceptable methods typically include shredding or burning paper records and other media and altering electronic records to make them unreadable.
State unfair or deceptive practices statutes
State consumer protection statutes that prohibit companies from engaging in unfair or deceptive acts or practices are often used to protect consumers’ privacy interests.
State privacy torts
Most states recognise common law privacy torts such as “intrusion upon seclusion” and “publication of private facts”. (Some states have codified these torts in statutes.) The elements of these torts vary, but in general, if an intrusion into private spaces or affairs, or a publication of the private facts, would be “highly offensive to a reasonable person”, the person harmed may be able to sue for monetary damages.
Local Level Overview
Smaller jurisdictions within states, such as counties, townships and cities, have enacted local laws to address specific privacy issues.
The New York City Biometric Identifier Information Act has two distinct components. The law:
Violations are enforceable by a private right of action.
More than a dozen local governments have banned or significantly limited use of facial recognition by government agencies. The City of Portland, Oregon was the first to extend such regulation to private entities.
A number of regulators at the federal and state level have investigative and enforcement authority. Some also have authority to promulgate rules to implement privacy laws.
Federal Trade Commission (FTC)
The FTC requires entities under its jurisdiction to:
In addition, the FTC is responsible for protecting children’s privacy rights under COPPA and has certain enforcement responsibilities under other federal privacy statutes, including HIPAA.
Other Federal Agencies
Other federal agencies have authority to enforce privacy laws and regulations under their respective jurisdictions. Examples include the following.
State Agencies
State attorneys general and/or state consumer protection agencies generally have authority to enforce state privacy laws and regulations, and some state consumer protection laws give consumers a private right of action. State attorneys general also have authority to enforce certain federal privacy laws, such as COPPA and HIPAA, when violations of those laws have an impact on state residents. Finally, California is the first state to establish a standalone privacy regulator, the California Privacy Protection Agency (CPPA).
Regulators (such as the FTC) generally initiate enforcement proceedings when a particular issue comes to the agency’s attention, either from press reports (for example, reports of data breaches), complaints from private parties or inquiries from other governmental entities.
Proceedings generally begin with a formal request for information, such as through a civil investigative demand, directing entities to answer questions, or a less formal (but legally binding) subpoena or “letter of inquiry”. These requests can require recipients to answer questions and produce records relevant to the inquiry.
Once the data-gathering phase is complete, the agency determines whether to initiate a formal enforcement proceeding. Prior to initiating a formal proceeding, most agencies will discuss the matter with the potential target to determine if a settlement can be reached. Agencies – and the FTC in particular – enter into settlements more frequently than they litigate formal enforcement proceedings. Settlements often include agreed-upon payments in the nature of fines.
Fines for alleged privacy violations vary. The FTC has negotiated fines as high as USD5 billion against a company that violated the privacy-related consent order it was operating under, as well as fines in the hundreds of millions of dollars against companies for alleged COPPA violations. Smaller fines are more common, however. For example, AT&T recently agreed to pay USD13 million to the FCC to settle claims that it failed to adequately protect consumer account information and call metadata, and Verizon and AT&T were recently fined USD47 million and USD57 million respectively for violations of their obligation to protect the location information of their wireless customers.
State regulators impose fines as well. In August 2022, the California Attorney General announced a USD1.2 million settlement with Sephora, a retailer of personal care and beauty products, resolving allegations that Sephora had not disclosed that it was selling consumers’ personal information and that it had not recognised consumers’ opt-out preference signals regarding the sale of their personal information.
In February 2024, the California Attorney General entered into a USD375,000 settlement with DoorDash regarding DoorDash’s alleged sale of consumer personal information without providing notice or an opt-out opportunity.
States have enacted several AI-specific laws.
The Utah Artificial Intelligence Policy Act requires deployers of generative AI technology to provide notice that the consumer is interacting with generative AI technology:
The Colorado Artificial Intelligence Act (CAIA), effective on 1 February 2026, imposes notice, disclosure, risk mitigation and opt-out requirements on deployers and developers of high-risk AI systems, and requires some disclosures for all AI systems that engage with consumers. High-risk systems are those that interact with consumers and that make, or are a substantial factor in making, consequential decisions regarding employment, insurance, housing, credit, education and healthcare.
The California Generative Artificial Intelligence Training Data Transparency Act, effective on 1 January 2026, will require developers of generative AI systems made publicly available to California consumers to publicly post disclosures regarding data used to train those systems, including whether the datasets include personal information.
California’s “unlawful use of bots” law requires notice that a bot is being used to communicate or interact with another person in California online in order to incentivise a purchase of goods or influence a vote in an election. Bots are defined as automatic online accounts where posts or actions are not “the result of a person”.
In addition to AI-specific regulations, the general consumer protections in federal and state general privacy laws as described in 1.1 Overview of Data and Privacy-Related Laws also regulate AI technology through their broad scope.
FTC
The FTC’s jurisdiction over unfair or deceptive acts or practices applies to AI technology just as it does to other services and industries. Examples of FTC enforcement and priorities related to AI technology include:
State General Privacy Laws
State privacy and consumer protection laws also have an impact on the development and deployment of AI technology to the extent that models are trained on personal information or are fine-tuned with personal information. Among others, issues that arise include:
The privacy litigation landscape has expanded greatly in the past 18 months to include new theories relating to the use of third-party vendors to augment companies’ online presence and the collection and sharing of certain data with those vendors. These relationships have spawned new theories relating to video-viewing habits of consumers and the reinterpretation of wiretapping claims. Meanwhile, data breach litigation continues, along with new collective actions that seek to use companies’ terms of service against them.
Wiretapping, Pen Registers, and Tap and Trace Litigation
Much of the so-called “wiretapping” litigation that became pervasive in 2024 invokes the California Invasion of Privacy Act (CIPA), which was originally enacted in the 1960s and was designed to prohibit the interception or recording of telephone calls without consent. Plaintiffs have brought claims under CIPA to allege that the use of third-party vendors to operate chat functions, improve website functionality or provide advertising metrics to the company somehow constitutes an illegal wiretap. Plaintiffs have also claimed that a related provision designed to allow law enforcement to install pen registers or trap and trace devices on suspected criminals’ phone lines applies to any website’s procurement of an IP address – a far-fetched notion considering that an exchange of IP addresses is required for internet operability.
Plaintiffs argue that these violations of CIPA result in statutory damages of USD5,000 per violation. While companies have had some success in defeating these claims at the pleading stage, often the cases are settled for nuisance value and before an appellate decision is issued that could foreclose the claims moving forward. One set of enterprising plaintiffs alleged that customer voice-authentication systems used by financial institutions violated CIPA’s provisions that prohibit examining or recording a person’s voiceprint or voice stress patterns to determine “the truth or falsity of statements made by such person”.
Biometrics
More than 2,000 cases have been litigated under the Illinois BIPA. Challenges to “fingerprint” timekeeping systems, facial or voice recognition authentication systems, and in-store security systems have been subject to multiple asserted claims, some resulting in “ruinous” damage awards prompting significant legislative amendments to BIPA in 2024.
The Texas State Attorney General has brought and settled multiple claims against online platforms under CUBI for alleged unconsented use and processing of biometric data and unauthorised disclosures.
Children’s Privacy Laws
Social media and technology companies have worked together to challenge, on First Amendment grounds, state laws designed to restrict the use of algorithms to deliver content to minors, obtaining injunctions in some instances to delay implementation and enforcement. The cases are still developing, with litigation continuing and some states choosing to modify the statutes in a manner more likely to survive legal challenges.
Data Breach Litigation
Consumer data breach class actions continue to cause companies headaches long after the incident has passed. While defendants have some success at the pleading stage and in defeating motions for class certification, the settlement value of cases involving statutory damages under statutes such as the CCPA has climbed higher, with plaintiffs typically insisting on non-reversionary funds with cash awards to persons covered by the statutory claims. Plaintiffs have been willing to hold out for greater settlements based in large part on the reluctance of defendants to litigate these matters, especially where they have insurance coverage. Moreover, plaintiffs’ recent success in obtaining favourable class certification orders over the past year has further increased the value of these cases, even if those orders were not as broad as plaintiffs requested.
VPPA Litigation
Websites of all sorts that link to video content and use third-party vendors to assist with website operations or analytics continue to find themselves in litigation, facing allegations that they have violated the VPPA. Like the claims under CIPA, these VPPA claims assert that websites that link to video content and share certain information with business partners (usually by “pixels”) violate the VPPA. Companies had enjoyed some amount of success in obtaining dismissals at the pleading stage, but the US Court of Appeals for the Second Circuit’s broad definition of what constitutes a “consumer” and “subscriber” under the statute will ensure that the filings continue.
Internet Privacy Litigation
The US Court of Appeals for the Sixth Circuit overturned an FCC reclassification of internet access service as a “telecommunications” service that would have subjected internet service providers (ISPs) to potentially expansive privacy rules, drastically limiting their collection, use and processing of internet users’ data.
The USA has always been the legal standard bearer in allowing collective or class actions, and that continues to be true in ongoing privacy litigation matters. Attorneys representing consumers have added another arrow to their quiver in bringing privacy claims over the past few years, turning mandatory arbitration provisions and class action waivers contained in the companies’ terms of service against them. Specifically, by assembling hundreds or thousands of consumers through online advertising campaigns in bringing CIPA, VPPA or other privacy claims with statutory damages attached, these attorneys have used the threat of pursuing mass arbitrations to force exorbitant settlements based on the prohibitive cost of paying for individualised arbitrations – costs that largely fall on the company. Even with claims that are dubious on the merits, settlement often makes sense because reaching the merits requires advancing significant fees to the arbitral forum.
While companies have been fighting back by modifying their terms to allow for grouped or batched arbitrations where mass claims are threatened, the arbitral bodies have been slow to adjust to plaintiffs’ increased willingness to weaponise the arbitration process, and courts that have reviewed the new provisions have expressed skepticism relating to enforceability.
The USA has not enacted a federal law like the EU Data Act, which aims to foster innovation and support the provision of services by making data more accessible and usable. Federal agencies and state legislatures have been active in this area, however. For instance, the CFPB recently issued its Personal Financial Data Rights rule (also frequently referred to as the “Open Banking” rule or the 1033 rule after the section of the Consumer Financial Protection Act that it implements), which requires certain financial institutions to make transaction data available to consumers (and third parties acting with consumers’ authorisation) in a standardised format that would enable use of that data by other entities in the financial services ecosystem. In addition, state privacy laws typically give consumers the right to obtain their personal data free of charge and in a format that enables portability so that they can transfer their personal data to another service. These laws are designed to foster both competition and innovation in the digital economy.
Regarding regulations governing internet of things (IoT) providers, the USA has focused more on the security of IoT devices than on the ability of such devices to make data available for use by others. For instance, the IoT Cybersecurity Improvement Act of 2020 directed the National Institute of Standards and Technology (NIST) to develop standards and guidelines for the federal government on the appropriate use and management of IoT devices “owned or controlled by an agency and connected to information systems owned or controlled by an agency, including minimum information security requirements for managing cybersecurity risks associated with such devices”. While this legislation regulates federal government procurement practices, it will nonetheless have an impact on the consumer marketplace as manufacturers that sell such devices to the federal government adjust their practices according to NIST guidelines.
In addition, two states – California and Oregon – have passed legislation mandating that manufacturers of IoT devices sold in those states ensure, among other things, that such devices have “reasonable security features” to protect the device and any information “from unauthorised access, destruction, use, modification or disclosure”. Other states have proposed similar legislation.
In the USA, some laws and regulations designed to foster competition in the digital information ecosystem also impose data privacy obligations, to ensure that such data will be protected even as the data is made available for new products and services. For instance, as previously noted, the CFPB’s Personal Financial Data Rights rule seeks to promote competition among various providers in the financial technology ecosystem by giving consumers the right – free of charge – to request the transfer of their personal financial data in usable format to third parties and to allow third parties to access such data with consumer authorisation. By requiring certain financial institutions, defined as “data providers” (what would be “data holders” under the EU Data Act), to make this data available, consumers may be able to switch between financial institutions more easily, potentially increasing competition and improving service offerings.
The CFPB’s rule is also intended to spur innovation in the fintech marketplace by enabling greater interoperability among banks and various fintech providers. At the same time, the rule also imposes privacy and data protections, such as requiring data providers and third parties to limit the purposes for which consumer data is used and disclosed, and prohibiting the sale of consumer data or its use for targeted advertising and cross-selling. The rule also imposes various data security obligations of the GLBA on data providers and third parties.
Similarly, the state laws that give consumers the right to obtain their personal data in a portable and readily usable format, when technically feasible, are privacy laws that require companies to give consumers certain privacy rights and protections. These laws also generally impose data minimisation requirements on companies, limiting the amount of personal data that they can process to what is necessary, reasonable and proportionate for the purposes disclosed to the consumer. These data minimisation requirements may limit the amount of personal information that is ultimately available for transfer to or access by another entity.
Data-processing services, including cloud service providers and similar service providers, are subject to the laws and regulations described in the foregoing sections and that apply generally to controllers and processors of personal information. To that end, consumers have the right under state privacy laws to request that data-processing service providers give them a portable and readily usable copy of their personal data, or otherwise enable the transfer of such data directly to another provider.
While the CFPB has authority to enforce the Personal Financial Data Rights rule, there has been no enforcement of said rule as of this article’s publication, since the first of several compliance dates is not until 1 April 2026. The CFPB’s enforcement authority flows from the Consumer Financial Protection Act, which allows the CFPB to file an action in federal court or by initiating an administrative adjudication proceeding in response to violation of its regulations.
Unlike in the EU, businesses are not required – except in limited circumstances – to obtain opt-in consent from consumers in the USA before allowing cookies to collect their personal data. (The one exception is for trackers present on websites or platforms displaying or allowing access to video content.) Therefore, cookie banners are not required in the USA, although businesses increasingly use them. Instead, US state laws require businesses to provide consumers with a mechanism to opt out of the disclosure of their personal information to third-party cookies when such disclosures are for monetary or other valuable consideration (called a “sale” under many state privacy laws) or for targeted advertising (ie, advertising based on consumers’ online activities over time and across unaffiliated websites).
Most state privacy laws do require businesses to obtain opt-in consent from consumers before allowing third-party cookies to collect certain sensitive personal data or before “selling” personal data (ie, making such data available to certain types of third-party cookies) collected from consumers known to be minors or before allowing third-party cookies to collect such data for targeted (or personalised) advertising.
As previously noted, comprehensive state privacy laws generally require controllers to give consumers the opportunity to opt out of the processing of their personal information for “targeted advertising”. State laws generally define “targeted advertising” as “displaying to a consumer an advertisement that is selected based on personal data obtained or inferred over time from the consumer’s activities across non-affiliated websites, applications or online services to predict consumer preferences or interests”.
The CCPA uses different terminology but similarly requires businesses to give consumers the chance to opt out of “cross-context behavioural advertising”, which it defines as “the targeting of advertising to a consumer based on the consumer’s personal information obtained from the consumer’s activity across businesses, distinctly branded websites, applications or services, other than the business, distinctly branded website, application or service with which the consumer intentionally interacts”. The CCPA is slightly more restrictive because it treats entities as a single “business” only if they are under common control and share common branding. Therefore, an entity that uses personal information obtained from a consumer’s activity across its differently branded affiliates’ websites for targeted advertising may need to provide the consumer with a mechanism to opt out of such advertising.
State privacy laws generally require controllers to provide an opt-out link in the footer of their website homepages, and many state privacy laws require controllers to recognise browser-based opt-out signals that consumers can configure to signal their requests to opt out of sales and for sharing of personal data for targeted or cross-context behavioural advertising.
As previously noted, some states require opt-in consent to process the personal data of minors for targeted advertising, and most states with comprehensive privacy laws require opt-in consent to process sensitive personal data. Therefore, controllers will need to obtain opt-in consent before engaging in targeted advertising in some circumstances.
The USA does not have a comprehensive federal employee privacy law, but an employer’s handling of employees’ personal information may be subject to sector-specific federal and state laws designed to protect the confidentiality of certain information (eg, FCRA, the Americans with Disabilities Act, BIPA (biometrics) and various state personnel file laws). There are also federal and state constitutional, statutory and common law protections for privacy in the employment context, as follows.
If the employer is a government entity:
If the employer is a private entity, the Fourth Amendment does not apply. However, the following should be noted.
In all cases, employees may have specific privacy rights established in a written employment contract.
Control of personal data is typically transferred between corporate entities as part of a merger, acquisition, asset purchase or other corporate transaction. In many cases, receipt of such personal data by the acquiring entity may be necessary for the acquiring entity to provide services or may have substantial value on its own (for example, for identifying and marketing to potential or former customers).
Before a merger, acquisition or other corporate transaction is closed, the acquiring entity will typically engage in due diligence of the target (or selling) entity’s data privacy and security practices. This typically involves a review of the target entity’s policies and procedures, and obtaining information from the target’s subject matter experts. As part of the transaction, the acquiring entity will typically receive representations and warranties from the target or selling entity related to data privacy and security.
State data privacy laws permit a target entity to transfer personal data to an acquiring entity as part of a proposed or actual merger, acquisition or other transaction without triggering a “sale” of personal data under those laws. Companies therefore need not offer consumers a right to opt out. The CCPA, however, expressly prohibits the acquiring entity from using or disclosing this personal data in a manner that is materially inconsistent with the commitments made to consumers by the target entity unless the acquiring entity provides adequate notice of the change in practices.
While US laws generally do not impose restrictions on the transfer of personal information outside the USA, restrictions have recently been imposed for national security reasons on transfers of certain US personal data, with a particular focus on transfers to China. While these restrictions are aimed at data brokers selling personal information to foreign governments and affiliated companies, international data transfers to governments or state-controlled entities in politically sensitive countries will need to be evaluated given the potential scope of these new measures.
For instance, the Protecting Americans’ Data from Foreign Adversaries Act of 2024 (PADFA) prohibits data brokers from selling, licensing, renting, trading, transferring, releasing, disclosing, providing access to or otherwise making available personally identifiable sensitive data of a US individual (ie, a person residing in the USA) to any foreign adversary country or any entity controlled by a foreign adversary. Sensitive data includes government-issued identifiers, biometric information, genetic information and precise geolocation information. Such data is considered personally identifiable if it identifies or is reasonably linkable to (alone or in combination with other data) an individual or their device. Foreign adversary countries are currently defined as China, Iran, North Korea and Russia.
Most recently, the US Supreme Court upheld the Protecting Americans from Foreign Adversary Controlled Applications Act, which banned TikTok from operating in the USA, requiring that it go dark or have its controlling interest severed from Chinese control, based on national security considerations. The law was also upheld based on preventing China’s control over a communications platform that allowed it to collect sensitive personal data associated with 170 million US TikTok users.
Separately, new DOJ regulations that cover a wide range of transactions restrict foreign access to sensitive US data by “countries of concern” and other “covered entities”, including private persons and entities that are subject to the control or jurisdiction of “countries of concern”. The regulations implement the Biden administration’s Executive Order on Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern. The restrictions apply to transactions involving sensitive personal data that exceeds certain bulk volume thresholds. The current “countries of concern” are China, Cuba, Iran, North Korea, Russia and Venezuela. Sensitive personal data includes precise geolocation information, biometric identifiers, human genomic data, personal health data, personal financial data and personal identifiers (eg, names linked to advertising IDs).
The Committee on Foreign Investment in the United States (CFIUS), a division of the US Department of Treasury, can block transactions that could allow foreign adversaries access to US sensitive personal data. Indeed, in September 2022, President Biden signed the Executive Order on Ensuring Robust Consideration of Evolving National Security Risks by the Committee on Foreign Investment in the United States (CFIUS), which, among other things, directed the CFIUS to scrutinise transactions involving sensitive personal data.
There are no general localisation requirements for personal data under US law. However, sector-specific laws and regulations may have the effect of requiring certain types of information to be kept within US territory.
See 5.2 Government Notifications and Approvals.
See 5.1 Restrictions on International Data Transfers.
Data Protection in the USA: an Introduction
As we step into 2025, the landscape of US data privacy law continues to be dominated by a patchwork of state-level privacy laws, with several states like California already having a comprehensive consumer privacy law in place (the California Consumer Privacy Act, as amended by the California Privacy Rights Act), whereas other states are busy either implementing or enacting their own privacy legislation.
By way of reminder, the first wave of US state data privacy laws came into effect in 2023 (California, Virginia, Colorado, Connecticut and Utah), with more data privacy laws coming into effect during the course of 2024 (Florida, Oregon, Montana and Texas). In January 2025, US state data privacy laws also entered into effect in Delaware, Iowa, Nebraska, New Hampshire and New Jersey – and more will enter into effect in the coming months:
As usual, it is not just new US state data privacy laws making the headlines, with a number of recent enforcement actions also dominating the news. This article introduces some of the key areas that are currently of note.
State legislation
The changes introduced by the US state data privacy laws that entered into effect in 2023 and 2024 have already been covered in previous Trends & Developments chapters, so will not be revisited here. Washington’s My Health My Data Act (MHMDA) is the first comprehensive US state law to protect “consumer health data” outside the scope of the Health Insurance Portability and Accountability Act (HIPAA). It came into effect on 31 March 2024 and is enforceable by the Washington Attorney General and via a private right of action.
In the absence of comprehensive federal data privacy legislation, more and more state legislatures have been considering introducing a state data privacy law to regulate the handling of personal information relating to state residents and to give state residents privacy rights (such as the right of access, the right of correction, the right of deletion and the right to opt out of targeted advertising or the sale of their personal information) that are gradually becoming standard.
The following US state data privacy laws have already taken effect in 2025:
Later this year, the following US state data privacy laws are due to take effect:
Finally, the following US state data privacy laws are due to take effect in January 2026:
Each of the above US state data privacy laws applies to an entity that conducts business in the relevant state and fulfils one of two thresholds:
The exceptions are Minnesota and Nebraska’s laws, which apply generally to all businesses processing personal data in each state, except small businesses as defined by the US Small Businesses Association.
Like the other US state data privacy laws, these laws apply to personal information collected from a natural person who is a resident of the state and, like most other US state data privacy laws (other than California), they expressly exclude personal information collected or processed from a natural person in an employment or commercial context (eg, business-to-business activities). Personal data is defined in these laws as any information that is “linked or reasonably linkable to an identified or identifiable individual” and excludes de-identified data and publicly available information.
They also include typical exemptions in line with most other US state data privacy laws, such as any information or data regulated by existing federal privacy laws, including HIPAA, the Children’s Online Privacy Protection Act (COPPA) and the Gramm-Leach-Bliley Act (GLBA). In this respect, some US state data privacy laws (such as Delaware and New Jersey) include an entity-level exemption under the GLBA, whereas other US state data privacy laws (eg, Minnesota) provide only a data-level GLBA exemption. Similarly, most new state privacy laws provide only data-level exemptions in relation to HIPAA. This means that organisations subject to federal laws such as HIPAA or GLBA may not be out of scope of certain US state data privacy laws.
Each of these laws provides for the following consumer privacy rights, which have now become standard in the recent US state data privacy laws:
The laws also give consumers the right to appeal decisions regarding their consumer rights requests, and give data controllers 45 days to comply with a consumer privacy rights request, with an additional 45-day extension to the extent reasonably necessary.
Most US states also impose certain obligations in relation to the processing of “sensitive data”, although the scope of what constitutes “sensitive data” differs across US states.
Finally, most US states require controllers to conduct data protection assessments for activities that present a heightened risk of harm to consumers – eg, in relation to:
Growing enforcement from US State Attorney Generals
In February 2024, the California Attorney General announced a settlement with DoorDash, whereby the company was required to pay a USD375,000 civil penalty to resolve allegations that it violated the California Consumer Privacy Act (CCPA) and the California Online Privacy Protection Act (CalOPPA). The investigation by the California Department of Justice claimed that DoorDash sold the personal information of its California customers without providing notice or an opportunity to opt out of that sale in violation of both the CCPA and CalOPPA. The sale allegedly occurred in connection with DoorDash’s participation in a marketing co-operative, where businesses contribute the personal information of their customers in exchange for the opportunity to advertise their products to each other’s customers.
In addition to the financial penalty, the company is required to comply with California requirements that apply to businesses that sell personal information, including reviewing contracts with marketing and analytics vendors and reviewing the use of technology to evaluate whether it is selling or sharing consumer personal information.
In June 2024, the California Attorney General announced a USD500,000 settlement with Tilting Point Media LLC, resolving allegations that the company violated the CCPA and COPPA by collecting and sharing children’s data without parental consent in its popular mobile app game “SpongeBob: Krusty Cook-Off”. According to the California Attorney General, the app was first investigated by the Children’s Advertising Review Unit (CARU), a division of the Better Business Bureau National Programs that investigates potential deceptive or inappropriate data collection from children online. CARU found that the privacy and advertising practices of the SpongeBob app failed to comply with COPPA and CARU’s industry guidelines.
Although Tilting Point took some corrective action, a joint investigation by the California Department of Justice and the Los Angeles City Attorney’s Office found that Tilting Point was in violation of the CCPA and COPPA in connection with how the mobile app handled children’s data. In particular, it was held that the age screen did not ask age in a neutral manner, meaning that children were not encouraged to enter their age correctly to be directed to a child-version of the game. It was also found that the inadvertently misconfigured third-party software development kits (SDKs) resulted in the collection and sale of kids’ data without parental consent.
In September 2024, the Texas Attorney General announced a settlement with a Dallas-based artificial intelligence healthcare technology company called Pieces Technologies, resolving allegations that the company deployed its products at several Texas hospitals after making a series of false and misleading statements about the accuracy and safety of its products. An investigation conducted by the Texas Attorney General found that the company had made deceptive claims about the accuracy of its healthcare AI products. The settlement agreement includes requirements related to disclosures in connection with the marketing and advertising of the company's products or services, prohibitions against misrepresentations (including the independence of an endorser or reviewers of a business product or service) and documentation obligations concerning potentially harmful uses of its products or services.
A coalition of 14 state attorneys general, led by California and New York, has also sued TikTok under state unfair and deceptive acts and practices laws and COPPA, alleging that the service is harmful for young users’ mental health and knowingly collects the personal information of children under 13.
Finally, the California Privacy Protection Agency announced an investigative sweep concerning data broker compliance with the registration requirements of the California Delete Act. A few weeks later, it announced settlements with Growbots, Inc. and UpLead LLC for allegedly failing to register and pay the annual data broker registration fee. Growbots agreed to pay USD35,400 to resolve the claims, and UpLead agreed to pay USD34,400.
Federal enforcement actions
The Federal Trade Commission (FTC) announced significant settlements in 2024, resolving allegations of the unlawful collection, sale and use of precise location information by data brokers (X-Mode, InMarket Media, Mobilewalla and Gravy Analytics), and this trend is expected to continue. Following the change of administration, Commissioner Andrew Ferguson has assumed the chairmanship of the FTC. His concurring and dissenting statement in December 2024 on the Mobilewalla and Gravy Analytics cases suggests that we will likely continue to see unfairness claims against organisations engaging in the unlawful collection and sale of location data, whereas claims alleging that organisations unfairly categorised consumers based on sensitive characteristics are less likely.
Data transfers
On 27 December 2024, the US Department of Justice issued a final rule carrying out Executive Order (EO) 14117 on “Preventing Access to Americans’ Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern”. The EO charged the US Department of Justice with establishing and implementing a new regulatory programme to address the national security threat posed by “countries of concern” (and covered persons that they can leverage) accessing and exploiting Americans’ bulk sensitive personal data and certain US government-related data. The “countries of concern” include China (including Hong Kong and Macao), Russia, Iran, North Korea, Venezuela and Cuba.
The rule defines six categories of “sensitive personal data”:
Data excluded from the definition of “sensitive personal data” includes public or non-public data that does not relate to an individual (eg, trade secrets and proprietary information), data that is already lawfully publicly available from government records or widely distributed media, and personal communications and certain informational materials. These exclusions apply to each of the categories of sensitive data.
According to the US Department of Justice, the final rule is intended to address the vulnerability of bulk sensitive data, as such data may be used to develop and enhance AI capabilities and algorithms that, in turn, enable the use of large datasets in ways to the detriment of US national security – for example, to identify US persons whose links to the federal government would otherwise be obscured in a single dataset and who can then be targeted for espionage or blackmail.
2650 Birch Street
Suite 100
Palo Alto, CA 94306
USA
+1 650 313 2358
info@fieldfisher.com www.fieldfisher.com