Data Protection & Privacy 2026

Last Updated March 10, 2026

USA – Florida

Trends and Developments


Authors



Jones Walker LLP has a privacy, data strategy and artificial intelligence team that helps clients with a full spectrum of data privacy, data protection and AI solutions, including identifying, preventing and responding to data incidents, contracting and transactional support, emerging technology guidance, and litigation and dispute resolution, all while managing and mitigating related risks. Its interdisciplinary team brings together highly experienced attorneys with professional backgrounds in a wide range of industries, including banking and financial services, healthcare, technology, telecommunications, energy, petrochemical, maritime, consulting, government, digital commerce and retail. Staying on top of legal, compliance and regulatory obligations under the myriad of burgeoning global privacy, data protection, and AI-related laws, regulations, frameworks and standards can be challenging for any organisation, large or small. Jones Walker has the knowledge and practical experience to help clients navigate these laws, obligations frameworks and standards. Timely insights can be found at www.AILawBlog.com.

Florida’s Privacy Landscape: A Targeted Approach

Unlike broad consumer privacy laws adopted by other states, Florida’s framework is intentionally narrow, targeted and politically shaped. Florida focuses on Big Tech monopolies, foreign ownership risks and AI accountability rather than generalised privacy compliance. The result: Many businesses operate without comprehensive privacy obligations, while a select few face intense scrutiny. As 2026 unfolds, Florida’s strategy is generating significant enforcement actions and legislative momentum that warrant close attention.

With a selective approach to privacy, many businesses are outside the scope of Florida’s omnibus consumer privacy rights law, the Florida Digital Bill of Rights (FDBR) but still face breach notification and security duties under the Florida Information Protection Act (FIPA), and they may have additional obligations if their online platforms or services are directed at, or substantially used by, minors. Florida has no general privacy notice requirement for most companies, no mandated consumer rights mechanisms, and no affirmative data governance obligations outside the narrow FDBR context. Certain categories, however, do face heavy scrutiny: Big Tech platforms subject to the FDBR, foreign-owned healthcare companies collecting biometric or demographic data and operators of AI systems interacting with children. Understanding whether an organisation falls into one of these targeted categories is essential for 2026 compliance planning.

The Florida Digital Bill of Rights: A Big Tech Law

The FDBR took effect in July 2024 and targets the world’s largest technology platforms. The law applies exclusively to controllers with more than USD1 billion in global gross annual revenue that also meet one of three criteria: (i) derive 50% or more of revenue from online advertising, (ii) operate a smart speaker or voice command service not integrated into vehicles, or (iii) operate an app store with at least 250,000 applications. This threshold limits coverage to a small group of global platforms such as Google, Meta, Apple and Amazon, while exempting mid-market businesses entirely. (Importantly, Florida imposes no comprehensive privacy programme, notice or data rights obligations on mid-market or small businesses.)

For covered entities, the obligations are substantial. The FDBR provides Florida consumers with access, correction, deletion and portability rights, plus the ability to opt out of targeted advertising, sales and profiling activities. Controllers must respond to consumer requests within 45 days, with a possible one-time extension of 15 days when reasonably necessary. The law requires controllers to provide at least two accessible methods for consumers to submit requests, accommodating how consumers normally interact with the controller.

Controllers must conduct data protection assessments for high-risk processing activities. These assessments must identify and weigh the benefits and risks of processing, considering the use of de-identified data, reasonable consumer expectations and the relationship between controller and consumer. Required assessments cover targeted advertising, sale of personal data, profiling that presents reasonably foreseeable risks, processing of sensitive data and other processing involving heightened risk. These assessments must be documented and made available to the Florida Department of Legal Affairs (the “Department”) upon request.

Enforcement rests exclusively with the Department, which can assess penalties of up to USD50,000 per violation, tripled for violations involving children or failures to honour deletion and opt-out requests. The law provides a 45-day cure period for alleged violations, though this opportunity does not apply to violations that knowingly involve children. If, after notification of an alleged violation, the entity cures the violation within 45 days and provides proof to the Department, no enforcement action can proceed, though the Department may issue a letter of guidance that no future cure period will apply.

First Enforcement: The Roku Action

In October 2025, Attorney General (AG) James Uthmeier brought Florida’s first FDBR enforcement action against video streaming provider Roku. The complaint alleged that Roku collected and sold children’s sensitive data without obtaining required consent and notice, enabled reidentification of children’s data and failed to provide effective privacy controls and opt-out mechanisms, violating both the FDBR and the Florida Deceptive and Unfair Trade Practices Act.

This enforcement action signals several enforcement priorities for the AG’s office. First, protection of children’s data receives heightened attention, consistent with the statute’s tripled penalties for violations involving minors. Second, transparency in data sales practices is non-negotiable; controllers must clearly disclose when they sell personal data and provide effective opt-out mechanisms. Third, the AG expects covered entities to obtain affirmative consent before collecting children’s information, not merely rely on age-neutral consent mechanisms.

The timing of the Roku action, coming barely 15 months after the FDBR’s effective date, signals that Big Tech platforms subject to the law should expect active enforcement. Covered entities should review their practices with regard to collection of children’s data, data sales disclosures and opt-out mechanisms to ensure compliance.

The CHINA Prevention Unit: Targeting Foreign-Owned Healthcare Companies

While FDBR enforcement has begun targeting platform behaviour, Florida’s privacy regulators are also targeting a different category of risk: foreign ownership and sensitive data collection.

In February 2026, Uthmeier announced the creation of the CHINA Prevention Unit, a specialised enforcement initiative within the AG’s office investigating foreign corporations, particularly those with Chinese ownership, that collect consumer data from Florida residents. The AG’s office has identified healthcare as a primary enforcement focus, citing concerns that sensitive health data may be shared with foreign adversaries. The unit has already issued subpoenas to medical device manufacturers, including demands for audits to identify ties to China and verify whether biometric or patient data is being transmitted to foreign servers.

This development creates immediate compliance risks for healthcare technology companies with foreign ownership or investment. The CHINA Prevention Unit’s mandate encompasses Florida’s data breach notification law, data security requirements and consumer protection statutes. Its creation reflects concerns that foreign adversaries could exploit health data access for intelligence purposes, individual targeting or infrastructure vulnerability assessment.

Telehealth platforms, health and fitness apps, genetic testing companies and medical device manufacturers with foreign ties should conduct ownership structure assessments and data transfer audits. Key compliance steps include documenting the corporate ownership chain, mapping data flows to determine whether personal information is transferred internationally and assessing whether business relationships with foreign entities create exposure under the CHINA Prevention Unit’s mandate.

The CHINA Prevention Unit reflects broader concerns about foreign access to sensitive health information. As healthcare increasingly relies on technology platforms, the collection of biometric identifiers, genetic data and detailed health metrics through wearable devices, mental health apps and genetic testing services has grown dramatically. Florida’s enforcement strategy suggests these practices will face heightened scrutiny when foreign entities are involved, either through direct ownership or through data sharing arrangements.

Artificial Intelligence Regulation: From Transparency to Rights

Florida is pursuing comprehensive AI regulation through initiatives addressing government use, consumer protection and children’s safety.

The Florida Artificial Intelligence Bill of Rights (SB 482), currently advancing in the 2026 legislative session (building on Gov. Ron DeSantis’ 2025 AI safeguards initiative), would establish significant consumer-facing AI rules, particularly for companion chatbots and data handling. The bill would require AI companies to provide parental controls allowing parents to access conversations their children have with companion chatbot platforms and large language models. This provision addresses growing concerns about children developing relationships with AI systems, receiving advice on sensitive topics without parental knowledge, or being exposed to inappropriate content through conversational AI interfaces.

The parental access requirement would create significant technical and operational challenges. Companies would need to implement age verification systems, develop secure portals for parental access and establish clear data retention policies.

The legislation would also restrict the sale or sharing of personal identifying information with third parties unless properly de-identified and would limit state agencies from using AI providers with certain foreign affiliations, reflecting national security and data sovereignty concerns similar to those underlying the CHINA Prevention Unit’s creation.

If enacted, SB 482 would position Florida alongside Colorado and California as a leader in AI regulation. Colorado’s AI Act focuses on algorithmic discrimination in consequential decisions. California’s various AI initiatives address deepfakes, automated decision-making and AI safety. Florida’s approach emphasises children’s protection and professional services boundaries rather than discrimination prevention.

Florida’s 2024 legislation established AI transparency requirements for government systems and mandated disclosures when AI creates or modifies political advertising, addressing deepfake concerns in elections. Under the FDBR, consumers can opt out of profiling that produces legal or similarly significant effects, addressing AI-driven automated decision-making in contexts like credit decisions, insurance underwriting, employment screening and access to essential services.

Industry-Specific Implications

Healthcare

Healthcare companies face the most immediate compliance pressure. Beyond CHINA Prevention Unit scrutiny, healthcare entities must navigate HIPAA requirements, state breach notification laws and potential AI regulation affecting telehealth and mental health applications. Foreign-owned companies or those transferring data internationally should expect enforcement attention in 2026.

Companies collecting biometric data, genetic information or detailed demographic profiles should conduct comprehensive risk assessments. Key evaluation points include documenting corporate ownership structures to identify foreign investors or parent companies, mapping data flows to determine whether health information crosses international borders, reviewing cloud service provider arrangements and data storage locations, and assessing whether business partnerships with foreign entities create exposure under the CHINA Prevention Unit’s mandate. Additionally, Florida law restricts state and local government contracts with entities owned or controlled by foreign countries of concern, creating further compliance considerations for foreign-owned healthcare companies seeking government contracts or partnerships. Healthcare companies should also evaluate whether their data security practices meet the “reasonable measures” standard under Florida law, particularly for cross-border transfers.

Fintech and financial services

Financial services firms generally remain outside the FDBR’s scope but face sectoral regulations under the Gramm-Leach-Bliley Act. However, fintech companies with foreign investment and those collecting health or biometric data for insurance or lending decisions should assess CHINA Prevention Unit exposure. Foreign-owned fintech companies should also note that, effective 1 July 2025, Florida law prohibits governmental entities from contracting with companies owned or controlled by countries of concern if such contracts involve access to personal identifying information. The intersection of financial services and healthcare data creates potential vulnerability. Companies processing payment data internationally should evaluate whether their ownership structures and data flows create compliance risks.

Education technology

Educational technology companies must consider how SB 482’s parental control requirements would affect learning platforms using AI tutoring or assessment tools. Ed tech companies should monitor interactions with student privacy law and AI transparency requirements, particularly regarding algorithmic decision-making in admissions, placement or disciplinary contexts.

Telecommunications

Telecommunications providers typically fall outside the FDBR’s scope but remain subject to data breach notification and security requirements. However, telecoms companies with foreign ownership or those offering health-related services through their platforms should evaluate CHINA Prevention Unit implications. As 5G networks enable more healthcare applications, the line between telecommunications and healthcare data collection becomes more blurred.

Data Breach Notification and Security Requirements

FIPA continues to impose notification obligations when breaches create a risk of financial harm. The statute’s risk-based approach requires covered entities to investigate breaches and consult with relevant law enforcement agencies before determining whether notification is necessary. If after investigation and consultation the entity reasonably determines that the breach will not cause financial harm to affected individuals, notification may be avoided. However, this determination must be reasonable, documented in writing, maintained for five years and provided to the Department within 30 days of the determination.

For breaches affecting 500 or more Florida residents, entities must notify the Department within 30 days of determining that a breach has occurred. If more than 1,000 residents are affected, entities must also notify all nationwide consumer reporting agencies. The notification to the Department must include specific information about the breach, the number of individuals affected and steps taken to address the incident. Third-party agents that maintain personal information on behalf of another entity must notify that entity as expeditiously as practicable but no later than ten days following determination of a breach or reason to believe a breach occurred.

Penalties can reach USD500,000 per breach, assessed at USD1,000 daily for the first 30 days, then USD50,000 per 30-day period up to 180 days. Notably, penalties apply per breach rather than per individual, creating more predictable liability than per-record penalty structures. The statute requires reasonable security measures to protect personal information and mandates secure disposal of records no longer needed for business purposes. The CHINA Prevention Unit’s focus on foreign-owned healthcare companies suggests security practices will face particular scrutiny where cross-border data transfers occur.

Florida’s State Cybersecurity Act requires state agencies to implement cybersecurity measures aligned with the National Institute of Standards and Technology (NIST) Cybersecurity Framework, including risk assessments, incident response protocols and mandatory reporting of cybersecurity incidents and ransomware attacks. While these requirements apply to government entities rather than private businesses, they signal the state’s overall approach to data security and may influence expectations for reasonable security measures under the breach notification statute.

Looking Ahead

Florida’s privacy landscape in 2026 reflects targeted enforcement rather than comprehensive regulation. The CHINA Prevention Unit’s creation signals aggressive action on foreign ownership concerns, particularly in healthcare. The potential passage of SB 482 would establish substantial AI governance requirements focused on protecting minors and preventing unauthorised professional services through automated systems. The Roku enforcement action demonstrates that Big Tech companies subject to the FDBR should expect active enforcement with regard to protection of children’s data, transparency obligations and consumer opt-out rights.

This approach contrasts sharply with comprehensive privacy laws in California, Virginia, Colorado and other states. By focusing on specific perceived harms rather than establishing broad privacy frameworks, Florida is testing whether targeted, issue-specific regulation can protect consumers while minimising compliance burdens on most businesses. Florida’s model mirrors broader state-level scepticism about foreign access to data and the risks posed by generative AI, reflecting national political dynamics around data sovereignty and technological competition with adversary nations. The model appeals to business interests by limiting regulatory obligations for the vast majority of companies while addressing concerns about data practices by dominant platforms, data collection by foreign entities and AI risks to children.

Florida’s narrow model could face pressure to evolve if federal privacy or AI legislation advances in 2026–2027. Congressional proposals for comprehensive federal privacy legislation would pre-empt some state laws, though the extent of pre-emption remains contested. Federal AI regulation focusing on algorithmic discrimination, transparency or safety could establish baseline requirements that affect Florida’s enforcement priorities. Additionally, if other states adopt more comprehensive frameworks and businesses seek regulatory harmonisation, Florida may face pressure to align with emerging national standards.

The practical impact of Florida’s approach will depend significantly on enforcement intensity. The CHINA Prevention Unit’s success in pursuing foreign-owned healthcare companies will determine whether other states adopt similar strategies. SB 482’s passage and implementation will demonstrate whether targeted AI regulation focused on children and professional services can effectively address AI risks without comprehensive algorithmic governance frameworks. The Roku case and subsequent FDBR enforcement actions will establish whether Florida can effectively regulate Big Tech platforms while exempting mid-market businesses.

Healthcare companies with foreign ownership, Big Tech platforms under the FDBR, and any company deploying AI systems that interact with Florida children should prioritise compliance preparation in 2026. For these entities, Florida’s targeted enforcement creates significant liability risk despite the state’s generally business-friendly reputation. For most other businesses, Florida’s light-touch approach means privacy obligations remain limited to breach notification and reasonable security measures rather than comprehensive privacy programme requirements. This dual reality (intense scrutiny for some, minimal obligations for most) defines Florida’s distinctive privacy law landscape. Florida’s 2026 enforcement and legislative activity will shape whether the state maintains its targeted model or moves toward broader privacy legislation in 2027.

Jones Walker LLP

Brickell World Plaza
600 Brickell Avenue
Suite 3300
Miami, Florida, 33131
USA

+1 404 870 7531

jloring@joneswalker.com www.joneswalker.com
Author Business Card

Trends and Developments

Authors



Jones Walker LLP has a privacy, data strategy and artificial intelligence team that helps clients with a full spectrum of data privacy, data protection and AI solutions, including identifying, preventing and responding to data incidents, contracting and transactional support, emerging technology guidance, and litigation and dispute resolution, all while managing and mitigating related risks. Its interdisciplinary team brings together highly experienced attorneys with professional backgrounds in a wide range of industries, including banking and financial services, healthcare, technology, telecommunications, energy, petrochemical, maritime, consulting, government, digital commerce and retail. Staying on top of legal, compliance and regulatory obligations under the myriad of burgeoning global privacy, data protection, and AI-related laws, regulations, frameworks and standards can be challenging for any organisation, large or small. Jones Walker has the knowledge and practical experience to help clients navigate these laws, obligations frameworks and standards. Timely insights can be found at www.AILawBlog.com.

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.