Contributed By Holland & Knight LLP
The US digital economy operates primarily under a patchwork of federal and state laws. This framework evolved as technology outpaced legislation, creating both opportunities and challenges for businesses navigating this space. Understanding these overlapping rules helps companies avoid costly missteps.
Intellectual Property Protection
Digital companies rely heavily on IP laws to protect their innovations and content. Copyright law governs digital content and software, while patent law covers technological inventions (though software patentability remains contentious). Trade mark law protects brand identity online, and trade secret law safeguards proprietary algorithms and data. These protections form the foundation of most digital business models.
Data Privacy and Security
Unlike Europe’s General Data Protection Regulation (GDPR), the United States lacks a single federal privacy law. Instead, sector-specific regulations exist: the Health Insurance Portability and Accountability Act (HIPAA) for health data, the Children’s Online Privacy Protection Act (COPPA) for kids’ information, and various state laws such as the California Consumer Privacy Act (CCPA) for consumer privacy. The Federal Trade Commission (FTC) also enforces against deceptive data practices under its consumer protection authority. This fragmented approach means compliance requirements vary significantly.
E-Commerce and Platform Regulation
Section 230 of the Communications Decency Act provides liability protection for online platforms regarding user-generated content. The CAN-SPAM Act regulates commercial email, while state laws address e-commerce issues. Antitrust authorities increasingly scrutinise major tech platforms, though enforcement approaches may shift dramatically based on who is president at any given time.
Industry Self-Regulation
Many digital sectors supplement legal requirements with voluntary codes of conduct. Trade associations often establish best practices for advertising, privacy and content moderation. While typically not legally binding, these standards influence business practices and shape future regulation. As the landscape continues to shift, companies need to stay informed and adaptable.
The United States faces ongoing debates about how to regulate digital markets without stifling innovation. Unlike the EU’s proactive regulatory approach, American policy tends towards reactive enforcement, which leaves significant gaps and creates uncertainty for both established companies and start-ups trying to plan ahead.
Competition and Platform Access
Antitrust enforcers are pushing hard on how traditional competition law applies to gatekeeper platforms, app stores and self-preferencing behaviour. Merger review has become more aggressive, and remedies increasingly focus on data portability. The friction points are predictable: access to platform infrastructure and APIs, terms for distribution and ranking, and platforms mining business-user data to launch competing products.
Content Moderation and Liability
Section 230 still shields platforms from most user-content liability, except for federal crimes and some IP violations, but states are chipping away at its edges through new laws that courts have not yet fully sorted. The Digital Millennium Copyright Act’s (DMCA) notice-and-takedown system anchors copyright online, though generative artificial intelligence (AI) has thrown new complications into the mix for training data and fair use. Meanwhile, states are passing youth-safety and age-verification laws that have raised First Amendment questions.
Privacy Patchwork
The FTC goes after deceptive data practices, breach notification is mandatory everywhere, and California-style privacy laws give users access, deletion and opt-out rights. Public companies now face SEC disclosure rules for material cyber incidents. What has emerged is a de facto national standard: clear disclosures, meaningful user controls and documented risk processes, even though the specific state requirements still vary.
Emerging Technology Governance
AI, cryptocurrency and Web3 technologies often operate in regulatory grey zones. Agencies are stretching existing authorities to cover new technologies, but clear guidelines remain elusive. Companies often cannot determine what is permissible until after enforcement actions occur.
Federal Taxation
The United States does not have a federal-level sales or similar tax on digital goods or services. Instead, each state and/or locality sets its own rules regarding what constitutes a taxable digital good or service, how it is sourced, and whether exemptions apply, producing a highly fragmented and often inconsistent system.
State and Local Taxation
For sales tax purposes, some states tax nearly all digital products and services, while other states exempt most digital goods and services. For instance, Washington and Connecticut apply sales tax to nearly all digital products and often classify Software as a Service (SaaS) as a taxable retail service, resulting in a wide tax base. Other jurisdictions, such as California and Florida, often exempt digital goods unless they fall into very specific statutory provisions.
Apart from sales taxes, there may also be specific state or local taxes aimed at digital goods or services – such as digital advertising taxes (which are discussed more thoroughly in 1.4 Taxation of Digital Advertising).
Compliance Challenges
Companies face several significant challenges in managing tax compliance as laws struggle to keep pace with modern business models and technologies, especially with inconsistencies across state and local taxing statutes. Many of these laws were written decades ago and were never designed with today’s digital economy and technologies in mind. As a result, taxing authorities often attempt to apply outdated statutory language to new and evolving technologies, creating uncertainty and inconsistent interpretations.
These challenges are compounded by the ongoing need to educate departments of revenue about the nature of digital products and services. Because technology evolves rapidly, tax administrators may not always have a clear understanding of how certain digital offerings function or how they should be classified within existing state or local tax frameworks. This often leads to protracted audits, conflicting guidance and uneven enforcement.
Adding to the complexity is the fact that states and localities frequently take divergent approaches when determining the taxability of digital goods and services. Even when jurisdictions attempt to modernise their tax codes, they do so in different ways and on different timelines. As a result, companies face a patchwork of rules governing similar transactions, increasing both compliance burdens and the risk of inadvertent non-compliance. This lack of uniformity makes it difficult for businesses to implement consistent tax policies, automate compliance processes or predict their overall tax exposure.
In the United States, there is nospecialtax at the federal level on digital advertising revenues. Instead, states have begun experimenting with gross receipts taxes or expanding sales tax bases to include digital advertising services.
An Evolving Landscape
Maryland was the first state to impose a digital advertising tax. The tax applies to global annual revenue greater than USD100 million and USD1 million in Maryland digital advertising revenue. The tax was immediately met by numerous legal challenges, mainly under the Internet Tax Freedom Act (ITFA) and United States Constitution’s Commerce Clause and First Amendment. During August 2025, the Fourth Circuit ruled that Maryland’s digital advertising tax violated the First Amendment. However, for the time being, the tax remains in effect. Apart from Maryland, other states have considered and continue to consider implementing digital advertising taxes.
Apart from implementing specific special taxes, some states have instead expanded their sales tax base to include digital advertising services. For instance, during October 2025, Washington began taking digital advertising services as part of an expanded sales tax base. Again, legal challenges were filed raising ITFA concerns.
Companies and practitioners should continue to monitor the outcome of the legal challenges in Maryland and Washington and expect more states to adopt special digital advertising taxes or expanded sales taxes that include digital goods and services.
Compliance
Compliance with digital advertising taxes requires a multifaceted approach given the rapidly evolving nature of applicable law.
Physical and economic thresholds
Companies should begin by determining whether they have nexus – physical or economic – with a given state and/or locality. Because several states rely on economic nexus thresholds tied to gross receipts, companies must assess whether their digital advertising revenue sourced to a state meets applicable thresholds, as this will trigger registration, filing and remittance responsibilities.
Taxable digital advertising activities
Next, companies should identify which of their digital advertising activities are taxable, recognising that definitions vary widely across jurisdictions. If there are taxable services, companies should then evaluate the proper sourcing of revenue. Such sourcing rules often look to the location of the user who views or interacts with the advertisement. This frequently necessitates enhanced data-tracking capabilities and co-ordinated efforts between tax, marketing and technology teams. Companies should be mindful of collaborating between tax teams and advertising teams to ensure accurate reporting, and integrating new tax requirements into contract terms, pricing models and data-collection practices.
Navigating uncertainty: additional considerations
Companies should be mindful of the various compliance uncertainties. While reviewing potential taxes, companies should also evaluate the legality of new digital advertising taxes and expanded sales tax bases to determine whether to challenge or comply with such laws. Additionally, since this is an evolving area, companies should monitor legislative changes, tracking bills that may impact their business and adjusting practices accordingly.
Digital goods and services are subject to regulations designed to safeguard consumer interests. In addition to the FTC Act, key federal laws also include COPPA, which restricts the collection of personal information from children under 13, and the CAN-SPAM Act, which regulates commercial email communications. The Telephone Consumer Protection Act (TCPA) and the Truth in Advertising laws further protect consumers from misleading marketing and unwanted communications. Many states have enacted their own privacy and consumer protection statutes, such as California’s CCPA, which grants consumers rights over their personal data.
Upholding Consumer Rights
Companies operating in the TMT sector must take proactive steps to uphold consumer rights. These include:
Regular training for staff on compliance requirements and consumer rights is also recommended.
The FTC provides a complaint process for consumers, and many states offer similar mechanisms through their own consumer protection offices. Companies are encouraged to establish internal dispute resolution procedures, such as dedicated complaint handling teams and escalation protocols, and alternative dispute resolution (ADR) methods. However, companies should be aware that plaintiffs’ counsel have increasingly weaponised ADR through tactics such as mass-arbitration filings, using the very mechanisms designed for efficiency to generate substantial pressure and extract settlements. As a result, businesses must carefully structure their dispute resolution provisions and ensure they remain compliant, balanced and resistant to misuse.
Best Practices for Consumer Disputes
Best practices for handling consumer disputes in the digital economy include:
TMT companies should also be aware of industry-specific codes of conduct and standards, such as those issued by the Interactive Advertising Bureau or the Digital Advertising Alliance. Adhering to these guidelines can enhance consumer trust and demonstrate commitment to ethical business practices.
Cryptocurrency in the Digital Economy
Cryptocurrency provides an alternative method to transfer value, using a variety of blockchains. Cryptocurrencies can thus be involved in any legal issues or disputes involving a transfer of value. The borderless nature of cryptocurrencies makes cross-border and international legal issues, including jurisdiction and choice of law, more common when compared to those contexts involving conventional payments.
Cross-Border Accessibility
Cryptocurrencies make cross-border financing and payments more accessible; they can also make identifying and holding bad actors accountable challenging. The wide variety and quality of blockchain technology, smart contracts, centralised and decentralised exchange interfaces, and wallet hardware and software – and the fact that many cryptocurrency services include all of these technologies in tandem – can introduce a large number of technological points of failure and attack vectors when using cryptocurrencies. Most blockchains’ cumbersome software update processes mean they are generally slow to adapt to known vulnerabilities.
Regulatory Uncertainty
The complexity of and regulatory uncertainty surrounding blockchains and cryptocurrencies presents legal opportunities in the United States, as entities seeking to serve the lucrative US market – or access US capital – require legal counsel in navigating the United States’ regulatory framework.
GENIUS Act and “Market Structure” Legislation
In 2025, the United States passed the federal GENIUS Act, which defines a specific category of US dollar-denominated, fiat-backed stablecoins – ”payment stablecoins” – and provides a high-level framework for regulating their issuance and use. Federal agencies overseeing the implementation of the GENIUS Act, including the Department of the Treasury, the Federal Deposit Insurance Corporation, the Federal Reserve, and the National Institute of Standards and Technology are currently creating implementing rules and regulations.
The GENIUS Act takes effect on 18 January 2027, or 120 days after implementation regulations are issued, whichever comes first. Under the GENIUS Act, payment stablecoins are not considered a security or a commodity. Other types of cryptocurrencies are subject to a patchwork of overlapping federal and state laws and regulations.
Depending on the context, cryptocurrencies are regulated under federal and state securities laws; commodities laws; money transmission laws; and consumer protection statutes, among others. As of the time of writing, draft legislation whose purpose is to create a uniform regulatory approach to cryptocurrencies – the so-called “market structure bill” – is being debated in Congress.
Cloud and Edge Computing Laws
Key federal statutes include the Federal Information Security Modernization Act (FISMA) for government contractors, the Federal Risk and Authorization Management Program (FedRAMP) for cloud service providers serving federal agencies, and export control regulations (EAR/ITAR) restricting certain data locations.
The FTC enforces against unfair or deceptive business practices regarding cloud security claims. Industry codes include the Cloud Security Alliance’s Security, Trust, Assurance, and Risk (STAR) Registry and ISO/IEC 27017/27018 standards for cloud security and privacy.
Restrictions on Regulated Industries
Heavily regulated sectors face significantly stricter requirements. Financial services institutions (including insurance businesses) must comply with the Gramm–Leach–Bliley Act (GLBA) and implementing regulations governing cybersecurity and data privacy for information systems. In addition, many states have implemented laws for this sector, such as the New York Department of Financial Services’ cybersecurity regulation (23 NYCRR 500), which mandates specific cloud vendor oversight.
Healthcare providers and their cloud vendors must satisfy stringent security and privacy safeguards under HIPAA, including required Business Associate Agreements with all cloud vendors. Other sectors with specific requirements for cybersecurity and/or privacy include education, telecommunications providers, some portions of the transportation and energy sectors, and defence contractors. All of these sector-specific requirements typically require extensive vendor due diligence, audit rights and incident notification provisions.
In addition, an increasing number of federal and state laws are addressing the privacy and safety of personal information and services associated with minors.
Data Protection in Cloud Computing
Cloud computing raises specific privacy concerns under comprehensive privacy laws that have been passed in approximately 20 states, beginning with CCPA.
Critical issues include:
Evolving Regulatory Landscape
While AI is quickly reshaping the regulatory landscape, the disruption has only begun. There is still no universally accepted framework for regulating the use of AI. As with any major technological shift, lawmakers, regulators and practitioners are still debating how existing and emerging laws should and will apply. Legislators across the United States are actively considering new measures aimed at safeguarding consumers and organisations in this evolving environment.
Existing Legal Frameworks Influencing AI
A range of existing laws already influences the development and deployment of AI tools, including non-discrimination and employment laws, intellectual property frameworks, as well as privacy and defamation principles. In the United States, we are also seeing a growing set of state-level initiatives directed at preventing bias and promoting fairness. Many of these focus on areas such as employment decisions, housing, lending and healthcare.
Proposed and Emerging AI Regulations
Beyond those categories, additional types of regulation under consideration include:
Deepfake-Related Legal Issues
On the issue of deepfakes, many states have passed or proposed laws requiring labelling of intentionally fabricated images or videos, often motivated by concerns about public misinformation – especially during election cycles. While these laws are typically designed to protect the public rather than individual subjects, existing causes of action such as defamation, rights of publicity, and sexual-image-related civil statutes remain available for individuals harmed by AI-generated content. Because of free speech protections, regulators cannot prohibit the publication of fabricated or satirical content outright, but they may be able to require appropriate disclosures to reduce confusion and deception.
AI in Transportation
Autonomous vehicles, commercial drones and drone-delivery systems have been heavily regulated for years, pre-dating the explosion of AI regulation. For example, vehicle safety rules strictly govern what can be introduced to the market, and aviation and drone operations are subject to stringent licensing and operational requirements. Numerous states have passed legislation setting forth testing and deployment procedures for autonomous vehicles. As regulation evolves, it remains to be seen whether AI-specific rules will emerge in these spaces, but existing frameworks already create significant guardrails.
Liability and Cross-Cutting Legal Issues
Finally, when assessing liability, transparency, insurance, data protection, intellectual property and fundamental rights concerns in connection with AI, the analysis typically relies on longstanding legal doctrines rather than AI-specific rules. Some of these topics – such as machine-to-machine communications, IoT-related secrecy and security, compliance and data-sharing obligations – overlap with broader technology governance issues and often require interdisciplinary legal expertise.
IoT Laws and Regulations
The FTC actively enforces consumer protection laws against IoT manufacturers for inadequate security practices and deceptive privacy claims. The California IoT Security Law requires “reasonable security features” for connected devices. The IoT Cybersecurity Improvement Act of 2020 establishes minimum security standards for IoT devices purchased by federal agencies. State privacy laws apply to personal information processed in connection with IoT devices.
Other Regulations
Sources of industry self-regulation include:
Sector-specific regulations which also cover IoT security include:
IoT Challenges
Companies deploying IoT solutions in the United States will find some challenges in managing the fragmented and evolving regulatory landscape. In particular:
Governance Frameworks
The National Institute of Standards and Technology (NIST) Cybersecurity Framework 2.0 is likely the mostly widely used and respected security standard in the United States, and NIST has offered specific guidance in connection with IoT devices, such as “IoT Device Cybersecurity Guidance for the Federal Government: Establishing IoT Device Cybersecurity Requirements” and “Profile of the IoT Core Baseline for Consumer IoT Products”.
Organisations should also consider a Data Protection Impact Assessment for IoT devices that will process potentially sensitive personal information.
IoT Legal Requirements
IoT companies face various data sharing obligations depending on jurisdiction and sector. In particular:
Sector-specific rules include:
Thresholds for Data Sharing Requirements
The thresholds and types of companies subject to these data sharing requirements can vary by law and by state, but typically include:
Non-profits are exempt from many but not all such laws.
Sector-specific requirements include:
Sensitive Personal Information
Many states have more stringent requirements for “sensitive” personal information, such as requiring opt-in consent for processing and imposing stricter limitations on the use or sale of such information. Such information typically includes precise geolocation data, biometric data, health and genetic data, sexual orientation, racial/ethnic origin, children’s data and financial information.
Complex Regulatory Structures
Questions about audiovisual media services often touch on complicated telecoms-related regulatory structures. Determining whether a service – such as TV, radio, a video-sharing platform or a streaming service like YouTube, Spotify or Netflix – falls within a regulated category depends on the underlying transmission medium and the nature of the service.
Licensing and Consumer Protection
Services operating over licensed airwaves, radio frequencies, satellite transmissions or other limited public communication resources may be subject to FCC licensing. By contrast, streaming platforms that offer bundled channels resembling linear programming (virtual MVPDs), as well as services that distribute proprietary or user-generated content, generally fall outside direct FCC regulation.
Regardless of licensing status, all consumer-facing media and entertainment services remain subject to consumer protection laws, intellectual property rules, and heightened obligations relating to children’s safety. Regulators are increasingly focused on mixed-use platforms to ensure minors’ experiences are safe and developmentally appropriate.
User-Generated Content and Paid Media Services
When a platform hosts user-generated content or charges consumers for access to media, consumer protection law will govern the business model and user interactions. These frameworks play a significant role in shaping operational and compliance planning in the media space.
Dependence on Communication Pathways
Telecommunications-related regulatory questions often hinge on how a product uses communications pathways. There is no single rule that governs when a telecommunications product must be licensed prior to market introduction. Instead, products become subject to specific requirements when they rely on regulated, limited or high-value communication channels.
Licensing Requirements
For example, any product using licensed radio spectrum, satellite transmissions or certain protected broadcast signals typically requires formal licensing. In contrast, consumer technology that merely incorporates communications capabilities without using restricted pathways may not require specialised telecoms licensing.
Security Requirements
Security expectations also play a significant role in this space. Regulators across jurisdictions have coalesced around the concept of “reasonable security”, which places the responsibility on companies to identify the risks associated with their products and data practices and to implement safeguards proportional to those risks. While the laws avoid prescribing a single, one-size-fits-all security framework, they empower regulators – including the FCC – to take enforcement action when companies fail to meet reasonable security standards.
Critical Infrastructure Requirements
Regulators have shifted, particularly under the Biden administration, to treating internet service providers as critical infrastructure. This designation would bring enhanced security and resilience obligations, reflecting the reality that widespread internet outages – or targeted attacks on telecoms infrastructure – could disrupt governmental operations, defence functions, law enforcement and essential services.
Shift in Regulatory Priorities
Net neutrality was a highly debated topic several years ago, but that intensity has largely subsided. After multiple rounds of federal rule changes during the Trump and Biden administrations, many stakeholders have stepped back from prioritising federal net neutrality battles. Multiple large internet providers have voluntarily committed not to engage in the behaviours that net neutrality proposals sought to curb while federal regulations have shifted back and forth.
Impact on Legal Risk Assessments
Telecommunications is, at its core, a technology-driven field, and to reduce risk and ensure compliance, companies in this sector must maintain robust internal processes to understand how their products work, how they affect consumers and what risks the products introduce. New products and new technologies are just the canvas on which those dynamics play out.
Data Protection and Privacy Implications
Both the proliferation of IoT devices and enhanced connectivity for devices via 5G networks generate exponential increases in personal data collection. Under regulations such as GDPR in Europe, the CCPA in California and similar statutes globally, companies face heightened obligations regarding:
Liability and Accountability Frameworks
AI-driven decision-making in telecommunications contexts – such as network optimisation, customer service and fraud detection – creates novel liability questions:
Internal vs Consumer-Facing Implications
These technologies can affect both consumer-facing offerings and internal operations. For in-house counsel, this means evaluating risks not only in outward-facing deployments but also in the systems the organisation uses internally.
Challenges in Technology Agreements
Organisations entering technology agreements in the United States face significant challenges:
Mandatory Rules
Liability limitations are generally enforceable, but courts may invalidate caps for gross negligence, wilful misconduct, fraud, or breaches of confidentiality. Some states also restrict or prohibit limitations for personal injury or statutory violations, such as consumer protection or privacy laws with private rights of action. Because enforceability varies by state, parties often debate governing law and forum selection.
Certain states require specific contractual privacy provisions when sharing personal data with service providers or third parties, and agreements must also account for statutory data-retention obligations (eg, tax, employment and healthcare records). While price adjustments are generally permitted, unconscionability doctrines and consumer protection laws may limit unilateral modifications, particularly in consumer-facing agreements.
Sector-Specific Requirements
Many regulated industries (such as financial services, healthcare, government contractors) must impose security, privacy and other obligations on their downstream service providers. These requirements can include data residency restrictions, audit rights, reporting obligations, and prescriptive administrative, physical and technical safeguards.
Key Elements in Telecoms Service Agreements
Key elements that should be included in telecommunications service agreements include:
Negotiating Favourable Terms
Companies need to consider their respective risk considerations, pricing and term commitments, potential exclusivity and market considerations to weigh their relative negotiation leverage to achieve the necessary compromises for a mutually beneficial relationship.
Considerations in Interconnection Agreements
Interconnection agreements are subject to the Telecommunications Act of 1996, which imposes specific obligations on incumbent local exchange carriers (ILECs). Agreements must be filed with state Public Utility Commissions and may be subject to arbitration. The agreement will generally need to address industry-standard protocols (eg, SS7, SIP, TDM), compensation mechanisms, traffic exchange points (such as points of interconnection, physical collocation, etc), quality and performance measures, number portability (Local Number Portability) obligations under FCC rules), security and fraud prevention (eg, robocall mitigation) and liability allocation for outages, misrouted traffic, regulatory non-compliance and unauthorised use.
E-Signatures and Digital ID Rules
Electronic signatures are widely accepted in the United States under the federal Electronic Signatures in Global and National Commerce Act (ESIGN) (15 U.S.C. § 7001 et seq) together with the Uniform Electronic Transactions Act which has been substantively adopted by 47 states plus DC, while the remaining states have a similar law in effect.
These laws recognise electronic signatures and records as legally valid and enforceable, provided that parties consent to execute an agreement electronically, understand they are entering into a binding contract, and suitable electronic records are preserved. There are, however, limited exceptions (sometimes dependent on jurisdiction) such as for wills, adoption documents, court orders, divorce decrees, notices of utility cancellation and certain UCC transactions.
Trust Services
The United States lacks comprehensive trust services regulation, but there are some digital trust schemes. For example, most states now recognise remote online notarisation. Likewise, no comprehensive federal digital identity framework exists, but the NIST Digital Identity Guidelines (SP 800-63) provide voluntary standards for identity proofing, authentication and federation, widely adopted by federal agencies and incorporated into FedRAMP requirements. Various state digital driver’s licence initiatives are emerging as well.
A Decentralised Regulatory Framework
Unlike many jurisdictions with unified national gaming statutes, the United States relies primarily on state-level regulations to govern gambling, wagering, sweepstakes, contests and other promotional devices, supported by targeted federal laws that reinforce areas such as anti-money laundering, consumer protection and interstate wagering.
In recent years, this framework has tightened considerably as states reassess the legality of sweepstakes casinos and online wagering formats amidst concerns about consumer risk and regulatory gaps.
Identifying the Game
Gambling
Most states define “gambling” based on three core elements: consideration, chance and prize. Traditional casino gambling, sports wagering and online real-money gaming are permitted only in states that authorise and license these activities.
Sweepstakes and promotional contests
By contrast, sweepstakes and promotional contests rely on compliance structures designed to avoid being classified as gambling. These models commonly use a free “alternative method of entry” and employ dual-currency systems – such as gold coins for entertainment and sweepstakes coins redeemable for prizes – to distinguish themselves from direct wagering. However, the rapid growth of these sweepstakes casinos between 2020 and 2024 prompted increasing legislative scrutiny.
Increased scrutiny
By 2025 and 2026, several states – including California, New York, Montana, Connecticut, New Jersey and Nevada – implemented explicit bans or statutory definitions that sweep these models into their gambling prohibitions. Other states issued attorney-general opinions, cease-and-desist orders or subpoenas to operators, signalling significant enforcement momentum nationwide.
Federal law also shapes the broader regulatory framework, though it does not directly license or prohibit sweepstakes or wagering models. The Unlawful Internet Gambling Enforcement Act restricts payment processing for unlawful online gambling, reinforcing state prohibitions by limiting how transactions can be facilitated. The FTC also plays a central role by enforcing rules on advertising transparency, consumer disclosures, and avoiding unfair or deceptive practices – obligations that apply to sweepstakes, contests and any promotional model using digital engagement. Skill-based competitions and fantasy sports typically fall outside the definition of gambling at the federal level, but states may still impose their own licensing, age-gating or consumer protection requirements.
Managing the Risks
Litigation
Recent class actions against sweepstakes platforms have alleged deceptive marketing, illegal gambling and misleading prize-redemption structures. These suits add civil exposure even when criminal or administrative enforcement may be limited. Marketing practices – including influencer promotions – have also drawn scrutiny where platforms fail to disclose paid relationships or where content appears directed towards minors.
These trends reflect regulators’ expectations that transparent design, responsible-gaming practices and clear disclosures are now baseline compliance features.
Compliance whiplash
From an operational standpoint, companies face significant complexity. Compliance programmes must adapt to varying state definitions of gambling, rapidly evolving legislation, and enforcement actions that may occur with little warning. Many operators now rely on geo-blocking, enhanced Know-Your-Customer procedures, and systems that clearly separate free play from prize-based entry. Businesses using contests or sweepstakes for marketing must ensure their promotions do not inadvertently satisfy a state’s test for gambling. Well-designed internal reviews – covering mechanics, disclosures, prize structures and user-experience elements – are essential to mitigate these risks.
Companies in the TMT sector must track state-level developments closely and incorporate compliance into product features from the outset. Those that adopt transparent practices, prioritise consumer protection and respond proactively to regulatory signals will be best positioned to operate lawfully in this fast-changing landscape.
Regulation of gambling, wagering, sweepstakes and promotional contests in the United States is primarily driven by state governments, with federal authorities playing a supporting role. Each state maintains its own regulatory structure, licensing requirements and enforcement procedures.
Regulatory and Enforcement Authority
At the state level, authority may rest with a variety of regulators, each with distinct mandates depending on the state’s gaming framework, including:
These bodies oversee licensing, ensure consumer protection, investigate potential violations, and enforce penalties ranging from administrative sanctions to civil and, in some cases, criminal consequences. Their jurisdiction typically extends to land-based casinos, online sports wagering, fantasy sports, charitable gaming and, increasingly, sweepstakes-style and promotional gaming platforms.
Attorneys’ General Critical Role
In addition to gaming-specific regulators, state attorneys general (AGs) play an increasingly central role, particularly in assessing whether sweepstakes and promotional contests violate state gambling laws. Throughout 2025 and 2026, AGs in states such as New York, Tennessee, Louisiana and Mississippi initiated broad enforcement campaigns targeting unlicensed sweepstakes casinos. Their enforcement tools are far-reaching and include cease-and-desist letters, subpoenas, civil investigative demands, negotiated settlements, and referrals for criminal prosecution when warranted by statute. AG offices are often the first to react when operators exploit perceived legal grey areas or when consumers raise concerns about deceptive practices, unfair disclosures or inaccessible redemption processes.
Monitoring state bills, tracking AG opinions and maintaining open communication with regulators can help mitigate enforcement risk. In an environment where agencies increasingly co-ordinate across jurisdictions and enforcement activity is accelerating, companies that adopt proactive compliance strategies are better positioned to operate lawfully and maintain consumer trust.
IP Challenges for Game Developers
Game developers face IP challenges at every stage. Pre-launch clearance is critical, as music, fonts, engines and other third-party assets must be properly licensed to avoid infringement claims.
Protecting original content is equally complex. Copyright covers code, art and audiovisual elements, but gameplay mechanics are generally unprotectable unless expressed in a sufficiently specific manner. Patents may protect novel systems but are costly and time-consuming, while trade secrets offer protection only if confidentiality is maintained.
Enforcement presents ongoing difficulties. Successful games often attract clones, while fan projects occupy legal grey areas. Streaming and “Let’s Play” content can implicate rights but are often tolerated as marketing. Because enforcement is expensive and outcomes uncertain, developers typically must prioritise which infringements to pursue.
Protecting IP in Virtual Spaces
Creators own the copyright to their original work the moment it is created, whether the work is models, textures, animations or sound files. Those who register for protection, though, are able to file federal lawsuits and can unlock statutory damages and attorneys’ fees.
Difficulty arises when game development happens under employment or contractor agreements that hand those rights to someone else.
Platform licences can complicate IP ownership. Contributions to virtual worlds or mod platforms typically require granting the platform rights to host and display the work, with the scope of those rights varying from narrow to broadly permissive, depending on the terms of service.
Moral rights protections are limited in the United States compared to Europe, allowing licensees greater freedom to modify or destroy properly licensed works without legal consequence.
Copyright in Digital and Virtual Assets
Digital assets are protected by copyright upon creation, but only as to their specific expression, not the underlying ideas. Independent creation of a similar asset – for example, a comparable weapon model – does not constitute infringement.
Format matters for enforcement. Game assets are typically stored in files on players’ devices, making them susceptible to extraction and unauthorised reuse. Although this generally constitutes infringement, it is widespread and difficult to police.
Derivative works get complicated fast. A player making a custom skin based on your character model might technically need your permission, but most only enforce when derivative works compete commercially or hurt the brand.
Copyright registration timing is critical. Registering before infringement occurs or within three months of publication preserves eligibility for statutory damages and attorneys’ fees. In the absence of timely registration, rights holders are typically limited to proving actual damages, which can be challenging for digital assets.
Trade Mark Rules for Virtual Goods
Trade mark protection extends to virtual goods when they function as source identifiers – if consumers recognise a branded virtual item as originating from a particular source, it is generally protectable, as courts have confirmed across virtual platforms.
Actual use in commerce is required: a mark must identify source and cannot be purely decorative. For example, a brand name on a virtual storefront is more likely protectable than a logo used only as background decoration.
Unauthorised virtual luxury goods are widespread in games and metaverse platforms. Although brand owners pursue takedowns and litigation, and although platform terms typically prohibit such uses, enforcement remains inconsistent.
User-Generated Content and IP Implications
User-generated content drives engagement but highlights areas where IP law lacks clarity. Players who create mods or skins typically own the copyright in their original contributions, but those works are also derived from the underlying game, which generally requires permission from the game’s creator. This is usually addressed through platform terms of service that permit creation while preserving the studio’s core rights, making careful drafting essential.
Clear third-party IP policies and licensing provisions are critical. While platforms may rely on DMCA safe-harbour protections, doing so requires prompt responses to takedown requests and termination of repeat infringers.
Rather than a single comprehensive digital-platform regime, the regulation of social media in the United States is composed of the following:
This decentralised structure means companies in the TMT sector must navigate a multilayered framework focused on transparency, fair business practices, advertising integrity and user safety. Increasingly, regulators are attentive not only to data-use practices but also to how platforms shape user experience, promote content and influence vulnerable populations.
The FTC’s Role
At the federal level, the FTC is the primary regulator of social media practices. Under Section 5 of the FTC Act, it polices deceptive or unfair conduct across platforms, including misleading advertising; failure to disclose material connections in endorsements; dark-pattern interfaces that manipulate user choice; and inadequate labelling of sponsored or algorithmically promoted content.
The FTC’s Endorsement Guides apply broadly to influencers, brands and platforms, requiring clear and conspicuous disclosure of compensation, gifts or relationships that could affect the credibility of endorsements.
In addition, the National Advertising Division (NAD), a self-regulatory body under BBB National Programs, reviews social media advertising for misleading claims, improper testimonials, inadequate substantiation and disclosure failures. While NAD decisions are not binding, they are frequently adopted by the FTC or state attorneys general when matters escalate to formal enforcement.
State Consumer Protection Laws
State consumer protection laws add further obligations. Many state regulators use their general consumer protection statutes to police advertising practices, especially where marketing targets minors, seniors or other vulnerable groups. Several states have introduced or are considering laws that impose duties related to algorithmic recommendations to minors, age-appropriate design, and protections against predatory or manipulative advertising.
Industry Self-Governance
Platforms also impose their own rules governing advertising, influencer marketing and promotional campaigns, including strict internal policies for sweepstakes and contests. These typically require clear disclosure of material terms, prohibit misleading claims, and mandate compliance with applicable federal and state laws. Violations can result in removal of content, suspension of accounts, or termination of advertising privileges.
Though platforms are generally protected from liability for user-generated content, they can still be held accountable for misleading or harmful practices associated with the presentation, curation or amplification of content – especially when such practices contradict public statements about safety or integrity. Regulators are increasingly evaluating how platform design choices impact exposure to harmful content, misinformation and targeted advertising towards protected or vulnerable groups.
For companies creating or leveraging social media, the regulatory landscape demands advertising transparency, endorsement compliance, protection of minors, and adherence to both statutory requirements and platform-specific rules.
Enforcement Priorities
Regulatory and compliance risks for social media companies largely arise from how federal, state and self-regulatory bodies enforce the framework outlined in 10.1 Laws and Regulations for Social Media. Enforcement now extends well beyond privacy into advertising integrity, consumer protection and the prevention of predatory or manipulative practices. Regulators increasingly expect companies to document, monitor and substantiate how their platforms present information, deliver advertising and safeguard vulnerable users.
Federal Enforcement
The FTC continues to lead federal enforcement. Recent matters have focused on failures to disclose paid relationships, improper use of endorsements or testimonials, deceptive marketing aimed at children and seniors, and dark-pattern interfaces that obscure material terms or steer users towards monetised outcomes. The FTC has also pursued cases involving misleading “organic” content that is in fact sponsored, as well as inadequate enforcement of platform rules on influencer disclosures or sweepstakes promotions.
State Enforcement
State attorneys general are active enforcers as well. Their actions frequently target predatory advertising, manipulation of minors through targeted content, misleading “free trial” or subscription-renewal practices, and misrepresentations tied to influencer marketing. States with broader consumer-protection statutes have applied them to address algorithmic recommendation practices and to scrutinise marketing directed at seniors, medically vulnerable populations and children.
Platforms’ Standards and Obligations
Compliance obligations also arise from platform-specific advertising and promotional rules, which frequently impose stricter requirements than federal law. Platforms typically mandate clear disclosures for paid or sponsored content; prohibit promotional tactics that target minors deceptively; and impose detailed eligibility, disclosure and record-keeping requirements for sweepstakes, contests and giveaways. Failure to follow these rules can result in removal of content, loss of monetisation privileges, or account termination.
As enforcement expands, companies must ensure that their advertising and promotional practices are substantiated, transparent, and consistent with platform policies and consumer-protection expectations. This includes monitoring influencers for compliance, reviewing content for adequate disclosures, ensuring that sweepstakes and promotions meet both legal and platform standards, and avoiding dark-pattern designs that regulators increasingly view as unfair or deceptive.
Key Telecoms Data Privacy Laws
The Communications Act imposes baseline privacy obligations on telecommunications carriers regarding CPNI such as call detail records, location information and service features. Carriers must obtain opt-in consent before using CPNI for marketing non-communications services, implement safeguards to protect CPNI, notify customers and law enforcement of CPNI breaches, and file annual compliance certifications with the FCC.
The TCPA imposes certain restrictions on telemarketing calls, text messages and auto-dialled communications, requiring prior express written consent for marketing using automated systems.
Comprehensive state privacy statutes in approximately 20 states (eg, CCPA) can apply to telecommunications providers’ non-CPNI data activities, including device sales, non-communications services, and internet services not classified as telecommunications.
Data Privacy Challenges for Telecoms Companies
The complexity of managing notices, consents, compliance and audit trails across these different sets of laws and regulations can be operationally challenging for telecoms operators. Due to the nature of telecoms operations, such providers may also find challenges in meeting data minimisation principles, accommodating consumer privacy rights such as access and deletion, and providing meaningful privacy notices that simplify complex circumstances and practices for consumers.
Cross-Border Compliance for Telecoms Operators
US telecoms operators must balance complex legal and operational considerations when handling international traffic. They may need to find lawful means for addressing foreign cross-border data transfer requirements such as under GDPR, and they may need to meet localisation requirements under certain federal and state contracts, as well as under certain commercial customer agreements.
Balancing Surveillance and Privacy
Telecoms operators are more prone to receive government demands to access data under various laws including the Communications Assistance for Law Enforcement Act (47 U.S.C. § 1001 et seq) and the Stored Communications Act (18 U.S.C. §§ 2701-2712). As a result, they will typically have a robust process directed by legal counsel for vetting and lawfully responding to legal demands while mitigating risks associated with conflicting laws (such as from foreign jurisdictions) as well as contractual obligations, public privacy representations and reputational considerations.
Third Party Vendors and Cloud Service Providers
Telecom providers rely extensively on third parties for network infrastructure, cloud services, data analytics, billing and CRM, and content delivery such as edge computing. All of these hardware, software, cloud and networking services need to be factored into the privacy compliance programme, security assessments and other risk management functions.
Impact of Evolving Regulations
Telecom providers often maintain robust legal and privacy functions in order to monitor the ongoing development and adoption of more advanced telecom technology and services, which then need to be reconciled with evolving federal and state data privacy regulations across different aspects of their suite of service (eg, CPNI, internet services, IoT, integrated third-party services, commercial data centres, etc). Privacy compliance demands may disproportionately burden smaller operators who may not have as robust compliance functions.
Challenges in User Data Protection
Digital media and streaming platforms have to navigate fragmented US privacy laws including comprehensive state privacy statutes (eg, CCPA/CPRA, Virginia CDPA, Colorado CPA, Connecticut CTDPA, Utah UCPA), each with varying thresholds, requirements and consumer rights. Platforms must implement technical systems for access, deletion, correction and portability requests across distributed content delivery and analytics infrastructure. Platforms also need to address more targets laws such as for children’s privacy (COPPA) and video records privacy (VPPA). All jurisdictions expressly or through case law expect businesses to maintain “reasonable security” measures appropriate to data sensitivity, and all states have breach notification obligations with different timelines and requirements.
Privacy‑by-Design and Security‑by‑Design in Digital Media
Privacy-by-design practices often including data minimisation efforts, privacy-oriented default settings (eg, opt-in for targeted advertising), purpose limitation controls, privacy dashboards for user control, and data protection impact assessments for high-risk processing activities. Digital media and streaming services often implement robust cybersecurity risk management programmes aligned with industry standards (eg, ISO 27001, NIST CSF) that increasingly include threat intelligence and modelling, secure coding practices, automated security testing, zero trust architecture, encryption (at rest, in transit, end-to-end for sensitive data), DDoS protection, WAFs, MFA, passwordless authentication and anomalous login detection. Security operations centres provide 24/7 monitoring with incident response playbooks.
Cyber Risks From Third‑Party Data Sharing
Platforms share data with demand-side platforms, supply-side platforms, data management platforms, ad exchanges and measurement providers. These activities are typically within the scope of state privacy laws, so compliance measures are necessary to address activities that may constitute a “sale” or “sharing” of information, and third-party targeted advertising, frequently through user opt-in/opt-out rights, Global Privacy Control signals and other related mechanisms. These third-party relationships are governed by an array of data processing agreements that include use restrictions, consent management, security requirements, audit rights and sub-processor management.
Regulation and Impact on Media Platforms
There have not been prescriptive cybersecurity regulations for digital media platforms, but the risk exposure from data breaches, along with tenets of cybersecurity referenced in evolving privacy laws, has prompted such platforms to implement robust cybersecurity programmes and often promulgate extensive contractual requirements across their service provider and third-party ecosystems. Obligations include compliance burdens and data use limitations that increase protections but may inhibit innovation, especially with regard to AI training and use.
701 Brickell Avenue Suite 3300,
Miami,
Florida,
USA,
FL 33131
305.374.8500
305.789.7799
www.hklaw.com