HB 3875, the TERMS Act, would compel online service providers to publicly disclose their acceptable use policies within 180 days and to detail how they enforce those policies, including any third parties involved and whether off-site conduct can trigger actions. It also requires advance written notice before restricting a user, with limited exceptions, and allows a public disclosure option for those notices.
Finally, providers would publish an annual, open‑license report describing enforcement actions and appeals.
In addition, the bill would empower the Federal Trade Commission to enforce these provisions as unfair or deceptive practices, set forth guidance to aid compliance, and establish a framework for categorizing enforcement actions and alert sources. The overarching aim is to give consumers, businesses, and organizations clearer information about platform moderation so they can make informed decisions and compare platforms on how they handle restrictions and appeals.
At a Glance
What It Does
Not later than 180 days after enactment, online service providers must publicly disclose a concise acceptable use policy describing prohibited acts, enforcement methods (including third parties used), appeal options, and whether off-site conduct can justify restrictions. The policy must also address the notice requirements described in section 5.
Who It Affects
Public-facing websites, apps, and online services engaged in interstate commerce must publish their AUPs; users and organizations relying on those platforms are directly affected by the increased transparency.
Why It Matters
This establishes a baseline of transparency for moderation practices, enabling users to understand what counts as a violation and how actions are enforced, which supports informed decision-making and market competition.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The TERMS Act would require online service providers to publish a public Acceptable Use Policy (AUP) within 180 days of enactment. The policy must clearly describe what conduct is prohibited, how enforcement works (including any third parties used), and whether acts outside the platform can lead to restrictions.
It must also explain whether users have an appeal process and how the provider handles notices about potential enforcement, including whether off-site behavior can be a basis for action.
A key feature is the notice requirement: providers must give advance written notice before restricting a user, with a good‑faith effort to notify the user at least seven days prior where feasible. If contact information is unavailable, providers must use on-site notices or other prominent disclosures.
The bill also lets users opt to have such notices publicly disclosed on the provider’s site. Exceptions apply for court orders or imminent serious harm, during which restrictions can occur without prior notice but with prompt disclosure after the restriction.The act requires an annual enforcement report, open-license and machine-readable, detailing instances of potential policy violations, actions taken (terminations, suspensions, or other restrictions), and appeal outcomes.
This reporting must categorize actions by the exact policy provision violated and the alert source. Enforcement is assigned to the FTC, with authority extended to nonprofit organizations under certain conditions, and the FTC must issue compliance guidance within 180 days.Taken together, the TERMS Act aims to illuminate how platforms moderate user content and use of restrictive tools, providing a data-driven basis for evaluating platform practices and consumer protections without prescribing platform-specific outcomes.
The Five Things You Need to Know
The bill requires公开 disclosure of the acceptable use policy within 180 days.
The policy must detail prohibited acts, enforcement, appeals, and cross‑site factors.
Advance written notice is required before restricting a user, with a 7‑day lead time where feasible.
Providers must offer a public-disclosure option for written notices.
An annual enforcement report, open for machine readability, must detail enforcement actions and appeals.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Purpose of the Act
Section 2 explains that the goal is to ensure consumers, businesses, and organizations have sufficient information about platform standards, processes, and policies when a provider terminates, suspends, or restricts a user. The aim is to promote informed choices and a competitive marketplace by clarifying how online services manage restrictions.
Definitions
Section 3 defines key terms: Commission (FTC), nonprofit organizations, online service provider (including the requirement to create a user account and be engaged in interstate or foreign commerce), and the meaning of restrict and user. The definitions anchor who is covered and how actions are categorized.
Disclosure of Acceptable Use Policies
Section 4 requires a public disclosure of an acceptable use policy within 180 days and specifies what information the policy must include: prohibited acts, enforcement methods (including third parties used), appeal rights, and whether off-site conduct can justify restrictions. It also ties the policy to how notices are described in Section 5.
Advance Written Notice Prior to Termination or Suspension
Section 5 mandates advance written notice before restricting a user, with a seven‑day good‑faith lead time when feasible. The notice must describe the restricted act, violation of policy, appeal options, and the possibility of publicly disclosing the notice. Exceptions exist for court orders and imminent harm, with contemporaneous disclosure requirements in such cases.
Reporting Requirements
Section 6 requires an annual enforcement report, open-license and machine-readable, detailing the number of potential violations, actions taken (including terminations and suspensions), appeals, reversals, and categorization by the specific policy provision violated and the alert source.
Enforcement
Section 7 assigns enforcement to the FTC, treating violations as unfair or deceptive acts or practices. It authorizes FTC powers consistent with the FTC Act, with special provisions for nonprofit organizations. The FTC must issue guidance within 180 days, and guidance cannot create rights or constrain enforcement beyond the statute.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Individual users of online services gain clearer expectations about moderation and a transparent path to appeals and public notices.
- Small and medium-sized businesses relying on online platforms can anticipate how restrictions are enforced, aiding risk management.
- Nonprofit organizations and advocacy groups benefit from clearer moderation standards and access to enforcement data.
- Consumer protection agencies and policymakers gain clearer metrics through annual enforcement reporting.
- Researchers and market analysts obtain structured data to study platform moderation practices.
Who Bears the Cost
- Online service providers must publish, maintain, and periodically update AUPs, and prepare annual enforcement reports, incurring compliance costs.
- Legal and compliance teams within providers must manage documentation, notices, and appeals processes.
- Regulators (FTC) may require additional resources to issue guidance and enforce the new regime, elevating oversight costs.
- Smaller platforms with limited resources may face disproportionate burdens from reporting and disclosure requirements.
Key Issues
The Core Tension
The central dilemma is balancing transparency and user rights with platform flexibility for timely, safety-driven moderation. Forcing advance notice and public disclosure may slow or complicate enforcement, while robust reporting and guiding enforcement could improve accountability but increase compliance costs and risk aversion among platforms.
The TERMS Act introduces a broad coverage net by defining online service providers to include any public-facing website or app that offers a product or service and conducts interstate commerce. While transparency is validated as a goal, the implementation creates potential burdens for smaller platforms, which may struggle with publishing comprehensive AUP disclosures, sustaining ongoing notices, and compiling annual reports.
The requirement to publicly disclose written notices (at the user’s option) adds a new dimension to platform moderation data, raising questions about privacy, reputational risk, and the possible chilling effect on enforcement. There is also a tension between the need for rapid action in the case of imminent harm and the statutory focus on notice and appeal timing.
The reliance on third parties for enforcement (as disclosed in the policy) could complicate accountability and lead to blurred responsibility across different actors in the moderation stack.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.