The TERMS Act would mandate online service providers to publicly disclose their acceptable-use policies, provide written notice before restricting a user’s account, and publish an annual enforcement report. It sets a clear timeline for disclosure, defines what must be included in the policy, and requires mechanics for appeals and public disclosure of notices.
The bill also gives the Federal Trade Commission authority to enforce these provisions and to issue guidance for providers.
At a Glance
What It Does
The Act requires online service providers to publicly disclose an acceptable-use policy within 180 days, detailing prohibited acts, enforcement methods (including third-party reliance), appeals options, and limits on considering off-site conduct. It also requires annual enforcement reporting and, for terminations, advance written notice to affected users.
Who It Affects
Public-facing online platforms with consumer accounts, including websites, apps, and services engaged in interstate commerce, plus their users who are subject to restrictions.
Why It Matters
It standardizes platform moderation practices, increases transparency for users, and provides regulators with data to assess compliance and potential consumer harms.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The TERMS Act creates a baseline for how online services moderate and restrict user activity. It starts by requiring every online service provider to publish an acceptable-use policy that clearly explains what is forbidden, how the provider enforces those rules, and whether appeals are available.
The policy must also describe how third parties might assist in enforcement and how off-platform behavior can influence on-platform decisions. If a provider makes a material change to its policy, it must give users advance notice before those changes take effect.
In addition to the policy, the Act requires advance written notice to any user who is about to be restricted for policy violations, detailing the specific conduct, how it violated the policy, and whether the user can appeal. If possible, the provider must offer an option for public disclosure of this notice.
The legislation also obligates providers to publish an annual report describing enforcement actions, including counts of restrictions, appeals, reversals, and the sources of alerts, in both human- and machine-readable formats. The FTC would enforce these provisions, and providers would receive guidance to help with compliance.
Taken together, the measure aims to increase transparency and accountability in how online services govern user access, while giving consumers clearer information and recourse when restrictions occur.
The Five Things You Need to Know
180-day disclosure deadline for acceptable-use policies.
Policies must detail prohibited acts, enforcement methods, and appeals rights.
Advance written notice required before restricting a user, with a seven-day good-faith notice window.
Option for public disclosure of notices to users who choose it.
Annual enforcement reports published in human- and machine-readable formats and FTC-enforced.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Short Title
Sets forth the name of the act as the Transparency in Enforcement, Restricting, and Monitoring of Services Act (TERMS Act). The title signals the bill’s focus on transparency and moderation practices by online service providers.
Purpose
Articulates the goal: to ensure consumers and organizations have sufficient information about how online service providers set business standards and decisions to restrict access, with the aim of informed purchasing and a competitive marketplace.
Definitions
Defines the FTC as the Commission, defines online service providers, and clarifies terms such as ‘restrict’ and ‘user’ to determine the scope of coverage. The definitions establish who must comply and who benefits from the transparency and notice requirements.
Disclosure of Acceptable Use Policies
Requires providers to publicly disclose an acceptable-use policy within 180 days, detailing prohibited acts, enforcement mechanisms (including third-party involvement), appeal rights or the lack thereof, off-platform considerations, and how the notice interacts with the policy. This section sets the substantive content providers must publish.
Advance Written Notice Prior to Termination or Suspension
Mandates advance written notice before restricting a user, including the conduct that triggered the restriction, how it violated the policy, and whether an appeal is available and how to pursue it. It provides a seven-day good-faith-notice window and allows public disclosure of the notice if the user consents.
Reporting Requirements
Requires annual enforcement reporting in accessible formats, detailing alerts of potential violations, the actions taken (term/limit/suspend), appeal counts and outcomes, and categorization by policy provisions. Reports must be machine-readable and open-licensed.
Enforcement
Authorizes FTC enforcement under the FTC Act, with rules and guidance to assist providers in compliance. It preserves existing authority and allows guidance to inform enforcement while ensuring it does not bind providers beyond the statute.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Individual users and business customers gain clearer expectations about restrictions and a path to appeal.
- Online platforms benefit from standardized practices that reduce ad-hoc moderation disputes and increase user trust.
- Regulators and policymakers gain access to systematic enforcement data through annual reports, aiding oversight and potential rulemaking.
- Digital rights groups and researchers can analyze platform practices through transparent disclosures and open data.
Who Bears the Cost
- Online service providers incur costs to publish the required policies and maintain annual reporting systems.
- Smaller platforms with limited compliance resources bear a relatively higher burden relative to larger incumbents.
- Some costs may arise from enabling the public disclosure option for notices and maintaining appeal processes or documentation systems (including third-party enforcement collaborations).
Key Issues
The Core Tension
The central dilemma is whether requiring explicit, publicly searchable policies and notices improves consumer protections and competition without imposing excessive regulatory costs or chilling effects on legitimate moderation.
The bill creates a clear framework for transparency and accountability, but it also imposes ongoing compliance costs that could be burdensome for smaller platforms. The open-records aspect of annual enforcement reporting invites scrutiny that might reveal moderation decisions that vary by provider and by jurisdiction.
Another tension is the balance between transparency and the risk of over-correction or chilling effects in moderation, particularly if reports or notices are publicized widely. Finally, the definition of “online service provider” is broad and could pull in a wide range of entities, raising questions about scope and implementation across diverse business models.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.