Codify — Article

Algorithmic Transparency and Choice Act requires platforms to give minors default non‑personalized feeds and disclosure

Sets a federal baseline forcing covered platforms to disclose how recommendation systems work for minors, give opt‑outs and an input‑transparent default, and makes violations an FTC enforcement priority.

The Brief

The Algorithmic Transparency and Choice Act mandates that any public-facing online platform using a personalized recommendation system must give minors clear notice the first time they interact with such a system, publish accessible terms describing how the system works, and provide user-facing options to switch algorithms or limit recommendation categories. Crucially, the bill requires platforms to set an "input‑transparent" algorithm as the default for minors—i.e., a system that does not use inferred or device-history user data unless the minor expressly supplies it.

The statute creates specific disclosure requirements (what inputs and features power the recommender, how user-specific data is collected or inferred, and which engagement metrics the system optimizes), preserves trade‑secret and privileged protections, grants the Federal Trade Commission enforcement authority by treating violations as unfair or deceptive practices, and preempts state laws that would cover the same requirements. The result is a clear compliance mandate for product, engineering, legal, and privacy teams at platforms that host user-generated content and target or serve minors.

At a Glance

What It Does

The bill requires covered online platforms to present a clear notice to minors when they first encounter a personalized recommendation system, to publish plain-language disclosures in their terms about inputs and optimization goals, and to provide toggles that let minors switch between a personalized recommender and an input‑transparent algorithm or limit categories of recommendations. It also mandates the input‑transparent algorithm as the default for minors.

Who It Affects

Public-facing websites and mobile apps that provide forums for user-generated content and employ personalized recommendation systems—social networks, video-sharing services, and similar platforms—are directly affected, as are their product, engineering, legal, and compliance teams. The Federal Trade Commission gets enforcement authority; state legislatures lose the ability to impose overlapping requirements.

Why It Matters

By making non‑personalized defaults and disclosure the norm for minors, the bill reshapes product design choices tied to engagement metrics and advertising. It creates a federal baseline that simplifies compliance across states but raises product, IP, and enforcement trade-offs platforms must manage.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The Act draws a bright line around minors who create accounts on platforms that use recommendation engines: when a minor first interacts with a personalized recommendation system, the platform must surface a clear, conspicuous notice that personal data drives recommendations. Beyond that initial notice, platforms must maintain a plain‑language explanation in their terms and update it whenever there is a material change.

That explanation must describe which features, inputs, and parameters power the recommender; how user‑specific data is collected or inferred; any options available to modify a profile or opt out; and which quantities (for example, time on site or engagement metrics) the system is designed to optimize and their relative importance.

Operationally, the Act requires platforms to offer two user-facing controls for minors: a simple switch between the platform's standard personalized recommender and an "input‑transparent" algorithm, and an option to limit the kinds of recommendations the system will surface. The input‑transparent algorithm is defined so that it will not rely on inferred characteristics or the history of a device unless the minor expressly supplies particular inputs (search terms, saved preferences, or precise geolocation supplied by the user).

The statute also carves out common safety and security systems—spam filters, fraud detection, and similar systems are not treated as covered recommenders.Enforcement is delegated to the Federal Trade Commission by treating violations as unfair or deceptive acts or practices under the FTC Act, giving the agency its usual investigative and remedial toolkit. The bill protects platforms from having to disclose trade secrets, privileged materials, or confidential business information when making the required disclosures.

Finally, the statute contains an express federal preemption clause: states may not enact or enforce laws that cover the same disclosure and choice requirements described in the Act.

The Five Things You Need to Know

1

The Act takes effect for covered platforms 1 year after enactment, giving platforms a one‑year window to implement UI and terms changes.

2

It requires platforms to set an input‑transparent algorithm as the default for any minor who registers an account or creates a profile.

3

The statute defines "input‑transparent algorithm" to exclude use of device history or inferred attributes—those remain off limits for determining recommendations unless the minor expressly supplies the specific data for that purpose (e.g.

4

a search term or saved preference).

5

Violations are treated as unfair or deceptive acts or practices and enforced by the FTC with the agency’s full investigatory and remedial powers under the FTC Act.

6

The law expressly preserves protection for trade secrets, confidential business information, and privileged materials and preempts state laws that would impose overlapping requirements.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 1

Short title

Gives the Act its procedural name: the "Algorithmic Transparency and Choice Act." It functions only as an organizational header but signals the focus of the statutory language that follows on transparency and choice for recommendation systems used with minors.

Section 2(a)

Notice, disclosures, and user controls for minors

This subsection contains the operative obligations. Platforms must show a clear, conspicuous notice at a minor’s first interaction with a personalized recommendation system and maintain an accessible terms explanation that is updated on material change. Practically, compliance means product teams must implement a first‑use banner or modal, revise terms of service and privacy disclosures into plain language, and expose UI toggles that let minors switch algorithms or narrow recommendation categories. The clause requiring disclosure of the quantities the system optimizes forces platforms to state whether the recommender prioritizes time on site, clicks, shares, or other engagement signals and to describe their relative weight in general terms.

Section 2(a)(1)(C)–(D) and (a)(2)

Input‑transparent default and switching mechanics

The Act requires an "input‑transparent" algorithm as the default setting for minors and mandates an easy way to switch to it. For engineers, this demands maintaining two behaviorally distinct pipelines or feature‑flags: the platform’s existing personalized recommender and a version that omits inferred features and device-history signals for minors unless they opt to provide specific inputs. Product teams must also implement controls to limit recommendation categories, which can be implemented as filters or preference settings. Platforms should document how the toggle affects ranking, content diversity, and personalization features to meet the disclosure requirement.

2 more sections
Section 2(b)

FTC enforcement and penalties

The subsection makes violations actionable as unfair or deceptive acts or practices under the Federal Trade Commission Act, bringing the full suite of FTC investigative powers, civil penalties, and remedial authorities to bear. That means the FTC can investigate, seek injunctive relief, and assess civil penalties consistent with its statutory authority; platforms should expect the agency to interpret disclosure adequacy and the functionality of the input‑transparent option in enforcement actions.

Section 2(c)–(e)

Limitations, definitions, and preemption

The statute limits disclosures so platforms are not forced to reveal trade secrets, confidential business information, or privileged materials. It includes detailed definitions (minor = under 18; personalized recommendation system = systems using user‑specific data not expressly supplied by the user) and draws lines around what counts as expressly provided data (search terms, saved preferences, user‑supplied precise geolocation, and current approximate geolocation listed separately). The Act also explicitly preempts state laws that would cover the same requirements, producing a single federal compliance baseline but foreclosing state-level variants.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Minors and caregivers — minors who register accounts gain a default non‑personalized experience and UI controls to limit recommendation categories; caregivers obtain clearer disclosures to assess platform behavior toward children.
  • Child‑safety researchers and advocates — required disclosures about inputs and optimization metrics create auditable, plain‑language data points researchers can use to study algorithmic effects on youth.
  • Compliance and legal teams at platforms — a single federal standard and explicit preemption reduce the complexity of navigating a patchwork of state obligations and clarify what the FTC will enforce.

Who Bears the Cost

  • Platform product and engineering teams — must build and maintain separate algorithmic pipelines or feature flags, update user interfaces for first‑use notices and toggles, and revise terms and documentation.
  • Ad‑supported businesses and publishers — defaulting minors to non‑personalized feeds may reduce engagement signals that underpin targeted advertising and content monetization.
  • Smaller platforms and startups — the engineering and legal work to comply (UI, terms, audit trails, and potential FTC engagement) creates nontrivial overhead that scales less easily for small operators, even though the law defines covered platforms by mechanism rather than revenue.

Key Issues

The Core Tension

The central dilemma is balancing meaningful transparency and user choice for minors against the need to protect platform trade secrets and preserve the signals that power content moderation and revenue models: requiring non‑personalized defaults and detailed disclosures protects minors and increases accountability, but it may undermine the technical inputs platforms rely on for safety and personalization or force companies to reveal or over‑describe proprietary systems.

Several implementation ambiguities and trade‑offs could complicate compliance. The statutory definitions hinge on distinctions between data "expressly provided" by a minor and data inferred or derived by platforms; in practice, feature engineering often blends explicit inputs with inferred features (demographic inferences, behavioral clusters, or device‑level signals).

Determining what counts as an impermissible inference versus permissible engineering could generate litigation and shifting enforcement expectations. Similarly, the requirement to disclose which "quantities" the system optimizes and their "relative importance" is useful in principle but vague in practice: platforms may struggle to give meaningful, non‑misleading summaries without revealing proprietary ranking logic, and high‑level descriptions may be too soft to satisfy regulators.

The trade‑secret carve‑out protects core IP, but tension remains: regulators and researchers want informative disclosures, while companies want to avoid revealing proprietary model architecture or signal weightings. The law’s preemption clause simplifies federal compliance but prevents states from experimenting with stronger protections; that could generate political pushback and leave gaps if the federal standard proves insufficient.

Finally, the input‑transparent default may reduce signals that help platforms surface harmful content quickly, so platforms might respond by shifting moderation burdens onto human reviewers or by designing workarounds that preserve safety signals while claiming compliance—both outcomes impose cost and operational complexity.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.