Codify — Article

SCREEN Act: Age-verification for online platforms

Requires covered interactive platforms to verify user age and block minors from content deemed harmful using technology-based verification measures.

The Brief

The SCREEN Act would require certain interactive platforms to adopt and operate technology verification measures to determine whether a user is a minor and to prevent minors from accessing content that is harmful to them. The measure would mandate a process for age verification, forbid relying solely on user self-declaration, and obligate platforms to publicly disclose their verification processes.

It also permits use of third-party verifiers while keeping the platform ultimately responsible for compliance and data security. Enforcement would be through the Federal Trade Commission, with audits, guidance, and a future GAO review.

At a Glance

What It Does

Starting one year after enactment, covered platforms must implement technology verification measures to verify user age and to block access by minors to content that is harmful to minors. The measures may rely on various technologies and must be publicly disclosed.

Who It Affects

Large, interstate or foreign interactive platforms that host or make available content (including pornographic content) and that earn revenue from such content; their users; and potential third-party verification providers.

Why It Matters

This represents a shift from blocking/software-filter approaches to verifiable age checks, aiming to reduce minor exposure while imposing a defined compliance framework for platforms and a formal enforcement regime.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The bill creates a federal framework to force major interactive platforms to verify users’ ages and prevent minors from accessing content that is harmful to them. It defines what counts as a covered platform and sets out the mechanism that platforms may use to verify age, including allowing third-party providers but holding the platform responsible for compliance and data-security.

The requirement is tied to a one-year timeline after enactment, with a mandate to publicly disclose the verification process and to subject IP addresses of users to verification activities unless the user is located outside the United States. The FTC would enforce the rules through audits and penalties, with ongoing rulemaking guidance and a later GAO evaluation of effectiveness, compliance, data-security practices, and broader side effects.

The bill anticipates tensions around privacy, user experience, cross-border enforcement, and potential overreach, and it establishes a severability clause to preserve remaining provisions if any part is struck down.

The Five Things You Need to Know

1

Platforms must implement technology-based age verification within one year of enactment.

2

Self-declaration of not being a minor is not enough for compliance.

3

IP addresses of users are subject to verification unless the user is determined to be outside the United States.

4

Third-party verification providers may be used, but platforms remain liable for compliance.

5

FTC audits, guidance, and a GAO report are required to assess effectiveness and impact.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 2

Findings and Sense of Congress

This section lays out the rationale for the act. It cites historical attempts to shield minors from online sexual content and notes the perceived ineffectiveness of blocking and filtering software. It frames age verification as a necessary evolution in policy and asserts that age-verification technology, if implemented narrowly and efficiently, is a permissible means to protect minors.

Section 3

Definitions

Key terms are defined: minor, child pornography, informational content providers and interactive computer services, and, crucially, technology verification measure and data. It also defines what qualifies as ‘harmful to minors’ and establishes that verification data can include IP-derived information but clarifies constraints on what may be collected and retained.

Section 4

Technology Verification Measures

This is the core requirement. Beginning one year after enactment, covered platforms must adopt technology verification measures to determine whether users are minors and to prevent minors from accessing content harmful to minors. It allows platform discretion in choosing specific verification methods, provided they meet the statutory criteria and prevent minors from seeing prohibited content.

5 more sections
Section 5

Consultation Requirements

The Commission must consult with technical experts, child-safety advocates, privacy and data-security specialists, and providers of verification technologies. The aim is to inform standards and metrics used to determine non-minor status and to shape enforcement approaches.

Section 6

Commission Requirements

The FTC is charged with audits of covered platforms, disclosure of audit terms, and establishing submission processes to demonstrate compliance. It must issue guidance within 180 days, though such guidance cannot bind third parties or unreasonably constrain enforcement. It also outlines how enforcement actions will be pursued and the relationship to existing FTC authorities.

Section 7

Enforcement

Violations are treated as unfair or deceptive practices under FTC law. The Commission has the standard enforcement powers of the FTC and may impose penalties consistent with existing statutory penalties. This section preserves the FTC’s broader authority to enforce related provisions.

Section 8

GAO Report

Two years after compliance begins, the Comptroller General must deliver a comprehensive report evaluating effectiveness, compliance rates, data security, and societal impacts, plus recommendations for improvements to both Congress and the Commission.

Section 9

Severability

If any provision is unconstitutional, the rest of the act remains in force, preserving the balance of the law as intended.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Minors receive reduced exposure to online content that is deemed harmful, due to enforced age verification and blocked access to prohibited material.
  • Parents and guardians gain assurances about their children’s online safety and the platforms’ accountability.
  • Large interactive platforms subject to interstate or foreign commerce gain a clear compliance framework and potential reduction in illegal access claims.
  • Child-safety advocacy organizations and privacy-conscious consumer groups gain procedural access to standards and audits that support safety and privacy.
  • The FTC gains a stronger enforcement mandate with audit visibility and public guidance.
  • Verification technology providers may see increased demand for compliant solutions.

Who Bears the Cost

  • Platforms will incur ongoing development, privacy, security, and auditing costs to implement and maintain verification measures.
  • Platforms may face data-security responsibilities and potential liability for breaches of verification data.
  • Third-party verification providers will bear integration and service costs but carry no guarantee of universal adoption.
  • Smaller platforms with limited resources may struggle to meet ongoing compliance demands without scale advantages.
  • Government agencies may see increased enforcement costs tied to audits and ongoing oversight.

Key Issues

The Core Tension

The central dilemma is whether robust age-verification technology can protect minors without introducing disproportionate privacy risks or creating barriers to legitimate access, particularly for users outside the United States or in regions with limited verification infrastructure.

The act shifts policy from blocking and filtering toward verifiable age checks, raising questions about privacy, data handling, and the risk of false positives or location misclassification. The reliance on IP addresses and potentially sensitive verification data must be weighed against existing privacy protections and data-security capabilities.

Operationally, platforms must balance user experience with verification rigor, and there is potential for cross-border enforcement complications and strategic avoidance by users located outside the United States. The scope and effectiveness of audits, as well as the impact on smaller platforms, remain areas for careful monitoring and possible legislative refinement.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.