Codify — Article

Safe Messaging for Kids Act (H.R.6257) bans ephemeral DMs for minors, mandates parental DM controls

Federal bill would prohibit ephemeral messaging for known minors, require accessible parental direct‑messaging controls, impose app‑store warnings, and preserve strong encryption.

The Brief

The Safe Messaging for Kids Act of 2025 (SMK Act, H.R.6257) would bar social media providers from offering ephemeral messaging features to any user the provider knows (or would know but willfully disregards) is a minor, and require platforms to provide parents with clear, usable controls to manage direct messaging for those minors. It also directs app stores to display warnings when a parent requires verifiable parental consent before a minor may download a messaging-enabled app.

This is a product‑design and compliance bill more than a criminal statute: it imposes behavioral requirements on platforms, creates an enforcement path through the Federal Trade Commission and state attorneys general, preempts state laws, and includes explicit language to avoid forcing platforms to weaken end‑to‑end encryption. For product, compliance, and legal teams at platforms and app stores, the bill would require technical changes, new user‑flows, and documentation of parental‑consent mechanisms.

At a Glance

What It Does

The bill prohibits social media platforms from providing ephemeral messaging features to covered users (minors or users the platform should have known were minors) and requires platforms to offer parental direct‑messaging controls that a parent can activate via verifiable parental consent. It also requires app stores to show a clear warning to parents when a download involves an app with direct messaging and a parental‑consent gate is in place.

Who It Affects

Major social media platforms and any apps with integrated direct‑messaging features, app stores that distribute those apps, parents and guardians of minors, the Federal Trade Commission (FTC), and state attorneys general who may bring parens patriae suits.

Why It Matters

The bill sets a federal product safety floor for how platforms must treat minors’ messaging: design changes (no ephemeral DMs for minors), parental gatekeeping tools, and compliance obligations. It also signals regulators will treat violations as unfair or deceptive practices enforceable by the FTC, while affirming that encryption protections remain intact.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The SMK Act defines the key players and features it regulates. A “covered user” is anyone the platform actually knows is a minor or would know is a minor but for willful disregard; “minor” is under age 17.

An “ephemeral messaging feature” is a messaging function that makes sent content effectively inaccessible—for example, disappearing after a recipient views it, after a set time, or when exiting a chat—but the bill excludes manual deletions and ordinary caching. The bill targets social media platforms as defined by a combination of profile, network, user‑generated content, and persistent interpersonal communication features.

The core operational rule is simple: platforms may not offer ephemeral messaging features to covered users. Separately, platforms that offer any direct‑messaging features to covered users must provide parental direct‑messaging controls that parents can activate and manage through verifiable parental consent (the bill ties that concept to COPPA’s definition).

Those controls must be clearly accessible (via the child’s profile settings and any parental portal), user‑friendly, and prominently offered to likely parents. Functionally, the controls must allow parents—by default unless they opt otherwise—to receive notifications about direct‑message requests from unapproved contacts and to approve or deny those requests before messaging can occur, manage an approved‑contacts list, be alerted to age changes on a profile, and disable direct messaging entirely for the child.The bill creates a strict default for younger children: platforms must disable direct messaging for profiles of covered users under 13 unless and until a parent provides verifiable consent to enable it.

Platforms must also take reasonable measures to prevent children from easily bypassing parental controls, and they may not degrade unrelated features as a result of parental controls. App stores have a separate duty to display a clear, conspicuous warning to a parent when a minor attempts to download or purchase an app that includes direct messaging and the parent has required verifiable consent for that download.Enforcement is delegated to the Federal Trade Commission by treating violations as unfair or deceptive acts or practices under the FTC Act; the FTC gets the same enforcement tools and remedies it has today.

States may bring parens patriae suits on behalf of residents, but the statute gives the FTC intervention rights and bars state actions against defendants already named in a pending federal action. The Act also forbids construing its requirements to force platforms to weaken or bypass encryption and directs implementation to preserve privacy and security.

Finally, the Act preempts state laws on the same subject and phases in compliance: general effect 180 days after enactment, one year to implement parental controls, and 18 months for app‑store warnings.

The Five Things You Need to Know

1

The bill bans “ephemeral messaging features”—functions that render sent messages inaccessible after viewing or a set time—for any covered user (a minor the platform knows, or would know absent willful disregard).

2

Platforms must provide parental direct‑messaging controls accessible from a child’s profile and any parental portal that let parents approve/deny incoming DM requests from unapproved contacts and manage an approved‑contacts list.

3

As a default, platforms must disable direct messaging for covered users under age 13; a parent may enable DMs only by providing verifiable parental consent (the bill references the COPPA definition).

4

Violations are enforced by the FTC as unfair or deceptive practices with the FTC’s full powers; states can sue as parens patriae but may be limited while a federal action against the same defendants is pending.

5

The statute explicitly prohibits construing the bill to require breaking, weakening, or ensuring the ability to decrypt strong encryption and directs that implementation must preserve user privacy and security.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 2

Definitions that shape coverage and technical scope

This section’s definitions determine who and what the rest of the Act applies to: a covered user includes anyone the platform actually knows is a minor or would know but willfully disregards, and a minor is under 17. The definition of ephemeral messaging is functional (messages made permanently inaccessible by design) and expressly excludes manual deletions and routine caching—an important line for engineers and product teams deciding which features to change or leave alone.

Section 3

Prohibition on ephemeral messaging for covered users

Section 3 forbids social media providers from offering ephemeral messaging features to covered users. Practically, this means disappearing messages, view‑once media, and auto‑delete after exit or a timer cannot be enabled for profiles a platform knows are minors. Legal teams will need to map product variants to the statutory definition and document how features are disabled for covered users.

Section 4

Parental direct messaging controls—design, defaults, and anti‑circumvention

Section 4 requires easily accessible, usable parental controls with specific functional minima: parental notification of unapproved‑contact requests with approve/deny flows, an approved‑contacts manager, alerts on profile age changes, and the ability to turn off direct messaging. It mandates user‑friendly placement (profile settings and any parental portal), prominent presentation to likely parents, and reasonable anti‑circumvention measures. The most consequential product rule is the default disable of direct messaging for under‑13 accounts, which can be reversed only after verifiable parental consent—platforms must implement robust consent flows and logging to show compliance.

5 more sections
Section 5

App store warning requirement

Section 5 places a duty on app stores to show a clear, conspicuous warning to a parent when a covered user tries to download or purchase an app with direct messaging and the parent has set a consent requirement. App stores must build UI hooks into parental‑consent settings and ensure the warning triggers reliably—this creates engineering and UX work beyond platform changes, and potentially new parental consent APIs between stores and developers.

Section 6

Enforcement via the FTC and state parens patriae actions

The Act treats violations as unfair or deceptive acts under the FTC Act, giving the FTC its usual investigative and remedial toolkit. States may bring civil actions on behalf of residents, but must notify the FTC and face limits while a federal action against the same defendants is pending. Practically, enforcement will rely on FTC rulemaking or complaint investigations plus state litigation, making coordination and evidentiary standards important for both regulators and defendants.

Section 7

Encryption protections and implementation constraints

Section 7 is a shielding clause: the Act cannot be read to require platforms to decrypt communications, to prevent end‑to‑end encryption, or to build capabilities that weaken user security. It also tells agencies and companies to implement requirements in a way that preserves encryption and avoids systemic monitoring. That language limits certain enforcement approaches and forces regulators and platforms to find compliance methods that do not rely on breaking encryption.

Section 8

Federal preemption of state law

Section 8 preempts state and local laws ‘relating to’ the Act’s subject matter. For states that have considered or adopted their own messaging or parental‑control mandates, this removes legal variance and centralizes the standard federally, but it also prevents states from imposing different or stronger protections on platforms operating within their borders.

Section 10

Effective date and compliance deadlines

The Act generally takes effect 180 days after enactment, but platforms get one year to implement parental controls and 18 months to implement app‑store warnings. Those windows matter for product roadmaps, procurement, and vendor contracting: companies must plan staged rollouts, testing, and recordkeeping to demonstrate compliance within the statutory windows.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Parents and legal guardians — gain explicit, statutory tools to control who can DM their minor children, including notifications, approval flows, and the ability to disable DMs entirely.
  • Younger minors (especially under 13) — receive an automatic default that disables direct messaging, reducing exposure to unmoderated private interactions and ephemeral content that can’t be reviewed.
  • Child welfare and safety organizations — get a uniform federal standard that platforms must follow, potentially simplifying outreach and coordination with platforms on safety features and reporting.
  • Vendors of parental‑control and consent technologies — will likely see new demand as platforms build verifiable parental consent flows, parental portals, and contact‑management features.
  • Privacy and security advocates — retain an explicit statutory commitment that platforms are not required to weaken encryption, reducing the risk of backdoor proposals tied to enforcement.

Who Bears the Cost

  • Social media platforms and app developers — must change product designs (remove or disable ephemeral messaging for minors), implement parental‑consent UIs, add anti‑circumvention measures, update policies, and maintain compliance logs.
  • App stores — must add UI warnings and potentially consent‑integration hooks, which requires engineering, QA, and policy work across millions of listings and localized store fronts.
  • Parents — pick up ongoing monitoring and administrative burden to approve contacts and manage settings; families that lack time or tech literacy may see less effective protection or more friction.
  • Small and niche platforms — face disproportionate compliance costs to rework messaging features, which could push some to limit features or exit certain markets rather than invest in bespoke parental‑control systems.
  • State governments — lose the ability to impose supplemental or alternative requirements on platforms within their jurisdictions because the Act preempts state law.

Key Issues

The Core Tension

The central dilemma is between protecting minors by restricting transient, hard‑to‑audit messaging and preserving privacy, encryption, and adolescent autonomy: effective parental controls and anti‑circumvention measures risk pushing platforms toward more invasive monitoring or burdensome identity checks, but the statute forbids weakening encryption—so regulators, platforms, and technologists must reconcile child safety with technical and privacy constraints without a clear, risk‑free path.

The bill leaves several implementation details unresolved that can create operational and enforcement difficulties. “Covered user” hinges on a platform’s actual knowledge or willful disregard that someone is a minor; litigating willful disregard will require showing what a platform knew and ignored. Similarly, verifiable parental consent is referenced to COPPA, but COPPA‑style mechanisms (credit card verification, government ID, or knowledge‑based checks) vary in usability and privacy tradeoffs; platforms must balance strong proof with not collecting excessive parent data.

The Act’s insistence that encryption not be weakened creates tension with enforcement and anti‑circumvention requirements. Platforms must prevent minors from bypassing parental controls, but the statute bars creating capabilities that undermine encryption—forcing technical solutions that work at the UX or account level rather than by monitoring message content.

Enforcement is assigned to the FTC and states, but the practical deterrent depends on investigatory resources, proof standards, and remedies; the Act does not specify particular civil penalties or private right of action, relying on the FTC Act framework. Finally, preemption simplifies the federal floor but removes state experimentation, and the one‑year/18‑month compliance windows will pressure product schedules while leaving open whether smaller players can realistically meet the requirements without disproportionate cost.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.