Codify — Article

Sammy’s Law: third‑party safety APIs and rules for large social platforms

Creates an FTC‑regulated process for parents or children to delegate account controls to accredited safety apps, with strict data, residency, and audit requirements.

The Brief

Sammy’s Law establishes a legal framework that lets a child (13+) or that child’s parent or guardian delegate account‑management rights on qualifying large social media services to an accredited third‑party safety software provider via real‑time application programming interfaces (APIs). The bill sets deadlines for API availability, defines which platforms are covered, and ties registration, data‑handling, and security obligations for third‑party providers to Federal Trade Commission (FTC) oversight.

The measure matters because it converts a parental control idea into a regulatory routine: platforms must enable technical access, and third‑party safety vendors must meet U.S.‑centric security and audit rules before they can operate. That combination reshapes how child safety tools access platform data, reallocates operational burdens among platforms, vendors, and the FTC, and raises tradeoffs between child protection, privacy, and market access for smaller vendors.

At a Glance

What It Does

The bill directs covered platforms to publish and maintain real‑time APIs that allow a child or guardian to delegate the child’s account controls to a registered third‑party safety provider and to push user data in machine‑readable form to that provider. It limits what delegated managers can change (settings and protections) and requires platforms to support frequent secure transfers of data.

Who It Affects

Covered entities are 'large social media platforms' — services that allow sharing with people discovered through the service and that exceed 100 million monthly global users or $1 billion annual revenue. Directly affected actors include those platforms, parent/guardian and child users, third‑party safety software vendors that must register with the FTC, and the FTC itself as regulator and auditor.

Why It Matters

Sammy’s Law creates a uniform, national technical and regulatory path for safety vendors to operate with platform access — potentially expanding the market for parental‑control and safety tooling while imposing specific security, data‑residency, and audit obligations that could exclude non‑U.S. firms and raise compliance costs for platforms and vendors.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The bill defines the covered universe tightly. A 'large social media platform' is a service that children may use, that lets children share images/content with users they meet through the service, and that either exceeds 100 million monthly global active users or generates over $1 billion in annual gross revenue (indexing for inflation).

Several categories are carved out (pure news sites, services focused on selling professional services or products, and certain messaging features that do not present the same open‑discovery risks).

Covered platforms must publish third‑party‑accessible, real‑time APIs and provide any necessary developer information within 30 days of the law becoming effective for existing covered services or within 30 days of crossing the coverage thresholds for later entrants. When a delegation is made by a child (13+) or a parent/guardian, the platform must keep the APIs available to the approved third party until revocation, account deletion, the third party’s rejection, or a change in the third party’s required attestations.

Platforms must also implement reasonable policies and technical practices to secure transfers of user data, and they must disclose to the child (and the parent/guardian if they delegated) that a delegation exists and provide a summary of data transferred and any subsequent changes.The bill narrows what 'user data' and delegated management entail: user data are items needed to have a profile or content created by or sent to the child while a delegation is in effect, and only for a rolling 30‑day window beginning when content is created or received. Platforms must allow secure transfers of that data in a commonly used, machine‑readable format and may not restrict transfer frequency to less than once per hour.

Delegated management is limited to actions that protect the child from harm — e.g., optimizing privacy, age, and marketing settings — not general moderation or platform governance beyond the child’s account rights.Third‑party safety software providers must register with the FTC before accessing APIs. Registration requires affirmations and a security review demonstrating U.S. incorporation, absence of foreign control, limited use and disclosure of transferred data, domestic storage/processing of data, and prompt deletion timelines (generally within 14 days of receiving the data, with special handling for disclosures and a 30‑day deletion window after cancellation of the third‑party account).

Registered providers must commission annual independent audits, submit audit reports to the FTC (with a public summary), and are subject to adverse actions — including denial, suspension, or de‑registration — for willful misconduct, gross negligence, material misrepresentations, or unresolved unusual audit findings. The FTC gains enforcement authority by treating violations as unfair or deceptive acts or practices under the FTC Act; it must also issue compliance and authentication guidance and perform biannual compliance assessments.

The law preempts contrary state laws that would impose their own API‑access requirements while preserving state consumer‑protection, tort, and anti‑fraud laws. The Act takes effect once the FTC issues its guidance required under the statute.

The Five Things You Need to Know

1

A platform meets the 'large social media' threshold if it has >100,000,000 monthly global active users or >$1,000,000,000 in annual gross revenue (adjusted annually for inflation).

2

Platforms must make APIs and related developer information available either within 30 days of the Act’s effective date (if already covered) or within 30 days of becoming covered thereafter.

3

User data accessible via delegation is limited to content created by or sent to the child while a delegation is active and only for a 30‑day window from creation or receipt; platforms must allow transfers at least hourly.

4

Third‑party safety providers must be U.S. companies not controlled by foreign persons, store and process delegated user data on hardware in the United States, delete received user data within 14 days (subject to specific exceptions), and undergo annual independent audits submitted to the FTC.

5

The FTC enforces the law by treating violations as unfair or deceptive practices under the FTC Act, may deny or suspend third‑party registrations, and must publish guidance and conduct biannual compliance reviews; platforms that follow the law in good faith get indemnification from third‑party transfer liability in private suits.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 3

Who and what the law covers (definitions and thresholds)

Section 3 sets the technical scope. It defines 'child' (under 17), 'large social media platform' (functional definition plus bright‑line user/revenue thresholds), 'third‑party safety software provider,' 'user data,' and other operative terms. Practically, the statutory definitions gate which platforms must publish APIs and which actors must register with the FTC; the revenue/user thresholds are the easiest compliance levers for platforms to evaluate coverage.

Section 4(a)

Platform responsibilities and API access mechanics

This subsection requires covered platforms to create, maintain, and make available real‑time APIs and developer information to registered third‑party safety vendors within defined 30‑day timelines. It prescribes ongoing access after a delegation is made until revocation or other stopping events, requires reasonable security policies for transfers, and directs platforms to notify the child (and parent/guardian where applicable) and provide a summary of transferred data. It also constrains delegated actions to protective account‑level settings and related safety measures, not broad moderation authority.

Section 4(a) — user data & transfer rules

Scope, timing, and format of data transfers

The Act narrowly defines 'user data' as profile or content related to the child created or received while a delegation is active and only for a 30‑day window from the time the content is produced or received. Transfers must be in a commonly used, machine‑readable format and platforms cannot limit transfer frequency to less than once per hour. That combination is intended to balance timeliness (near real‑time monitoring and intervention) with a statutory limit on historical access.

4 more sections
Section 4(b)

Registration, attestations, and data residency for third‑party providers

Third‑party safety software providers must register with the FTC and make several specific attestations: U.S. incorporation, absence of foreign control, limited use/disclosure of user data, domestic processing and storage, and deletion commitments (generally within 14 days, with particular rules for disclosed records and 30‑day deletion upon account cancellation). Registration also triggers a security review requirement; the FTC can require an independent auditor’s written report to verify compliance.

Section 4(b)(2) — audits and FTC powers

Annual audits, reporting, and corrective authority

Registered third parties must retain independent auditors annually and submit audit reports to the FTC (with public summaries). The FTC reviews those audits, may direct remediation for unusual findings, and can take adverse actions — deny registration, suspend, or permanently de‑register — for willful misconduct, material misrepresentations, gross negligence, or failure to remediate. The statute builds an administrative pathway for ongoing oversight rather than a one‑time vetting.

Section 4(f)

Narrowly enumerated permitted disclosures of user data

Third parties may disclose user data only in limited circumstances: lawful government requests (warrants, subpoenas), disclosures required by law, to the child or delegating parent/guardian with a requirement to limit disclosure to narrowly relevant harms, to prevent an imminent threat to safety, or to public‑health/child‑abuse authorities. The bill adds a reporting obligation to notify the child/parent unless notification would itself create risk or is legally prohibited.

Sections 5–7

Enforcement, preemption, and effective date mechanics

The FTC enforces the statute by treating violations as unfair or deceptive acts under the FTC Act, with the full suite of FTC remedies and procedures. The law preempts state laws that would impose their own API‑access requirements but preserves state consumer‑protection, tort, and anti‑fraud statutes. The Act becomes effective when the FTC issues guidance required under the statute, and the FTC must also publish implementation and authentication guidance and run biannual compliance assessments.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Parents and legal guardians — gain an avenue to delegate account controls and receive summaries of transferred data, enabling external safety services to act on privacy and settings quickly on behalf of a child.
  • Children at risk — the regime increases the ability of specialized safety tools to intervene or alert caregivers based on near real‑time data and account settings, potentially reducing exposure to harms the statute enumerates.
  • Accredited safety‑software vendors that meet the rules — those that can comply with the U.S. residency and audit requirements gain privileged technical access to platforms and a clearer regulatory status to market services to caregivers.

Who Bears the Cost

  • Large social media platforms — must build and maintain secure, documented real‑time APIs, implement data‑transfer and notification flows, and operationalize revocation controls and compliance processes within tight timelines.
  • Small or foreign safety vendors — face an exclusionary hurdle: the registration requires U.S. incorporation, no foreign control, domestic data processing, and annual independent audits, which raises onboarding costs and may bar startups or international specialists.
  • The Federal Trade Commission — must expand regulatory, audit‑review, and enforcement capacity (guidance, registration, audit review, biannual assessments), creating administrative burdens and resource allocation questions.

Key Issues

The Core Tension

The central dilemma is balancing rapid, delegated intervention to protect children with robust limits on data sharing and tight security controls: empowering third parties to act fast requires granting them access to sensitive, near‑real‑time data, but concentrating that access risks privacy breaches, market concentration, and the creation of new single points of failure that could undermine the very safety the law aims to improve.

The bill packs detailed operational rules into a public‑safety goal, but several implementation tensions remain. Authentication is central: ensuring a third party’s right to act on behalf of a child requires robust identity and consent verification, yet the statute leaves the technical approach to FTC guidance.

That gap creates both privacy risks (overly permissive verification could enable hijackings) and safety risks (overly strict verification could block legitimate interventions). The data scope limitations — a 30‑day window and hourly transfer floor — narrow historical access but may not provide enough context for some safety analyses (patterns often emerge over months), raising a trade‑off between data minimization and analytic usefulness.

The law also recasts liability and market structure. Platforms receive indemnification in private suits if they comply in good faith, which shifts practical legal exposure toward third‑party providers even as those vendors must shoulder the operational and audit burdens.

The U.S.‑only residency and no‑foreign‑control requirements limit the pool of qualified vendors and could concentrate the market among larger domestic players able to bear audit and infrastructure costs; that outcome could reduce competition and innovation in safety tooling. Finally, the FTC is asked to perform dual roles — gatekeeper for registration and active regulator for audits and compliance reviews — which assumes significant administrative bandwidth and technical expertise that the statute does not fund or phase in.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.