Sammy’s Law directs large social media platforms to expose a set of third‑party‑accessible, real‑time application programming interfaces (APIs) so a child — or a parent of a child under 13 — can delegate permission to an approved third‑party safety software provider to manage that child’s account, interactions, and settings and to transfer user data to that provider at least hourly. The bill sets a 180‑day compliance window for existing large platforms, defines the covered platforms and data, and caps what third‑party providers may do with transferred data (including a ban on sale and short deletion timelines).
Why it matters: the bill builds an enforceable, national channel for parental controls and third‑party safety tools while creating a new FTC registration, audit, and oversight regime for those providers. Its mechanics touch core technical issues — real‑time APIs, secure transfers, end‑to‑end messaging exceptions, and data‑deletion rules — and will impose engineering, legal, and compliance burdens on multi‑billion‑user platforms and safety‑app vendors alike.
At a Glance
What It Does
The bill compels platforms with >100 million monthly users or >$1 billion annual revenue to provide third‑party real‑time APIs that let a child or parent delegate account control and to push user data to approved safety apps at least once per hour. It creates FTC registration and annual audit requirements for safety software providers, prescribes deletion timelines, and bans the sale of data collected under the Act.
Who It Affects
Major social platforms that permit child accounts, third‑party safety software vendors seeking access, parents and children (defined as persons under 17), and the Federal Trade Commission as the enforcement and registration authority.
Why It Matters
It establishes a single federal standard that forces technical interoperability between platforms and safety tools, effectively creating a regulated market for parental‑control services while preempting state laws that would require similar APIs; the law also raises immediate security, privacy, and operational questions for implementers.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill defines a ‘‘child’’ as anyone under 17 with an account on a covered platform and defines a ‘‘large social media platform’’ by features and size thresholds (100 million monthly global active users or >$1 billion annual revenue). A covered platform that allows children to use the service must provide third‑party‑accessible, real‑time APIs so a child — or a parent of a child under 13 — can delegate to an approved safety app the ability to manage account settings, content, and online interactions in the same way the child could.
Platforms must also support secure transfers of ‘‘user data’’ to the safety app in a commonly used machine‑readable format, with transfers allowed at least once per hour.
The bill tightly limits what constitutes user data for these transfers: data created or received while a delegation is active and only for a 30‑day window starting on the date the content was created or received. Platforms must adopt state‑of‑the‑art safeguards for data in transit and disclose to the delegating child or parent when transfers occur and provide summaries of transferred data.
Messaging features that are designed to make platform access to message content technically infeasible (for example, end‑to‑end encryption) are treated differently: platforms do not have to expose message content through the API, but must provide a contemporaneous transmission channel to a registered, user‑designated recipient without altering or accessing the encrypted content themselves.Third‑party safety software providers must register with the Federal Trade Commission to use the API. Registration requires attestations including that the provider is not owned/operated by a covered nation, will use the data only to protect children from harm, will not sell the data, and will delete transferred user data promptly (generally within five days, with narrow retention exceptions for disclosures).
Registered providers must commission annual independent audits, submit reports to the FTC, and are subject to FTC oversight that can deny, suspend, or revoke registration. The FTC enforces the statute as an unfair or deceptive act or practice, will accept complaints, and will conduct biannual compliance assessments of covered platforms.
The law creates federal preemption for state laws that would require similar APIs while preserving other state consumer‑protection and tort authorities.
The Five Things You Need to Know
A ‘‘large social media platform’’ is any service that allows child use, enables sharing of images/text/video with users discovered via the service, and meets either >100,000,000 monthly global active users or >$1,000,000,000 annual gross revenue (adjusted for inflation).
Existing covered platforms must deliver third‑party‑accessible real‑time APIs and documentation within 180 days of enactment; platforms that reach the thresholds later must comply within 30 days of becoming covered.
Platforms must allow secure transfers of applicable user data to a delegated third‑party safety provider at least once per hour; transferred user data is limited to content created or received while a delegation is active and only for a 30‑day window after creation/receipt.
Third‑party safety software providers must register with the FTC, attest they are not operated by a covered nation, will not sell the data, will delete transferred data within 5 days (15 days after revocation), and submit an annual independent audit to the FTC.
The FTC enforces violations as unfair or deceptive acts or practices, can deregister or suspend third‑party providers for willful misconduct or material security risks, and the statute preempts state laws requiring equivalent API obligations while preserving unrelated state consumer and tort laws.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Short title — 'Sammy’s Law'
This single provision gives the Act its popular name. It has no operational effect but is how the statute will be cited in later regulatory and enforcement materials.
Definitions that drive scope and limits
Section 2 sets the critical thresholds and working definitions: who is a child (under 17), what counts as a large social media platform (feature set + size/revenue thresholds), what ‘‘user data’’ may be transferred (only data created/received while a delegation is active and only for 30 days), and who counts as a third‑party safety software provider. These definitions narrow the Act’s reach to very large consumer platforms and constrain data portability to a bounded temporal slice, which matters both for implementation effort and legal interpretation.
Platform obligations: APIs, transfers, revocation, and data security
This section requires covered platforms to build and make available real‑time APIs and necessary documentation so delegated safety apps can manage a child’s interactions and initiate secure transfers of user data at least hourly. Platforms must keep the interfaces available while a delegation is in effect and honor revocation events (including parental revocation, account deletion, or when the child turns 17). Platforms must implement reasonable, state‑of‑the‑art safeguards for transferred data and must notify the delegating child or parent and provide summaries of transferred data. The provision also carves out messaging features designed to prevent platform access (e.g., E2E encryption): platforms do not have to expose those messages via API but must offer a contemporaneous transmission channel that delivers the messages to a registered, user‑designated recipient without the platform itself accessing message content.
Third‑party provider registration, audits, and oversight
This subsection creates the FTC registration regime for safety app vendors. Applicants must make multiple attestations — including no control by a covered nation, purpose‑limited use of data, non‑sale of user data, prompt deletion timelines, and consumer disclosures — and must engage independent auditors annually. Registrants submit audit reports to the FTC; the FTC publishes summary findings and can require remediation. The Commission may deny, suspend, or permanently deregister providers for willful misconduct, gross negligence, material misrepresentations, or failure to remediate unusual audit findings.
Liability shield for platforms and limits on third‑party disclosures
Section 3(c) provides that platforms acting in good faith under the statute are insulated from civil damages in private suits arising from transferring user data to a registered third‑party provider. Section 3(d) tightly restricts onward disclosures by safety apps: they may not sell user data and may only disclose to lawfully authorized government requests, required legal obligations, parents/children in narrowly defined harm categories, to prevent imminent serious harm, or to appropriate child‑abuse reporting authorities. Registrants must notify delegating parents/children of disclosures unless notice would create risk or is legally prohibited.
Enforcement, FTC procedures, and federal preemption
Section 4 folds enforcement into the FTC’s unfair or deceptive practices authority, requires the FTC to set up complaint procedures and biannual compliance assessments, and preserves the agency’s full powers. Section 5 preempts state and local laws that would impose equivalent API obligations but explicitly preserves state consumer laws of general applicability, tort and contract law, and laws addressing fraud, unauthorized access, and required breach notifications.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Parents and legal guardians — gain a regulated mechanism to delegate account management to third‑party safety apps and receive summaries of transferred activity to monitor serious risks to children.
- Children at risk — third‑party tools can act on delegated authority to change privacy settings, block interactions, and surface warning signs to caregivers, potentially reducing exposure to harm.
- Third‑party safety software providers — obtain a federal pathway and predictable access to platform data and controls, creating a market opportunity with FTC‑backed legitimacy for compliant vendors.
- FTC and consumer protection officials — receive explicit statutory authority, complaint processes, and audit reporting to oversee a new class of services focused on child safety.
Who Bears the Cost
- Large social media platforms — face significant engineering and operational costs to design, document, secure, and maintain real‑time APIs and to support hourly data transfers and specialized interfaces for encrypted messaging features.
- Safety software providers — must satisfy registration attestations, invest in security and privacy controls, commission annual independent audits, and absorb compliance costs that could raise barriers to small entrants.
- Federal Trade Commission — will need resources to review registrations and audits, conduct biannual assessments, run complaint mechanisms, and handle enforcement actions under expanded responsibilities.
- Children’s privacy advocates and encryption proponents — may bear indirect costs if the law’s mechanisms increase parental surveillance, pressure providers to weaken technical controls, or create incentives to route message content through third parties.
Key Issues
The Core Tension
The bill’s central dilemma is straightforward: enable parents and vetted third‑party tools to intervene on behalf of children (improving safety and accountability) while avoiding undue erosion of children’s privacy, weakening of encryption protections, and creation of new vulnerabilities or commercial gatekeepers — a conflict between empowerment and protection that has no mechanically perfect policy solution.
The bill advances a familiar policy goal — giving parents tools to protect children online — but packs several implementation and policy trade‑offs into a compact statutory framework. Technically, the requirement to provide ‘‘real‑time’’ APIs and hourly transfers is feasible but nontrivial: platforms will need to catalog data elements that qualify as ‘‘user data’’ under the 30‑day window, implement robust rate‑limiting and authentication, and design secure channels that interoperate across many vendors.
The messaging carve‑out attempts to respect end‑to‑end encryption but forces platforms to build a separate contemporaneous transmission mechanism to deliver encrypted content to a user‑designated third party; that introduces operational complexity and potential new attack surfaces if not designed carefully.
On the governance side, the registration plus annual audit model creates a regulated market for safety apps but also establishes recurring costs and entry hurdles that may advantage well‑capitalized firms and raise questions about competition and innovation. The prohibition on registrants being operated by or supplying data to covered nations addresses national security concerns but could complicate cross‑border services and vendor selection.
Finally, the indemnity for platforms — shielding them from civil suits so long as they acted in good faith under the Act — reallocates risk from private litigation to federal enforcement, which may be efficient in some respects but places a high burden on the FTC to police misuse and to act where harms slip through the regulatory net.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.