The STOP CSAM Act of 2025 amends multiple provisions of Title 18 to (1) expand and clarify court protections for children and former-child victims in federal proceedings, (2) change restitution and victim‑payment rules, and (3) impose new reporting, preservation, transparency, and liability obligations on online service providers and app stores. The bill also creates a new standalone civil remedy for victims against platforms that host, promote, or facilitate child sexual exploitation.
This matters because the bill moves beyond narrow criminal enforcement: it imposes time-bound reporting and preservation duties to NCMEC, establishes meaningful civil exposure for platforms (including damages and punitive remedies), scales fines to service size, authorizes court‑supervised trusts for restitution, and injects transparency requirements into large online services’ operations. Compliance, enforcement, and evidentiary interactions between law enforcement, NCMEC, courts, and industry will change materially if enacted.
At a Glance
What It Does
The bill broadens the statutory definition of covered victims and protected information in 18 U.S.C. §3509, creates a presumption against public disclosure of that information, and funds guardian ad litem services. It requires providers to submit detailed CyberTipline reports to NCMEC within 60 days of knowledge, preserve material, and deliver annual transparency reports for qualifying services. The Act adds criminal and civil penalties for providers that knowingly fail obligations, creates a new criminal liability provision for services that host or facilitate child pornography, and establishes a new private cause of action against platforms and app stores.
Who It Affects
Large online platforms and app stores (annual revenue > $50M and more than 1,000,000 monthly users for annual-reporting triggers), interactive computer service providers generally (new criminal exposure under section 2260B), NCMEC and the CyberTipline, federal courts and probation offices, multidisciplinary child‑abuse teams, guardians ad litem, prosecutors, and victims seeking restitution or civil damages.
Why It Matters
The bill changes where risk sits: it pairs expanded victim protections in court with operational mandates and transparency obligations that push technological and compliance costs onto platforms, while narrowing some of the safe‑harbor space services have relied on. It also creates concrete pathways for victims to recover damages and for courts to manage restitution for minor victims.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill is a package of interlocking changes that treats court protections for children and platform responsibilities as two halves of the same policy. First, it revises 18 U.S.C. §3509 (protecting child victims and witnesses) by expanding the statutory vocabulary (adding “covered person,” “protected information,” and an explicit definition of “psychological abuse”), creating a legal presumption that public disclosure of a covered person’s protected information is harmful, and authorizing courts to designate additional categories of protected information by order.
It also requires guardians ad litem to gather and present victim impact information in child‑appropriate forms and authorizes $25 million annually to the federal courts to support guardian ad litem appointments.
Second, the bill tightens restitution and victims’ rights statutes. It revises multiple restitution provisions to broaden which offenses count as “child pornography production” and “trafficking,” clarifies that restitution may be ordered for certain related offenses (including those under 1466A), and authorizes courts to appoint trustees or fiduciaries to hold and disburse restitution for minor, incapacitated, or foreign‑resident victims; Congress authorizes $15 million annually to support that trustee mechanism.
The net effect is to make restitution administratively simpler to hold and route to vulnerable victims while spelling out trustee duties, conflict‑of‑interest rules, and payment options.Third, the bill rewrites the provider reporting regime (amending section 2258A and related sections). It imposes a 60‑day reporting floor: once a provider obtains actual knowledge of apparent child pornography or facts pointing to certain child exploitation offenses, the provider must report to NCMEC’s CyberTipline and preserve material.
The statute specifies what must be in a report—technical identifiers, contextual communications, historic timestamps, whether the depiction is machine‑generated, and whether content was publicly accessible—and directs NCMEC to forward reports to appropriate federal, state, local, or designated foreign law enforcement. The bill also adds criminal and civil fines for knowing failures to report or preserve, with dollar amounts scaled to a provider’s monthly active user base and enhanced penalties when an individual is harmed.Finally, the Act alters civil liability.
It expands the private‑right‑of‑action statute for victims and creates a new section (2255A) that enables civil suits against interactive computer services and app stores for intentional, knowing, or reckless promotion, aiding and abetting, or hosting of child sexual exploitation; remedies include a $300,000 liquidated‑damages floor (or actual damages), attorney’s fees, punitive damages, injunctive relief, and no statute of limitations. The bill preserves defenses for compliance with lawful process, shields good‑faith research uses and certain provider actions under a limited safe harbor scheme, and sets out an encryption‑aware evidentiary regime and an affirmative, time‑limited removal defense for providers that act promptly.
The Five Things You Need to Know
Providers must submit a CyberTipline/NCMEC report within 60 days after obtaining actual knowledge of apparent child pornography or facts suggesting specified child‑exploitation offenses, and must preserve material identified in those reports.
Criminal fines for a knowing failure to report or to preserve evidence are scaled by size: up to $850,000 (initial) or $1,000,000 (repeat) for providers with ≥100M monthly active users, and lower thresholds for smaller services; fines double if an individual is harmed.
Qualifying providers (more than 1,000,000 unique monthly visitors/users and >$50,000,000 in prior‑year revenue) must file an annual transparency report to DOJ and the FTC covering CyberTipline counts, policy frameworks, safety‑by‑design practices, detection tools, and prevalence trends; those reports are publishable with redactions.
The bill creates a new civil cause of action (new 2255A) allowing victims to sue platforms and app stores for hosting, promoting, or aiding child sexual exploitation, with a $300,000 liquidated‑damages floor, costs and attorney fees, punitive damages, injunctive relief, and no statute of limitations.
Courts may appoint trustees or fiduciaries to hold and disburse restitution for minor, incapacitated, or foreign victims; the bill authorizes $15M annually to support implementation and sets duties, fee rules, and restrictions on trustee profits.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Expanded definitions, presumptions, and guardian ad litem funding
This section replaces and expands key definitional language: it adds ‘covered person’ and a detailed definition of ‘protected information’ (names, identifiers, medical and educational records, and any court‑designated protected material). It creates a rebuttable presumption that public disclosure of protected information would be detrimental and instructs courts to deny disclosure requests unless a party demonstrates a compelling public interest that outweighs harm to the covered person. Practically, this raises the bar for media or third‑party requests to access victim identifiers and gives judges explicit statutory authority to expand protections. The section also mandates guardian ad litem duties that include collecting child‑appropriate victim impact statements and authorizes $25 million per year to the federal courts to implement these appointments, signaling Congressional intent to resource representation for child victims in federal proceedings.
Broader restitution triggers and court‑supervised trust mechanism
Section 3 harmonizes and broadens definitions related to ‘child pornography production’ and ‘trafficking’ across the restitution statutes so more offenses are actionable for restitution. Importantly, it authorizes courts to appoint trustees or fiduciaries to hold restitution payments for minors, incompetent victims, or certain foreign victims when necessary to protect safety or ensure access. The bill specifies trustee duties (no conflicts, no profiteering beyond ordered fees), allows courts to order defendants to pay trustee fees, and authorizes $15 million annually to support the courts’ administration of these trusts. For practitioners, the change reduces practical frictions in delivering court‑ordered payments to vulnerable or cross‑border victims but creates a new administrative layer and fee schedule that courts must manage.
Mandatory 60‑day reports to NCMEC, preservation duties, and graded penalties
This extensive rewrite of 18 U.S.C. §2258A imposes a concrete duty to report to NCMEC’s CyberTipline within 60 days after obtaining actual knowledge of apparent child pornography or facts indicative of specified crimes. The statute prescribes the report contents (technical identifiers, contextual communications, timestamps, whether content was machine‑generated, and whether it was publicly accessible) and directs providers to follow CyberTipline formatting. NCMEC is instructed to forward reports to appropriate law enforcement. The bill also creates criminal and civil penalties for knowing failures to report or preserve, with higher maximum fines for very large services and enhanced fines if an individual is harmed. It adds an annual reporting obligation for larger providers (1M+ monthly users and >$50M revenue) describing safety‑by‑design practices, detection tools, policies, prevalence metrics, and gaps—subject to limited redaction for law‑enforcement sensitivity and trade secrets.
Expanded victim suits and a new private cause of action against platforms
Section 5 expands the existing §2255 private‑right‑of‑action to cover more offense categories and victims depicted in child pornography or in 1466A visual depictions. Critically, it creates §2255A, a new, narrower but robust civil remedy aimed squarely at online services and app stores: victims can sue for intentional, knowing, or reckless hosting, promotion, or aiding and abetting certain child sexual‑exploitation offenses. Remedies include actual or liquidated damages ($300,000 floor), attorney’s fees, punitive damages, and injunctive relief; there is no statute of limitations. The provision preserves defenses for compliance with lawful process, a narrow removal/disablement safe harbor if providers act quickly (48 hours for larger providers; 2 business days for smaller ones), and makes specific provisions about admissibility of encryption‑related evidence.
Non‑preemption and severability clauses
These concluding sections make clear that if any provision is held unconstitutional the rest survives, and that nothing in the Act preempts or diminishes existing federal, state, or tribal remedies that are at least as protective of victims. In other words, the Act is written to layer additional protections and private remedies on top of existing law rather than to replace them, which matters for parallel civil litigation and state enforcement.
This bill is one of many.
Codify tracks hundreds of bills on Justice across all five countries.
Explore Justice in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Covered persons and child victims — stronger presumptions against public disclosure, expanded definitions of protected information, court‑authorized guardians ad litem, and easier judicial routing of restitution through trustees improve privacy, participation, and payment access.
- Victim service organizations and multidisciplinary child‑abuse teams — clarified evidence‑preservation and reporting rules should produce more actionable referrals and better forensic material, and authorized funding (guardian ad litem and trustee administration) supports service capacity.
- Prosecutors and law enforcement — standardized, richer CyberTipline reports and preserved forensic materials will streamline investigations and cross‑jurisdictional referrals; NCMEC’s mandate to compare hashes and forward matches centralizes investigative leads.
- Civil litigants who are victims — a new, explicit private cause of action (2255A) creates a direct, high‑value civil pathway to recover damages from platforms and app stores that intentionally, knowingly, or recklessly host or facilitate child sexual exploitation.
- Courts (in theory) — statutory clarity on protected information, presumptions, and trustee mechanisms gives judges clearer tools for balancing transparency, victim safety, and restitution administration.
Who Bears the Cost
- Large online platforms and app stores — new compliance obligations (60‑day reporting, preservation, record‑keeping), annual transparency reporting, potential criminal fines up to seven figures and civil penalties, and exposure to multi‑million‑dollar civil judgments and punitive damages.
- Smaller platforms and start‑ups — while some thresholds target very large services, the statutory definitions and criminal liability provisions create legal risk and compliance costs, and smaller providers may face difficult choice points around detection tools and content moderation.
- Technology and product teams — mandated ‘safety by design’ disclosures and developer‑facing changes (detection, age‑identification, parental controls) mean substantial engineering and privacy trade‑offs, and possible redesigns to avoid certain liabilities.
- Federal judiciary and Administrative Office of the U.S. Courts — the Act authorizes funding but also creates administrative work (guardian ad litem administration, trustee supervision, more in‑camera protections) and likely increases docket complexity.
- Providers’ users and privacy advocates — increased reporting and retention may raise concerns about data collection, retention periods, and how preserved material is used for research or forwarded to law enforcement.
Key Issues
The Core Tension
The central dilemma is stark: the Act presses platforms to become more active gatekeepers to protect children—by requiring detection, reporting, preservation, and transparency—while also demanding that victims’ privacy and safety be preserved, and that encryption and lawful‑process compliance be protected. Strengthening victim privacy and enabling civil recovery both require more data and more centralized handling of sensitive content; simultaneously, expanding liability pushes platforms toward more aggressive content controls or feature restrictions, which can undermine privacy, encryption, and lawful speech. There is no solution in the bill that simultaneously minimizes data collection, preserves full end‑to‑end encryption, and eliminates platform misuses without creating gaps in victim identification or shifting burdens in ways that reshape online services.
The bill concentrates power and responsibility along two axes—courts/justice actors and private platforms—creating implementation frictions. Operationalizing the 60‑day reporting and preservation obligation will force providers to calibrate between rapid reporting and the risk of under‑ or over‑reporting; “actual knowledge” and what is “reasonably available” information are fact‑dependent standards that will generate litigation.
NCMEC’s expanded clearinghouse role and the requirement to forward reports to designated law enforcement increase reliance on a private nonprofit to gate criminal referrals, which raises questions about capacity, governance, and oversight of triage decisions.
On the civil and enforcement side, the Act ties penalties to platform scale and triples fines when individuals are harmed, but it leaves open contested questions about what constitutes harm, how to prove recklessness or promotion by a platform, and which design choices (e.g., end‑to‑end encryption, metadata minimization) are protected. The annual reporting regime demands operational detail (detection tools, prevalence metrics, safety‑by‑design), but the statute tries to protect trade secrets and law‑enforcement sensitive information through a redaction regime—those carveouts will produce disputes and possibly inconsistent public disclosures.
Cross‑border enforcement is also unresolved: obligations and liability apply to services that operate in or affect U.S. commerce, but evidence preservation and restitution for foreign victims will involve diplomatic, privacy, and enforcement complexities.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.