The STOP CSAM Act of 2025 restructures how federal courts, prosecutors, nonprofit clearinghouses, and online services handle crimes involving child sexual exploitation. It rewrites definitions and courtroom protections for child victims, expands restitution tools (including court‑supervised trustees for payment), and directs new duties and reporting standards for online providers that discover apparent child sexual exploitation on their services.
On the technology side the bill forces providers to submit richer CyberTipline reports to NCMEC within fixed timeframes, preserves material for law enforcement, requires annual public disclosures by large platforms, and creates both new civil remedies against platforms and criminal liability for interactive computer services that host or facilitate child sexual exploitation. The measure mixes victims‑focused procedural fixes with aggressive compliance and transparency obligations for industry — and backstops them with specified fines and statutory damages.
At a Glance
What It Does
Revises 18 U.S.C. 3509 to create a presumption protecting a ‘covered person’s’ identifying and sensitive information in federal proceedings; expands restitution statutes and authorizes court‑appointed trustees; rewrites provider reporting duties to NCMEC (including a 60‑day filing deadline and required data elements); adds criminal and civil liability for platforms and app stores; and mandates annual safety reports by large providers.
Who It Affects
Large online platforms, app stores, interactive computer services, domain registrars, NCMEC (and successor CyberTipline functions), federal prosecutors and courts, multidisciplinary child abuse teams, and victims of online child sexual exploitation (including those depicted in imagery).
Why It Matters
It tightens the data pipeline from platforms to law enforcement and creates novel private causes of action and criminal exposure for providers — shifting compliance costs and legal risk onto tech companies while broadening procedural protections and compensation paths for victims.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill broadens courtroom protections for people who were children when victimized. It recasts statutory terms (for example, defining “psychological abuse,” “exploitation,” and “protected information”) and creates a new category, “covered person,” to trigger presumptions against public disclosure of identifying and sensitive records.
Courts must now treat disclosure of a covered person’s protected information as presumptively harmful and deny public release unless that presumption is rebutted. The bill also modernizes testimony recording language (replacing “videotape” with “video recording”), requires recordings of adult attendants present with testifying children, and authorizes guardian ad litem funding and appointments with a new $25 million annual appropriation for courts to implement guardian protections.
On restitution and victims’ remedies, the Act expands the universe of offenses that trigger mandatory restitution (explicitly adding conduct under section 1466A where an identifiable minor is involved) and redraws definitions for “child pornography production” and “trafficking” to capture production and distribution scenarios across new and existing statutes. It authorizes courts to appoint trustees or fiduciaries to hold and administer restitution funds for minors, incapacitated victims, or certain foreign/stateless victims, sets parameters for trustee duties and fees, and authorizes $15 million annually to support trustee implementation.The technology‑facing provisions overhaul 18 U.S.C. chapter 110 reporting and liability rules.
Providers who obtain actual knowledge of apparent child sexual exploitation or apparent child pornography must report to NCMEC’s CyberTipline within 60 days and include a fixed set of data elements when reasonably available (technical hashes, account identifiers, whether media is software‑generated, whether it was publicly available, contextual messages, timestamps, and other locating information). NCMEC remains the clearinghouse and may forward reports to appropriate federal, state, local, or designated foreign law enforcement.
Providers must preserve materials; failures to report or preserve can trigger criminal fines (tiered by provider size and doubled/tripled if an individual is harmed) and civil penalties. The bill also creates an annual public reporting obligation for providers meeting thresholds (>1,000,000 monthly unique users and >$50,000,000 revenue), requiring disclosure of how providers receive and handle reports, safety‑by‑design practices, prevalence and trends, and assessments of tools and technologies — subject to limited redactions for law enforcement sensitivity and trade secrets.Finally, the Act creates new private and public liability pathways.
It adds a new civil cause of action targeting providers and app stores that intentionally, knowingly, or recklessly promote, host, store, or otherwise facilitate child sexual exploitation; plaintiffs can recover actual damages or statutory damages and attorney’s fees, and there is no statute of limitations. It also narrows limited‑liability protections for providers that engage in preservation, reporting, or research for abuse prevention, while imposing a new criminal offense on interactive computer services that intentionally host child pornography or knowingly promote or facilitate enumerated child‑sex offenses, with fines scaled for seriousness and harm.
The bill preserves other federal, state, and tribal remedies and clarifies severability.
The Five Things You Need to Know
Providers must submit a CyberTipline report to NCMEC within 60 days after obtaining actual knowledge of apparent child pornography or facts suggesting an imminent or planned sex‑exploitation offense, and the report must include specified account identifiers, timestamps, technical hashes, and whether imagery is software‑generated.
Criminal penalties for knowingly failing to report or preserve material are tiered by platform size: initial fines up to $850,000 for services with ≥100M monthly active users (up to $600,000 if smaller), with higher maxima for repeat violations and doubling of fines if an individual is harmed.
Large providers (more than 1,000,000 unique monthly users and >$50M revenue) must submit annual, disaggregated reports covering CyberTipline activity, safety‑by‑design practices, prevalence and trends, enforcement policies, and an efficacy assessment; the Attorney General and FTC publish those reports subject to redaction rules.
The bill creates a new private cause of action (added section 2255A) allowing victims to sue platforms or app stores for intentional, knowing, or reckless promotion, hosting, or facilitation of certain child‑sex offenses; statutory damages are $300,000 (or actual damages), plus fees, and there is no statute of limitations.
Section 2260B makes it a federal crime for an interactive computer service to intentionally host or make child pornography available or knowingly promote/facilitate specified child‑sex offenses, with fines up to $1,000,000 (and up to $5,000,000 if reckless risk or actual harm results).
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Courtroom protections, definitions, and guardian ad litem funding
This provision rewrites definitions used in federal proceedings to favor privacy and trauma‑informed handling of child victims. It adds a “covered person” class (anyone under 18 when victimized or who testified as a child), defines “protected information” broadly (identifiers, health, educational and juvenile records), and creates a rebuttable presumption that public disclosure is harmful. It also modernizes language around recorded testimony, requires video recording of adult attendants present with child witnesses, and authorizes $25 million annually to fund guardian ad litem appointments — shifting implementation duties and resources to the federal courts and multidisciplinary teams.
Expanded restitution scope and court‑supervised trustees for victims
The amendments expand mandatory restitution coverage to include offenses under 18 U.S.C. §1466A when an identifiable minor is involved and recast which offenses qualify as child‑pornography production or trafficking. Importantly, courts may now appoint a trustee or fiduciary to hold restitution payments for minor, incapacitated, or (in limited findings) foreign/stateless victims; the statute prescribes trustee duties, fee controls, conflict‑of‑interest limits, and allows courts to order defendants to pay trustee fees with implementation rules mirroring fine collection procedures. The statute authorizes $15 million per year to support trustee administration, creating a standing federal infrastructure for long‑term victim compensation.
CyberTipline reporting, required data elements, preservation, and forwarding
This is the bill’s largest operational change for industry. Providers must report to NCMEC’s CyberTipline within 60 days of gaining ‘actual knowledge’ of apparent child sexual exploitation or apparent child pornography, and include detailed data (account identifiers, IP and geolocation metadata, timestamps, chats/messages, industry hash values, whether material was publicly accessible, and an indicator if media is AI‑generated). Providers must also preserve material; NCMEC keeps its clearinghouse role and may compare hashes against its repository and forward matches to appropriate federal, state, local, or designated foreign law enforcement. The provision also tightens limits on how providers may share CyberTipline elements with other entities and explicitly permits NCMEC to include matching depictions from its database when forwarding reports.
Criminal and civil penalties for reporting failures and mandatory annual disclosures
The bill imposes criminal penalties for knowing failures to report or preserve material, with tiered maximum fines tied to platform size and increased penalties for repeat violations or if an individual is harmed. Separate civil fines apply for knowingly submitting false reports, failing to preserve material, or missing the annual disclosure requirement. Large providers meeting user and revenue thresholds must file an annual report (by March 31) to the Attorney General and FTC describing CyberTipline volumes, safety‑by‑design practices, tools deployed, prevalence metrics, and gaps — subject to redaction rules for law‑enforcement sensitivity and trade secrets. Enforcement proceeds (criminal fines and civil penalties) are directed to a victims’ reserve.
2255A — private suits against platforms and app stores; remedies and defenses
The bill inserts a new section creating a standalone private right against providers and app stores that intentionally, knowingly, or recklessly promote, host, store, or facilitate enumerated child‑sex crimes. Plaintiffs can seek actual damages or $300,000 statutory damages, attorney’s fees, punitive damages and injunctive relief, and there is no statute of limitations. The provision expressly states that Section 230 of the Communications Act does not bar these claims, but it carves out good‑faith compliance with legal process and establishes an evidentiary safe harbor for encryption‑related design choices while allowing those facts into evidence where relevant.
Criminal liability for interactive computer services
Added as a new section, §2260B criminalizes intentional hosting or making child pornography available and knowingly promoting or facilitating certain child‑sex offenses by interactive computer services. Penalties are fines up to $1,000,000, and up to $5,000,000 where there is a conscious or reckless risk of serious personal injury or where an individual is harmed. The text includes a narrow construction protecting actions taken to comply with valid legal process or for preservation requested by law enforcement.
Preserving other remedies and severability
The bill affirms that its provisions do not preempt or displace existing federal, state, or tribal causes of action or remedies for child sexual exploitation and contains a standard severability clause to preserve the remainder of the statute if portions are held unconstitutional.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Covered persons (individuals who were minors when victimized): The bill creates presumptive protections for identifying and sensitive information, increases access to guardian ad litem representation, and strengthens restitution pathways (including court‑supervised trustees) to secure and deliver compensation.
- Multidisciplinary child abuse teams and children’s advocacy centers: The statute formalizes their role in sentencing/restitution reporting and funds guardian ad litem activities, increasing federal support for coordination between medical, social, and prosecutorial actors.
- Law enforcement and prosecutors: They receive more structured, standardized, and richer data from providers (hashes, timestamps, message context, preservation), plus a legal funnel through NCMEC for prioritizing cross‑jurisdictional investigations and evidence preservation.
Who Bears the Cost
- Large online platforms and app stores: Must invest in expanded detection, preservation, reporting pipelines, compliance personnel, and produce annual public disclosures; they also face increased civil and criminal exposure with specified fines and new private lawsuits.
- Intermediate and smaller providers: Even if below annual‑report thresholds, the 60‑day reporting and preservation duties plus potential liability create compliance and legal burdens. Platforms that aggregate user content must decide resource allocation to avoid criminal or civil penalties.
- Federal courts and Administrative Office of the U.S. Courts: Will absorb implementation duties (guardian ad litem appointments, trustee supervision, handling new evidentiary regimes, and more civil cases), requiring funding and procedural adjustments despite the targeted appropriations.
Key Issues
The Core Tension
The central dilemma is straightforward: the Act substantially increases protections, transparency, and remedies for child victims by forcing platforms to disclose, preserve, and report detailed user‑level data — but doing so raises privacy, operational, and legal burdens for platforms and risks swamping NCMEC and law enforcement with data or incentivizing overly cautious engineering choices (including limits on encryption or product features) to avoid liability. The bill solves one problem (victim identification, evidence preservation, and accountability) at the cost of creating hard trade‑offs between privacy, platform design, and enforcement capacity.
The bill ties a victim‑centric presumption against public disclosure to a broad statutory definition of ‘protected information,’ but leaves courts with a non‑trivial balancing test when a public interest is asserted. That balancing will generate litigation over what counts as a ‘compelling public interest’ and which public‑interest alternatives could suffice without disclosure.
On the CyberTipline side, the Act mandates a large set of data elements be provided when reasonably available and permits NCMEC to forward matches from its repositories — but it does not create an explicit federal standard for how providers must detect or classify AI‑generated imagery, nor does it set technical standards for hash formats or retention length. These gaps will force technical and legal consensus processes between industry, NCMEC, and law enforcement.
Liability, enforcement, and operational capacity are closely coupled. The bill increases exposure — criminal fines, civil penalties, and a new private statutory remedy — while also requiring providers to preserve and, in some cases, make materials available for inspection.
That creates a risk of over‑preservation and over‑reporting (to minimize liability) that could overwhelm NCMEC and law enforcement with noise; conversely, aggressive enforcement could chill product features or encryption choices. The Act attempts to thread this needle by protecting good‑faith preservation and research and by allowing encryption‑related design choices to be considered at trial, but those defenses are fact‑intensive and uncertain.
Finally, cross‑border cases remain problematic: the bill contemplates limited relief for foreign or stateless victims via trustees, and authorizes forwarding to designated foreign law enforcement — but it does not resolve extraterritorial enforcement limits or data‑transfer conflicts with foreign privacy regimes.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.