AB 1137 creates a private civil remedy targeted at commercial sexual exploitation of minors and nonminor dependents. It gives courts discretion to amplify penalties against individual wrongdoers and establishes a standalone cause of action against social media platforms that 'facilitate, aid, or abet' exploitation.
The bill ties monetary recovery directly to victims — penalties go to the victim (and may be held in trust until adulthood) — and conditions a platform safe harbor on routine third‑party audits, timely mitigation, and corporate disclosure. For compliance officers and counsel, the bill shifts attention away from content takedown alone and toward product design, governance, and auditability as elements of legal risk.
At a Glance
What It Does
Authorizes courts to increase fines and civil remedies against people who commercially exploit minors and permits courts to award statutory damages against social media platforms that knowingly facilitate exploitation. The bill creates a limited safe harbor for platforms that submit to independent design and algorithm audits and take specified mitigation steps.
Who It Affects
Adult perpetrators of commercial sexual exploitation; operators and boards of social media platforms as defined by California law; third‑party auditors and trust and safety teams; minor victims and their representatives. It also implicates platform engineers and product teams responsible for features and affordances.
Why It Matters
This statute reframes liability from isolated content incidents to platform design and corporate governance choices, making product features and audit practices potential evidence in civil suits and a determinative factor for damages exposure.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill applies only to civil actions brought by or for the benefit of minors or nonminor dependents who were commercially sexually exploited by an adult or in cases where a social media platform facilitated the exploitation. When a court or other factfinder has discretion over the size of a fine or remedial penalty, the bill requires consideration of specific aggravating factors (multiple victims, substantial harm, and whether the defendant knew or should have known the victim’s status) when setting amounts.
It also removes consent as a defense to monetary remedies.
If a statutory framework does not already authorize a civil penalty for the conduct, the bill empowers courts to award a standalone civil penalty per act of exploitation and directs that any penalty ordered be paid to the victim. Courts may, at their discretion, hold those funds in trust for underage victims until they reach adulthood or are emancipated.On the platform side, the bill exposes social media companies to civil liability when their systems, designs, features, or affordances are a substantial factor in causing minors to become victims of commercial sexual exploitation.
The text creates a conditional compliance path: platforms can avoid liability by engaging independent auditors, acting to mitigate identified risks within set deadlines, and providing audit materials to corporate boards and the public with limited redaction for trade secrets.Definitions appear in the statute to align covered conduct with existing Penal Code offenses and to carve out certain services — notably end‑to‑end encrypted direct messaging and specified nonprofit platforms — from the definition of 'social media platform.' The bill also supplies a mechanics‑focused definition of when a platform is deemed to have knowledge of problematic material, tying that determination to persistent reporting using an external reporting mechanism referenced elsewhere in the code.
The Five Things You Need to Know
When a factfinder makes an affirmative finding on one or more aggravating factors, it may impose penalties up to three times the amount otherwise authorized for an act of commercial sexual exploitation.
If no statute authorizes a civil penalty for the conduct, the court may impose a civil penalty per act between $10,000 and $50,000.
For each act of commercial sexual exploitation facilitated, aided, or abetted by a social media platform, a court may award statutory damages between $1,000,000 and $4,000,000.
A platform avoids liability under the statute only if it: submits its designs, algorithms, practices, affordances, and features to an independent third‑party audit twice a year; takes mitigation action within 30 days of audit completion; and provides both board-level and public copies of the audit (trade secrets redacted) within 90 days.
The bill deems a platform to have knowledge of problematic material if that material was reported via the reporting mechanism specified in Section 3273.66 for four consecutive months, meets the criteria in that section, and was first displayed, stored, or hosted on the platform after January 1, 2025.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Scope: who the statute covers
This subsection limits the statute to civil actions for victims who are minors or nonminor dependents and ties the operative ages to the time of the defendant’s act. Practically, that means claims must be brought on behalf of people who were under the statutory categories at the time of exploitation; it is not a retroactive change to cases where ages were different at the time of the conduct.
Enhanced remedies and fallback civil penalties
Subdivision (b) instructs factfinders to weigh three aggravating factors and authorizes up to treble damages (three times the authorized amount) when an affirmative finding is made. Subdivision (c) supplies a fallback penalty range when no statutory civil penalty otherwise exists. The pair together create both an enhancement mechanism within existing statutory schemes and a standalone penalty vehicle where legislated remedies are absent.
Payment, trusts, and evidentiary rules
All penalties go directly to victims; courts may order amounts held in trust for minors until they attain majority or emancipation. The statute explicitly bars consent as a defense to monetary penalties. For practitioners, this centralizes financial relief for victims while removing common defenses that had limited recovery.
Platform liability standard and conditional safe harbor
This is the operative platform section. It bars platforms from knowingly facilitating commercial sexual exploitation and imposes a heavy statutory damages band for violations. It also establishes a compliance path: twice‑annual independent audits of design and algorithms, mitigation steps following audits, board notification, and public disclosure of audits (with trade secret redaction). Additionally, it defines when a platform will be treated as having knowledge based on repeated reporting through an identified reporting mechanism and limits the statutory rule on what 'facilitate, aid, or abet' means to deployment of systems or features that are a substantial factor in producing victimization.
Key definitions and carve-outs
This subsection aligns the statutory term 'commercial sexual exploitation' with specific Penal Code offenses and adopts the Welfare & Institutions Code definition for 'nonminor dependent.' It imports the California definition of 'social media platform' but expressly excludes end‑to‑end encrypted standalone direct messaging services and certain nonprofit platforms. These definitions will govern whether a particular service or incident falls within the statute’s reach.
Waiver unenforceable
The statute declares any waiver of its provisions void as against public policy. That bar prevents defendants or platforms from contracting away victims’ statutory protections, which affects settlement posture and release language in civil litigation.
This bill is one of many.
Codify tracks hundreds of bills on Justice across all five countries.
Explore Justice in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Victims (minors and nonminor dependents): The statute directs penalties to victims and permits courts to hold funds in trust for minors, creating a direct financial remedy and structured access to recovery.
- Victim advocacy organizations and attorneys representing victims: The availability of trebled or standalone statutory damages increases leverage in litigation and settlement negotiations and creates a clearer path to compensatory relief.
- Independent auditors and trust & safety consultants: The safe harbor regime creates repeat demand for specialized audits of product designs, algorithms, and affordances, and for remediation consulting services.
Who Bears the Cost
- Social media platforms and their shareholders: Platforms face the prospect of seven‑figure statutory damages per act unless they satisfy the audit-and-mitigation requirements, and they must periodically fund independent audits and remedial engineering work.
- Platform product and engineering teams: Product design choices, feature rollouts, and experiment pipelines will require additional documentation, risk assessment, and possibly changes to roadmaps to meet auditors’ and courts’ expectations.
- Defense counsel, insurers, and corporate boards: Boards must receive and review audit reports, legal teams will confront new discovery and risk of adverse jury findings tied to product design, and insurers may face exposure leading to higher premiums or coverage disputes.
Key Issues
The Core Tension
The bill forces a trade-off between robust victim remedies and imposing heavy, design‑level liability on platforms: protecting children by holding platforms accountable can incentivize safer product design, but it also risks over‑deterring legitimate features, imposing large compliance costs, and spawning expensive, technical litigation over causation and proprietary information.
Several implementation frictions and legal uncertainties merit scrutiny. First, the bill imports a reporting‑based knowledge rule that depends on a separate reporting mechanism (Section 3273.66); the statute’s practical effect hinges on how that reporting system functions, who can submit reports, and how platforms log and aggregate them.
Second, the audit-and-disclosure safe harbor raises a classic transparency-versus-trade‑secret problem: public release of audits increases accountability but may reveal proprietary algorithmic detail; the statute permits trade‑secret redaction but gives no standard for what may be redacted, inviting litigation.
Third, the statutory definition of 'facilitate, aid, or abet' centers on whether a system or feature is a 'substantial factor' in causing exploitation. That phrase will generate fact-intensive inquiries into product causation — from recommendation algorithms to UI affordances — and could lead to inconsistent judicial outcomes unless courts articulate a clear causation standard.
Finally, the burden and frequency of twice‑annual third‑party audits and the required board and public disclosures create ongoing compliance costs and institutional responsibilities that may particularly squeeze smaller platforms or services that straddle the defined carve-outs.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.