AB 392 creates a statutory regime that forces operators of pornographic internet websites to verify that sexually explicit material does not depict nonconsenting persons or minors. The bill requires uploaders to submit a signed certification (not under penalty of perjury) that everyone shown was an adult when the material was created, consented to the depiction, and consented to the upload; operators must verify the uploader’s email and retain the certifications for seven years.
The bill couples recordkeeping and verification duties with civil and infraction penalties: uploaders who knowingly lie face a $1,000 infraction, depicted individuals can sue for damages up to $75,000 per violation (plus punitive damages and fees), and public prosecutors can seek $25,000 civil penalties. The measure shifts verification risk onto websites and creates potential operational, privacy, and enforcement challenges for platforms and creators alike.
At a Glance
What It Does
Requires pornographic website operators to exercise ordinary care and reasonable diligence to prevent display of nonconsensual or underage sexually explicit material, mandates user-submitted consent certifications before upload, email verification, and seven-year retention of records. Establishes both private civil remedies and prosecutor-initiated civil penalties for violations, and imposes an infraction for knowingly false uploader certifications.
Who It Affects
Commercial adult-hosting sites and platforms that solicit or allow public uploads of sexually explicit imagery (including AI-generated imagery), the individuals who upload that content, content-moderation and compliance teams, and attorneys or prosecutors who may bring enforcement actions. Pure email, direct messaging, and basic cloud storage services are excluded from the definition of internet website for this chapter.
Why It Matters
The bill reallocates the burden of proof and verification onto platform operators and uploaders rather than victims, creates substantial statutory damages and per‑day liability for lingering content, and requires platforms to collect and store identifying certifications—shaping moderation workflows, vendor needs (ID/consent verification), and privacy risk profiles for any service that hosts or solicits adult content.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
AB 392 builds a new statutory framework around what it calls “pornographic internet websites.” It defines key terms: a “depicted individual” is someone shown nude or engaged in sexual conduct who either did not consent to the depiction, was a minor when the material was made, or did not consent to the upload. “Sexually explicit content” explicitly covers visual imagery, including AI-generated or substantially digitized images, that depict sexual conduct and lack serious literary, artistic, political, or scientific value. The bill excludes ordinary email and basic cloud-storage services from the website definition.
Under the bill, operators must exercise ordinary care and reasonable diligence to ensure sexually explicit items on their sites do not include a depicted individual. Practically, that obligation is linked to a procedural requirement: before allowing an upload, the operator must obtain from the uploader a written certification that each person depicted was an adult at creation, consented to the depiction, and consented to the upload.
The operator must verify the uploader’s email address and keep the certification and contact information in a readily available form for seven years; the operator may require the submission be made through a particular mechanism.Liability and enforcement combine private and public remedies. If an operator fails to obtain the uploader statement, the operator is presumed to have violated its duty of care; it can rebut that presumption only by showing, by a preponderance of the evidence, that it took other reasonable verification steps consistent with its duty.
A depicted individual can sue the operator and the uploader for violations; the statute authorizes actual or statutory damages up to $75,000 per violation (whichever is greater), punitive damages, attorney’s fees, and injunctive relief. A public prosecutor may also bring a civil action with a $25,000 civil penalty per violation and other equitable remedies.
The bill treats each full calendar day that material remains accessible beyond an identified 48‑hour removal window as a separate violation, substantially increasing potential exposure for platforms.The text mixes clear operational mandates (email verification, seven‑year retention, a specific uploader statement) with open standards (ordinary care; reasonable diligence) that will require platforms to set procedures and documentation to rebut presumptions of liability. It also brings AI-generated imagery expressly into scope and creates both statutory and infraction-level penalties for uploaders who knowingly submit false certifications, a framework that tilts compliance costs toward website operators and raises new data protection concerns because sites will hold long-lived records tied to sexually explicit uploads.
The Five Things You Need to Know
The bill defines a “depicted individual” to include anyone shown without consent, anyone who was a minor when the content was created, or anyone who did not consent to the upload.
Before permitting an upload, platforms must obtain a non‑perjury written certification from the uploader that every person depicted was an adult at creation, consented to the depiction, and consented to the upload; knowingly falsifying that statement is an infraction carrying a $1,000 fine.
Operators must verify the uploader’s email address prior to upload and retain the certification and contact information in a readily available format for at least seven years.
A depicted individual may sue for actual or statutory damages (up to $75,000 per violation, whichever is greater), punitive damages, attorney’s fees, and injunctive relief; a public prosecutor can seek a $25,000 civil penalty per violation.
The statute treats each full calendar day that qualifying content remains online beyond a 48‑hour removal window as a separate violation, multiplying potential liability exposures for operators.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Definitions and scope
This section sets the boundaries: it narrows what counts as a ‘pornographic internet website’ and excludes email and basic cloud-storage services. It defines ‘sexually explicit content’ to include AI-generated and substantially digitized imagery and creates the operative statutory concept of a ‘depicted individual.’ For implementers, the definition package is important because it determines when the rest of the chapter applies and explicitly pulls AI imagery into the same regulatory net as photographed content.
Operator duty of care
The bill requires operators to exercise ordinary care and reasonable diligence (a civil‑law negligence standard) to prevent hosting content that includes a depicted individual. That language is flexible by design: it creates a litigation standard rather than prescribing a checklist. Operators will need documented policies, vendor contracts (for ID or consent verification), and audit trails to demonstrate they met this standard in the face of claims.
Uploader certification, email verification, and records
This is the operational core. Uploaders must submit a certification that each person shown was an adult at creation, consented to being depicted, and consented to the upload; operators must verify the uploader’s email before permitting upload, may mandate a particular submission mechanism, and must retain the certification and contact details for seven years. Failure to obtain the certification creates a presumption of operator liability under Section 22606, though the operator can rebut that presumption with evidence it took alternative verification steps consistent with its duty of care.
Enforcement: private suits and prosecutor actions
Depicted individuals gain a private right of action against both operators and uploaders; remedies include actual or statutory damages up to $75,000 per violation (whichever is greater), punitive damages, attorney’s fees, and injunctive relief. A public prosecutor may also sue for civil penalties of $25,000 per violation and equitable relief. The section also defines per‑day liability for content that remains accessible after a 48‑hour removal window, dramatically increasing exposure for delayed takedowns.
Cumulative remedies and fiscal language
The statute makes remedies cumulative (it doesn’t preclude other legal claims) and includes standard state fiscal language about local costs tied to new infractions. Practically, cumulative remedies mean plaintiffs can pursue this statute alongside other state claims (civil or criminal) unless another law expressly preempts recovery.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Depicted individuals who did not consent or were minors: They gain a clear statutory claim, access to statutory damages up to $75,000 per violation, attorney’s fees, and injunctive relief to remove images and seek compensation.
- Prosecutors and public enforcers: The bill creates a civil enforcement tool with a $25,000 per‑violation penalty and injunctive authority to pursue platforms that repeatedly host nonconsensual or underage content.
- Competing platforms that already verify age/consent: Sites that maintain robust verification workflows will gain a competitive advantage because they can rely on their documented processes to rebut presumptions of liability.
Who Bears the Cost
- Operators of pornographic websites and marketplaces: They must build or buy verification systems, change upload flows to collect and store certifications, manage takedown workflows tied to the 48‑hour standard, and absorb higher litigation risk and potential statutory damages.
- Independent creators and sex workers who upload content: The certification and potential identity demands can erode anonymity, increase friction to publish, and create privacy and safety risks—some creators may be unable or unwilling to comply and lose distribution channels.
- Platform compliance and moderation teams (and their vendors): Expect increased staffing, legal review, ID/consent verification vendor costs, and records‑management burdens tied to seven‑year retention and potential subpoena or discovery in litigation.
Key Issues
The Core Tension
The bill tries to protect victims of nonconsensual or underage sexual imagery by shifting verification duties to platforms, yet doing so compels platforms to collect and hold sensitive proof of age and consent and to adopt cautious moderation practices that may reduce legitimate expression and anonymity—forcing a tradeoff between stronger victim remedies and increased privacy, operational, and economic burdens on platforms and lawful creators.
The bill blends bright‑line procedural requirements (a written uploader certification, email verification, seven‑year retention) with open‑ended legal standards (ordinary care and reasonable diligence). That mix creates a regime that is both enforceable and uncertain: operators can be penalized for failing to collect a certification under a presumption rule, but the statute leaves it to courts to define what alternative verification steps are sufficient to rebut that presumption.
The result will be heavy litigation over what counts as reasonable diligence and whether specific vendor checks, contractual warranties, or spot audits meet the standard.
There are practical and privacy tradeoffs. Requiring sites to retain certification records for seven years creates a trove of sensitive data tied to sexually explicit material that increases breach‑risk, triggers privacy compliance obligations (and potential civil discovery), and may deter uploaders who fear loss of anonymity or retaliation.
The inclusion of AI‑generated or heavily digitized images expands coverage but raises thorny evidentiary questions about proving age and consent when no physical model exists. Finally, the statute references a 48‑hour removal window and imposes per‑day liability beyond it, but it does not clearly spell out the triggering mechanics for that window or the operator’s specific removal obligations, producing an implementation gap that courts or regulators will need to fill.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.