Codify — Article

California SB1276 defines ‘sexual abuse’ and expands sexual‑exploitation definitions to cover digital and AI depictions

Sets statutory definitions for sexual assault, sexual exploitation, and commercial sexual exploitation—explicitly covering digitally altered and AI‑generated depictions of minors and clarifying certain age‑and‑consent exclusions.

The Brief

SB1276 adds a consolidated definition of “sexual abuse” to the California Penal Code and then defines its two components: “sexual assault” and “sexual exploitation.” The text cross‑references existing sex‑offense statutes to enumerate acts that qualify as sexual assault, sets narrow exclusions for certain voluntary conduct among minors, and specifies acts that constitute sexual exploitation, including producing, duplicating, accessing, or exchanging images that depict minors in obscene sexual conduct. The bill also defines “commercial sexual exploitation” to include child sex trafficking and exchanges of food, shelter, or payment for sexual acts.

This is primarily a definitional bill: it does not create new offense headings but reshapes the statutory language courts, prosecutors, and investigators will use to charge, evaluate, and prove child sexual‑abuse cases. Its notable innovations are the express inclusion of digitally altered and AI‑generated depictions of minors and a clarified carve‑out for consensual conduct between peers (with an adult/child age cutoff).

Those changes will affect charging decisions, evidence handling for digital media, platform content moderation practices, and defense strategies.

At a Glance

What It Does

The bill defines “sexual abuse” to mean either sexual assault or sexual exploitation, then lists specific acts that qualify as sexual assault and sexual exploitation. It makes clear that creating, duplicating, downloading, streaming, accessing, or exchanging images depicting minors in obscene sexual conduct— including digitally altered or AI‑generated matter—falls within sexual exploitation, subject to limited law‑enforcement exceptions.

Who It Affects

Prosecutors and defense counsel who litigate child sexual‑abuse cases, law enforcement units that investigate digital evidence, online platforms that host user content, and residential care institutions and parents referenced as persons responsible for a child’s welfare.

Why It Matters

By consolidating definitions and expressly covering digital and AI‑generated depictions, the bill narrows interpretive gaps prosecutors and courts have relied on and forces platforms and investigators to treat a broader set of digital media as actionable. That changes evidentiary and operational practices without creating new categories of criminal conduct.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB1276 creates a single statutory anchor—“sexual abuse”—and tells practitioners to read it as either sexual assault or sexual exploitation. Instead of inventing new crimes, the bill sets out what counts under those labels by pointing to existing penal sections and by listing concrete behaviors that will be treated as qualifying conduct.

That approach pulls together scattered definitions so charging documents and caselaw have a uniform starting point.

On sexual assault, the bill references a long list of existing Penal Code sections (rape, statutory rape, incest, sodomy, oral copulation, lewd acts on a child, sexual penetration, and child molestation) and supplies an explicit examples list: penetration (however slight), oral‑genital contact, intrusion with an object, intentional touching of intimate parts for sexual arousal, and a perpetrator masturbating in the presence of a child. It also adds an exclusion: certain voluntary conduct that otherwise violates specific sections will not count as sexual assault if there are no indicators of abuse—except where the older‑partner/young‑minor scenario applies (a person 21 or older and a minor under 16).For sexual exploitation, the bill covers both the supply side (preparing, selling, distributing obscene matter; employing a minor to perform obscene acts; inducing a child into prostitution or obscene live performances) and the media side (depicting or knowingly developing, duplicating, downloading, streaming, accessing, or exchanging images or recordings showing a minor engaged in obscene sexual conduct).

Critically, SB1276 names digitally altered and AI‑generated matter that depicts persons under 18 engaging in obscene sexual conduct as included. The text preserves explicit exceptions already in Penal Code section 311.3 for law enforcement and prosecution activities.Finally, the bill defines “commercial sexual exploitation” by tying it directly to the sexual trafficking provision in Section 236.1 and by adding language that makes it an exploitative act when anyone furnishes food, shelter, or payment to a child in exchange for sexual acts.

That wording aligns the commercial‑exchange concept with both trafficking prosecutions and common‑sense evidence of exploitation in transactions that fall short of formal trafficking charges.

The Five Things You Need to Know

1

SB1276 defines “sexual abuse” to mean either “sexual assault” or “sexual exploitation” as newly elaborated in the statute.

2

The bill enumerates specific acts that qualify as sexual assault, including any penetration, oral‑genital contact, object intrusion, intentional touching of intimate parts for arousal, and masturbating in a child’s presence.

3

It excludes certain voluntary conduct that violates specified Penal Code sections from the sexual‑assault label when there are no indicators of abuse—except when the perpetrator is 21 or older and the victim is under 16.

4

The statute expressly includes creating, developing, duplicating, downloading, streaming, accessing, or exchanging images depicting minors in obscene sexual conduct—explicitly covering digitally altered and AI‑generated matter—as sexual exploitation, with law‑enforcement exceptions tied to section 311.3.

5

“Commercial sexual exploitation” is defined to include child sexual trafficking under Section 236.1 and any provision of food, shelter, or payment to a child in exchange for sexual acts described in the bill.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

11165.1 (a)

Sexual assault: cross‑reference and age carve‑out

Subsection (a) anchors the definition of “sexual assault” to a long list of existing Penal Code offenses (rape, statutory rape, incest, sodomy, oral copulation, lewd acts on a child, sexual penetration, child molestation). That cross‑referencing means prosecutors don’t need to prove a novel statutory element— they prove the underlying offense—while defense counsel must attend to how those underlying statutes interact with this article. The subsection’s exclusion for voluntary conduct removes certain consensual peer activity from the article’s reach unless the conduct involves a 21‑plus individual and a minor under 16, which creates a bright‑line risk threshold for older‑partner cases.

11165.1 (b)

Enumerated conduct that qualifies as sexual assault

Subsection (b) lists concrete acts—penetration, oral contact, object intrusion, intentional touching of intimate parts, and masturbation in the child’s presence—and supplies limited exceptions: acts for valid medical purposes, normal caretaker interactions, and demonstrations of affection. That text is operational: it narrows dispute over what types of touching qualify and flags commonly litigated defenses (medical procedure, caregiver duties). For prosecutors, the list supports charging decisions; for investigators, it clarifies what behaviors investigators should document.

11165.1 (c)

Sexual exploitation: prostitution, obscene performances, and media

Subsection (c) covers conduct that induces a child into prostitution or obscene performances and criminalizes the production and promotion of obscene material involving minors. Crucially, it expands the media prong beyond physical film or photographs to expressly include digital behaviors—developing, duplicating, downloading, streaming, accessing, or exchanging—and it names digitally altered and AI‑generated matter depicting persons under 18 in obscene conduct. The provision preserves the statutory exceptions for law‑enforcement uses specified elsewhere, but otherwise signals that a wide range of digital activity will qualify as sexual exploitation.

1 more section
11165.1 (d)

Commercial sexual exploitation: trafficking and exchanges

Subsection (d) ties commercial exploitation to two concrete pathways: sexual trafficking under Section 236.1 and transactional exploitation where food, shelter, or payment is provided in return for sexual acts. This dual framing lets prosecutors pursue either trafficking enhancements or straightforward exploitation charges when financial or in‑kind inducements are present, broadening the tools available to target exploitative arrangements that fall below trafficking thresholds.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Criminal Justice across all five countries.

Explore Criminal Justice in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Child victims and their advocates — They gain clearer statutory language to describe harms and broader coverage of digital and AI‑generated depictions, which can prevent gaps where exploitative images were previously treated ambiguously.
  • Prosecutors and child‑abuse units — The consolidated definitions and explicit media provisions simplify charging choices and provide textual support to treat certain digital conduct as criminal exploitation.
  • Law enforcement digital evidence teams — The bill’s specific enumeration of digital acts (download, stream, access, exchange) clarifies investigative avenues and justifies forensic processes for seizing and analyzing digital media.
  • Child welfare and residential‑care oversight bodies — By naming persons responsible for a child’s welfare in the exploitation context, the bill strengthens the basis for internal investigations and administrative actions against caregivers who permit or encourage abuse.

Who Bears the Cost

  • Online platforms and content hosts — Platforms will face increased pressure to detect and remove a broader set of images (including AI‑altered material) and to preserve and produce evidence for investigations, raising moderation and compliance costs.
  • Defense counsel and accused individuals — Broader definitions and the inclusion of digital/AI content may expose defendants to more charges based on non‑traditional media and increase litigation over how to prove age and obscenity in manipulated images.
  • Small residential institutions and caregivers — Facilities and staff may face criminal exposure if a court interprets “permits or encourages” broadly, creating compliance costs and potential staffing or screening burdens.
  • Courts and prosecutors’ offices — Expect an increased caseload tied to digital evidence disputes, threshold motions over AI content, and the need for technical experts, which can strain resources.

Key Issues

The Core Tension

The central dilemma is protecting children from modern forms of sexual exploitation—especially digitally created or circulated material—while avoiding overbroad criminalization and chilling legitimate research, journalistic, or defensive platform moderation activities; tightening definitions helps enforcement but transfers complexity to courts, investigators, and private platforms with no clean operational answers in the statute.

The bill’s inclusion of digitally altered and AI‑generated matter closes an important substantive gap but creates practical dilemmas. First, proving that a digital file depicts a real minor versus a synthetically generated likeness can be technically and legally fraught; prosecutors will need reliable forensic methods and expert testimony, and courts will face new threshold battles over admissibility and whether AI‑generated material is functionally equivalent to images of real children.

Second, the statute uses terms like “obscene sexual conduct” and “indicators of abuse” without defining them here, meaning courts must import standards from other sections and case law—an ambiguity that can produce uneven application across jurisdictions.

Operationally, platforms and intermediaries will shoulder much of the compliance burden without explicit safe‑harbor guidance. The statute criminalizes a wide range of digital actions (accessing, downloading, streaming, exchanging) which could sweep in ordinary user behavior or investigative research absent clear intent standards.

The exceptions for law enforcement align with existing practice, but the bill does not set out protections for researchers, child‑protection NGOs, or journalists who may handle sensitive material as part of lawful work. Finally, the 21/under‑16 exclusion for voluntary conduct narrows exposure for some peer activity but raises edge cases (e.g., 20 and 15) that remain unresolved by the text.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.