Codify — Article

California SB 1159 carves AI out of statutory definitions of “person”

The bill bars AI systems, robots, and autonomous agents from qualifying as a “person” or similar public participants under several key California statutes, reshaping who can request records and participate in government processes.

The Brief

SB 1159 adds Section 17.5 to the California Government Code to specify that terms such as “person,” “interested person,” “participant,” and “member of the public” in a set of California statutes do not include artificial intelligence systems, autonomous agents, robots, or other nonhuman entities, whether physical or digital. The statutes named in the bill are the California Public Records Act, the Bagley-Keene Open Meeting Act, the Ralph M.

Brown Act, the Political Reform Act of 1974, the Administrative Procedure Act, and the California Environmental Quality Act (CEQA).

The change is narrowly technical but consequential: it removes any statutory argument that a machine can claim the rights or duties accorded to a human “person” under those laws. That affects who can lodge public-records requests, participate in meetings, file CEQA comments or appeals, or trigger administrative procedures that depend on status as a “person.” The bill also includes legislative findings intended to satisfy the California Constitution’s requirement for changes affecting local agency access obligations.

At a Glance

What It Does

The bill amends Government Code by adding Section 17.5 to say that defined statutory terms for people who may engage with government do not include AI systems, autonomous agents, robots, or other nonhuman entities, physical or digital. It lists six existing bodies of law to which the exclusion applies.

Who It Affects

State and local public agencies that administer CPRA, Bagley‑Keene, Brown Act, Political Reform Act, APA, and CEQA; private parties and vendors that operate or deploy AI-driven bots or autonomous systems that interact with government; and members of the public who use automated tools to engage with government processes.

Why It Matters

By closing a potential statutory door for nonhuman actors, the bill reduces the risk that automated systems can claim procedural rights or trigger obligations intended for humans or legal persons. That reshapes compliance, records handling, and participation rules across open‑government and administrative processes in California.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

SB 1159 answers a straightforward but increasingly urgent question: when a law uses words like “person,” “interested person,” “participant,” or “member of the public,” does that include an artificial intelligence system? The bill’s single, targeted move is to say “no” for a set of six California statutes that govern public access, meetings, campaign and ethics rules, administrative procedure, and environmental review.

Those statutes have long used broad definitions of “person”; SB 1159 narrows the practical reach of those terms by excluding nonhuman entities explicitly.

Practically, the exclusion means agencies can treat submissions, requests, or comments that originate from machines differently from those originating from natural persons or legal entities. It removes a statutory basis for an AI system to claim the procedural status that triggers disclosure obligations, meeting participation rights, CEQA standing, or administrative review rights.

The bill explicitly covers both physical robots and purely digital autonomous agents; it also instructs agencies and courts to read the named terms in those statutes without counting nonhuman systems among eligible actors.The bill also contains legislative findings targeted at satisfying a California constitutional requirement about changes that affect local agency access to records and meetings. Those findings are a technical but necessary step to ensure the enactment operates against the state constitutional standard that protects public access.

The text refers to “artificial intelligence, as defined,” but the provided excerpt does not include the bill’s operative statutory definition, which leaves a practical question about thresholds and edge cases for implementation.Because SB 1159 amends definitional reach rather than creating new permissions or creates a new regulatory regime, its immediate effect will be on how governments, vendors, and litigants interpret standing and participation across a cluster of statutes rather than on creating new enforcement pathways. That makes the bill impactful for compliance officers and legal teams who must revise policies, forms, and intake procedures to reflect the exclusion.

The Five Things You Need to Know

1

SB 1159 adds a new Section 17.5 to the Government Code that excludes AI systems, autonomous agents, robots, and other nonhuman entities from statutory terms like “person,” “interested person,” “participant,” and “member of the public.”, The exclusion applies explicitly to six California laws: the California Public Records Act, the Bagley‑Keene Open Meeting Act, the Ralph M. Brown Act, the Political Reform Act of 1974, the Administrative Procedure Act, and CEQA.

2

The bill covers both physical machines (robots) and purely digital systems (autonomous agents and AI systems), stating the exclusion applies whether entities are physical or digital.

3

SB 1159 includes legislative findings intended to satisfy the California Constitution’s requirement for statutes that amend public‑access laws affecting local agencies’ obligations.

4

The excerpt references “artificial intelligence, as defined,” but the provided bill text does not include the operative statutory definition in the excerpt, leaving implementation details open until the definition is visible.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 17.5(a)

Scope — statutes to which the exclusion applies

This paragraph lists the specific statutes to which the new exclusion applies: the California Public Records Act, Bagley‑Keene, the Ralph M. Brown Act, the Political Reform Act, the Administrative Procedure Act, and CEQA. For compliance teams, this is the operative sweep — changes will be required in procedures and forms tied to each named statute rather than across the entire Government Code. The practical implication is targeted administrative work for agencies that handle records requests, public meeting participation, campaign filings, administrative rulemaking, and environmental review.

Section 17.5(b)

Substantive exclusion of nonhuman actors

This provision declares that terms referring to eligible participants—“person,” “interested person,” “participant,” “member of the public,” and similar expressions—do not include artificial intelligence systems, autonomous agents, robots, or other nonhuman entities, whether physical or digital. Operationally, that lets agencies refuse to treat machine-originated inputs as triggering statutory duties or rights reserved for human or legal persons. It also preempts arguments that automated systems could stand in for natural persons for purposes such as requesting records or formally participating in meetings.

Section 17.5(c)

Breadth of excluded terms

The bill’s language covers a range of statutory terms—beyond just the single word “person”—so the exclusion reaches procedural concepts like who may appeal, who may submit comments, and who qualifies for notice or party status under the listed statutes. This matters because many statutory triggers depend on being an “interested person” or “participant”; the exclusion eliminates a route for machines to claim those procedural designations.

1 more section
Section 17.5(d)

Legislative findings and constitutional compliance

SB 1159 includes findings declaring that the enactment furthers the constitutional purpose of public access to government records and meetings, a required step under the California Constitution when changing statutory public‑access rules that affect local agencies. Those findings are largely procedural but necessary to insulate the amendment from constitutional challenges or to satisfy procedural prerequisites for local‑agency obligations to remain consistent with state constitutional access protections.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Government across all five countries.

Explore Government in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Human members of the public who rely on procedural protections — The bill reduces the chance that machine actors can displace human participants in processes like public comment, records requests, or CEQA participation, preserving human-centric engagement pathways.
  • State and local agencies — Agencies gain clarity and a statutory basis to screen out automated actors when administering records requests, meeting rosters, campaign reporting interactions, and CEQA participation, simplifying intake and reducing risk of automated manipulation.
  • Compliance and legal teams at regulated entities — The bill narrows the scope of who can claim procedural rights under the named statutes, making it easier for lawyers and compliance officers to advise clients about eligibility and to defend administrative decisions against claims premised on machine status.

Who Bears the Cost

  • Vendors and developers of automated systems — Companies that operate bots, autonomous agents, or AI-driven advocacy tools lose a statutory footing to act as formal participants or to file requests and may need to redesign products or add human wrappers to access government processes.
  • Advocacy groups and researchers using automation — Groups that use automated scraping, bulk requests, or automated comment-generation tools may see their workflows limited or forced to ensure a human sign‑off to meet statutory thresholds.
  • Public‑facing agency staff and IT teams — Agencies must update policies, forms, intake systems, and verification processes to detect and document human versus nonhuman submissions, creating administrative cost and potential friction for legitimate hybrid human‑AI submissions.

Key Issues

The Core Tension

The central tension is between protecting public processes from automated manipulation and preserving broad, practical access to government: the bill gives governments a clear tool to exclude nonhuman actors, but doing so risks excluding beneficial automated assistance and forces agencies to build verification systems that may burden legitimate human participants.

The bill is tightly focused but raises several practical implementation questions. The text provided refers to “artificial intelligence, as defined,” but does not include the operative definition in the excerpt; how broadly or narrowly that definition reads will determine many edge cases—for example, whether AI‑assisted human submissions (a human using an AI drafting tool) count as human, and whether automated processes that act on behalf of a legal person are excluded.

Agencies will need clear guidance to avoid over‑broad exclusions that block legitimate public engagement.

Detecting and proving a submission is nonhuman creates enforcement and operational burdens. Agencies will face choices about verification (e.g., requiring attestations, CAPTCHA, or identityproofing), which could raise privacy and administrative‑cost concerns and might slow genuine public participation.

There is also a risk that actors will impersonate humans to evade the exclusion, shifting the problem from statutory interpretation to identity verification and fraud prevention.

Finally, narrowing statutory definitions alters litigation strategy and standing doctrines tied to CEQA and administrative challenges. Removing machines from the statutory class of “persons” cuts off a theoretical vector for automated legal actions, but it does not resolve how courts will treat AI‑generated submissions attached to human plaintiffs, or whether other statutes or common‑law rights will fill the gap.

Those unresolved interactions make the bill a partial fix rather than a comprehensive governance framework for AI in public processes.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.