AB 1609 creates baseline consumer protections for interactions with customer-service chatbots used by large companies that make those systems available to people in California. The bill defines key terms, requires businesses to disclose when a user is interacting with an automated system, and mandates that eligible firms provide an accessible way for customers to reach a human agent during specified hours.
The measure targets firms meeting a national revenue threshold and pairs transparency rules with operational requirements (including maximum wait/hold times in the statutory text). Enforcement is limited to the Attorney General and district attorneys, with civil penalties for violations and no private right of action.
The draft contains several internal drafting inconsistencies—most notably multiple, conflicting time figures and overlapping definitions—that will matter at implementation.
At a Glance
What It Does
Requires large private businesses (>$500 million annual revenue) that offer customer-service chatbots to Californians to disclose that the system is not human and to provide a clear, persistent option to reach a human agent during core business hours. The text sets wait-time and hold-time limits in multiple places and authorizes civil penalties enforced by the Attorney General or a district attorney.
Who It Affects
Large national firms making chatbots available to California residents, their contact-center vendors, and third‑party chatbot providers. It also affects state enforcement offices that will interpret and enforce the statute.
Why It Matters
This bill would be among the first state-level laws that couples mandatory AI disclosure with enforceable service-level commitments for consumer-facing bots, forcing operators to provision human support or change their customer-routing design to meet California-specific standards.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
AB 1609 builds a consumer-facing rule set around customer-service chatbots. It starts with a definitions section that attempts to capture ‘‘artificial intelligence,’’ ‘‘customer service chatbot,’’ ‘‘customer,’’ and a revenue-based scope marker called ‘‘large private business’’ (entities with more than $500,000,000 in gross annual revenue nationally).
The definitions also carve out ‘‘extraordinary or emergency situations’’ and exclude services regulated under the California Public Utilities Commission’s General Order 133.
The bill requires that any chatbot interaction that a reasonable person could mistake for a human must carry a clear and conspicuous disclosure that the user is interacting with an automated system. For voice-only interfaces that means an audible notice that can be repeated on request; for other media the disclosure must be presented in the same medium and remain accessible throughout the interaction.
The text emphasizes plain language and continuous availability of the disclosure.On operational safeguards, AB 1609 obligates covered businesses to offer a conspicuous way for customers to contact a human customer service agent during an identified daily business window (the draft repeatedly references 8 a.m.–6 p.m.). Once a customer requests human help, the business must make a good‑faith effort to connect the customer to an agent within the timeframes stated in the text.
The bill also puts limits on telephonic holds and requires online platforms that offer services for California customers to provide telephonic support and post a phone number prominently.Enforcement is reserved to state public prosecutors: the Attorney General and district attorneys can bring actions and seek civil penalties up to $10,000 per violation. The statute expressly disallows a private right of action, allows the Attorney General to issue implementing regulations, and suspends enforceability during ‘‘extraordinary or emergency situations.’' The draft also contains several conflicting numeric edits and duplicated or overlapping terms, which will require clarification before agencies or courts can reliably apply the obligations in practice.
The Five Things You Need to Know
The bill applies only to ‘‘large private businesses’’ defined as entities with more than $500,000,000 in gross annual revenue nationally that make a chatbot available to a person in California.
It requires a clear, conspicuous disclosure that a customer is interacting with an artificial intelligence system or chatbot; for audio‑only interfaces the disclosure must be audible and repeatable on request.
During the bill’s core business window (referenced as 8 a.m.–6 p.m. daily), covered firms must provide a conspicuous customer feature to request human assistance and make a good‑faith effort to connect the requester to a human agent within the timeframes stated in the draft (the text includes 15‑minute connection targets in multiple places).
For telephonic platforms the statute sets per-hold and cumulative hold limits in the draft language (the text bars being placed on hold for more than 15 minutes at a time after answer and caps cumulative hold time for a call at one hour), and it requires that customers who reach a chatbot by phone be given the option to request a human agent.
Enforcement is limited to the Attorney General and district attorneys, with civil penalties up to $10,000 per violation, no private right of action, and a specific carve-out making the chapter unenforceable during declared extraordinary or emergency situations.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Definitions and scope
This section sets out the bill’s operative vocabulary: ‘‘artificial intelligence,’’ ‘‘customer,’’ ‘‘customer service agent,’’ ‘‘customer service chatbot,’’ ‘‘extraordinary or emergency situations,’’ and—critically—‘‘large private business’’ (the revenue threshold). The draft contains duplicated and partially inconsistent clauses (for example, stray references to ‘‘operator’’ alongside ‘‘large private business’’ and two subparagraph labels both marked (b)), which creates uncertainty about who precisely is covered and whether smaller firms are implicated by any stray terminology.
Mandatory chatbot disclosure rules
This provision requires businesses to disclose that a user is interacting with a chatbot or automated AI system when a reasonable person could be misled into thinking the counterpart is human. The disclosure must be clear, conspicuous, and presented in the same medium as the interaction; for voice interfaces it must be audible and repeatable upon request, and it must remain readily accessible during the exchange. Practically, companies will need to bake persistent disclosures into UI flows and IVR scripts and maintain transcripts or logs showing the notice was presented.
Human‑assistance and hold‑time requirements
This section imposes operational obligations during the statute’s core business window (the draft repeatedly references 8 a.m.–6 p.m.). Covered firms must provide a conspicuous means to request human customer-service assistance, give an estimated wait time, and allow customers to accept immediate connection or schedule an appointment. The statute sets time-based performance targets for connecting customers to humans and contains telephonic rules limiting how long a customer can be placed on hold (the draft text includes per-hold and cumulative caps). The language instructs firms to offer multiple contact channels (text, email, telephone) and to post telephone numbers prominently on websites that serve California customers.
Enforcement, penalties, and exemptions
Enforcement is delegated to the Attorney General and district attorneys; the bill provides civil penalties not exceeding $10,000 per violation and expressly denies a private right of action. The Attorney General may promulgate implementing regulations and guidance. The chapter is suspended during ‘‘extraordinary or emergency situations’’ and excludes services compliant with the California PUC’s General Order 133. It also provides a defense where a large private business cannot comply due to unforeseen circumstances beyond its reasonable control.
Interaction with existing law
This short section confirms that the duties in the chapter are cumulative and do not relieve covered entities from other legal obligations. That keeps existing consumer‑protection, privacy, and telecommunications rules in play and signals that firms must comply with overlapping regulatory frameworks in designing chatbot disclosure and routing systems.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- California consumers who prefer or require human assistance — they gain an explicit mechanism to request human support and statutory expectations about how quickly they should be connected.
- Consumers with accessibility or language needs — the bill’s disclosure and channel‑choice requirements (including audible, repeatable notices and the option to select text, email, or phone) create clearer routes to human help that can assist users who struggle with automated interfaces.
- Consumer protection enforcers and advocacy groups — the Attorney General and district attorneys receive a statutory hook to pursue company practices around chatbot transparency and responsiveness.
- Customer-service employees — firms facing new legal minimums may be less able to replace human agents entirely with automation, reinforcing demand for live agents or higher-skilled escalation staff.
Who Bears the Cost
- Large private businesses (>$500M revenue) that make chatbots available to Californians — they must implement disclosures, staffing or queueing systems to meet connection and hold-time targets, and maintain records to demonstrate compliance.
- Third‑party chatbot vendors and contact-center providers — these vendors will need to adapt products and service-level agreements to provide the disclosure mechanics, logging, and routing capabilities buyers must have.
- State enforcement offices — the Attorney General and district attorneys will need to interpret ambiguous drafting, develop compliance guidance or regulations, and allocate resources to investigate and litigate violations.
- Operational budgets and possibly consumers — firms may incur higher labor or contracting costs to meet human‑access commitments and may pass some costs through to customers via prices or service changes.
Key Issues
The Core Tension
The central dilemma is balancing consumer expectations for reliable human access and transparent AI interactions against businesses’ need to scale automated support and control costs: the bill privileges low-latency human connection as a consumer right, but guaranteeing that right at the scale of large national providers requires staffing, reengineering of routing systems, or potentially higher prices—there is no cost‑free way to deliver both instant human service and the efficiency advantages companies seek from automation.
The draft contains drafting collisions that matter in practice. It alternates between the terms ‘‘operator’’ and ‘‘large private business,’’ and it includes multiple, inconsistent numeric edits for connection and hold times (5 minutes, 15 minutes, 10 minutes, and one hour appear in different places).
Those internal inconsistencies create real legal uncertainty about which timing standards apply and which entities are covered. Implementing agencies will need to resolve whether the shorter or longer time frames govern and whether each customer interaction or each pattern of conduct counts as a separate ‘‘violation’’ for penalty purposes.
Operationally, the statute forces firms to choose among increasing live‑agent staffing, more sophisticated callback/appointment systems, or routing designs that could be gamed to appear compliant (for example, giving an ETA or scheduling an appointment instead of providing immediate assistance). The PUC exemption and the emergency carve‑out create additional edge cases: firms that straddle regulated telecom services and consumer-facing platforms will need to determine exactly when General Order 133 applies, and the emergency exception could swallow enforcement during rolling outages or public-safety power shutoffs if not narrowly construed.
Finally, cross-border application remains an open question; an out-of-state company that makes a chatbot ‘‘available to a person in the state’’ may be bound by these rules, raising jurisdictional and notice‑to‑users issues.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.