AB 2412 amends Government Code section 11549.66 to require California state agencies and departments that use generative artificial intelligence (GenAI) when communicating with the public to include explicit disclosures and contact information for a human employee. The bill replaces language limiting the rule to communications “regarding government services and benefits” with a broader trigger — any communication with the public — and specifies where and how disclosures must appear across media types.
Why this matters: the change expands transparency obligations from transactional, service-related messages to a potentially wide range of public-facing communications (news releases, social posts, automated answers). Agencies will need to map where GenAI is used, update templates and interfaces to show disclosures consistently across written, continuous online, audio, and video channels, and ensure links or instructions to reach a human are available and maintained.
At a Glance
What It Does
The bill requires state agencies that use GenAI in communications with the public to include a clear disclaimer that the content was generated by GenAI and to provide information or a link explaining how to contact a human employee. It prescribes media-specific placement rules for written, continuous online, audio, and video formats.
Who It Affects
State agencies and departments, their contractors and vendors who supply GenAI tools (chatbot providers, IVR vendors, video platforms), and public-facing communications teams that publish or distribute automated content. Californians who interact with state digital and media channels will see the disclosures.
Why It Matters
By widening the rule from service-focused exchanges to any public communication, the bill forces system-wide inventory, governance, and user-notice changes and creates ongoing maintenance obligations for links and contact pathways; it also establishes a de facto standard for how government must label GenAI outputs across formats.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
AB 2412 rewrites a single California statute to make two concrete changes: (1) expand when disclaimers are required, and (2) spell out how disclaimers must appear depending on the medium. The statutory trigger becomes any use of GenAI in communications with the public, rather than only direct, service-oriented exchanges with an individual about government benefits.
That broadening means a wide set of materials — from automated social media posts and press advisories drafted with GenAI to chatbot responses on agency websites — will fall within the rule.
The bill then details placement rules. For discrete written items (letters, emails, one-off messages), the disclaimer must appear prominently at the start.
For continuous online interactions — the typical chatbot or conversational UI that maintains an ongoing session — the bill requires the disclaimer to be visible throughout the interaction rather than only at the opening. For audio, agencies must state the disclaimer orally at both the start and the end of the interaction, a requirement that affects phone systems and voice assistants.
For video content, the disclaimer must be displayed prominently for the duration of the interaction or presentation.Finally, AB 2412 requires that each GenAI-generated communication include either information or a link to an internet page describing how a person may contact a human employee. That provision is deliberately flexible about whether the contact detail is a phone number, a staffed chat option, or a link to a contact page, but it creates an ongoing duty to maintain a working human-contact pathway tied to any automated outreach.
The statute does not add enforcement mechanisms, fines, or implementation timelines; it focuses on disclosure placement and the human-contact requirement, leaving agencies to decide operational approaches to comply.
The Five Things You Need to Know
The bill amends Government Code section 11549.66 to expand the disclosure requirement from communications “regarding government services and benefits” to any communications with the public.
For discrete written communications (letters, email, occasional messages), the statute requires the GenAI disclosure to appear prominently at the start of the message.
For continuous online interactions, such as chatbots, the law requires the disclosure to be visible throughout the user’s session, not only at the opening prompt.
Audio interactions must include an oral GenAI disclosure both at the start and at the end of the interaction; video interactions must display the disclosure throughout the presentation.
Each GenAI-generated public communication must include information or a link to a website explaining how to contact a human employee of the agency; the bill does not define the required response time or staffing level for that human contact.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Broadened scope: ‘communicates with the public’ replaces service-only trigger
The amendment removes the narrower trigger tied to communications “regarding government services and benefits” and instead applies the statute whenever an agency “communicates with the public” using GenAI. Practically, that change can sweep in press releases, social media, FAQs, and proactive outreach that previously fell outside the rule; agencies must therefore reassess which outputs are generated or assisted by GenAI and update policies and inventories accordingly. The provision does not add exemptions or carve-outs for specific channels.
Media-specific placement rules for GenAI disclaimers
Subsection (a) codifies four placement rules: (1) for one-off written communications, the disclaimer must appear at the start; (2) for continuous online interactions (e.g., chatbots), the disclaimer must be displayed throughout the session; (3) for audio, the disclaimer must be spoken at the start and end; and (4) for video, the disclaimer must be displayed throughout. Each rule focuses on visibility rather than precise wording, but terms like “prominently” and “throughout the interaction” introduce interpretive discretion that agencies will need to resolve in internal guidance and interface design standards.
Duty to provide information or link to human contact
Subsection (b) requires every GenAI-generated public communication to include either direct information or a link to a webpage describing how to contact a human employee. The statute is intentionally technology-neutral about the contact method, allowing agencies to use phone numbers, staffed chat options, ticket portals, or contact pages. However, it creates an operational obligation to maintain that pathway and ensure links are current; failure to do so could create reputational, access, and legal risks even though the bill contains no express penalty scheme.
This bill is one of many.
Codify tracks hundreds of bills on Government across all five countries.
Explore Government in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Californians interacting with government online — they gain clearer notice when content is machine-generated, reducing the risk of mistaking automated outputs for human statements and helping them decide whether to seek a human interlocutor.
- Journalists, researchers, and watchdogs — consistent disclosures improve transparency and make it easier to track where and how agencies rely on GenAI, supporting accountability and FOIA-like review of agency communications practices.
- Agency transparency and ethics officers — the statute creates an explicit standard they can enforce internally and use to justify resources for governance, audits, and public-facing documentation.
Who Bears the Cost
- State agencies and departments — they must inventory GenAI use, redesign templates and UI, add disclosure text to material across many channels, and maintain contact links; these are recurring operational and staffing costs.
- Vendors and contractors that supply GenAI chatbots, IVR systems, and video production — they will need to build configurable disclosure mechanisms into products and possibly support logging to prove compliance.
- Communications teams and contact centers — they may face higher volumes of ‘escalate to human’ requests and must coordinate to keep contact pathways staffed and up to date, increasing training and resource needs.
Key Issues
The Core Tension
The core tension is between transparency and usability: the bill prioritizes clear notice that a government communication was produced by GenAI, protecting the public from misleading automated content, but doing so universally and conspicuously can undermine efficient, scalable digital service delivery, increase operational burdens on overstretched agencies, and create user friction — especially where immediate human help is limited or impractical.
AB 2412 draws a bright line on labeling GenAI outputs but leaves multiple practical questions unresolved. The statute uses broad terms — “prominently,” “throughout the interaction,” and “communicates with the public” — that invite divergent interpretations across agencies; without implementing guidance, agencies will reach different conclusions about what content needs a label and how intrusive the disclosure must be.
The audio requirement (start and end) is especially practical: inserting spoken disclaimers into IVR systems and short voice prompts changes call flows, potentially extending interactions and increasing caller frustration unless carefully crafted.
The bill also imposes an operational duty to provide human-contact information without specifying staffing, response times, or verification standards. That gap creates a potential mismatch between disclosure and actual access: agencies could satisfy the letter of the law by providing a link to a general contact page while lacking the capacity to respond meaningfully.
Finally, because the bill lacks enforcement provisions, compliance will depend on agency governance, budget decisions, and external pressure; that structure risks uneven execution and potential litigation over vagueness or constitutional issues if agencies overreach in applying the rule.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.