The bill directs the Election Assistance Commission (EAC), within 60 days of enactment, to issue to Congress, state and local election offices, and the public a report containing voluntary guidelines for using artificial intelligence in election administration, developed with the National Institute of Standards and Technology (NIST).
The guidelines must address the risks and benefits of AI in election administration, cybersecurity risks, and how AI-generated information can affect the sharing of accurate election information, how offices should respond, and how AI-driven information can spread disinformation that undermines public trust.In addition, the act requires a study of AI technologies used in the 2024 federal elections, due by July 31, 2026, with coordination with NIST, and a process to review and update the voluntary guidelines based on study findings.
At a Glance
What It Does
Within 60 days of enactment, the EAC must issue a report that includes voluntary guidelines for AI use in election administration, developed in consultation with NIST. The guidelines are publicly available and nonbinding.
Who It Affects
State and local election offices, election administrators, and cybersecurity staff; federal agencies coordinating election security; and AI vendors serving election offices.
Why It Matters
The act creates a structured, nonbinding framework to anticipate AI-related risks in elections, promote information integrity, and guide orderly adoption of AI tools based on evidence from the 2024 elections.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The Preparing Election Administrators for AI Act requires the Election Assistance Commission to publish voluntary guidelines within 60 days of enactment. These guidelines, developed with the National Institute of Standards and Technology, will address how AI can be used in election administration, the cybersecurity risks posed by AI technologies, and how AI-generated information can affect the sharing of accurate election information and contribute to disinformation that erodes public trust.
Importantly, the guidelines are voluntary and intended to inform state and local election offices and other stakeholders, not to impose mandatory requirements.
Separately, the bill directs the EAC to study AI technologies used in the 2024 federal elections. The study, performed in coordination with NIST, must assess how AI was used, how AI-generated information was shared, and the implications for election administration.
Based on the study results, the EAC is to review and update the voluntary guidelines as appropriate. The overall aim is to provide a practical, evidence-based framework that keeps pace with AI developments while supporting reliable, trustworthy elections.
The Five Things You Need to Know
The bill requires the EAC to issue voluntary AI guidelines for election administration within 60 days of enactment.
Contents must address AI risks and benefits, cybersecurity risks, information sharing, and AI-driven disinformation.
Guidelines are voluntary and publicly available, not mandatory.
Section 3 requires a study of AI technologies used in the 2024 elections, due by July 31, 2026.
Study findings can prompt updates to the voluntary guidelines.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Short title and purpose
Defines the act as the Preparing Election Administrators for AI Act and establishes its purpose: to prepare election administrators for AI by enabling a framework of voluntary guidelines that address AI use in election administration.
Voluntary guidelines for AI in election administration
Directs the EAC, in consultation with NIST, to issue a report including voluntary guidelines within 60 days of enactment. The guidelines cover the risks and benefits of AI in election administration, cybersecurity risks, and how AI-generated information can affect the sharing of accurate election information and how offices should respond to disinformation that undermines public trust.
Study on AI technologies in the 2024 elections and updating guidelines
Requires the EAC to study AI technologies used in federal elections held in 2024, with NIST, and to issue a publicly available report by July 31, 2026. The act also requires the EAC to review and update the voluntary guidelines based on study findings.
This bill is one of many.
Codify tracks hundreds of bills on Elections across all five countries.
Explore Elections in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- State and local election offices gain a practical, nonbinding framework to assess and manage AI risks in administration.
- Election officials and staff benefit from clearer guidance to handle AI-enabled tasks and information flows.
- Election security professionals at federal, state, and local levels gain structured guidance for risk assessment and incident response.
- Voters benefit from improved information integrity and reduced risk of AI-driven disinformation that undermines trust in elections.
- The Election Assistance Commission (EAC) and the National Institute of Standards and Technology (NIST) have a formal mandate to develop and refine AI-related election guidance based on real-world data.
Who Bears the Cost
- State and local election offices may incur costs to train staff and adjust processes to align with guidelines (despite the nonbinding nature).
- Taxpayers fund the study and guideline development efforts undertaken by EAC and NIST.
- EAC and NIST staff time and resources dedicated to coordination, data gathering, and analysis.
- Vendors of AI-enabled election tools may face expectations to align products with guideline recommendations, potentially incurring update costs.
- State and local IT/security teams may need to allocate resources to evaluate AI applications and implement recommended safeguards.
Key Issues
The Core Tension
The central dilemma is whether to push out timely, nonbinding guidance to address AI risks now, or to wait for study-driven, potentially more robust standards that could take longer to finalize and may slow adoption. This balance between immediacy and evidence-based rigor defines the bill's policy trade-offs.
Because the guidelines are voluntary, adoption may vary across jurisdictions, potentially creating uneven readiness for AI-enabled election administration. The bill hinges on the quality and timeliness of the 2024-election study, and on whether the resulting updates to guidance keep pace with rapidly evolving AI technologies.
Additionally, the interaction between voluntary guidelines and existing laws—privacy, civil rights, and election administration statutes—raises questions about enforcement, scalability, and the risk of inconsistent standards across states.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.