Codify — Article

Artificial Intelligence Civil Rights Act of 2025: federal framework for algorithmic discrimination

Sets a civil-rights-style prohibition on discriminatory algorithms, creates mandatory pre-deployment and annual impact checks, public disclosures, and new enforcement tools.

The Brief

The bill defines “covered algorithms” broadly and bars developers and deployers from offering or using algorithms in ways that cause or contribute to discrimination or disparate impacts in contexts the law calls “consequential actions” (employment, housing, health care, credit, criminal justice, elections, and similar life-or-death or materially significant domains). It establishes procedural obligations—evaluations before deployment, annual post-deployment assessments, public disclosures, and contract rules between developers and deployers—to detect and mitigate harms.

The measure centralizes oversight with the Federal Trade Commission (with expanded investigatory authority), preserves a private right of action, enables state attorneys general enforcement, requires transparency (public repository, short-form notices, multilingual and accessible disclosures), and directs rulemaking on human alternatives and appeals. For compliance officers and legal teams, the bill replaces informal best practices with statutory duties, prescriptive reporting deadlines, and civil liability exposure for failures to identify, mitigate, or disclose algorithmic harms.

At a Glance

What It Does

Creates a federal civil-rights protection for people affected by computational decision systems and requires developers and deployers to identify plausible harms, conduct independent audits when harm is plausible, and take reasonable mitigation steps. It mandates public, accessible disclosures and summaries of evaluations and requires deployers to perform annual impact assessments and share summaries with developers and the Commission.

Who It Affects

Developers that design or substantially modify machine-learning or other advanced computational systems and any commercial deployer that uses those systems for consequential actions (e.g., lenders, employers, housing platforms, health providers, election infrastructure vendors). It also touches auditing firms, state attorneys general, civil-rights groups, and federal agencies that use or procure algorithms.

Why It Matters

This is the first federal statute to treat algorithmic harms as a civil-rights problem with affirmative duties (not just consumer privacy or unfair practices). It shifts risk onto developers and deployers through mandatory assessments, recordkeeping, disclosures, and both public and private enforcement routes—changing how compliance, procurement, and product teams must document and govern AI systems.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The bill opens by defining the universe of systems it regulates: covered algorithms include machine learning, natural language processing, and other similarly complex computational processes whenever they make, facilitate, or materially inform consequential actions that significantly affect people’s lives. “Consequential actions” is an expansive category intentionally tied to concrete domains—employment, credit, housing, health care, elections, criminal justice, benefits, utilities, and any other area the FTC later designates.

Before a covered algorithm is used in a consequential setting, the sponsoring party (developer or deployer) must do a two-step assessment. First, they run a short plausibility screen: is harm possible?

If yes, an independent auditor must perform a full pre-deployment evaluation. Those evaluations are detailed technical documents: model design and methodology, training and test datasets and metrics, representation of demographic groups, stakeholder consultation logs, outputs observed during testing, anticipated harms and disparate impacts, and concrete mitigation recommendations.

Deployers likewise must assess necessity and proportionality relative to the baseline process being replaced. Both developer- and deployer-facing reports are then summarized for public consumption and submitted to the Commission.After deployment, the deployer carries the main monitoring duty: an annual preliminary impact review that escalates to a full independent-auditor impact assessment if harms occurred or were reasonably likely.

Deployers must give developers summaries of those assessments so developers can compare field performance to lab testing and decide whether product-level fixes are needed. Parties must retain evaluations and records for ten years.

Trade secrets and personal data are protected by redaction rules, but regulatory and public-summary obligations remain.On transparency, the bill requires clear, accessible public disclosures about an algorithm’s purpose, data categories used, transfer practices, and how individuals can exercise rights (appeal, opt-out, request human alternatives). Deployers must present a concise short-form notice (capped in length by regulation), make the full set of summaries searchable via an FTC-hosted public repository, and ensure multilingual and disability-accessible presentation.

The FTC is given rulemaking authority to define the details (templates, auditor standards, what constitutes a material change) and must study whether and how to require technical explanations of algorithmic outcomes.Enforcement is layered. The FTC enforces the statute as unfair or deceptive practices and gains explicit jurisdictional reach.

State attorneys general can bring parens patriae suits with statutory civil penalties and restitution remedies. The law also creates a private right of action for individuals and invalidates pre-dispute arbitration or class-waiver clauses for claims under this Act, making litigation an available path for harms that slip past enforcement.

Finally, the bill funds federal capacity-building (new algorithm-auditing occupational series at OPM and up to 500 new FTC hires) and directs rulemaking timelines for human alternatives and appeals.

The Five Things You Need to Know

1

If a preliminary evaluation finds that harm is plausible, the developer or deployer must engage an independent auditor to perform a full pre-deployment evaluation before offering or deploying the covered algorithm.

2

Developers and deployers must retain all pre-deployment evaluations, impact assessments, and related reviews for at least 10 years and submit full evaluations and summaries to the FTC within 30 days of completion.

3

State attorneys general can recover civil penalties of $15,000 per violation or 4% of the defendant’s average gross annual revenue over the preceding three years, whichever is greater, and may sue as parens patriae.

4

The law invalidates pre-dispute arbitration agreements and pre-dispute joint-action waivers for claims under this Act; plaintiffs can recover treble damages or $15,000 per violation (whichever is greater) plus attorneys’ fees, punitive, and equitable relief.

5

The Commission must issue rules within two years establishing when deployers must offer human alternatives and an appeals mechanism; short-form consumer notices are limited by regulation to a concise, user-facing template (the statute requires the Commission to provide a model).

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 101

Prohibition on algorithmic discrimination in consequential contexts

This section makes it unlawful for a developer or deployer to offer, license, or use a covered algorithm that causes or contributes to a disparate impact or otherwise makes unavailable equal enjoyment of goods or services tied to a consequential action on the basis of a protected characteristic. It is not a pure intent standard: disparate impact is unlawful unless the developer or deployer proves necessity and lack of a less discriminatory alternative. The section contains narrow operational exceptions (research, self-testing, or diversity-expanding uses). For compliance teams, this means documenting necessity analyses and alternative assessments will be central to legal defense.

Section 102

Pre-deployment evaluations, annual post-deployment impact assessments, and recordkeeping

This procedural core sets out a two-stage pre-deployment pathway: a lightweight plausibility screen followed by a mandatory independent-auditor evaluation if harm is plausible. The statute prescribes the scope of technical disclosures auditors must examine—training datasets, performance metrics, testing across demographic groups, and stakeholder engagement—and it imposes annual deployer assessments after deployment. Reports and summaries must be submitted to the FTC and retained for ten years; summaries must also be published. Practically, product, data, and legal teams must create audit-ready artifacts and formal stakeholder logs to survive an audit.

Title II (Sections 201–203)

Operational standards, contract duties, and human alternatives

Developers and deployers must certify that benefits outweigh harms, keep algorithms aligned with advertised performance, and prohibit off‑label uses (deployment outside evaluated uses). Contracts between developers and deployers must include data-processing instructions, notification provisions for material changes, and bar the mixing of data collected from other clients. The Commission must write rules on when deployers must offer human alternatives and implement appeal mechanisms—these will shape customer-service and escalation workflows for regulated actors.

2 more sections
Section 301

Disclosure, multilingual short-form notices, and public repository

The bill mandates public, plain-language disclosures about identity, data categories, transfers, and how to exercise statutory rights. Short-form consumer notices must be concise, accessible, and provided at first interaction or on-site where no relationship exists. The FTC must build and maintain a searchable public repository for the submitted evaluations and summaries. Operationally, companies will need compliance webpages, multilingual assets (10 most-spoken U.S. languages), accessibility support, and processes to notify affected individuals of material changes.

Sections 401–403

Enforcement architecture: FTC, states, and private rights

The FTC enforces the Act as an unfair or deceptive practice with expanded jurisdictional reach; the statute preserves other agency authorities and allows the FTC to issue rules. State attorneys general can sue on behalf of residents, seek penalties and restitution, and must notify the FTC before suing (with a right for the FTC to intervene). Individuals have an express private right of action with statutory damages and the law disables pre-dispute arbitration gaps. Compliance and litigation risk therefore run on three parallel tracks—agency enforcement, state suits, and private litigation.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Civil Rights across all five countries.

Explore Civil Rights in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Individuals subjected to consequential automated decisions — they gain statutory protections against disparate impact and routes (appeal, opt-out, human alternatives) to challenge or avoid harmful algorithmic outcomes.
  • Civil-rights organizations and consumer advocates — access to audit summaries and a public repository provides evidence for systemic patterns and supports strategic enforcement and advocacy.
  • Independent auditing and compliance firms — the law creates sustained market demand for qualified independent auditors and for services to prepare, review, and remediate pre-deployment and annual assessments.
  • State attorneys general and regulators — clear statutory causes of action, defined remedies, and mandatory submissions to the FTC improve investigatory leverage and enforcement coordination.

Who Bears the Cost

  • Developers of complex models — must fund independent audits, prepare extensive technical documentation, and possibly reengineer models and datasets to meet necessity and nondiscrimination tests.
  • Deployers (especially SMEs and sector-specific vendors) — must run annual impact assessments, maintain notification systems, provide human alternatives and appeal workflows, and face contractual constraints with developers.
  • Companies with trade secrets — must balance redacting proprietary material against the statutory push for public summaries and FTC review, increasing legal and operational friction when publishing evaluations.
  • FTC and federal budget — statute authorizes significant new staffing and systems (occupational series and up to 500 hires) and requires building/maintaining a public repository, all of which need appropriations and sustained funding.

Key Issues

The Core Tension

The central dilemma is balancing civil-rights style protection (transparency, accountability, and meaningful remedies) against practical constraints on innovation, trade-secret protection, and operational feasibility: strong duties and public disclosures help identify and remedy harms but impose compliance costs and commercial risks that may concentrate liability on smaller deployers and reshape product design in ways that reduce fast-moving innovation; there is no frictionless way to solve both aims simultaneously.

The bill imposes prescriptive technical and documentary obligations on both creators and users of advanced computational systems, but leaves key implementation choices to the FTC’s rulemaking. That delegation helps the statute adapt to evolving technology but also creates uncertainty during the interim: companies must anticipate standards that may require rework of models, audit processes, and contracts.

The independent-auditor requirement addresses conflict-of-interest concerns but raises questions about the supply, accreditation, and liability of auditors—an emergent market where credentialing and standards-setting will determine audit quality and cost.

Transparency requirements aim to make audits intelligible to the public, but the statutory balance between disclosure and trade-secret protection is delicate. Firms can redact trade secrets and personal data, and the FTC may withhold information under FOIA exceptions, but heavy redaction risks hollowing out the repository’s usefulness.

Similarly, protecting individuals’ privacy while mandating detailed dataset provenance and output traces will require careful technical and legal guardrails. Finally, the law’s private right and disabled arbitration waivers create litigation risk that may accelerate defensive compliance but could also skew enforcement toward well-resourced plaintiffs and states, producing uneven outcomes across sectors.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.