Codify — Article

Facial Recognition Act of 2025 narrows law‑enforcement use, adds testing, reporting, and penalties

Sets warrant-first rules and audits for law enforcement facial recognition, mandates NIST testing and bias checks, and creates suppression and civil remedies for violations.

The Brief

The Facial Recognition Act of 2025 creates a federal framework that tightly restricts when and how investigative and law enforcement officers may use facial recognition systems. It requires judicial orders (with narrow exceptions for emergencies and deaths), mandates accuracy and operational testing, creates public reporting and independent audits, and authorizes suppression and civil claims for unlawful uses.

This is a compliance‑forward bill: agencies that keep using facial recognition must redesign policies, submit to recurring testing (NIST benchmarks and independent operational tests), maintain detailed logs, and face both programmatic penalties (suspension, loss of certain DOJ grant funds) and private liability if they violate the statute. The bill also preserves state authority to set stricter rules and limits how driver’s license photos are shared for bulk searches.

At a Glance

What It Does

The bill bars most searches of reference photo databases via facial recognition unless a prosecutor obtains a court order that meets specified showing, with narrowly drawn exceptions (e.g., deceased persons, AMBER Alerts, booking photos, and true emergencies). It requires annual benchmark and operational testing, public reporting, and independent audits of agencies that use facial recognition.

Who It Affects

State and local law‑enforcement agencies, federal prosecutors and agencies that operate or access facial recognition systems, DMV operators whose photos serve as reference databases, vendors of biometric systems, and individuals subject to identification by these systems.

Why It Matters

It replaces ad hoc local policies with a national compliance architecture: judicial gatekeeping, technical validation (NIST benchmarks and operational testing), transparency through reporting, and robust remedies that can exclude evidence and trigger significant civil damages.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The Act defines the covered technologies and data (including “facial recognition,” “reference photo databases,” and “arrest photo databases”) and then builds a layered compliance regime around their law‑enforcement use. For reference databases (driver’s licenses, passports, commercial lists, and arrest photos), the baseline rule is clear: officers may search only with a court order that a prosecutor requests and the judge issues, except for limited initial exceptions.

Orders must list the probe images to be searched, name the requesting officer and authorizing official, describe prior investigative steps, identify the database to be searched, and include a deadline for the search (no more than seven days). Courts may issue orders only where the application establishes probable cause tied to specified serious offenses.

The bill preserves a short emergency pathway: officers may run facial recognition in narrowly described urgent circumstances (e.g., imminent risk of death or when a suspect cannot be identified by other means) but prosecutors must seek judicial approval within 12 hours; if approval is denied or not sought, the results must be destroyed and admissibility blocked. For booking and arrest contexts the Act explicitly allows searches during lawful arrest/booking, and it imposes a maintenance rule requiring custodians of arrest photo databases to remove photos of juveniles, uncharged persons, those whose charges were dropped, and acquitted people on a recurring schedule.Technical and programmatic controls are central.

The Act requires annual benchmark testing through the National Institute of Standards and Technology (NIST) and independent operational testing that evaluates systems as used in the field (including human reviewer effects). Agencies may not use systems that fail accuracy standards or demonstrate significant variance by race, ethnicity, gender, or age; the Assistant Attorney General (Civil Rights Division) must define what ‘‘sufficiently high’’ accuracy means by rule, after consulting outside experts.

Agencies must log uses, publish policies online, provide DMV notices to the public, and submit recurring reports to state bodies and the Administrative Office of the U.S. Courts; independent audits can force suspension of use until violations are fixed.Enforcement combines criminal‑procedure remedies and private rights: courts must suppress evidence obtained in violation of the Act; individuals may sue for civil relief and recover damages (the greater of actual profits related to the violation or $50,000 per violation), punitive damages in appropriate cases, and attorneys’ fees. The Act includes a two‑year statute of limitations for civil claims and a good‑faith defense for officers who reasonably relied on warrants or an express legal authorization.

The bill also ties compliance to federal grant funding: failure to substantially comply risks a 15 percent cut to certain DOJ Byrne/JAG-style grants for the following fiscal year.

The Five Things You Need to Know

1

A court order is required to search a reference photo database with facial recognition unless a narrow exception applies; orders must include the probe images, identity of requesting officer, the database, and a search window no longer than seven days.

2

Emergency use is allowed but prosecutors must apply for retroactive court approval within 12 hours; absent approval the results must be destroyed and are inadmissible.

3

Custodians of arrest photo databases must, beginning 180 days after enactment and every six months thereafter, remove photos of minors, people released without charge, persons whose charges were dropped, and those acquitted.

4

No agency may use a facial recognition system unless it passes NIST’s annual benchmark test and independent operational testing; the Civil Rights Division must promulgate a rule defining the required accuracy and acceptable variance across race, ethnicity, gender, and age.

5

Violation remedies include suppression of evidence, suspension of agency use after audit findings, and civil liability with damages equal to the greater of profits from the violation or $50,000 per violation, plus attorneys’ fees; civil suits must be filed within two years of discovering a violation.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 2

Grant penalty for noncompliance

If a State or local government that receives certain DOJ grants fails to substantially comply with the Act for a fiscal year, the Attorney General must cut that jurisdiction’s next-year award under the listed Omnibus Crime Control grants by 15 percent. Practically, this creates a financial lever to push compliance even for jurisdictions that prefer looser local rules; grantees will need to map facial recognition policies to grant reporting and demonstrate substantial compliance to avoid automatic cuts.

Section 101(a)–(b)

Reference databases, required removal, and court‑order regime

The bill treats reference photo databases (driver’s licenses, passports, commercial lists, and arrest photo files) as the central data source for law enforcement facial searches. It mandates semiannual removal of certain arrest photos and requires agencies that maintain such databases to adopt procedures to ensure those removals. For searches, a prosecutor must apply and a judge must issue a narrowly circumscribed order; search orders must specify probe images, the databases to be searched, the requesting officer, and a seven‑day window for the search, limiting open‑ended or ubiquitous searching.

Section 101(c)

Emergency and limited exceptions

The Act permits facial recognition without prior judicial approval in tightly drawn circumstances: identifying deceased or incapacitated persons, AMBER Alert suspects, during lawful booking, and in immediate‑danger emergencies. However, emergency use tied to imminent danger triggers a mandatory after‑the‑fact application by a prosecutor within 12 hours; if authorization is not obtained the records must be destroyed. That structure attempts to balance urgent investigative needs with judicial oversight, but it places operational strain on prosecutors and records management systems.

4 more sections
Section 102

Civil‑rights and civil‑liberties prohibitions

The Act forbids using facial recognition to create records about constitutionally protected activities (assembly, speech, association) and prohibits selection of subjects based on protected characteristics absent trustworthy, locality‑specific, time‑bound information linking those characteristics to a particular criminal incident. It also bans facial recognition searches using body‑worn or vehicle cameras and bars using the technology for immigration‑law enforcement, narrowing common operational use cases and creating compliance boundaries for multi‑mission agencies.

Sections 104–105

Logging, public reporting, and audits

Agencies must log uses to support public reporting to state agencies, the Bureau of Justice Assistance, and the Administrative Office of the U.S. Courts. State judges and prosecutors must report specifics of orders and searches annually; the AOUSC publishes an annual public report. Independent audits (GAO for federal agencies; an independent state agency for state/local) review compliance and can trigger suspension of facial recognition use until violations are remedied, along with public notice of suspension.

Section 106

NIST benchmarks, operational testing, and rulemaking

NIST must run ongoing benchmark tests aligned with law‑enforcement data and operational characteristics and develop protocols for field (operational) testing that measure system accuracy and human‑reviewer effects. Agencies must submit systems to both benchmarks and independent operational testing annually; the Civil Rights Division must define the minimum accuracy standard (including acceptable variance by demographic groups) by rule. The statute sets an 18‑month delay before these accuracy/testing requirements take effect, giving agencies and vendors a transitional window.

Sections 107–108

Remedies, civil actions, and notice to identified individuals

Evidence derived from uses that violate the Act is inadmissible. Individuals may sue for civil relief and recover damages (greater of profits from the violation or $50,000 per violation), punitive damages where appropriate, and attorneys’ fees; suits must be filed within two years of discovering the violation. Agencies must provide arrested individuals with notice that identifies the agency, the database used, the probe and candidate list, the order authorizing the search, accuracy reports, and related police documentation, and must provide this material in an appropriate language if the arrestee is not fluent in English.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Privacy across all five countries.

Explore Privacy in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • People falsely identified or wrongfully searched: The suppression remedy and civil damages give individuals legal recourse and a financial deterrent against careless or unlawful facial recognition use.
  • Civil‑rights and privacy organizations: The bill creates enforceable limits (e.g., bans on body‑cam searches and face surveillance) and mandates reporting and audits that increase transparency and oversight.
  • Judges and prosecutors seeking procedural guardrails: The court‑order regime and reporting requirements supply a uniform legal framework and evidence for oversight that can reduce ad hoc or opaque operational practices.

Who Bears the Cost

  • State and local law‑enforcement agencies: Agencies must develop court‑order workflows, logging systems, compliance programs, and policies, pay for independent operational testing and audits, and face potential suspension and grant penalties for violations.
  • Biometric vendors and system integrators: Vendors will need to submit algorithms to NIST and to independent operational tests, and to provide documentation and support for accuracy/bias evaluations—raising product development and validation costs.
  • State departments of motor vehicles: DMVs must post multilingual notices, provide informational materials online and on request, and are restricted from bulk transfers of photos for facial‑recognition searches without order—requiring policy and systems changes.

Key Issues

The Core Tension

The bill balances the investigatory utility of facial recognition in serious, time‑sensitive cases against the risk of systemic misidentification, mass surveillance, and disparate impacts across demographic groups; it attempts to resolve this by substituting judicial gates, technical validation, and vigorous remedies, but doing so imposes burdens on prosecutors, courts, agencies, and vendors and leaves crucial policy pivots to later rulemaking and testing regimes.

The Act stitches technical, judicial, and administrative controls together, but several implementation issues are unresolved. The seven‑day search window and strict ordering framework reduce expansive searching but will require substantial new judicial capacity and standardized application forms; differing state court procedures could create inconsistent access across jurisdictions.

The bill’s maintenance requirement for arrest databases (removing certain categories of photos every six months) will oblige agencies to audit historical holdings and may be technically difficult where images are embedded in legacy case files or shared with third parties.

Operational testing and accuracy thresholds are central to the statute’s protective aims, yet they rest on NIST and DOJ rulemaking and independent testing capacity that may lag. The Civil Rights Division’s eventual definition of “sufficiently high” accuracy will determine which systems survive in practice; until that rule is finalized, agencies and vendors face legal uncertainty.

Finally, the emergency exception—permitting immediate searches with only 12‑hour after‑the‑fact court filings—protects urgent investigations but hinges on prosecutorial judgment and swift judicial review, creating a tension between speed and oversight and opening risk for claims of post hoc justification.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.