Codify — Article

AI Fraud Accountability Act creates federal crimes and FTC powers for digital impersonation

Establishes criminal penalties and civil enforcement for AI-generated 'deepfakes', directs NIST to publish technical best practices, and authorizes international cooperation.

The Brief

The AI Fraud Accountability Act amends Section 223 of the Communications Act to criminalize the use of visual or audio digital impersonations—whether of real, identifiable people or fictitious persons—when used with intent to defraud. It creates a parallel civil enforcement path by treating the same conduct as an unfair or deceptive practice enforceable by the Federal Trade Commission.

Beyond enforcement, the bill directs the National Institute of Standards and Technology to convene a multidisciplinary working group to produce, publish, and annually update technical best practices for detecting, tracing, and preventing digital impersonations, and requires federal agencies to coordinate on international cooperation and treaty-level assistance for cross-border cases. For practitioners, the package combines a narrow criminal offense, broad FTC civil authority, asset forfeiture rules, and an explicit extraterritorial reach—plus a First Amendment savings clause and exemptions for authorized law enforcement and intelligence activity.

At a Glance

What It Does

The bill adds a new subsection to 47 U.S.C. §223 making it a federal crime to deploy AI-generated audio or visual impersonations intended to defraud, authorizes criminal forfeiture, and establishes extraterritorial jurisdiction. It separately designates the conduct as an unfair or deceptive act under the FTC Act so the FTC can pursue civil remedies and monetary penalties.

Who It Affects

Affected parties include creators and distributors of AI-generated media, digital platforms hosting impersonations, financial services and other fraud targets, forensic labs, and vendors of detection tooling. Federal and state law enforcement and intelligence agencies are carved out for authorized operations but will be involved in cross-agency coordination and international assistance.

Why It Matters

This bill creates a legal framework that treats certain deepfake-driven schemes as both criminal fraud and consumer-protection violations, which raises direct compliance and takedown obligations for platforms and tools vendors while shaping technical standards via NIST to guide industry and investigators.

More articles like this one.

A weekly email with all the latest developments on this topic.

Unsubscribe anytime.

What This Bill Actually Does

The Act creates two parallel enforcement tracks for AI-enabled impersonation fraud. First, it amends the Communications Act to add a criminal prohibition on using a “digital impersonation” in interstate or foreign communications to defraud someone of money or other value.

The statute defines digital impersonation to cover both manipulated authentic depictions and wholly synthetic (imaginary) individuals when the result is, to a reasonable viewer or listener, indistinguishable from an authentic depiction. Convictions carry fines, up to three years’ imprisonment, and a statutory forfeiture regime that targets proceeds and tools used to facilitate the crime.

The statute also establishes extraterritorial jurisdiction and criminalizes threats to commit the offense for intimidation or extortion.

Second, the bill treats the same conduct as an unfair or deceptive act under the FTC Act, bringing civil enforcement, injunctions, and monetary remedies into play. The FTC gets the same investigatory and remedial powers it uses for other deceptive-practice cases and must coordinate its civil enforcement with its preserved broader authority.

The bill preserves First Amendment protections such as parody and satire and exempts lawfully authorized law enforcement and intelligence activity.To bridge law, policy, and technology, the Act requires the Department of Commerce—acting through NIST—to convene a technical working group that includes federal and state law enforcement, industry sectors likely to be targeted (finance, health care, retail, telecom, digital platforms), and technical experts in AI and digital forensics. NIST must publish best practices and recommendations within one year, hold at least one public workshop, and review and update those practices annually.

That working-group mandate sunsets after ten years. The Act also instructs the FTC and the Department of Justice to pursue international cooperation: the FTC will identify the top 10 foreign source countries for offenses and can negotiate enforcement agreements, while the DOJ will review and, where appropriate, modify existing law-enforcement cooperation to support cross-border investigations and prosecutions.

The Five Things You Need to Know

1

The bill inserts subsection (i) into 47 U.S.C. §223 to criminalize using digital impersonations in interstate or foreign communications with intent to defraud.

2

‘Digital impersonation’ covers both AI-manipulated authentic depictions and fully synthetic (imaginary) individuals whose audio/visual output a reasonable person would find indistinguishable from authentic content.

3

Conviction carries a penalty of up to 3 years imprisonment, fines under title 18, and criminal forfeiture using the procedures in 21 U.S.C. §853 (except subsections (a) and (d)).

4

The FTC can enforce the civil prohibition by treating violations as unfair or deceptive acts under the FTC Act, giving it injunctive powers and remedies separate from criminal prosecution.

5

NIST must convene a public working group, publish technical best practices within one year, review them annually, and the working-group requirements expire after 10 years.

Section-by-Section Breakdown

Every bill we cover gets an analysis of its key sections. Expand all ↓

Section 2 (47 U.S.C. §223(i))

New federal crime for digital impersonation fraud

This provision defines ‘digital impersonation’ and creates the criminal offense for using such impersonations in interstate or foreign communications with intent to defraud. Practical implications include a ‘reasonable person’ indistinguishability test that will drive litigation and forensic disputes, explicit extraterritorial jurisdiction, and a separate threat offense for intimidation or extortion. Courts will apply the Controlled Substances Act’s forfeiture procedures (21 U.S.C. §853) to strip proceeds and property used in offenses, which raises asset-tracing and seizure issues familiar from drug and financial-crime prosecutions.

Section 3

Civil enforcement by the FTC as an unfair or deceptive act

Section 3 treats the same factual conduct as an unfair or deceptive practice under the FTC Act, importing the FTC’s investigatory tools, civil penalties, and remedial authority. That dual track means an actor can face both criminal charges and FTC civil enforcement for the same conduct; the bill explicitly preserves the FTC’s broader authority under other statutes. Firms should expect administrative investigations, possible consent decrees, and coordination between criminal prosecutors and the FTC on matters like evidence preservation and consumer redress.

Section 4

NIST working group and technical guidance

NIST must convene a cross-sector working group including federal/state law enforcement, industry representatives (finance, health care, retail, telecom, platforms), and AI/digital-forensics experts to develop best practices for recognition, detection, prevention, and tracing of digital impersonations. The Director must publish a report within one year, hold at least one public workshop to solicit feedback, and update the guidance annually; the statutory duties for the working group expire after 10 years. The guidance itself is nonbinding but will inform investigations, platform practices, procurement, and potential private‑sector standards.

2 more sections
Section 5

International cooperation and reporting requirements

The FTC, DOJ, and State must coordinate to identify the top 10 foreign source countries for offenses affecting U.S. persons; the FTC may negotiate cooperation agreements with those countries subject to statutory constraints. The FTC must report annually to relevant congressional committees on agreements, negotiations, and cooperation challenges. The DOJ must review and, as appropriate, modify international law-enforcement agreements on a five‑year cycle and report recommendations for strengthening cross-border enforcement—recognizing that extradition, mutual legal assistance, and divergent data‑access laws will complicate enforcement.

Section 6

First Amendment savings clause and authorized activity exemption

The Act explicitly preserves parody, satire, journalism, and other First Amendment protections and exempts lawfully authorized investigative, protective, or intelligence activities by U.S. and state agencies and U.S. intelligence agencies. The exemptions place the burden on prosecutors and civil enforcers to distinguish illicit deception-for-profit from protected speech and legitimate law‑enforcement tradecraft, creating predictable evidentiary and legal thresholds for a prosecution or civil action to proceed.

At scale

This bill is one of many.

Codify tracks hundreds of bills on Technology across all five countries.

Explore Technology in Codify Search →

Who Benefits and Who Bears the Cost

Every bill creates winners and losers. Here's who stands to gain and who bears the cost.

Who Benefits

  • Targets and victims of impersonation fraud — individuals and businesses that suffer monetary loss or reputational harm gain a federal criminal remedy, civil FTC enforcement options, and access to forfeiture as a restitution pathway.
  • Financial services and high-risk sectors — banks, insurers, and payment processors may see fewer AI-driven social‑engineering losses and clearer grounds to demand takedowns, preserve evidence, and push for platform cooperation.
  • Digital forensics and detection vendors — demand for forensic analysis, attribution services, and detection tools will expand because investigators, the FTC, and platforms will rely on technical best practices developed by NIST.

Who Bears the Cost

  • Platforms and hosting services — will face increased takedown requests, compliance burdens, and potential civil liability when impersonations facilitate fraud; they may need to adopt detection tools and stronger identity-verification processes.
  • AI model providers and startups — may incur compliance, labeling, and tooling costs to prevent misuse, and could face operational disruption if content or model outputs are implicated in enforcement actions.
  • Law enforcement and prosecutors — federal and state agencies will need resources and technical expertise to investigate complex AI-manipulation cases, run forensic analyses, and coordinate cross-border actions; failure to resource investigations could blunt enforcement.
  • International partners and ML research community — countries and researchers may bear diplomatic and operational burdens as the U.S. seeks cooperation and as norms around detection and attribution solidify, potentially chilling some cross-border research collaborations.

Key Issues

The Core Tension

The central dilemma is balancing deterrence and remediation for AI-powered impersonation fraud against preserving lawful speech and innovation: stronger criminal and civil rules reduce fraud risk and pressure platforms to act, but they also risk chilling legitimate expression and imposing high compliance costs on technology creators and hosts—especially where the legal test of ‘indistinguishable’ content and international enforcement remain unsettled.

The bill tries to thread a narrow criminal needle while giving the FTC broad civil authority, but that design creates implementation frictions. The criminal definition hinges on whether a digital impersonation is “indistinguishable” to a reasonable person; that subjective standard invites litigation over admissible expert testimony, the sufficiency of automated detection, and contested recreations used for demonstration in court.

Applying the forfeiture regime from drug law raises procedural questions about notice, third-party claims to proceeds, and the administrative capacity of courts to handle novel digital-asset tracing.

Civil enforcement by the FTC duplicates criminal tools but with different objectives and standards, creating potential coordination challenges—should the FTC pause a civil case when criminal investigators are active, and how will evidence sharing and grand-jury secrecy rules be managed? The NIST working group will produce nonbinding best practices, but industry uptake is voluntary; absent mandated standards or safe harbors, platforms may adopt inconsistent measures that shift risk rather than eliminate it.

Finally, cross-border enforcement is a practical headache: the bill directs identification of source countries and negotiation of cooperation agreements, but legal differences in privacy, evidence-gathering, and classification of speech mean that international partners may be limited or slow, leaving victims with enforcement gaps.

Try it yourself.

Ask a question in plain English, or pick a topic below. Results in seconds.