The bill creates a new federal offense for using a digital impersonation—audio or visual content created or altered with AI or other technology that a reasonable person would mistake for authentic—to defraud someone in interstate or foreign communications. Conviction carries up to 3 years’ imprisonment, criminal forfeiture of proceeds and tools used, and extraterritorial jurisdiction; threats to commit the offense are also punishable.
Separately, the bill makes the same conduct an unfair or deceptive practice enforceable by the Federal Trade Commission (FTC), directs the National Institute of Standards and Technology (NIST) to convene a technical working group to publish best practices within one year and update them annually, and requires the FTC and Department of Justice to pursue international cooperation and reporting. The statute contains explicit exceptions for authorized law-enforcement and intelligence activities and a First Amendment savings clause.
At a Glance
What It Does
The bill adds subsection (i) to 47 U.S.C. §223 to criminalize using AI- or software-generated audio/visual impersonations to defraud across interstate or foreign communications, and treats the same conduct as an unfair or deceptive act the FTC can enforce civilly. It directs NIST to develop and publish technical best practices and requires FTC/DOJ actions to coordinate international enforcement.
Who It Affects
Individuals and organizations that create, distribute, host, or monetize realistic AI-generated audio or video—including AI developers, social media and communications platforms, financial institutions, e-commerce firms, and telecommunications carriers—plus federal, state, and local law enforcement, and NIST.
Why It Matters
The bill ties criminal, civil, technical, and diplomatic levers together: it creates a prosecutable offense, gives the FTC a civil pathway to stop harms, charges NIST with technical norms, and compels cross-border engagement—shifting how private-sector platforms and AI toolmakers must manage and mitigate realistic impersonation risk.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The Act creates two parallel enforcement tracks. First, it amends the Communications Act to add a criminal offense for falsely posing as an identifiable or an "imaginary" individual in an audio or visual depiction generated or materially altered by software, machine learning, AI, or similar technology, when the impersonation is intended to defraud someone of money or other value.
The criminal provision includes traditional sentencing (up to three years), allows forfeiture of proceeds and instruments used in the offense, treats threats to commit the offense as a separate punishable act, and expressly asserts extraterritorial jurisdiction.
Second, the bill gives the Federal Trade Commission civil authority by declaring the same conduct an unfair or deceptive act under the FTC Act. That means the FTC can bring administrative or civil actions using its existing investigatory and remedial powers, and victims can seek relief through the Commission’s enforcement mechanisms.
The Act preserves existing FTC authority and imports penalties, privileges, and procedures from the FTC Act for civil enforcement.To support detection and attribution, the Secretary of Commerce, through NIST, must convene a working group of government, private-sector, and technical experts to develop best practices for recognizing, detecting, preventing, and tracing digital impersonations used in fraud. NIST must publish those recommendations within one year, hold at least one public workshop to solicit comment, and review and update the guidance annually for up to ten years.
The working group explicitly includes representatives from financial services, health care, retail, telecommunications, digital platforms, law enforcement, and AI/digital-forensics scientists.The bill also tackles the cross-border nature of impersonation fraud. The FTC, DOJ, and State Department are directed to identify the top foreign origins of impersonation harms, negotiate cooperative agreements with foreign law enforcement, and report to Congress on those efforts.
The DOJ will periodically review and, if necessary, modify international agreements to strengthen assistance in cases originating abroad. Finally, the Act includes carve-outs for authorized law-enforcement and intelligence activities and a First Amendment savings clause protecting parody, satire, and journalism.
The Five Things You Need to Know
The bill defines “digital impersonation” to include both AI-generated depictions of real, identifiable people and AI-generated depictions of imaginary persons that are, to a reasonable viewer or listener, indistinguishable from a real person.
The criminal offense applies when a person uses a digital impersonation in interstate or foreign communications with intent to defraud and carries penalties of up to 3 years’ imprisonment, criminal forfeiture of proceeds and tools, and extraterritorial jurisdiction.
The FTC gets civil enforcement authority: violations are treated as unfair or deceptive acts under the FTC Act, allowing administrative orders, civil penalties, and other remedies available to the Commission.
NIST must convene a working group and publish technical best practices and recommendations within one year, hold public workshops, and conduct annual reviews and updates for up to 10 years.
Within 90 days the FTC (with DOJ and State) must identify the top 10 foreign source countries for these harms; the FTC may negotiate cooperative agreements, and the Attorney General must review/modify international law-enforcement agreements at least every five years.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Creates a federal crime for fraudulent digital impersonation
This section inserts a new subsection into the Communications Act making it a crime to use AI- or software-generated audio/visual impersonations to pose as an identifiable or imaginary individual with intent to defraud in interstate or foreign communications. It supplies a precise definition of 'digital impersonation' (including adaptation or manipulation of authentic material) and of 'identifiable individual' (face, likeness, voice, or other distinguishing characteristic). Practically, that means generating a realistic deepfake intended to trick a target into transferring money or sensitive documents can be charged federally.
Penalties, threats, forfeiture, and extraterritorial reach
The bill sets criminal punishment at fines under Title 18 and up to three years’ imprisonment for violations, treats intentional threats to commit the offense as punishable, and mandates forfeiture of gross proceeds and property used to facilitate the crime. Forfeiture procedures incorporate the Controlled Substances Act’s criminal forfeiture process, with an exception for certain subsections. The provision explicitly authorizes extraterritorial jurisdiction, enabling prosecution where conduct crosses borders.
Civil enforcement via the Federal Trade Commission
Section 3 makes the same impersonation conduct an unfair or deceptive act under the FTC Act, importing the FTC’s investigatory authority, remedies, and procedural framework. The FTC can use administrative orders, civil penalties, and injunctive relief under its existing toolkit. The text preserves other FTC authorities, meaning the Commission can combine this civil route with other consumer-protection actions where appropriate.
NIST working group to create and update technical best practices
This section requires the Commerce Secretary (via NIST) to form a multi-stakeholder working group—government law enforcement, private industry (finance, health, retail, telecom, platforms), and technical experts—to develop and publish best practices for recognizing, detecting, preventing, and tracing digital impersonations. NIST must hold at least one public workshop, publish the report within one year, and update it annually, with the mandate sunsetting after ten years. The requirement is normative: NIST issues guidance, not binding technical standards.
International cooperation and reporting
The Commission, DOJ, and State Department must coordinate on international enforcement. The FTC must, within 90 days, identify the top 10 foreign source countries for impersonation harms to U.S. persons and may enter into cooperation agreements subject to existing FTC requirements. The Attorney General must review and, when necessary, modify international law-enforcement agreements at least every five years and report to multiple congressional committees on these efforts and enforcement challenges.
First Amendment savings clause
The bill expressly states it does not restrict parody, satire, journalism, or other First Amendment protections. That preserves constitutional defenses but creates an evidentiary line-drawing problem: whether a given realistic AI-generated clip is protected expressive activity or punishable fraud will often depend on intent and context, which enforcement actors must evaluate case-by-case.
This bill is one of many.
Codify tracks hundreds of bills on Technology across all five countries.
Explore Technology in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Consumers targeted by impersonation scams — clearer federal remedies and both criminal and FTC civil avenues increase the chance of redress and deterrence against financially motivated deepfakes.
- Financial institutions and payment processors — stronger legal tools for identifying fraud, pursuing forfeiture, and securing cooperation from platforms can reduce charge-offs and fraud losses.
- Victim-facing industries (healthcare, retail, professional services) — sector-specific inclusion in the NIST working group aims to surface practicable detection and mitigation steps tailored to industry workflows.
Who Bears the Cost
- AI developers and tool providers — must assess and possibly modify offerings, incorporate detection or watermarking, respond to takedown or information requests, and face potential civil liability if their technology facilitates fraud.
- Online platforms and communications providers — increased compliance, monitoring, and content-mitigation obligations; platforms may need to enhance detection pipelines, legal review, and international cooperation capabilities.
- NIST and federal law-enforcement bodies — required to organize, staff, and sustain a multi-year working group and cross-border cooperation activities without dedicated funding in the text, creating administrative and technical burdens.
Key Issues
The Core Tension
The central dilemma is balancing robust protection against realistic AI-driven impersonation fraud—requiring clear definitions, investigative reach, and enforcement tools—against protecting legitimate expression, international legal constraints, and feasible technical compliance; measures that effectively deter fraud risk chilling valuable uses of synthetic media and imposing heavy compliance costs on technology and platform actors.
The statute packs multiple legal tools into one Act—criminal law, FTC civil remedies, technical guidance, and international diplomacy—which is rhetorically tidy but operationally complex. Key implementation questions are unresolved: how prosecutors and regulators will prove the mental-state element (intent to defraud) when impersonation technology can be used for plausible deniability; how the “reasonable person” test will be applied to rapidly improving synthetic media; and how protection for parody, satire, and journalism will be distinguished from fraudulent impersonations in close cases.
On the technical side, the bill directs NIST to publish best practices but stops short of mandatory standards or funding for their adoption. That makes the guidance advisory; platforms and vendors will weigh reputational and enforcement risk against the cost of technical changes.
International enforcement is similarly aspirational: the FTC and DOJ are empowered to seek agreements and to identify problem countries, but cross-border evidence-gathering, differing privacy regimes, and reluctant foreign authorities may limit practical cooperation. Finally, the forfeiture provision borrows the criminal forfeiture framework from the Controlled Substances Act but excludes certain subsections, a drafting choice that could raise procedural or constitutional challenges in complex asset-recovery cases.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.