The TAKE IT DOWN Act amends 47 U.S.C. 223 to criminalize the knowing publication of nonconsensual intimate visual depictions—including AI-generated ‘digital forgeries’—and adds forfeiture and restitution tools. It also requires covered platforms to adopt a formal notice-and-removal process and to remove reported material within 48 hours; the Federal Trade Commission will enforce platform duties as unfair or deceptive acts.
This law targets two problems at once: it creates a federal criminal backstop for victims and establishes a tight administrative removal duty for platforms. The combination raises immediate operational questions for content hosts, clarifies enforcement paths for victims, and creates potential constitutional and compliance tensions that platform operators, counsel, and compliance officers will need to navigate.
At a Glance
What It Does
Adds a new subsection to 47 U.S.C. 223 making it a federal crime to knowingly publish nonconsensual intimate images or AI-forged intimate images of identifiable people; establishes criminal penalties, forfeiture, and mandatory restitution. Separately, it obligates covered platforms to implement a written notice-and-removal process and to remove validly reported items within 48 hours, with the FTC authorized to enforce those duties as unfair or deceptive acts.
Who It Affects
Major and mid‑sized user‑generated content platforms that ‘serve the public’ and routinely host user content, platform legal and trust & safety teams, creators and journalists who publish visual material, victims of nonconsensual intimate imagery, and federal prosecutors and the FTC. Smaller startups and noncommercial sites may be affected if they meet the statutory definition of a covered platform.
Why It Matters
It combines criminal law and civil regulatory pressure to force faster takedowns and to penalize perpetrators, including for deepfakes—a sector where state law and platform policy are currently patchy. Compliance teams must build processes to meet strict procedural requirements and think through constitutional risk and cross‑border enforcement implications.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill creates a federal crime for publishing nonconsensual intimate visual depictions and for publishing digital forgeries (deepfakes) that portray identifiable individuals. The criminal offense covers adults and minors separately, carries prison terms (longer where minors are involved), and includes ancillary penalties: courts must order forfeiture of materials and proceeds tied to the offense and may require restitution to victims.
The statute defines key terms—consent, digital forgery, identifiable individual, and intimate visual depiction—so the criminal prohibition targets both authentic leaked imagery and AI‑generated fakes that are indistinguishable from real images when judged by a reasonable person.
On the civil/compliance side, the Act forces covered platforms to create an accessible, documented notice-and-removal process within one year. A complaint must be in writing, include a signature, identify and locate the disputed image, state in good faith that the image was posted without consent, and give contact information.
Platforms must post a clear notice of the process and, once they receive a valid request, remove the item and make reasonable efforts to take down known identical copies within 48 hours.The FTC is given explicit authority to treat platform failures to follow the takedown procedure as unfair or deceptive acts or practices; it enforces the removal duty with its usual investigatory and remedial powers and the Act clarifies that the FTC’s reach extends to non‑profit entities. The statute also preserves exceptions for authorized law enforcement or intelligence activity, legitimate medical or educational uses, reporting obligations, and cases where the person published the depiction of themselves.
Finally, the bill amends defenses in the Communications Act to reflect the new subsection and adds technical conforming language.
The Five Things You Need to Know
The bill requires covered platforms to remove validly reported nonconsensual intimate images and make reasonable efforts to remove known identical copies within 48 hours of receiving a compliant takedown request.
Platforms have one year from enactment to establish a written notice-and-removal process that includes a signature, a locating identifier for the content, a good‑faith statement that the image was nonconsensual, and contact information for the requester.
Federal criminal penalties for knowingly publishing nonconsensual intimate depictions are up to 2 years’ imprisonment for adults and up to 3 years when victims are minors; threats and digital‑forgery‑specific threats carry separate shorter maximum terms.
The FTC enforces platform takedown obligations by treating violations as unfair or deceptive acts or practices, and the Act expressly authorizes the FTC to use its full powers—extending enforcement to entities that aren’t organized for profit.
The statute defines ‘digital forgery’ as an AI‑or‑software‑created intimate visual depiction that a reasonable person, viewing the image as a whole, would find indistinguishable from an authentic depiction.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
Short title
Declares the statute’s formal name: 'TAKE IT DOWN Act.' This is purely stylistic but establishes the bill’s branding for citations in later rules, guidance, and litigation.
New federal crimes for nonconsensual depictions and digital forgeries
Amends section 223 of the Communications Act to add subsection (h), laying out two parallel offenses: publication of authentic nonconsensual intimate depictions and publication of digital forgeries (deepfakes). Each offense is split into adult and minor categories, with element lists (consent, exposure, public concern, intent or actual harm). Practically, prosecutors must prove knowledge and the listed elements; defense rules are adjusted so that preexisting defenses in section 223(e)(1) also encompass the new subsection.
Criminal sanctions plus mandatory forfeiture and restitution
Specifies custodial maximums (2 years adults, 3 years minors) and separate threat penalties for extortionate or intimidating threats involving these materials. The statute requires courts to order forfeiture of the material, proceeds traceable to the offense, and any property used to commit it, using procedures borrowed from the federal criminal forfeiture regime; courts must also order restitution under the existing victims’ restitution framework at 18 U.S.C. 2264. That combination increases the possible financial exposure for offenders.
Platform notice-and-removal process and FTC enforcement
Requires covered platforms to create an accessible notice-and-removal procedure within one year and to remove validly reported content within 48 hours, including taking 'reasonable efforts' to remove known identical copies. The provision limits platform liability for good‑faith takedowns and makes noncompliance an FTC enforcement matter by classifying failures as unfair or deceptive acts or practices; the FTC’s full investigatory and remedial powers apply and its jurisdiction explicitly covers non‑profit actors for purposes of this section.
Definitions and severability
Defines 'covered platform' narrowly (public-facing services primarily hosting user-generated content, with explicit exclusions for broadband and email) and adopts the criminal‑law definitions added to 47 U.S.C. 223. Includes a standard severability clause so that if a provision is struck down, the rest remains effective; operationally this signals drafters anticipated potential constitutional challenges and wanted to preserve as much of the law as possible.
This bill is one of many.
Codify tracks hundreds of bills on Privacy across all five countries.
Explore Privacy in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Survivors of image-based abuse — faster removal timelines, criminal tools, restitution, and forfeiture provide victims clearer remedies and decrease the time harmful content remains accessible. The statutory definitions also capture AI‑generated fakes so victims of deepfakes gain explicit protection.
- Platforms that build compliant systems quickly — by meeting the one‑year setup and 48‑hour takedown standard, compliant platforms reduce litigation and regulatory risk and gain a statutory safe harbor for good‑faith removals. Clear procedures also reduce case-by-case friction for trust & safety teams.
- Federal prosecutors and law enforcement — the statute creates a clean federal cause of action and statutory tools (forfeiture, restitution) that simplify charging decisions in interstate and cross‑platform cases.
- Advertisers and brand managers — faster removals and a statutory framework for handling nonconsensual imagery reduce brand safety risks and help preserve advertising channels from association with abusive content.
Who Bears the Cost
- User‑generated content platforms (legal and trust & safety teams) — must design, staff, and document a new takedown pipeline, meet the 48‑hour removal clock, and implement identification systems for known identical copies, which could be costly and operationally complex.
- Small or niche platforms and startups — the one‑year compliance window and ambiguous thresholds for being a 'covered platform' create legal uncertainty and compliance costs that may disproportionately burden companies with limited moderation budgets.
- Publishers, journalists, and researchers — the statute’s criminal and takedown language plus broad FTC authority may chill legitimate publishing in gray‑area cases (‘public concern’ determinations are fact‑sensitive) or require additional legal review before posting or republishing sensitive visual material.
- FTC and federal courts — enforcement will demand agency resources and will likely spawn litigation over definitions, constitutional limits, and the scope of the FTC’s expanded jurisdiction, increasing administrative and judicial burdens.
Key Issues
The Core Tension
The central tension is between rapid, effective protection for victims (speedy takedowns, criminal penalties, restitution) and the risk that broad, expedited takedown duties and criminalization will overreach—chilling protected speech, imposing heavy operational costs on platforms (especially smaller ones), and inviting constitutional and jurisdictional challenges that could undercut enforcement or create uneven outcomes.
The bill mixes criminal law and administrative regulation, and that blend raises practical and constitutional questions. Several operative terms—'intended to cause harm,' 'reasonable expectation of privacy,' and 'matters of public concern'—are fact‑intensive and will drive litigation over mens rea and scope.
Prosecutors must prove knowledge and the other elements, but the statute also leaves open civil/regulatory remedies via the FTC for platforms that fail to implement the takedown regime. That dual pathway creates overlapping exposures: an individual can face criminal prosecution while a platform can be subject to FTC action for not removing content quickly enough.
Operationally, the 48‑hour removal deadline and the 'reasonable efforts' obligation to remove known identical copies are blunt instruments. Identifying identical copies across a sprawling content ecosystem requires hashing, perceptual similarity indexing, or human review—each has trade‑offs between under‑ and over‑blocking.
The Act’s limited liability for good‑faith takedowns reduces some risk of over‑removal, but the specter of FTC enforcement for noncompliance provides a counter‑pressure that will push platforms toward conservative takedowns. Additionally, defining 'covered platform' leaves borderline services uncertain; the exclusions for email and broadband are clear, but hybrid services and sites with mixed content may need legal interpretation to determine coverage.
Finally, the statute’s reach into digital forgeries invites First Amendment and federalism challenges. Criminalizing the publication of expressive imagery—especially where the image touches public interest—will force courts to reconcile victim protection with free‑speech doctrine.
State laws on revenge porn, civil remedies, and platform obligations already vary; the federal overlay will preempt neither constitutional review nor routine jurisdictional frictions in cross‑border cases, and it will likely generate precedent that narrows or reshapes enforcement over time.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.