The TAKE IT DOWN Act amends section 223 of the Communications Act to make it a federal offense to publish nonconsensual intimate visual depictions — including AI-generated “digital forgeries” — and to threaten their use for coercive purposes. The statute distinguishes adults and minors, creates criminal penalties, and adds forfeiture and restitution tools for convictions.
The bill also imposes a statutory notice-and-removal duty on covered platforms: platforms that host user-generated intimate content must operate a process enabling victims (or authorized representatives) to request removal and require platforms to disable access and try to remove identical copies. Enforcement is delegated to the Federal Trade Commission, which the bill empowers to treat noncompliance as an unfair or deceptive practice.
At a Glance
What It Does
The bill adds a new subsection to 47 U.S.C. 223 that criminalizes the knowing publication of nonconsensual intimate visual depictions and AI-crafted digital forgeries, and it requires covered platforms to implement a formal notice-and-removal process for those depictions.
Who It Affects
Individuals depicted in intimate materials, creators and traffickers of such content (including deepfake creators), and websites or apps that primarily host user-generated content or regularly publish such imagery. The FTC gains an enforcement role for platform noncompliance.
Why It Matters
This is a hybrid criminal-and-regulatory approach: it makes image-based abuse a federal crime while also shifting moderation duties onto platforms with a statutory remedy and federal enforcement, extending legal exposure to both content authors and intermediaries.
More articles like this one.
A weekly email with all the latest developments on this topic.
What This Bill Actually Does
The bill defines core terms and then draws two parallel tracks of liability. First, it defines “digital forgery” to capture intimate visual depictions created or materially altered by software, machine learning, AI, or similar technological means — specifically those that a reasonable viewer would find indistinguishable from an authentic image.
It pairs that definition with a plain-English consent standard: consent must be affirmative, conscious, and voluntary, and the statute says that consent to creation or to share with another person is not automatically consent to public publication.
Second, the text separates conduct involving adults from conduct involving minors. For adults the statute targets knowing publication of nonconsensual intimate depictions that were created or obtained when the subject had a reasonable expectation of privacy, were not voluntarily exposed in a public or commercial setting, and either were intended to cause harm or actually caused harm.
For minors the statute focuses on publication with intent to abuse, humiliate, or to arouse sexual desire. The bill also criminalizes intentional threats to publish such material when used for intimidation, coercion, or extortion.On the platform side, the bill requires covered platforms to publish and operate a clear, plain-language process that lets an identifiable individual — or an authorized representative — notify the platform and request removal of an intimate visual depiction published without consent.
The notice process must enable the platform to locate the content and to contact the requester. Once a platform receives a valid request, the legislation requires the platform to disable access to the content and to use reasonable measures to take down known identical copies; the statute also protects platforms from liability when they act in good faith to disable access or remove material that, based on the facts before them, appears to be an unlawful intimate depiction.Finally, the bill layers in enforcement and sentencing mechanics.
It amends existing Communications Act provisions and provides criminal penalties for violators, alongside criminal forfeiture and a directive that courts order restitution to injured individuals. For platform compliance failures the bill invokes the Federal Trade Commission by treating violations of the notice-and-removal obligation as unfair or deceptive acts or practices, and it incorporates the FTC’s remedial toolbox and investigatory authority to police adherence.
The Five Things You Need to Know
A covered platform must remove a validly reported nonconsensual intimate visual depiction and take steps to remove known identical copies within 48 hours after receiving the request.
Platforms have one year from enactment to establish the statutorily required notice-and-removal process that accepts written requests from victims or authorized representatives.
The statute penalizes individuals who knowingly publish nonconsensual intimate depictions involving adults with criminal fines and imprisonment of up to 2 years, and raises the maximum imprisonment for offenses involving minors.
A removal notice must include a physical or electronic signature, information sufficient to locate the content on the platform, a brief good-faith statement that the depiction was published without consent, and contact details for the requester.
The Federal Trade Commission enforces the platform obligations by treating noncompliance as an unfair or deceptive practice and the bill explicitly extends that FTC enforcement scope to organizations not operated for profit.
Section-by-Section Breakdown
Every bill we cover gets an analysis of its key sections.
New federal offenses for nonconsensual intimate images and digital forgeries
This section inserts a new subsection into the Communications Act that defines several terms (consent, digital forgery, identifiable individual, intimate visual depiction) and then sets out four core criminal rules: publication of nonconsensual authentic intimate depictions involving adults; publication of such depictions involving minors; publication of digital forgeries of adults; and publication of digital forgeries of minors. Each rule includes required mental states (knowingly, and for minors specific intent standards), a set of exceptions for law enforcement, medical and legal uses, and an explicit rule saying consent to creation or private sharing does not equal consent to publication. The provision also establishes ancillary criminal tools — forfeiture tied to conviction using procedural cross-reference to federal forfeiture law, restitution modeled on existing federal restitution statutes, and separate penalties for threats.
Short legal housekeeping and defenses
The bill amends an existing defenses provision so that the new subsection is covered by the same defenses already available under section 223. It also makes a purely technical edit to label definitions within the section. These changes preserve previously recognized exceptions while folding the new offenses into the existing statutory structure for enforcement and defense.
Mandatory notice-and-removal regime for covered platforms
Section 3 directs covered platforms to create a plain-language process through which an identifiable individual or their authorized representative may notify the platform and request content removal. The statute requires that requests supply identifying and contact information, and it conditions a platform’s takedown duties on receipt of a valid request. The text protects platforms from civil liability when they act in good faith to disable access or remove material that appears unlawful on its face. It also treats violations of these duties as fair-trade problems to be pursued by the FTC under the agency’s general unfair and deceptive acts and practices authority.
Definitions and the scope of covered platforms
Section 4 fixes the statutory glossary for the bill and sets out what counts as a covered platform: broadly, public-facing websites, online services, or apps that primarily provide a forum for user-generated content or whose regular business includes publishing or hosting nonconsensual intimate visual depictions. The provision draws lines around excluded services — for example, basic email and certain non-UGC, publisher-driven sites — to focus obligations on services that function as public forums for user content.
Severability
A standard severability clause preserves the remainder of the Act if any provision is held unenforceable. That keeps other criminal rules, platform duties, and enforcement mechanisms intact even if a court strikes part of the statute.
This bill is one of many.
Codify tracks hundreds of bills on Privacy across all five countries.
Explore Privacy in Codify Search →Who Benefits and Who Bears the Cost
Every bill creates winners and losers. Here's who stands to gain and who bears the cost.
Who Benefits
- Individuals targeted by nonconsensual intimate imagery — the bill gives victims a statutory takedown path and creates criminal penalties and restitution options against perpetrators.
- Law enforcement and prosecutors — they gain a dedicated federal statute tailored to image-based sexual abuse and digital forgeries that includes tools (forfeiture, restitution, and specified offenses) aligned with existing federal enforcement structures.
- Advocacy organizations that assist victims — the statute permits authorized representatives to submit takedown requests on behalf of victims, clarifying standing for third‑party assistance and improving access to remedies.
Who Bears the Cost
- Covered platforms and large UGC services — they must build and operate a complaint intake and triage process, staff content review, and implement mechanisms to find and remove identical copies across their services.
- Small and medium-sized platforms that host user content — while the bill targets platforms that primarily host UGC, compliance complexity (identity verification, notice tracking, duplicate detection) will create operational and legal costs for smaller operators that meet the definition.
- Content-moderation teams and vendors — increased volume of removal requests and faster turnaround expectations will raise labor needs, accelerate automation deployments, and increase legal review demands to avoid improper takedowns or liability.
Key Issues
The Core Tension
The central dilemma is protecting victims and deterring exploitative creators versus avoiding overbroad suppression of speech and imposing unsustainable operational burdens on platforms: the bill speeds removal and creates criminal liability to protect privacy and dignity, but those same mechanisms encourage rapid, imperfect content takedowns and force platforms into adjudications that courts and regulators may not be best placed to resolve.
The bill attempts a hybrid approach — criminalizing harmful individual conduct while placing civil‑regulatory duties on platforms — but that blend creates implementation puzzles. The statutory standard for a “digital forgery” hinges on whether a reasonable person would find an image indistinguishable from authentic material; litigating that standard will require technical and expert proof that could be expensive and inconsistent across jurisdictions.
Platforms that must act quickly to remove reported content will face tough choices between automated removal (which risks over‑blocking legitimate speech and lawful reporting) and slower human review (which can leave victims exposed). The statute’s good‑faith safe harbor for takedowns reduces legal risk for platforms that act promptly, but it also creates an asymmetric incentive to take content down first and defend later — a design that favors quick suppression over careful adjudication.
Enforcement via the FTC pulls the agency into content moderation disputes and broadens its jurisdictional reach, including over nonprofit organizations, which may stretch FTC resources and invite novel administrative litigation. The bill also leans heavily on victim-submitted notices that require a signed statement and contact information; while that helps prevent fraudulent or frivolous claims, it could raise safety concerns for victims who fear retaliation or for whom providing identifiable information is not feasible.
Finally, exceptions for law enforcement, medical, and legal uses are necessary but will require careful operational rules so platforms can recognize and process legitimate disclosures without creating loopholes that bad actors exploit.
Try it yourself.
Ask a question in plain English, or pick a topic below. Results in seconds.