Federal Law Creates ‘Delete Button’ for AI-Generated Deepfakes and Revenge Porn

President Trump signed the TAKE IT DOWN Act into law yesterday, creating the first federal framework for removing non-consensual intimate images—including AI-generated deepfakes—from online platforms. The law gives victims a mechanism to request removal and imposes criminal penalties on those who distribute such content.

Why This Matters: The AI Deepfake Problem

Artificial intelligence has made it trivially easy to create convincing fake intimate images of anyone. You don’t need technical skills or expensive software—just a few photos scraped from social media and a free AI tool. The results can be disturbingly realistic.

The statistics are stark: 96% of deepfake videos online are pornographic, and research suggests that 90-95% of these are non-consensual. Women are overwhelmingly targeted, representing 99% of victims in deepfake pornography. And the problem is accelerating—what once required sophisticated expertise now takes minutes.

Until now, victims had limited recourse. Some states had laws addressing “revenge porn,” but they varied widely and didn’t always cover AI-generated content. There was no federal law and no consistent platform removal process.

What the TAKE IT DOWN Act Does

The law operates on two tracks: immediate criminal penalties and future platform requirements.

Criminal Provisions (Effective Immediately)

It is now a federal crime to knowingly distribute non-consensual intimate images in interstate or foreign commerce. This includes:

  • Authentic images: Real intimate photos or videos shared without consent
  • “Digital forgeries”: AI-generated deepfakes that a reasonable person couldn’t distinguish from real images

The penalties are significant: up to two years in prison for adults depicted in the images, and up to three years when the victim is a minor. The law clarifies an important distinction: consent to create an intimate image does not equal consent to publish it.

Platform Requirements (Effective May 19, 2026)

Starting one year from now, “covered platforms” must establish procedures to receive and process takedown requests for non-consensual intimate images. The key requirements:

48-Hour Removal Deadline: Platforms must remove reported content within 48 hours of receiving a valid notice.

Hash-Based Blocking: After removing an image, platforms must make “reasonable efforts” to prevent identical copies from being re-uploaded. This typically involves creating a digital fingerprint (hash) of the removed content and blocking future uploads that match.

FTC Enforcement: The Federal Trade Commission can enforce these requirements. Violations are treated as unfair or deceptive practices under existing consumer protection law.

Safe Harbor: Platforms that remove content in good faith are protected from liability—even if the content is later determined to be lawful. This encourages platforms to err on the side of victims.

For Victims: What This Means

If you’re a victim of non-consensual intimate images—whether authentic or AI-generated—here’s what you need to know:

Federal Criminal Law: Distribution of your images without consent is now a federal crime. While this doesn’t guarantee prosecution in every case, it provides law enforcement with a consistent tool across all 50 states.

Platform Removal (Starting 2026): Once the platform requirements take effect in May 2026, you’ll be able to submit takedown requests to major social media platforms, forums, and other sites. They’ll have 48 hours to remove the content and must take steps to prevent it from being re-uploaded.

Covers AI Deepfakes: The law explicitly covers “digital forgeries”—AI-generated images that appear real. The standard is whether a “reasonable person” would recognize them as fake. This addresses a key gap in existing state laws, many of which only covered authentic images.

No Need for a Lawyer: The law is designed to allow victims to submit takedown requests directly. You won’t need to hire an attorney to use this process.

For Platforms: What You Need to Know

If you operate a platform that hosts user-generated content, you have one year to implement compliance procedures.

Who Is Covered?

The law applies to “covered platforms”—services that host user-submitted content where intimate images could appear. This likely includes:

  • Social media platforms (Facebook, Instagram, X/Twitter, TikTok, Snapchat)
  • Forums and message boards (Reddit, Discord)
  • Video and image hosting sites (YouTube, Imgur)
  • Cloud storage services with sharing features
  • Adult content platforms

Private, end-to-end encrypted messaging (like Signal or WhatsApp) and email are likely not covered, as they’re not platforms where content is “hosted” for public or group access.

What You Must Do

By May 19, 2026, you must:

  1. Establish a Takedown Process: Create a clear procedure for receiving and reviewing notices of non-consensual intimate images.
  2. Remove Within 48 Hours: Once you receive a valid notice, you have 48 hours to remove the content.
  3. Block Identical Copies: Implement systems (typically hash-matching) to prevent the same content from being re-uploaded.
  4. Respond to the FTC: If the FTC investigates, you must be able to demonstrate your compliance procedures.

Safe Harbor Protection

The law provides important protection: if you remove content in good faith based on a takedown notice, you’re protected from liability even if it turns out the content was lawful. This is meant to encourage quick action without fear of being sued by users whose content was removed.

Timeline and What Comes Next

Now: Criminal provisions are in effect. Law enforcement can prosecute those who knowingly distribute non-consensual intimate images.

May 19, 2026: Platform takedown requirements become enforceable. The FTC can take action against platforms that don’t comply.

Coming Soon: The FTC is expected to issue guidance on what constitutes a “covered platform,” what makes a notice “valid,” and what “reasonable efforts” means for hash-blocking.

What This Doesn’t Do

This law creates a framework for removing content from platforms and prosecuting distributors. It does not:

  • Criminalize the creation of deepfakes (only distribution)
  • Require AI tools to have built-in safeguards
  • Create a private right of action (victims cannot sue in civil court under this law)
  • Guarantee that all content will be removed (enforcement depends on platform compliance and jurisdictional reach)

The Bottom Line

The TAKE IT DOWN Act represents the first comprehensive federal attempt to address non-consensual intimate images in the age of AI. For victims, it provides a clear path to removal and federal criminal penalties for perpetrators. For platforms, it sets a national standard with a one-year implementation deadline.

Whether this will prove effective against the scale and sophistication of AI-generated abuse remains to be seen. But for the first time, there’s a federal “delete button” that victims can use—and platforms must honor.


This analysis is based on the text of S.146 as signed into law on May 19, 2025. Platform compliance requirements take effect May 19, 2026. For victims seeking immediate assistance, contact the Cyber Civil Rights Initiative (cybercivilrights.org) or the National Sexual Assault Hotline (1-800-656-4673).