Combating Deepfake Exploitation: What You Need to Know About the TAKE IT DOWN Act
Aug 22, 2025
By: Marisa McAdams, Administrative Assistant | Digital4Good
In an age where generative artificial intelligence can imitate real people’s voices and likenesses with chilling accuracy, synthetic non-consensual intimate imagery (NCII) has emerged as a pervasive form of digital abuse. From AI-edited nude images to pornographic videos, these digital violations are increasingly targeting women, minors, and public figures.
Now, with the passage of the TAKE IT DOWN Act on May 19, 2025, the United States government has taken a direct step to confront the harmful rise of deepfake abuse.
Why Was It Passed?
The Internet has long lacked strong legal tools and protections to combat the spread of sexually explicit images created and distributed without consent — especially those powered by AI. Since the popularization of generative AI software products, deepfakes — synthetic media made to simulate real people — have flooded social media platforms with fabricated pornographic content. In many cases, the targets remain unaware until the damage is done.
In October 2023, 14-year-old Elliston Berry didn’t find out until receiving a text from her friend, 15-year-old Francesca Mani. Photos had been taken from the two students’ social media accounts and altered to make them appear nude. Within a day, doctored images of Berry, Mani, and several of their female classmates were being circulated around their high school.
The targeted students were left with little recourse to reclaim their privacy and seek punishment for the perpetrator. When Berry and Mani reported the violations to the social media platforms and school boards, their concerns were dismissed or outright ignored.
Berry said, “School is supposed to be our safe place. It's where we go to learn, get our education. Yet, there's so many cases around the world that people are not even wanting to go to class because they're being tormented on social media.”
Mani, Berry, and Berry’s mother, Anna, brought the issue to the attention of Senator Ted Cruz’s office. Subsequently, Cruz and Senator Amy Klobuchar introduced the “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act” (or the “TAKE IT DOWN” Act), which was swiftly passed with bipartisan support in the Senate and the House of Representatives.
Regarding the passage of the act, Elliston Berry commented, “I knew I could never go back and undo what [the perpetrator] did, but I wanted to do anything to help prevent this from happening to others. With the passage of the TAKE IT DOWN Act, we can protect future generations from having to experience the pain I went through.”
On May 19, 2025, President Donald Trump signed the act into law, with First Lady Melania Trump acting as co-signer.
What Does It Say?
The TAKE IT DOWN Act criminalizes the digital distribution of NCII, both authentic and AI-generated, as well as threats thereof. Penalties include fines and jail time — up to two years for intimate visual depictions of adults, and up to three for depictions of minors.
The Act also grants victims the right to request removal of NCII from social media sites. Under the Act, platforms are required to respond to takedown requests within 48 hours, making reasonable efforts to remove the reported content and any duplicates. Accordingly, platforms must implement a clear system for user reporting and content moderation.
What Was the Response?
The TAKE IT DOWN Act has been widely endorsed by victim advocacy groups and technology companies. RAINN (Rape, Abuse, & Incest National Network) hailed the Act as a “landmark victory for survivors,” while Meta called it an “important step forward in fighting [intimate image] abuse and supporting those affected across the internet.”
While broadly popular, the Act has sparked First Amendment and privacy concerns from free speech advocates and digital rights groups. EFF (Electronic Frontier Foundation) expressed that the Act “pressures platforms to actively monitor speech, including speech that is presently encrypted.” Moreover, the CCRI (Cyber Civil Rights Initiative) critiqued the lack of “safeguards against false or malicious reports,” which leave the reporting process “highly susceptible to abuse.” Still, lawmakers argue that the urgent need to protect victims outweigh the risks.
What Can We as Citizens Do?
The TAKE IT DOWN Act represents a major milestone in legislation against intimate image abuse, offering long-overdue protections in an era where AI can too easily become a weapon. While challenges and concerns about free expression and digital privacy remain, the message is clear: technology companies must take responsibility, and victims of digital exploitation deserve justice.
If you or someone you know becomes the target of NCII — real or AI-generated — you now have a legal path to fight back. Victims can file takedown requests with social media platforms, while resources from advocacy groups like RAINN and NCMEC are available to assist with reporting and recovery.
Furthermore, families, educators, and community leaders have a responsibility to help young people understand their rights and responsibilities in the digital era. To this end, Digital4Good has developed tailored curriculum packages to aid schools and districts in teaching digital literacy and developing strong policies for online abuse. To join the movement and stay updated on our latest resources for schools, please subscribe to our newsletter.
Stay connected with news and updates!
Join our mailing list to receive the latest news and updates from our team.
Don't worry, your information will not be shared.
We hate SPAM. We will never sell your information, for any reason.