New Federal AI Deepfake Law Takes Effect: 4 Steps Schools Must Take Under the “Take It Down” Act
Insights
6.03.25
A sweeping new federal law gives schools a powerful new tool to fight the growing threat of AI-generated deepfake pornography and nonconsensual explicit content. Signed into law on May 19 and taking effect immediately, the “Take It Down” Act makes it a federal crime to knowingly publish sexually explicit images – real or digitally manipulated – without the depicted person’s consent. This bipartisan legislation passed nearly unanimously in Congress comes amid a surge in deepfake harassment affecting students and educators, especially teenage girls targeted by explicit AI-generated content. Schools must now prepare for new legal duties, reporting processes, and the potential for increased investigations. Here are four steps your institution should take right now to comply with the law and protect your school community.
Take It Down Act, in a Nutshell
Victims of revenge porn and explicit deepfakes have previously faced substantial difficulty removing explicit content online. But the Take It Down Act gives victims a nationwide remedy against the publishers of explicit content and “covered online platforms” that host explicit content. A covered platform includes public websites, online services, and applications that primarily provide a forum for user-generated content.
Authentic Intimate Visual Depictions and Digital Forgeries
The Act outlaws the knowing publication of “authentic intimate visual depictions” and “digital forgeries.” The Act’s definitions of these categories differ between adults and minors.
Authentic Intimate Visual Depiction
An actionable authentic intimate visual depiction of an adult must meet the following elements:
- Obtained or created when the person knew or reasonably should have known the identifiable individual had a reasonable expectation of privacy;
- Not voluntarily exposed by the identifiable individual in a public or commercial setting;
- Not a matter of public concern; and
- Publication was intended to cause harm or causes harm, including psychological, financial, or reputational harm, to the identifiable individual.
An actionable authentic intimate visual depiction of a minor, however, only needs to meet one of the following elements:
- Published with the intent to abuse, humiliate, harass, or degrade the minor; or
- Arouses or gratifies the sexual desire of any person.
Digital Forgery
An actionable digital forgery of an adult must meet the same elements as an authentic intimate visual depiction, except that the first element instead requires a showing that the digital forgery was published without the consent of the identifiable individual.
An actionable digital forgery of a minor requires the same elements as an authentic intimate visual depiction.
Exceptions
The Act contains a few exceptions to criminal liability, including disclosures made as part of a lawfully authorized investigative process, medical education or treatment, legitimate scientific purposes, or those made by the depicted person themselves.
Penalties
Those convicted of publishing authentic intimate visual depictions and digital forgeries could face up to two years of imprisonment for content depicting adults or three years for content depicting minors. Additionally, the Act penalizes threats involving authentic intimate visual depictions and digital forgeries, with imprisonment of up to two years for adults and 30 months for minors.
Notice and Removal of Nonconsensual Intimate Visual Depictions
Perhaps the most important provision of the Act is the requirement that covered platforms must, within one year (or by May 19, 2026), establish a process where an identifiable individual (or an authorized person acting on behalf of such individual) may notify the platform of the intimate visual depiction and submit a request for the platform to remove the intimate visual depiction.
The notification and request for removal must include:
- A physical or electronic signature of the identifiable individual;
- An identification of, and information reasonably sufficient for the covered platform to locate, the intimate visual depiction of the identifiable individual;
- A brief statement that the indefinable individual has a good faith belief that any intimate visual depiction identified is not consensual; and
- Information sufficient to enable the covered platform to contact the identifiable individual (or an authorized person acting on behalf of such individual).
4 Things Your School Should Do
As a result of this new law, your school should consider taking these four steps.
1. Ensure Your Student Conduct Policies and Procedures Are Up to Date
Consider whether your student conduct policies and procedures are up to date with the latest developments and address the following situations:
- How will your school handle violations involving students or even faculty members and parents?
- Is your school prepared to cooperate with law enforcement when students or faculty members are involved in investigations concerning violations of the law?
- What educational programs has your school implemented to explain to students the dangers of deepfakes?
2. Recognize That the Law Also Applies to Images of Faculty and Parents
The language of the Act isn’t just limited to sexually explicit content involving minors. If there is reason to believe that a faculty member or parent is subject to or threatened with nonconsensual sexually explicit content, members of your school’s community now have more options to seek that content’s removal.
3. Prepare for Subpoena Compliance
Expect an increase in litigation involving families affected by revenge porn and deepfakes, whether or not your school is directly named as a party to the lawsuit. Remember, authentic intimate visual depictions and digital forgeries may be shared without penalty as part of an ongoing legal proceeding or legally authorized investigation. Would you expect your school to maintain any records of such content as part of the student discipline process?
4. Consider How Your School Will Handle Records of Student Misconduct Involving Sexually Explicit Content
Covered platforms have a year to implement a notice and removal protocol, and once that takes effect, victims will need to submit a brief statement sufficient to locate the explicit content and explain the basis for a good faith belief that the explicit content was not consensual. Given that months may pass before some of the covered platform’s notice procedures are effective, ensure that your school maintains ample documentation of student misconduct issues involving deepfakes. What your school knows about these incidents may be critical for students, faculty members, and parents alike to seek the recourse they deserve under the Act.
Conclusion
Please consult your Fisher Phillips attorney, the authors of this Insight, or any attorney on our Education Team to obtain practical advice and guidance on how to navigate changes in federal and state law impacting explicit content and schools. Please also make sure you are subscribed to Fisher Phillips’ Insight System to get the most up-to-date information.
Related People
-
- Preston L. Buchanan
- Associate
-
- Jennifer B. Carroll
- Partner