On April 28, the House of Representatives voted 409-2 to pass the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (“TAKE IT DOWN Act”), which criminalizes the publication of nonconsensual intimate visual depictions (“NCII”) and requires online platforms to establish a notice and takedown process for NCII. The Act, which previously had been passed by the Senate, now goes to the President’s desk for signature. President Trump has indicated that he intends to sign the bill into law.
The TAKE IT DOWN Act prohibits the use of an interactive computer service to knowingly publish NCII involving minors and adults, including certain NCII that is created using AI or other computer-generated or technological tools. Penalties for violations of these requirements range from a maximum of 18 months to three years of imprisonment, depending on the nature of the offense.
In addition, the bill requires online platforms to establish a notice and takedown process for NCII. Specifically, the bill requires a “covered platform” to establish a process by which individuals (or their authorized representatives) can notify the platform of NCII depicting the individual that was published without that individual’s consent. Upon receipt of a valid request to remove the NCII, the covered platform must, as soon as possible and within 48 hours after receiving the request, remove the NCII. Covered platforms must also “make reasonable efforts to identify and remove any known identical copies of such depiction.”
“Covered platform” is defined in the bill as a website, online service, online application, or mobile application that “primarily provides a forum for user-generated content” or in the regular course of business publishes, curates, hosts, or makes available NCII, subject to certain exceptions.
Failure to comply with the bill’s notice and takedown requirements is enforceable by the Federal Trade Commission and shall constitute a violation of a rule defining an unfair or deceptive act or practice under Section 18(a)(1)(B) of the FTC Act. The bill provides that a covered platform is not liable for any claim based on the covered platform’s “good faith disabling of access to, or removal of, material claimed to be a nonconsensual intimate visual depiction based on facts or circumstances from which the unlawful publishing of the intimate visual depiction is apparent.”