Karen Painter Randall, chair of the Cybersecurity, Data Privacy and Incident Response Group, was interviewed by the NJ Law Journal regarding a new lawsuit filed in New Jersey federal court that will address whether victims of “deepfake” pornography generated by artificial intelligence can recover damages.
The suit’s plaintiff is a 15-year-old high school student who says one of her classmates used an AI application called ClothesOff to create and distribute fake nude images of her without her consent, according to the NJ Law Journal. The victim seeks damages pursuant to tortious causes of action as well as two federal statutes—15 U.S.C. §6851 and 18 U.S.C. §2252A—the first of which allows a person whose nude image was disseminated without consent to recover $150,000 along with the costs and expenses associated with litigation, and the second of which affords civil remedies to child pornography victims.
New Jersey legislation cited in the lawsuit—N.J.S.A. 2A:58D-1—also allows child pornography victims to recover damages, but currently there is no federal or state law that specifically imposes civil or criminal penalties for using AI technology to create and disseminate pornographic deepfakes without the subject’s consent.
Randall told the NJ Law Journal that this incident “signifies the immediate need for federal legislation to regulate AI and the production of pornographic deepfakes, including imposing civil and criminal penalties.” She said the fact that the pop music star Taylor Swift was recently the victim of a similar incident might incentivize lawmakers to move quicker and seriously consider federal legislation criminalizing the digital manufacturing of nude images without the subject’s consent.
Meanwhile, a bill that would impose criminal and civil penalties for the nonconsensual disclosure of deepfake pornography has been introduced by New Jersey State Senators Kristin Corrado and Jon Bramnick and is pending before the Senate Judiciary Committee.