A New Jersey high school student who found herself the unwitting subject of AI-generated deepfake porn circulated last fall on Snapchat is fighting back against the bogus photos with a very real federal lawsuit—which accuses one of her own classmates of perpetrating the lewd con.

In it, 15-year-old “Jane Doe” says a fellow pupil at Westfield High School identified as “K.G.” used an AI app to make her appear fully nude in perfectly innocent pictures he downloaded off social media. Doe is asking for $150,000 per “intimate visual depiction,” plus an injunction and temporary restraining order prohibiting K.G. from further disseminating the offending material.

“Based on the timing of the events at issue in this case and the photos appearing on her Instagram account at such time, Jane Doe believes that the original photo [K.G.] used to create a nude image or images of her was taken when Jane Doe was 14 years old and depicts her posing for a picture taken by another person in a public setting in which Jane Doe’s face and the entire front side of her (clothed) body, including her chest and pelvis, are visible,” states the lawsuit, which was filed Feb. 2 and has not been previously reported.

K.G. also victimized “several other minor girls” with similarly manipulated “nudes” taken from their social media accounts, according to the lawsuit. All of them were clothed in the original images, it says.

The situation made national headlines last November, and triggered a police investigation. It also led to calls for stronger laws and regulations that would better protect unwitting targets from being victimized by nonconsensual deepfakes.

In January, after phony nude images of Taylor Swift flooded X, formerly known as Twitter, a bipartisan group of U.S. senators introduced a bill that would “hold accountable those who are responsible for the proliferation of nonconsensual, sexually-explicit ‘deepfake’ images and videos.”

On Thursday, Doe’s lawyer, Shane Vogt, told The Daily Beast that the unusually sensitive nature of the case precluded him from discussing it in detail until it has been adjudicated.

“Out of respect for the court and legal process and to ensure the anonymity of everyone involved, we cannot comment on any specifics about this case but hope it is successful and will demonstrate that there is something victims can do to protect themselves from the AI pornography epidemic,” Vogt said. “All the girls and women being victimized by nonconsensual pornography created using artificial intelligence deserve to have someone willing to fight for them and their privacy, and we are extremely proud and humbled to have been given the responsibility to do that for our incredibly brave client in this case.”

Vogt declined to say how many doctored pictures of his client he believes are out there.

Attorney Christopher Adams, who is representing K.G. and his parents, told The Daily Beast, “I’m the first to admit that artificial intelligence has taken us to a place that we never envisioned. But I can tell you that it has not created a federally justiciable issue. There are no laws [here] that have been broken, and no causes of action have been met. What I want to make very clear is that AI images such as those alleged here are computer-generated. They’re not of an individual. They are generated through artificial intelligence to appear as a computer thinks they should. It’s no different than CGI or anime.”

Adams would not disclose K.G.’s age or whether he remains a student at Westfield High.

The case can be traced back to August 2023, when Doe received a follow request from K.G. on Instagram, where she keeps her account private, according to the lawsuit. After Doe accepted, K.G. downloaded or screenshotted pictures—fully clothed—of Doe that she had posted, along with ones of “several other minor girls from their social media accounts,” the lawsuit states.

K.G. then used an AI app called Clothoff.io to artificially undress Doe and the others, according to the suit.

“ClothesOff [sic] touts its ability to ‘be used quickly’ and ‘easily’ to ‘undress anybody,’” it says. “... ClothesOff [sic] also specifically highlights its proficiency in producing nude images of women and girls from their ‘[I]nstagram screenshots’ and allowing users to ‘strip your dream girl naked.’”

“K.G.,” a New Jersey high school student, used Clothoff.io to create nonconsensual deepfake nudes of several classmates, according to court filings.

Once K.G. created the ersatz nudes, which the lawsuit calls “virtually indistinguishable from real, unaltered photos,” he distributed them to his and Doe’s classmates via a Snapchat group, it continues, arguing that a “reasonable, unsuspecting viewer” wouldn’t have any inkling the pictures were computer-made.

“Jane Doe and her parents did not learn about the existence of the Nude Photos until October 20, 2023, when Jane Doe’s parents received an email from her school and Jane Doe’s mother received a phone call from the school’s assistant principal,” the lawsuit goes on. “During this phone call on October 20, 2023, Jane Doe’s mother was informed that nude images of students generated using artificial intelligence were created and distributed amongst students and that Jane Doe was a ‘confirmed victim’ of pornography being circulated.”

The same day, K.G.’s father contacted Doe’s parents about what “[their son] did,” according to the suit. The Westfield Police Department quickly launched an investigation, which led Doe to believe “that some actions would be taken to hold [K.G.] accountable for his conduct, ensure the Nude Photos had been deleted, determine the extent of their dissemination, and try to ensure that no further dissemination occurred,” the lawsuit states.

“However, Jane Doe and her parents were informed on January 24, 2024, that charges could not pursued at that time because the facts gathered by Jane Doe’s school could not be used to support the investigation and because [K.G.] and other potential witnesses failed to cooperate with, speak to, or provide access to their electronic devices to law enforcement.”

The photos “are lasting images depicting Jane Doe and the other girls, all identifiable minors, seemingly posing nude in a sexually explicit manner,” which “reasonably conveys the false impression that Jane Doe and the other girls depicted therein willingly posed and were photographed nude and disseminated their own nude photos to others,” the suit states.

What K.G. is alleged to have done is not constitutionally protected speech, the suit contends, citing past court decisions holding that “child pornography created by imposing the actual faces of real children onto the bodies of others” is a criminal violation. If the court does not step in to ensure K.G. is stopped, Doe “is powerless against the sense of hopelessness and fear she must endure from knowing that the nude images of her… exist and will almost inevitably make their way onto the Internet (if they have not already).”

There, Doe’s sexualized image will be at the whim of “others, such as pedophiles and traffickers, resulting in a sense of hopelessness and perpetual fear that at any time such images can reappear and be viewed by countless others, possibly even their friends, family members, future partners and employers, or the public at large,” according to the lawsuit.

Lawyers for both sides are scheduled to appear in court on Feb. 14.

QOSHE - New Jersey Teen Sues Classmate for Spreading Fake AI Nudes - Justin Rohrlich
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

New Jersey Teen Sues Classmate for Spreading Fake AI Nudes

5 1
09.02.2024

A New Jersey high school student who found herself the unwitting subject of AI-generated deepfake porn circulated last fall on Snapchat is fighting back against the bogus photos with a very real federal lawsuit—which accuses one of her own classmates of perpetrating the lewd con.

In it, 15-year-old “Jane Doe” says a fellow pupil at Westfield High School identified as “K.G.” used an AI app to make her appear fully nude in perfectly innocent pictures he downloaded off social media. Doe is asking for $150,000 per “intimate visual depiction,” plus an injunction and temporary restraining order prohibiting K.G. from further disseminating the offending material.

“Based on the timing of the events at issue in this case and the photos appearing on her Instagram account at such time, Jane Doe believes that the original photo [K.G.] used to create a nude image or images of her was taken when Jane Doe was 14 years old and depicts her posing for a picture taken by another person in a public setting in which Jane Doe’s face and the entire front side of her (clothed) body, including her chest and pelvis, are visible,” states the lawsuit, which was filed Feb. 2 and has not been previously reported.

K.G. also victimized “several other minor girls” with similarly manipulated “nudes” taken from their social media accounts, according to the lawsuit. All of them were clothed in the original images, it says.

The situation made national headlines last November, and triggered a police investigation. It also led to calls for stronger laws and regulations that would better protect unwitting targets from being victimized by nonconsensual deepfakes.

In January, after phony nude images of Taylor Swift flooded X, formerly known as Twitter, a........

© The Daily Beast


Get it on Google Play