Schools face a new threat: “nudify” sites that use AI to create realistic, revealing images of classmates

Schools face a new threat: “nudify” sites that use AI to create realistic, revealing images of classmates

In October last year, a 14-year-old girl named Francesca Mani was sitting in her high school history class when she heard a rumor that some boys had naked photos of female classmates. She soon learned her picture was among them, but the images were doctored – created with artificial intelligence using what’s known as a “nudify” website or app, which turns real photos of someone fully clothed into real-looking nudes. We’ve found nearly 30 similar incidents in schools in the U.S. over the last 20 months and plenty more around the world. We want to warn you, some of what you’ll hear – and see – is disturbing, but we think unveiling these “nudify” websites is important. In part because they’re not hidden on the dark web, they are openly advertised, easy to use, and as Francesca Mani found out there isn’t much that’s been done to stop them. 

Anderson Cooper: When you first heard the rumor, you didn’t know that there were photos– or– or a photo of you?

Francesca Mani: No. We didn’t know. I think that was like the most chaotic day I’ve ever witnessed. 

Anderson Cooper: In a school, somebody gets an inkling of something, and it just spreads.

Francesca Mani: It’s like rapid fire. It just goes through everyone. And so then when someone hears– hears this, it’s like, “Wait. Like, AI?” Like, no one thinks that could, like, happen to you.

Francesca Mani knew nothing about “nudify” websites when she discovered she and several other girls at Westfield High School in New Jersey had been targeted. According to a lawsuit later filed by one of the other girls through her parents, a boy at the school uploaded photos from Instagram to a site called Clothoff. We are naming the site to raise awareness of its potential dangers. 

Dorota Mani and Francesca Mani
Dorota Mani and Francesca Mani

60 Minutes


There are more than a hundred of these “nudify” websites – a quick search is all it takes to find them. Clothoff is one of the most popular, with more than 3 million visits last month, according to Graphika, a company that analyzes social networks. It now offers to “nudify” males as well – but female nudes are far more popular. “Have someone to undress?” Clothoff’s website asks. You can upload a photo – or get a free demonstration, in which an image of a woman appears with clothes on, then a few seconds later, her clothes are gone. We are blurring it out but the results look very real.

Francesca Mani never saw what had been done to her photo, but according to that lawsuit at least one girl’s AI nude was shared on Snapchat and seen by several kids at school. What made it worse, Francesca says, is that she and the other girls found out they were the victims, when they were called by name to the principal’s office over the school’s public address system.

Francesca Mani: I feel like that was a major violation of our privacy while, like, the bad actors were taken out of their classes privately. When I left the principal’s office, I was walking through a hallway, and I saw these group of boys laughing at these group of girls crying. And that’s when I realized I should stop crying and be mad, because this is unacceptable.

That afternoon, Westfield’s principal sent this email to all high school parents, informing them “some of our students had used artificial intelligence to create pornographic images from original photos.” The principal also said the school was investigating and “at this time we believe that any created images have been deleted and are not being circulated.” 

Francesca’s mom Dorota, who’s also an educator, was not convinced. 

Anderson Cooper: Do you think they did enough?

Dorota Mani: Well, I don’t know, Anderson. You work in television. Is anything deleted in the digital world?

Anderson Cooper: You feel like even if somebody deletes something somewhere, who knows where these images may be?

Dorota Mani: Who printed? Who screenshotted? Who downloaded? You can’t really wipe it out.

Dorota says she filed a police report, but no charges have been brought. She was shocked by the school’s handling of the whole incident.

Dorota Mani: The principal informed me that one boy receives one-day suspension, and that was it. So I ask her if this is all. Are there gonna be any other consequences? And she said, “No, that’s– for now, this is all that is going to happen.”

The school district wouldn’t confirm details about the photos, the students involved, or any disciplinary action. In a statement to 60 Minutes, the school superintendent said the district revised its Harassment, Intimidation and Bullying policy to incorporate AI — something the Manis said they spent months urging school officials to do. 

Anderson Cooper: You feel like the girls paid a bigger cost in the end-

Francesca Mani: Yeah, they did.

Anderson Cooper: Than the boy or boys–

Francesca Mani: Yeah.

Anderson Cooper: – who were involved in this did?

Francesca Mani: Because they just have to live with knowing that maybe an image is floating, their image is floating around the internet. And they just have to deal with what the boys did.

Kolina Koltai has been looking into Clothoff and other “nudify” sites for more than a year. She’s a senior researcher who specializes in the misuse of AI at Bellingcat, an international investigative group.

Kolina Koltai, a senior researcher at the international investigative group Bellingcat
Kolina Koltai, a senior researcher at the international investigative group Bellingcat, speaks with Anderson Cooper.

60 Minutes


Anderson Cooper: This site, as soon as you get there, it says, you have to be 18 or over to use the website. You can’t use others’ photos without their permission. You can’t use pictures of people who are under 18. Is there any way for them to actually check if you’re–

Kolina Koltai: No.

Anderson Cooper: –under 18 or over 18?

Kolina Koltai: You’ll see, as we click accept, that there’s no verification. And now we’re already here.

Anderson Cooper: And immediately, you’re getting very explicit photos.

Kolina Koltai: And then they have the poses feature, which is one of their new settings, which is the different sex poses, which is the premium feature.

Anderson Cooper: Wow. So, Wow.

Kolina Koltai: This is the preview. We haven’t…

Clothoff and other “nudify” sites encourage customers to promote their services on social media, and users often show off their favorite before and after AI nudes.

Kolina Koltai: I’ve even seen on social media platforms people showing before and after photos of what are clearly, like, high school girls. And I’ve, like, reverse image searched the original photo. And they’re, like, a high school girl’s, like, swim meet. You’ll see these are very clearly, these are minors, and adult content is being made of them nonconsensually, then also being posted on social media.

Anderson Cooper: I think a lotta parents would be surprised to learn that you post a picture of your child on your Instagram account–your child could end up– a naked photo of your child out there.

Kolina Koltai: Yeah. And so you have a registration…

To “nudify” a photo on Clothoff is free… the first time. After that it costs from $2 to $40. The payment options often change, but there are always plenty to choose from.

Kolina Koltai: It’s giving me everything from crypto using a credit card for a variety of different credit cards. We got PayPal here. Google Pay.

Anderson Cooper: I would imagine, some of these companies are not thrilled that their services are being used by these websites.

Kolina Koltai: Yeah. And in many of these cases, it directly violates their policies.

To trick online payment services, Kolina Koltai says, Clothoff and other “nudify” sites, redirect their customers’ payments through phony websites like these pretending to sell flowers and photography lessons. 

Nudify sites
Kolina Koltai, a senior researcher who specializes in the misuse of AI at Bellingcat, speaks with Anderson Cooper.

60 Minutes


Kolina Koltai: Say, for example, you want to pay through PayPal. So we click this, and it’ll take a second. So it’s now redirecting you.

Anderson Cooper: It’s redirecting through a dummy–

Kolina Koltai: A dummy website.

Anderson Cooper: –website.

Kolina Koltai: So that way, on PayPal’s end, it looks like you may be purchasing anything from–motorcycles or bee-keeping lessons or Rollerblade lessons. And so now we got to a PayPal screen. But we can see down here it says, “Cancel and return to innernookdesigns.motorcycles.”

Anderson Cooper: So that’s what PayPal is being told is the website that’s asking for the– the charge.

Kolina Koltai: Yes.

PayPal told us it banned Clothoff from its platforms a year ago and shuts down the accounts for these redirect sites when it finds them. The problem is Clothoff often just creates new ones.

And that’s not the only deception it relies on. Its website lists a name, Grupo Digital – with an address in Buenos Aires, Argentina, implying that’s where Clothoff is based, but when we sent our cameras, there was no Grupo Digital there. It turned out to be the office of a YouTube channel that covers politics and when we knocked on the door… 

…the employee who answered said she never heard of Clothoff. 

Employee (in Spanish): No, no, no, no, we don’t do that, we’re not that company.

Clothoff also made up a fake CEO, according to Kolina Koltai, complete with what she says is an AI-generated headshot.

Kolina Koltai: There is a really inherent shadiness that’s happening. They’re not being transparent about who owns it. They’re obviously trying to mask their payments. But you look at the sophistication of these really large sites, it’s completely different than say some guy in a basement that set up a site that he’s trying to do it on his own. When these sites launched, and the way that they’ve been developing and going this past year, it is not someone’s first rodeo. It’s not the first time they set up a complex network.

Clothoff claims on its website that: “processing of minors is impossible.” We emailed what the site says is a press contact, asking for any evidence of that and to respond to a number of other questions. We didn’t hear back.

Yiota Souras: A lot of people might say, “Well, these images are fake.” But we know victims will suffer– humiliation. They’ll suffer– you know, mental health distress and– and reputational harm. In a school setting it’s really amplified, because one of their peers has created this imagery. So there’s a loss of confidence. A loss of trust. 

Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children
Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children, speaks with Anderson Cooper

60 Minutes


Yiota Souras is chief legal officer at the National Center for Missing and Exploited Children. Her organization regularly works with tech companies to flag inappropriate content on their sites. 

Anderson Cooper: In at least three cases Snapchat was reportedly used to circulate these photos. In one instance a parent told us that it took more than eight months to get the accounts that had shared the images taken down.

Yiota Souras: Their responsiveness to– to victims, that is a recurring problem that we see across tech companies.

Anderson Cooper: So it’s not as– it’s not as easy as a parent–

Yiota Souras: No.

Anderson Cooper: –sending a note through Snapchat, “Hey– this is happening. My child has been exploited.”

Yiota Souras: No. It’s– it’s entirely unclear why it is not a faster process. We can actually notify tech companies as well and ask them to take that content down. Anderson Cooper: And in your experience do they?

Yiota Souras: Much faster than when an individual calls. Yes. That isn’t the way it should be. Right? I mean a parent whose child has exploitative or child pornography images online should not have to rely on reaching out to a third party, and having them call the tech company. The tech company should be assuming responsibility immediately to remove that content.

Anderson Cooper: Why are they not doing that?

Yiota Souras: Because I do not think there are ramifications to them not doing so.

Social media companies are shielded from lawsuits involving photos someone posts online due to what Yiota Souras considers an outdated law. 

Yiota Souras: Under section 230 of the Communications Decency Act, a law from 1996, so very different world back then. Online platforms have near complete immunity for any liability arising from content that a user puts on their system. The section 230 protection is really what, allows this very loose ecosystem to exist in terms of “nudify” apps and websites that cause harm to children.

We asked Snapchat about that parent who told us the company didn’t respond to her for eight months. A Snapchat spokesperson told us they’ve been unable to locate her request and said in part: “We have efficient mechanisms for reporting this kind of content,” and added “We have a zero-tolerance policy for such content” and “…act quickly to address it once reported.”

AI nudes of minors are illegal under federal child pornography laws, according to the Department of Justice, if they depict what’s defined as “sexually explicit conduct.” But Souras is concerned some images created by “nudify” sites may not meet that definition.

Yiota Souras: There is this gap in the law around a “nudify” app that desperately needs to be shut.

Anderson Cooper: What are the gaps in the law?

Yiota Souras: Currently a nude image of a child that does not include sexually explicit conduct is not illegal. And that is a serious gap that exists for real children and that exists certainly for images of nude children that are created by a “nudify” app.

Nats- Francesca Mani Speaking at Encode Justice event: Send a clear message that what the boys had done…

In the year since Francesca Mani found out she was targeted, she and her mom have urged schools to implement policies around AI and worked with members of Congress to try and pass a number of federal bills. 

The Take It Down Act, co-sponsored by Sens. Ted Cruz and Amy Klobuchar, made it through the Senate this month and is now awaiting a vote in the House. It would create criminal penalties for sharing AI nudes and require social media companies to take photos down within 48 hours of getting a request.

Anderson Cooper: Schools don’t really know how to address this. Police in many cases don’t do much at this stage. And the sites are making, I presume, millions of dollars off this. So can it be fixed?

Yiota Souras: Absolutely. If we have the appropriate laws we will have the criminal consequences, first of all to deter offenders, and then they’ll be held liable if they are still using these apps. We would have civil remedies for victims. Schools would have protocols. Investigators and law enforcement would have roadmaps on how to investigate. What charges to bring. But we’re a long way from that. We just need the laws in place. All the rest will come from that.

If you or someone you know needs help, contact the National Center for Missing & Exploited Children at 1-800-THE-LOST or www.ncmec.org.

Produced by Nichole Marks and John Gallen. Broadcast associate, Grace Conley. Edited by Daniel J. Glucksman.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *