When Elliston Berry was 14 years old, one of her classmates created fake explicit pictures of her using AI and shared them on social media. She didn’t know where to get help or how to remove the photos. But now she wants to make sure other young people don’t go through the same thing.
She has created an online training program to teach students, parents, and teachers about fake explicit images created using AI. The online course is 17 minutes long teaches people how to understand and recognize AI-generated fake images, fake sexual images, and sextortion.
Sextortion is a scam where criminals trick victims into sending explicit photos online, then threaten to share them unless the victim pays money or sends more images. This has affected thousands of teens in recent years and has led to several teen suicides.
The course also has links to help resources from RAINN (a support organization). It explains the legal punishments under the Take It Down Act. It also shows how to get images removed. Berry said it took her nine months to get the fake pictures of her removed from social media. Now, the Take It Down Act requires social media platforms to remove these images within 48 hours after being told about them.
“It’s not just for the potential victims, but it’s also for the potential perpetuators of these types of crimes,” said Adaptive Security CEO Brian Long. “They need to understand that this isn’t a prank, right? … It’s against the law and it’s really, really harmful and dangerous to people.”
Adaptive Security is offering the course for free to schools and parents.
“I know a handful of girls that this has happened to in just the past month,” Berry said. “It is so scary, especially if no one knows what we’re handling. So, I think it’s super important to take initiative, learn more, educate more and have conversations.”
This type of harassment is becoming more common because AI tools that create fake sexual images are easy to find and use. Recently, Elon Musk’s company xAI got in trouble after its AI chatbot Grok was used many times to create nude or sexual AI images of women and children. (The company has now limited this feature.)
Research from a non-profit organization called Thorn shows that one out of every eight US teenagers knows someone who has been targeted by fake nude images. This is happening even though there’s now a law called the Take It Down Act that makes sharing these images illegal. President Donald Trump signed this law last year, and Berry helped push for it.











