Crime & Safety

Nude Student Photos Roil Another LA County School As Case Is Handed To Prosecutors

A 16-year-old girl and her mother claim the girl's former friend filmed her while she was taking a shower.

CALABASAS, CA — An investigation into allegations that a Calabasas High School student created nude images of a friend without her consent and distributed them throughout the school has been handed off to the Los Angeles County District Attorney's Office, the DA's office confirmed to Patch Thursday.

The case centers on nude images of a 16-year-old sophomore, which she claims were shared with classmates. The girl and her mother sat down for several TV interviews that aired this week, where they said the girl was secretly filmed in the shower by her former friend.

Authorities were investigating to determine how exactly the images were created: Reports varied about whether the student used artificial intelligence or other digital means to put the girl's face on nude bodies.

Find out what's happening in Calabasasfor free with the latest updates from Patch.

The case, its fallout and trauma, reflect a larger problem confronting schools everywhere. In a world of photo sharing where readily available technology blurs the lines between reality and deep fakes, students are becoming the creators and distributors of child pornography.

The situation began in August but became more widely publicized following a series of media reports this week featuring interviews with the 16-year-old and her mother, Jacqueline Smith. The pair claim that the girl's now-former best friend secretly filmed her in the shower and distributed the images to other students at Calabasas High School, ABC 7 and Fox 11 reported.

Find out what's happening in Calabasasfor free with the latest updates from Patch.

But in an interview with KTLA, Smith's daughter claims that some of all of the images in question may have included her face pasted onto the image of a nude body taken from pornography website Pornhub.

Smith told KTLA that the images were shared with other Calabasas students via AirDrop and Snapchat; ABC 7 reported that some of the images ended up on Pornhub.

The incident was investigated by the Los Angeles County Sheriff's Department Special Victim's Bureau, the Lost Hills Sheriff's Station confirmed to Patch.

"The Sheriff has shared with us that they don’t believe the images have been altered," Las Virgenes Unified School District Superintendent Dan Stepenosky told Patch Thursday.

"We take our students' safety very seriously and we're in the middle of investigating claims from both students (and parents) against each other. We have also shared all of the information we have with the Lost Hills Sheriff's Office and the LA County Office of Education," Stepenosky said.

"One of the families has shared that they are experiencing homelessness, which impacts which actions we can take because we must follow the law and have a detailed process while being thoughtful and sensitive. This matter has our full attention, we're in the middle of an active investigation and at this point I don't know where it will land," Stepenosky added.

Smith and her daughter expressed frustration with school officials' response earlier this week.

"I want Las Virgenes school district, starting from Dan Stepenosky on, to be held accountable for allowing this to go on for so long and causing all this pain and suffering for my daughter and our family," Smith told Fox 11.

Patch reached out to the Sheriff's Department seeking additional information about the case. The case has been presented to the DA's office, a spokesperson said, but the office cannot provide additional details because the case involves minors.

The case comes a week after five Beverly Hills eighth graders were expelled due to their involvement in the creation and dissemination of AI-generated nude photos of their classmates.

Beverly Hills Unified School officials determined that "16 8th-grade students were identified as being victimized." Beverly Vista Middle School administrators last month became aware of the images, which featured AI-generated nude bodies with students' faces superimposed onto them.

The Beverly Hills Police Department's investigation was ongoing as of last week, according to school officials.

AI tools can create "deepfakes," where a real-life person's likeness is used to create a kind of digital puppet, allowing them to be depicted doing or saying something they did not do or say in real life. The problem was highlighted earlier this year when fake pornographic images of Taylor Swift went viral on social media.

AI, used in combination with other software, allows for a broad range of sophistication when it comes to the creation of deepfakes — from crude cut-and paste jobs to eerily convincing simulacra.

As part of their response, Beverly Hills school officials shared resources about how parents should address deepfakes with their children. Experts recommend parents have open conversations — not lectures — with their children about the topic, even if they feel the technology may be too complex for their own understanding.

In a New York Times article in response to the Beverly Hills case, Jessica Grose wrote that the frightening side of AI technology can come into focus when it's "used by teens and tweens, groups with notoriously iffy judgment about the permanence of their actions."

Grose interviewed Devorah Heitner, who writes about the topic in her book “Growing Up in Public: Coming of Age in a Digital World."

"Teach your child the importance of never sharing an explicit message or photograph of another person — especially without that person’s consent. Explain to them that regardless of how they came across the explicit image or message, passing it on to someone else is unethical, perpetuates that person’s violation, and is very likely illegal in their state (especially if the image is of a minor)," Grose wrote.

Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.

More from Calabasas