Editor's note
This article is the second in a three-part series on deepfake sex crimes at schools. The crimes, involving the manipulation of photos and videos to create explicit content, cause distrust and strain relationships between students, and sometimes between pupils and teachers. The series has been produced in collaboration with Excellence Lab, a team dedicated to investigative journalism at The Hankook Ilbo, the sister paper of The Korea Times. — ED.
In the wake of deepfake sexual crimes affecting schools not only in Korea but also in the United States, 15-year-old Francesca Mani and her mother, Dorota Mani, have transformed their personal ordeal into a powerful campaign for change.
"I wasn't looking for sympathy, I wanted accountability," Francesca told The Korea Times during a recent interview.
"The school needed to make it clear that this behavior is unacceptable and ensure the perpetrators are held accountable for their actions."
Her story stands in stark contrast to similar cases in Korea, where victims often stay silent or face social stigmatization.
Last October, Francesca and several other 10th-grade girls at Westfield Public School in New Jersey reported to school administrators that male classmates had used AI software to create and distribute explicit deepfake images of them.
The school launched an investigation later that month. However, the system failed to adequately protect the victims.
While school administrators quietly questioned some of the boys involved, they publicly summoned Francesca and the other girls targeted by the deepfakes to the school office, announcing their names over the intercom.
Mary Asfendis, the school's principal, acknowledged the seriousness of the incident and sent an email to parents, describing the deepfakes as a "very serious matter." Despite concerns from students and parents about the potential circulation of the images, Asfendis reassured them that the school believed all fabricated images had been deleted and were no longer being shared, according to the Mani family.
Although the school acknowledged the seriousness of the incident, it failed to take appropriate action against the offenders.
Just a few days after the deepfake incident came to light, Francesca discovered that one of the boys responsible for creating the manipulated images had been given only one or two days of detention as punishment.
The offenders continued attending school and even represented the school in sports, behaving as though nothing had happened.
When she questioned the lenient punishment, the school told her they couldn't do anything further, citing a lack of legislation addressing such incidents.
"That day, my daughter came home and said, 'That's not right. I refuse to wear the victim's badge,'" Dorota recalled.
"The incident was initially a shock, but over time, the greater disappointment came from the lack of laws, legislation and regulations," she said.
The mother also condemned school administrators for shirking responsibility, as they justified their inaction by citing a lack of familiarity with deepfake technology and its evolving threats.
Determined to ensure that schools could respond more effectively in the future, Francesca told her mother that she wanted to advocate for legal changes and asked for her help.
"That's how it all started," Dorota said, reflecting on the beginning of the efforts she and her daughter are undertaking.
'What seemed hardest was the easiest'
For Francesca and Dorota, the deepfake incident that initially sparked shock and anger became a powerful catalyst for their mission to seek justice through systemic change.
"What I thought would be the hardest part — changing laws and passing legislation — turned out to be the easiest," Dorota said.
They reached out to lawmakers, including Rep. Joe Morelle, who has been advocating against deepfake abuses for seven years.
"Morelle is the first lawmaker who we've worked with," Dorota said.
On Jan. 16, Morelle announced his bipartisan proposal, the Preventing Deepfakes of Intimate Images Act, aimed at halting the spread of AI-generated deepfake pornography.
He emphasized that deepfake pornography is a form of sexual exploitation and abuse. He was astounded that it is not already classified as a federal crime.
"My legislation will finally make this dangerous practice illegal and hold perpetrators accountable," Morelle told The Korea Times through email.
The Mani family also collaborated with Texas Senator Ted Cruz on his bipartisan legislation, the Take It Down Act.
Introduced on June 18, the proposed legislation seeks to criminalize the publication of non-consensual, sexually exploitative images, including AI-generated deepfakes, and requires online platforms to implement mechanisms allowing victims to request the removal of such content.
"The Take It Down Act really resonates with general sexual crimes related to deepfake," Dorota said.
She pointed out that there are victims who, for various reasons — religious, familial or deeply personal — don't feel comfortable coming forward.
"It's not our place to judge or push them, but they need a way to reclaim the power over these images," she said.
Dorota emphasized the importance of empowering victims, noting, "People need outlets that allow them to take the power of the images back."
As of Dec. 2, the Take It Down Act remains pending in the Senate.
In addition to working with lawmakers, Francesca and Dorota also reached out to the social media company Snapchat, a popular messaging app used by approximately 60 percent of teenagers, through Cruz's office.
"Snapchat shared with us that they've been conducting a three-year study on the effects of their platform on younger generations," Dorota said.
Following recurring incidents involving deepfake abuse, the company developed a tool designed to help educators understand and teach students how to protect themselves in the digital age.
Dorota continues to communicate with Snapchat, exploring opportunities to introduce the tool to educators.
"This will provide educators with resources to better protect their students by equipping them with a deeper understanding of current technologies," she said.
Breaking silence, fighting for accountability
While their proactive efforts led to proposed legislation aimed at amplifying the voices of victims, they argued that schools have remained unchanged, still lagging in establishing effective policies to address such incidents.
The district office overseeing public schools in the region initially focused on controlling the use of generative AI tools like ChatGPT to prevent cheating on assignments but failed to implement measures to address deepfake abuse.
After 10 months, the school finally revised its AI regulations. The updated policies now include lessons for students on the civil and criminal consequences of creating illegal deepfakes, as well as education on the profound impact such actions have on victims.
Despite her ordeal, Francesca described herself as fortunate, as many people supported her efforts to create meaningful change.
"My teachers were incredibly supportive, and so were my friends. It was the school administration that failed us," Francesca said.
She also expressed enthusiasm for continuing her advocacy against deepfakes, highlighting her work as a partner with the U.N. Dynamic Teen Coalition and as an AI ambassador at Yale University.
Her efforts have earned her significant recognition, including being named the youngest individual on the Time 100 AI list this year.
Korea is also taking steps to combat deepfake crimes, with new legislation approved on Oct. 10 calling for tougher penalties for digital sex crimes involving the technology.
The revised sexual crimes punishment law imposes penalties of up to three years in prison or fines of up to 30 million won ($22,251) for those caught possessing, purchasing, storing or viewing deepfake sexual materials and other fabricated videos.
In addition, the government announced measures in November, aiming to more effectively crack down on deepfake-related sexual abuse, including the immediate deletion of exploitative materials and the use of undercover investigations.
However, Francesca's case highlights how the voices of those affected can play a pivotal role in driving systemic change.
Francesca stresses the importance of standing up for one's rights.
"If the system fails you, it's time to revisit it and demand accountability," she said.