A recent event at Aliso Viejo Middle School in Capistrano Unified, where a student used artificial intelligence software to create a nude photo of a 13-year-old girl by putting the girl’s face on another body, illustrates how unprepared schools are for the rise in AI manipulation of student photos, the Orange County Register reported.
According to the Register, the school district didn’t reach out to the girl’s stepmom until she filed a complaint more than a week after the incident, which she said was handled poorly. The district’s investigation found that the student created AI-generated images of multiple student victims and another student shared the photos; Capistrano Unified said the students faced “disciplinary consequences,” which may have included suspension.
What happened in Capistrano Unified is not the first such incident, the Orange County Register reported. In April, Laguna Beach High School was investigating the use of AI tools to create nude photos of students, and in March, five Beverly Hills middle school students were expelled for it.
Experts told the Register that AI-generated sexual harassment and bullying could be occurring on most school campuses.
“If a parent called me and asked, ‘What do I do?’ the first thing you do is go to your school district,” said Rebecca Delfino, an expert on “the intersection of the law and current events and emergencies.” “Most school districts have codes of conduct related to the behavior of their students, and they’re typically broad enough to address what would be considered harassment.”
But much like most states, California doesn’t have laws banning the use of inappropriate AI-generated images, according to the Register.
“We’re way behind the curve,” John Pizzuro, a former police officer who led New Jersey’s task force on internet crimes against children, told the Register. “There is no regulation, policy or procedure on this.”
Editor Note: This story was originally published by EdSource.com on April 12.