Source: The Conversation (Au and NZ) – By Gabrielle Hunt, PhD Candidate, Australian Catholic University
A Victorian school community is reeling after fake, sexually explicit images of female students were generated using artificial intelligence and then shared on social media.
About 50 high school students at Bacchus Marsh Grammar had their images altered. A teenage boy has been arrested and released pending further inquiries. Parents have described the images as “incredibly graphic” and “sickening”.
Unfortunately this is not an isolated case. Last month, a Victorian high school student was expelled after he used AI to generate sexually explicit images of a teacher. This comes amid warnings AI is leading to an increase in sextortion reports.
It also follows multiple cases where males students have used misogynistic and derogatory language about female peers, including a spreadsheet to rank students from “wifey” to “unrapeable”.
Our research shows it is not unusual for young people to harass and abuse their peers. So how can parents and schools respond?Just because it’s AI does not mean it is OK
Sharing sexually explicit images of children (those under 18) and sharing them without consent (no matter how old someone is) is image-based abuse – even if AI has been used or the images have been altered in some way.
As the eSafety commissioner explains, it is still abuse if the image or video is:
altered or faked to look like you [or] shared in a way that makes people think it’s you, even when it’s not (such as a nude of someone else tagged with your name).
As the commissioner also explains, there are criminal laws that cover this abuse and police may be able to investigate. Last week, the federal government also introduced new laws to parliament to strengthen protections against “deepfakes” (using AI to generate a false depiction of a real person).
But apart from the legal issues, this abuse is also highly damaging and distressing for those involved.
A 2023 investigation by the Stanford Internet Observatory found some AI models are using a database of existing child sexual abuse material to generate new images. This means there are real children being exploited to generate sexually explicit images of more children.
What does the research say?
Research clearly shows sexual abuse and harassment is a gendered issue. Women, girls and gender diverse individuals are disproportionately affected. And men and boys are overwhelmingly more likely to perpetrate these crimes.
The Australian Child Maltreatment Study has found adolescents also make up a substantial proportion of perpetrators of child sexual abuse and sexual harassment. The results, collected in 2021, also showed this has increased over time.
Overall, 18.2% of participants aged 16–24 reported being sexually abused by an adolescent during their childhood, compared to 12.1% of those aged 45 years and over.
More than 15% of women and 24% of gender diverse participants aged 16 and over reported being sexually harassed by an adolescent during their childhood, compared to 5% of men.
For more than 90% of people who experienced peer sexual harassment, it was inflicted by a male peer.
Why is this happening?
As our research indicates, we have a cultural problem with gender-based violence in Australia.
The ease of access to pornography and technology such as AI have likely made this problem worse. Exposure to pornography, including violent content, is happening from a young age. A 2024 Australian study showed more than 52% of men and 32% of women had reported viewing pornography by age 14.
Viewing pornography is associated with the sexual objectification of women. Intentionally viewing violent X-rated material is also associated with a significant increase in the likelihood of sexually aggressive behaviour.
What can we do about it?
While one part of the solution may come from age verification technology for adult websites, research also shows we need to take a primary prevention approach. This means trying to stop this sort of behaviour from happening in the first place.
This necessarily involves changing cultural norms around violence and gender with all young people. Parents and schools are key.
What can parents do?
Parents need to make space for their children to talk about tricky and concerning things. You can do this by:
1. Talking early
Help your child feel comfortable about talking to you by starting conversations early. Conversations about bodily autonomy and boundaries can start in primary school. As your child grows, conversations about consent, healthy relationships, porn and sexting can also start. Listen to what your child has to say.
2. Making the conversations regular
We can’t expect children to welcome or respond well to parents’ questions the first time or every time. So keep conversations short and regular. Be guided by your child.
3. Acting on concerns
Stay calm, talk to and listen to your child. Focus on their wellbeing by asking how they are feeling and what you can do to support them. Also look for signs they might need further support, such as talking to their school, police or making a report to the eSafety Commission.
Read more:
What to do if you, or someone you know, is targeted with deepfake porn or AI nudes
What should schools do?
Schools are also a significant part of the community’s response. Research shows school programs aimed at addressing sexual violence and cultural norms around gender can be effective. Some things schools can do include:
1. Providing training and resources for staff
Staff need clear policies and procedures on how to respond effectively and report to the relevant authorities when there is sexual harassment, assault or child abuse.
2. Comprehensive sex education for students
Research shows sex education can help prevent harmful behaviours by teaching children and young people about healthy relationships, boundaries and informed and enthusiastic consent. This education needs to include consideration of pornography, sexting and online safety.
3. Providing strong school leadership
Leaders are responsible for the culture and practices of their schools. They need to take a zero-tolerance approach to anything that normalises stereotyping, degrading comments, violence or misogyny. Children in a school should be empowered to raise concerns to adults and know they will be listened to and believed.
If this article has raised issues for you, or if you’re concerned about someone you know, call 1800RESPECT on 1800 737 732 or visit the eSafety Commissioner’s website for help with image-based abuse.
Gabrielle works with the Australian Child Maltreatment Study (ACMS) team as part of her PhD candidature.
Daryl Higgins receives funding from the Australian Research Council, the National Health and Medical Research Council and a range of government departments, agencies, and service providers, including Bravehearts. He is a Chief Investigator on the Australian Child Maltreatment Study.
– ref. Deepfake AI pornography is becoming more common – what can parents and schools do to prevent it? – https://theconversation.com/deepfake-ai-pornography-is-becoming-more-common-what-can-parents-and-schools-do-to-prevent-it-232248