Source: The Conversation (Au and NZ) – By Anastasia Powell, Associate Professor, Criminology and Justice Studies, RMIT University
As we have been cooped up at home during coronavirus lockdown, reports of image-based abuse have skyrocketed.
According to Australia’s eSafety Commissioner, it received more than 1,000 reports of image-based abuse between March and May 2020. This represents a 210% increase on the average weekly number of reports they received in 2019.
There was also a huge spike over the Easter weekend, where there was an almost 600% increase on usual reporting figures.
The problem is not limited to Australia. The United Kingdom has witnessed a similar increase, where its Revenge Porn Helpline says it has opened double the number of cases from the previous April.
What are we talking about?
Image-based abuse happens when an intimate image or video is created or shared without the consent of the person pictured. It can also involve threats to share images.Though it is known colloquially as “revenge porn”, researchers and policymakers have rightly rejected that term. They use “image-based abuse” to better capture the harms of the non-consensual taking, sharing, or threat to share, nude or sexual images.
Why has there been an increase?
When we all entered lockdown, digital forms of communication and connection became central to our lives – and that includes our dating lives.
In lockdown, the exchange of nude and sexual images is likely to be a more common way to express our intimacy with another person. Unfortunately, this means perpetrators have greater access to victims’ images to threaten and abuse them.
Since lockdown, the eSafety Commissioner has also observed an increasing trend in people being blackmailed over their intimate images, as well as people trying to monetise their intimate content. This includes reports of sextortion scam emails, which eSafety Commissioner Julie Inman-Grant says, “scares people into paying cryptocurrency payments”.
Read more: Online sex parties and virtual reality porn: can sex in isolation be as fulfilling as real life?
COVID-19 has seen many people lose their jobs or income. So financial pressures could also be in play as more perpetrators look to exploit non-consensual nude or sexual images for financial or other material gain.
We also know that image-based abuse occurs in the context of domestic and family violence. Victims living in isolation with an abusive partner or family member may be particularly vulnerable to these harms.
According to the UK’s Revenge Porn Helpline, the majority of their increased reports came from victims experiencing image-based abuse by an abusive or controlling intimate partner.
Image-based abuse was already a widespread problem
Before the pandemic, our research found as many as one in three of those surveyed in Australia, the UK and New Zealand (aged 16 to 64 years) have experienced image-based abuse.
Women victims in particular reported greater harms and fear for their safety, as well as experiencing multiple forms of victimisation.
We also found one in six people surveyed reported they had been the perpetrator of image-based abuse.
Perpetration rates were highest among men in their 20s and 30s, with one in three men aged 20 to 29 years disclosing they had engaged in these behaviours.
There are laws against this
In Australia, we have specific laws across the country criminalising image-based abuse, except in Tasmania.
If you’re a victim of image-based abuse, you can document the evidence and report to police, and through the eSafety Commissioner’s online portal to request the images are removed. In over 90% of cases, the eSafety Commissioner is successful in image removal. You can also seek support from national helplines such as 1800 RESPECT.
But some victims find it difficult to come forward for help.
Many victims of image-based abuse report experiencing shame and humiliation. They often feel violated and exposed by the perpetrator’s actions.
It is sometimes friends or family who first see the images when they are distributed online. Sadly, victims can feel judged rather than supported by these vital social lifelines.
It’s not the victims’ fault
Too often we blame the victims. We ask why they took or sent images of themselves in the first place. But now, more than ever, it must be made clear that it is not the exchange of intimate images between consenting adults that is the problem.
It is the non-consensual taking, sharing or threatening to share these images that is wrong.
Read more: Revenge porn is sexual violence, not millennial negligence
We need to educate the community about the seriousness of these non-consensual and criminal harms. And Australians need to know that they can take action.
As a community, we must challenge the attitudes that minimise the abuse, blame victims and make excuses for perpetrators.
We can do better as a community
The last National Community Attitudes Survey showed many Australians minimise image-based abuse and other forms of technology-facilitated abuse. As friends and family members, we can instead provide support to victims and let them know we do not blame them for someone else’s actions.
If we become aware someone is misusing intimate images without consent, we can and should call out their behaviour. As the current national Our Watch campaign says: “there is no excuse for abuse”.
It is vital that we take positive action as bystanders by supporting victims and challenging perpetrators if we are going to get ahead of this issue.
Particularly at a time when we are using technology in ways to consensually express our intimacy, in an otherwise quite isolated setting.
You can report image-based abuse to the eSafety Commissioner at esafety.gov.au
If you or someone you know is impacted by sexual assault or family violence, call 1800RESPECT on 1800 737 732 or visit www.1800RESPECT.org.au. In an emergency, call 000.
– ref. Reports of ‘revenge porn’ skyrocketed during lockdown, we must stop blaming victims for it – https://theconversation.com/reports-of-revenge-porn-skyrocketed-during-lockdown-we-must-stop-blaming-victims-for-it-139659