“I think you’ve been a victim of sextortion.”
Inside the office of the Revenge Porn Helpline, Amanda Dashwood is speaking to a distressed caller who has fallen victim to catfishing.
“They start bombarding you with terrifying messages usually beginning with ‘I’m going to ruin your life’, or ‘I’m going to make this content go viral’,” she adds.
Seduced by a scammer using a fake social media profile, the caller sent them intimate images.
Now, he is being threatened with those images being leaked online if he doesn’t pay up. After she gets off the call, Amanda will start gathering evidence. If the images are posted online, she will try to get them taken down.
There is a process to follow – and while they encourage people to report it to the police – often the caller’s main priority is to get the images offline.
“He felt guilt for sharing that content in the first place,” Amanda says. “We try to get across the message that you haven’t done anything wrong.”
Embarrassment, humiliation, feeling violated – these are all emotions that callers to the helpline go through.
“A typical phrase will be: ‘I should have known better’,” says call handler Hayley Laskey. “I get passionate about reminding them it isn’t their fault.”
The helpline was set up in 2015, when revenge porn was first made a crime. Since then, it has helped remove more than 300,000 intimate images and videos. The most a client has paid a scammer to take down images, Amanda says, is £30,000.
It operates out of a small office on an industrial estate in Devon. From the outside, there is no trace of the type of work that occurs there.
Inside, the only clue is a sign hanging above the desks that reads: “Caution: Content Being Reviewed”. The team are known as Trusted Flaggers by social networks and adult sites – and for significant cases, they can pick up the phone to raise the alarm.
These days, the helpline has global reach. It liaises with some of the world’s largest social networks and adult sites, from Facebook to Only Fans, Pornhub and 4chan.
“I’ve had calls with people in the White House,” says manager Sophie Mortimer.
And now, the number of victims they’re helping is on the rise.
Sky News can reveal that in the first nine months of 2023, the number of phone calls and online reports to the helpline was more than 10,000, already above 2022’s total of 8,924. The number of phone calls is 31% higher than the same period last year.
“There’s not much we haven’t seen online at this point,” says Hayley. “Often, we are the first person that victims speak to. That initial reaction can have a massive impact.”
Revenge porn, or intimate image abuse, usually involves someone uploading private, often sexual, content online without consent.
One of the most high-profile cases involved Love Island star Georgia Harrison, whose ex-boyfriend, Stephen Bear, was jailed for 21 months for sharing a sex video on Only Fans. But many cases happen out of the limelight.
Briony’s* ex-partner filmed her in the shower without her knowledge – before threatening to share the video.
“He was laughing, saying ‘Oh, you should have heard yourself scream’ – but he refused to delete it,” she says. Briony still lives with the threat that footage might come out.
“It stays with you, especially as a woman you think oh, no, someone else will see me naked. That’s completely unfair. My body is my own.”
For other victims, it’s too late.
Footballer Leigh Nicol, who recently retired from playing for Crystal Palace, had naked images of her uploaded to an adult site in 2018. She was alerted to the leak by a message on Instagram.
“That was a night from hell. I didn’t sleep. I was being sick, while waiting for the police to turn up in the morning. I lost my privacy in a click of a finger – that is something I’ll never get back.”
Since Leigh became a victim of revenge porn, she has worked with the helpline to raise awareness. She now gives talks to teams and players around the country on how to stay safe online.
“It changed my life. I’m much more educated on this issue now but I’ve got scars that people will never see,” she says.
Only Fans told Sky News it is “incredibly proud to work with organisations like the Revenge Porn Helpline”.
It described the work of the helpline as “absolutely critical in supporting victims of intimate image abuse, raising awareness and helping to prevent future cases.”
Meta, which owns Instagram, told Sky News it has “clear rules against sharing or threatening to share non-consensual intimate imagery and we remove this kind of content and the offending account whenever we become aware of it.”
There are two factors potentially driving the spike in demand for the helpline, says Professor Clare McGlynn, who specialises in violence against women and girls.
“Since the pandemic, we have seen even greater use of smartphones and the internet, which means taking and sharing intimate videos without concern is far easier than it ever has been.”
But, says ProfMcGlynn, the presence of “non-consensual porn” on mainstream online porn sites is also normalising revenge porn among viewers. “Non-consensual porn is there as a main genre, which means it’s legitimate and normalised. It’s just not recognised how harmful intimate image abuse is.”
The Crown Prosecution Service told Sky News that when intimate image abuse first became a crime in 2015, 206 charges reached an initial court hearing. In 2022-23, that figure was 631 – an increase of 206%.
Despite the increase, charge and conviction rates are low.
“It’s not taken seriously by police,” says Prof McGlynn. “It’s often assumed that as it’s online, it’s not particularly harmful.” Lack of police resources and training also play a part, she adds.
Back at the helpline, sifting through thousands of extreme images can take its toll.
“I don’t want to say you get desensitised to it,” says Georgia Street, another of the helpline’s trusted flaggers, “but you’re being professional, and you know that that’s the job, so you find a way to separate it.”
The call workers have a dedicated support service.
Counsellor David Humeniuk says that those he sees typically start off putting on “a show of how well they’re doing,” but it’s not unusual for sessions to end in tears. “There is scope for them being traumatised by what they’ve viewed or been witness to,” he adds.
The helpline manages to remove 90% of intimate images and videos reported – but the growing trend of AI and deep fake porn means the numbers will continue to rise.
“We definitely expect to see more cases where content has been created by AI technologies,” says manager Sophie.
“It’s not something that is going to be going away anytime soon. The more time we spend on our phones, the more sites there are, the greater the spread of this content.”
The job the helpline’s team does is difficult and important. For Hayley, the reward comes when those intimate images, which have caused callers so much pain and embarrassment, are finally removed.
“It’s the best feeling ever,” says Hayley. “That win is why we love doing this.” (Sky)