Five years ago, Cosmopolitan UK partnered with the domestic abuse charity Refuge on ‘The Naked Threat’ campaign, a tireless push to make threats to share intimate images illegal. We shared the harrowing stories of survivors like Natasha, whose abuser used intimate photos as tools of coercion, blackmail and control to keep her trapped in a terrifying relationship. Our collective efforts resulted in a monumental victory: the ‘threat to share’ offence was officially written into the Domestic Abuse Act of 2021.

But on the fifth anniversary of that landmark legislation, alarming new data reveals a broken justice system for women. While reports of image-based abuse are soaring, charging rates are dismally low, leaving survivors without justice and perpetrators free to offend.

Freedom of Information requests submitted by Refuge to 43 police forces across England and Wales (and to which 27 responded) paint a bleak picture. Between July 2021 and February 2026, an overwhelming 21,905 intimate image abuse offences were recorded. Yet, a shocking 95.2% of these cases resulted in zero accountability, with only 4.8% (1,047) of perpetrators actually charged or summonsed.

Even more concerning is the trajectory. Rather than improving in the wake of the new legislation, charging rates have actively declined over time – and while recorded offences surged by 26.9% between 2022 and 2025, the proportion of cases resulting in a charge fell from 5.8% to an abysmal 4.5%.

The specific ‘threat to share’ offence, the exact crime Cosmopolitan UK and Refuge campaigned to outlaw, is experiencing a terrifying boom. Among the police forces able to provide specific case breakdowns, recorded threats to share intimate images skyrocketed by 344% between 2021 and 2025.

a teenage girl sitting with their face obscuredpinterest
Getty Images

So, why are charges so rare? The data points to a systemic failure in supporting vulnerable victims. In over half of cases where a suspect was clearly identified, no charges were brought, with the primary reason for this collapse being victims withdrawing support for the investigation. This mass attrition raises serious questions about the traumatic, unsupported reality of navigating the criminal justice system as a survivor.

Everyone's clicking on...

Speaking exclusively to Cosmopolitan UK, Gemma Sherrington, CEO of Refuge, explains that these high drop-out rates reflect poor confidence in the justice system, alongside "harmful police responses including victim-blaming, dismissive or inappropriate language, and a failure to recognise the severity of image-based abuse."

"Some survivors we work with have been told by police that they ‘should not have shared images,’ revealing a fundamental misunderstanding of how consent operates," Sherrington told us. She notes that officers too often overlook the fact that these crimes are frequently just one manifestation of a wider pattern of coercive control.

Some police have told survivors they 'shouldn't have shared' the intimate photos to begin with

There are also major concerns regarding how digital evidence is handled. Sherrington reveals that Refuge is aware of cases where police have taken a survivor’s device for evidential purposes rather than prioritising the perpetrator’s, or even worse, "returned a perpetrator’s device without removing its contents, thereby putting the survivor at risk of further abuse."

"Access to justice is a fundamental right – yet in practice it has become a postcode lottery, with outcomes tied to where a survivor lives," she adds. "Refuge is clear: this must change."

The consequences on the front line are devastating. Refuge reports that since the introduction of the 2021 law, its specialist Technology-Facilitated Abuse team has not supported a single survivor who has seen their perpetrator successfully convicted, despite reporting to the police – and they work with hundreds of women each year.

Emma Pickering, Head of Technology-Facilitated Abuse and Economic Empowerment at Refuge, warned that survivors are being "failed far too often," adding: "These troubling police figures lay bare the stark disparity between the sheer number of reports compared to the shockingly low charge rates. Without a serious improvement in police response, survivors will continue to miss out on justice."

With tech-facilitated abuse referrals to Refuge soaring by over 62% last year alone, the charity is urgently calling for mandatory, trauma-informed training for all police officers to secure digital evidence quickly and effectively. They are also demanding the government upgrade Ofcom’s guidelines to a legally binding code to hold tech platforms accountable.

In response to the findings, a government spokesperson told Cosmopolitan UK, "Sharing or creating intimate images without consent is a vile crime and we are taking immediate action to tackle this growing issue.

"Soon, tech companies will be legally required to remove intimate images shared without consent within 48 hours of being flagged and we have also established the Policing AI Threat Hub to tackle the criminal misuse of AI."

They added that creating intimate abuses without consent (something known as deepfaking) is a crime that could result in a six-month prison sentence. "We are [also] banning AI tools which generate deepfake sexual images of people without consent, with developers and suppliers facing up to three years in prison."

Five years after a triumphant change to the law, it is abundantly clear that we are still waiting on the powers at large to properly enforce it.

Headshot of Jennifer Savin
Jennifer Savin
Features Editor

 Jennifer Savin is Cosmopolitan UK's multiple award-winning Features Editor, who was crowned Digital Journalist of the Year for her work tackling the issues most important to young women. She regularly covers breaking news, cultural trends, health, the royals and more, using her esteemed connections to access the best experts along the way. She's grilled everyone from high-profile politicians to A-list celebrities, and has sensitively interviewed hundreds of people about their real life stories. In addition to this, Jennifer is widely known for her own undercover investigations and campaign work, which includes successfully petitioning the government for change around topics like abortion rights and image-based sexual abuse. Jennifer is also a published author, documentary consultant (helping to create BBC’s Deepfake Porn: Could You Be Next?) and a patron for Y.E.S. (a youth services charity). Alongside Cosmopolitan, Jennifer has written for The Times, Women’s Health, ELLE and numerous other publications, appeared on podcasts, and spoken on (and hosted) panels for the Women of the World Festival, the University of Manchester and more. In her spare time, Jennifer is a big fan of lipstick, leopard print and over-ordering at dinner. Follow Jennifer on Instagram, X or LinkedIn.