Ofcom, the UK's communications regulator, has published a new report highlighting the abuse and challenges women and girls online face today, along with a drafted list of changes that social media companies ought to make. All of which has come about following the Online Safety Act becoming enshrined in law. But is it anywhere near enough, or realistic? Sadly, I doubt it.

To recap: the new piece of guidance presents itself as a method for "tackling misogyny, pile-ons, online domestic abuse and other harms" and includes a number of sobering stats. For example? Women are five times more likely to be the victims of intimate image abuse (such as so-called revenge porn), 73% of female journalists have experienced abuse online and a quarter of teenage girls regularly see content on their feeds that demeans women. Ofcom also flags the ever-growing amount of deepfaked content online (a.k.a. AI-generated material, that more than 90% of the time features women in sexual scenarios they did not consent to.)

Add into this that Refuge, the women's charity, says it's seen a 258% rise in reported cases of abuse involving technology, a stat they claim is merely the tip of a depressing iceberg.

In a bid to reduce some of the online harms and digital violence women and girls face, also including coercive control via digital means, Ofcom is now outlining that good practice should involve platforms 'abuse testing' how a service or feature could be misused or weaponised against women and girls, removing geolocation functions in a bid to prevent stalking, training moderators to spot the signs of domestic abuse and operating with greater transparency. It also encourages platforms to take appropriate action when gender-based harms occur online and allow those who experience gender-based harms to easily report it, as well as urging platforms to conduct risk assessments of how their products or features could be weaponised to abuse before allowing users to access them.

It has also backed the database to tag and keep track of non-consensually shared images; a welcome initiative, as too many survivors have to deal with an emotional and time-consuming fight with online platforms to get photographs of themselves removed.

And while this might all sound good on paper, and it's a downer to criticise the chinks of light attempting to shine through during a particularly grim and divided era of social media, a lot of this new guidance is just that: guidance. The reality is that it's one thing to say this is how platforms should operate, and it's another to actually make them do it – and Ofcom have limited powers on that front and will be going up against multi-million pound corporations.

From July 2025, it sounds like Ofcom will at least be able to hit companies with large fines in relation to already published illegal content (e.g. content that incites violence, that promotes suicide, some types of pornography, imaged-based abuse and threatening messages), but when it comes to the rest of the guidance on proactively keeping women and girls in particular safe, things aren't as clear cut.

An Ofcom spokesperson told Cosmopolitan UK that "All sites and apps operating in the UK must comply with the UK’s online safety laws and from next month, they will have to protect people from illegal content and take it down when they become aware of it. To be clear, controlling or coercive behaviour, stalking and harassment, and intimate image abuse are all illegal. Ofcom is resolved to hold tech firms to account, using the full force of our enforcement powers where necessary."

They added "Our Guidance is a call to action – there’s not only a moral imperative for tech firms to protect the interests of women and girls, but it also makes sound commercial sense."

But anyone who has been on X since Elon Musk took over knows just how rampant sexually violent material and hate speech is on the platform, and how seemingly unbothered they are about gutting it. Ofcom's 'please be nicer!' guidebook also drops just weeks after Meta (WhatsApp, Instagram and Facebook) announced it's getting rid of moderators – essentially saying 'deal with abuse yourself, we cba' and handing 'fact checking' over to the community. Add in that Mark Zuckerberg currently seems more interested in calling for 'more masculinity' in the workplace and trying to get in with the Trump crowd than he is actually stopping vitriol on the platforms he leads, and the picture is bleak.

Just this week, MP Emily Darlington confronted Wifredo Fernandez, X's senior director for government affairs, and read out death threats she'd received after petitioning to keep her local Post Office open. She pointed out that the same account which threatened her also had a history of racist and violent posts, all of which have been live for months in spite of repeat reporting. When asked why they were not removed, Fernandez acknowledged they were 'abhorrent' but merely replied he'd have his team 'take a look', which looking uncomfortable.

young woman holding mobile phone in abstract purple lit roompinterest
Getty Images

These are the people who are really in control of making the online world better. But frankly, as the Ofcom proposed playbook doesn't outline a clear way of making more money and could risk generating less engagement (outrage/abuse = hard to scroll away from), it's hard to imagine they'll care. It's likely things will just carry on as they are – at best. Or, in reality, they'll probably get worse, as many people online continue parroting that anyone trying to call out rampant hate speech is a 'snowflake', 'woke' or 'the left'. Comments have already appeared online in response to Ofcom's suggestions, saying 'what about all the pile-ons men face on the internet?' (Spoiler! Yep, that's also important to discuss and address) and slut-shaming women whose private images are shared without consent.

Echoing concerns that the well-intended actions from Ofcom hold little weight, Andrea Simon, Director of End Violence Against Women Coalition (EVAW), commented that they are commendable but far from a silver bullet. "It remains the case that Ofcom is hamstrung by the fact that the proposals are voluntary only, with no actual requirement on tech companies to put in place any of the recommended good practice."

She added that Ofcom and the government need to work hard to make sure the guidelines are firmly enforced: "Key to this work will be the routes through which the regulator will incentivise, and track take up of the guidance. In a landscape where protections for users are being eroded, with a general trend of tech providers delivering the bare minimum when it comes to safety, any next steps from the new government in securing an internet that is safer and freer for women and girls must be to introduce these recommendations into a code of practice.

"This would give Ofcom the power to insist that measures are introduced, and the ability to enforce against bad actor tech companies who continue to prioritise profits over people."

It's a sentiment that Emma Pickering, Head of Technology-Facilitated Abuse at Refuge, agrees with. "The fact that [the new best practice code] is not legally binding means that it falls drastically short of the change that is desperately needed to protect women and girls from online abuse."

So, where do we go from here?

Pickering feels the government needs to throw more weight behind this and give Ofcom greater powers of enforcement. "The Government now needs to give this guidance the weight it requires by making it legally-binding. Without any legal force, it is unlikely these recommendations will amount to much more than a drop in the ocean," she stresses. "While we would like to see tech companies voluntarily comply, the sad fact is that many will continue to prioritise profits over the protection of women and girls."

It's encouraging to hear Ofcom saying all the right things, but unfortunately it seems – for now at least – that this is a clear case of a regulator who is talking the talk without (being able to fully) walk the walk.

Follow Jennifer on Instagram (but not X, because that place is a cesspit)


Headshot of Jennifer Savin
Jennifer Savin
Features Editor

 Jennifer Savin is Cosmopolitan UK's multiple award-winning Features Editor, who was crowned Digital Journalist of the Year for her work tackling the issues most important to young women. She regularly covers breaking news, cultural trends, health, the royals and more, using her esteemed connections to access the best experts along the way. She's grilled everyone from high-profile politicians to A-list celebrities, and has sensitively interviewed hundreds of people about their real life stories. In addition to this, Jennifer is widely known for her own undercover investigations and campaign work, which includes successfully petitioning the government for change around topics like abortion rights and image-based sexual abuse. Jennifer is also a published author, documentary consultant (helping to create BBC’s Deepfake Porn: Could You Be Next?) and a patron for Y.E.S. (a youth services charity). Alongside Cosmopolitan, Jennifer has written for The Times, Women’s Health, ELLE and numerous other publications, appeared on podcasts, and spoken on (and hosted) panels for the Women of the World Festival, the University of Manchester and more. In her spare time, Jennifer is a big fan of lipstick, leopard print and over-ordering at dinner. Follow Jennifer on Instagram, X or LinkedIn.