Despite changes in the law and awareness campaigns around the harms of deepfake porn – wherein a person's image is edited into a fake sexually explicit picture or video without consent at just the tap of a few buttons – 13% of teenagers say they have experience with this new form of image-based abuse. This figure accounts for sending or receiving a nude deepfake, encountering a nude deepfake online, using a nudifying app or someone they know having used a nudifying app.

The data comes from research by Internet Matters, a non-profit concerned with online safety. Its new report says, "Deepfake nudes have become alarmingly common online and affect children in three ways: child-on-child sexual abuse, adult-perpetrated child sexual abuse on offenders networks, and sextortion [e.g. blackmailing children with deepfaked images]."

Internet Matters is now calling on the government to ban nudifying tools and to have a greater focus on media literacy as part of the curriculum, as well as urging Ofcom to better tackle child-on-child abuse online.

While deepfake technology can be used in other ways, for example to create humorous or creative (and crucially consensual) material, sadly the majority of deepfakes are sexually explicit and 99% involve women and girls – many of whom have no idea their image is being abused behind their back. If discovered, it can have devastating consequences.

Heartbreakingly, deepfake nudes have already been viewed as a possible contributing factor in the suicide of a 14-year-old girl in the UK, and amongst the celebrities who are most likely to be deepfake victims on popular websites, the most common names are women who became famous at a younger age (with images of from their teenage years often used).

Deepfaking, which began in niche forums online in 2017 and predominantly targeted celebrities, has now evolved to feature victims on a mass scale, many of whom are ordinary women and girls without a public persona.

Everyone's clicking on...

Jodie*, who found out one of her closest friends had posted deepfake nudes of her online, said the discovery led to her crying so hard she burst blood vessels in her eyes. "There's an idea that the perpetrators behind catfishing and deepfakes are strangers, or losers who live in their parents’ basement, but they can be people you trust with intimate details about your life," she said while sharing her story with Cosmopolitan UK.

a teenage girl sitting with their face obscuredpinterest
Getty Images

"I didn't realise how easy it was to make a deepfake until I became a victim, but it turns out you don't even have to pay to do it," Jodie added. "Since experiencing the deep betrayal of having my image abused by a friend, it's completely changed the way that I interact with others."

Worryingly, all it takes to create deepfake porn of someone is a single photo and access to a nudifying app or forum that takes requests. Some of the results are eerily realistic, causing major mental health repercussions for the victims (on par with 'real' image-based abuse), despite it now being illegal to share a non-consensual deepfake.

This all sits firmly in the centre of the online misogyny epidemic that has seen many boys and young men exposed to harmful anti-women content creators online, which teachers have told Cosmopolitan UK is having a noticeable impact in classrooms around the country.

Cosmopolitan UK has urged the government to also outlaw the creation of deepfakes, too.

Previously a government spokesperson told us, "The Online Safety Act will create four new offences, including the criminalising of sharing intimate images without consent, which also applies to so-called 'deepfakes'.

"Social media platforms will [now] be subject to robust duties for illegal content and will be required to proactively tackle distressing crimes like so-called revenge pornography, harassment and controlling or coercive behaviour, or face huge fines."

Deepfake porn and the way it is weaponised against women and girls is an issue that Cosmopolitan first highlighted in 2022, when campaigning to have deepfakes mentioned in the Online Safety Act and when pushing for leading websites that host this sort of content to be shut down.

What to do if you've been deepfaked

  • Reach out to the Revenge Porn Helpline (call 0345 6000 459 or email help@revengepornhelpline.org.uk), they can offer you emotional support and advise on what the next best steps are.
  • Contact Stop Non-Consensual Intimate Image Abuse, who will be able to help you create a case for having the images removed from the internet. They have a 90% success rate of having revenge porn images taken down.
  • Take screenshots as evidence and if you feel comfortable doing so, report what has happened to the police.

Follow Jennifer on Instagram and Twitter

Headshot of Jennifer Savin
Jennifer Savin
Features Editor

 Jennifer Savin is Cosmopolitan UK's multiple award-winning Features Editor, who was crowned Digital Journalist of the Year for her work tackling the issues most important to young women. She regularly covers breaking news, cultural trends, health, the royals and more, using her esteemed connections to access the best experts along the way. She's grilled everyone from high-profile politicians to A-list celebrities, and has sensitively interviewed hundreds of people about their real life stories. In addition to this, Jennifer is widely known for her own undercover investigations and campaign work, which includes successfully petitioning the government for change around topics like abortion rights and image-based sexual abuse. Jennifer is also a published author, documentary consultant (helping to create BBC’s Deepfake Porn: Could You Be Next?) and a patron for Y.E.S. (a youth services charity). Alongside Cosmopolitan, Jennifer has written for The Times, Women’s Health, ELLE and numerous other publications, appeared on podcasts, and spoken on (and hosted) panels for the Women of the World Festival, the University of Manchester and more. In her spare time, Jennifer is a big fan of lipstick, leopard print and over-ordering at dinner. Follow Jennifer on Instagram, X or LinkedIn.