On the evening of 20th November, 2017, Molly Russell watched I’m A Celebrity with her family, completed her homework and then went upstairs, stepped into her bedroom and closed the door. This was the space she had created for herself, a space that was supposed to be safe.

But the outside world had made its way in. It was always there, in her pocket, or waiting in her bag. Taunting her and telling her that she was worthless. That night, Molly took her own life. She was 14-years-old.

In the six-months leading up to her death, Molly had 2,100 images and videos drip-fed to her, posts that read things like: ‘I hate you. You’re weak. You deserve the pain. You deserve to die.’ They were like a never-ending whisper in her ear, one that she could not escape or silence. But they weren’t sent to her by a real person, a bully. Instead by an algorithm, a machine.

At Molly’s inquest a coroner concluded that social media contributed “more than minimally” to her death. But still, even after her death, what she saw was deemed ‘safe’. It could be, even this very second, twisting its way into the minds of millions of others, turning everything dark.

Today, Molly’s story is the subject of a powerful new Channel 4 documentary, Molly vs The Machines, which examines the tragedy, and investigates the power Big Tech holds over our lives, and actions. The documentary shows that the influence can, at times, be an entirely subconscious one, something we think we have power over but… do we really? To delve further into this, Cosmopolitan UK speaks to Charlotte Campbell and Sophie Conlan, Molly’s best friends on growing up without her, and what role they believe Silicon Valley billionaires played in their best friends’ death.

A bright star

“In PE, we just sang Hamilton over and over, running around, she was – as we were – musical obsessed,” says Sophie, now 23-years-old, smiling and reflecting on her friend who was “so bubbly and so genuinely herself.”

Everyone's clicking on...

This, in high school, is not easy. It’s a time where your friendships can bond you but also break you. Where competition is rife, and being ‘cool’ can feel like the only thing that matters, make one false move out of line and your life, and social cache, is ruined… for good. But Sophie and Charlotte, whose fondest memories of Molly are in drama rehearsals - Molly had just got the lead part in the school play - say she wasn’t like that. “She was so understanding, she really cared about you just being yourself and not being a false persona, something that you’re just trying to be,” says Charlotte. She created safety for the pair; she was the person they could be free and silly around.

a person with curly hair in a dimly lit environment
Channel 4
Sophie Conlan

“We weren’t aware of what was going on behind closed doors,” says Sophie, while Charlotte nods. Her family say that they’d noticed their daughter becoming more withdrawn but nothing that flagged as extremely worrying, just changes they put down to the natural ups-and-downs of teenage life. Her friends now reflect on the baggier clothes she had started to wear, the longer sleeves.

At her inquest, it was noted that Molly was suffering from a depressive illness but that the content she viewed exacerbated how she was feeling. The loop of negativity, played out to her on her phone, was inescapable, as each post she saw, or saved, made the algorithm think she wanted more, and more of that content. It made her feel as if there was no escape from her darkest thoughts.

“[The content] is like how you have daily affirmations, that are really positive and say ‘I am loved, I am caring, I am kind.’ It’s taken that and done the opposite,” says Charlotte. “It’s telling people that ‘I am worthless, I am horrible, I am not loved.’ When you see it, over time, you will eventually start to believe it.”

Research from Samaritans, in collaboration with Swansea University, found that 83% of social media users were recommended self-harm content on their personalised feeds, and 76% said they had harmed themselves more severely because of the content they had viewed. More than three-quarters of those surveyed had viewed this, at 14, or younger.

When delivering his verdict, into the cause of Molly’s death, Coroner Andrew Walker, said that the algorithms used by social media resulted in “binge periods” of material. “Some of this content romanticised acts of self-harm by young people on themselves… it is likely that the above material viewed by Molly, already suffering with a depressive illness and vulnerable due to her age, affected her mental health in a negative way and contributed to her death in more than a minimal way,” he said. “In some cases, the content was particularly graphic, tending to portray self-harm and suicide as an inevitable consequence of a condition that could not be recovered from.”

Time for change

After her death, Molly’s dad Ian Russell has thrown himself into campaigning, to discover and expose the truth about harmful online content and to prevent other families from having to go through what his has. He set up the Molly Rose Foundation which offers support for anyone impacted by harmful online content, and it was watching how he took his pain and poured it into action that inspired Charlotte and Sophie to speak up.

“At first, [after her death] I didn’t want anyone to speak about her,” says Charlotte. “Because she was mine, and [strangers] couldn’t understand. Then Ian went to the news about social media's [role in her death], and it turned into an automatic anger. I was so angry at the world for letting this happen.”

“When he started the foundation that was when I was inclined to say ‘this isn’t something I have to be quiet about, we don’t have to just let it happen to us, instead we can think ‘well what can we do about this?’” adds Sophie.

This is an attitude that they say Molly had, that she wanted to help people and right the wrongs of the world. “If anybody had ever met Molly they couldn’t forget her, the energy she had when she walked into a room… There are no words,” Sophie says. “There’s no words to describe how good of a human being she was.”

But, at 14, it can be almost impossible to view yourself how others see you. To have an accurate reflection of what matters. To understand that so much of what you see online is not true. “You begin to think that what you see on your phone is your whole world,” reflects Charlotte.

a person with long hair wearing earrings seated indoors
Channel 4
Charlotte Campbell

“We’re desensitised as a generation to what we see on social media, things that wouldn’t be normal, or safe, in the real world feel okay online.”

Algorithms have been designed to help us find what we’re looking for, to connect us to new interests, if a friend likes something, the algorithm assumes we will too, serving it to us, as soon as we open our phone. Sometimes this can be great, we can find new brands, new fashion designers, new TV shows to follow. But this digital assumption goes the other way. “It’s like if you look at a post for one second too long, it’s going to promote you more of that,” says Charlotte. Molly, they muse in the documentary, could have been looking for help for how she was feeling, before being getting trapped in a dark content loop, viewing posts that she hadn’t sought out or wanted in the first place.

“If I want to go on social media, and search for healthy eating content, I’ll then get so many posts that are basically pro-anorexia content,” explains Charlotte.

“Imagine if you showed anyone, any human being, something incredibly traumatic and then you just said afterwards ‘don’t let that affect you,’” adds Sophie. “I think we’re blind to the harms that social media can do, there’s proven research that even a short scroll can change your whole mood of the day. And that’s not even viewing harmful content. But the societal norms for young women in general are abysmal, even beyond [what Molly was viewing].”

Our phones deliver us a trend-led world, where perfection is seemingly available to other women… just not you. Where, if trends are followed and obeyed, success awaits. It’s warped, and we know it’s warped, but that doesn’t stop us from being pulled in, we still want what our phones tell us to want. “In high school, it was all about getting a BBL and having a big bum,” says Sophie. “Now all the celebs are getting their surgeries removed. We’re seeing celebrities walk the red carpet with their collarbones protruding…”

The pair say, even after Molly’s death, they weren’t aware of how the content they were seeing could be influencing how they felt about themselves. “It was when Ian went to the BBC with the story, and we realised ‘we thought we were wrong’ but that’s not the case,” says Charlotte. “It’s the social media companies, they’re the ones feeding this information.”

We’re currently in the midst of a reckoning. The wild west of the internet, a place where extreme damage, and harm, has gone unchecked, for too long, is now being policed and examined. We’re looking at what is seemingly acceptable online and recognising that it absolutely wouldn’t be allowed offline. If someone was posting letters to you, of disturbing images and messages telling you that you should hate yourself, and that you should die, you would report that to the police, and it would (hopefully) be dealt with. But, for Molly, because this messaging came to her via her phone, it was deemed okay (at her inquest, a representative acting on behalf of social media companies said that some of the content she viewed could be seen as helpful to others, and make them feel less alone.)

“It shouldn’t be up to a child to try and defeat an algorithm,” says Sophie. “It’s up to [the social media companies] it’s their platform.” Yet, when the regulation in place, the thing that makes the decision as to whether content is safe, or unsafe, is a machine of its own, how can it understand the nuances of harm? After all, there’s a big difference between someone crying, and sharing their emotion online, and a video of someone crying and saying there’s no escape from this pain, that the only way out is death. “AI isn’t human, it does not have emotions,” says Charlotte. “AI doesn’t understand [how] what you see could make you feel incredibly upset.”

“Social media, for me, is an incredibly triggering place,” adds Sophie. “I’m scrolling past all these models, with the big house, the big car, the incredible outfits and I think ‘I’m worthless’ and that’s before I’ve even been served content that tells me I’m worthless. It’s so easy to get trapped in a doom cycle… We want more people to have conversations around social media and its genuine affects. I think people are scared because its such a normal thing, they don’t want to say ‘I feel uncomfortable about it’ in case they’re laughed at, or just told to switch their phone off. But it’s not as simple as that. It’s very hard to regulate your social media use, because of how it’s designed."

As the government proposes plans for an Under 16 ban of social media, much of the current discussion is surrounding its impact on children and young people, but Charlotte and Sophie, both 23, are keen to highlight that this is an issue that affects us all, not just children and their parents. “Yes children are definitely more vulnerable to this content, and we’ve seen from the data that they’re the ones engaging,” says Charlotte. “But there are older generations who will see this, and it can also have an affect on them.” Dr Navin Venugopal, who spoke at Molly’s inquest, said that after viewing the content she had been served, she was “not able to sleep” for weeks afterwards.

While Charlotte and Sophie are keen to encourage us all to seek out community, and friendships, they’re also aware that change has to come from the very top. But, the very algorithms that served Molly the content that contributed to her death are the same algorithms that keep us all hooked. And they need us, our eyeballs, to keep the money rolling. Molly was the friend who allowed others the freedom to be themselves, but her ability to see who she was, who she could grow to be, was robbed from her. Now, as they sit, side-by-side, without her, the question hangs in the air: how many more Mollys does there have to be before something changes?

If you’re struggling, text Shout on 85258, or call Samaritans on 116 123, any time, day or night. If your life is at imminent risk, please call 999 for immediate help.

Catriona Innes is Commissioning Director at Cosmopolitan, you can follow her on Substack and on Instagram.


Headshot of Catriona Innes

Catriona Innes is Cosmopolitan UK’s multiple award-winning Commissioning Editor, who has won BSME awards both for her longform investigative journalism as well as for leading the Cosmopolitan features department. Alongside commissioning and editing the features section, both online and in print, Catriona regularly writes her own hard-hitting investigations spending months researching some of the most pressing issues affecting young women today. 


She has spent time undercover with specialist police forces, domestic abuse social workers and even Playboy Bunnies to create articles that take readers to the heart of the story. Catriona is also a published author, poet and volunteers with a number of organisations that directly help the homeless community of London. She’s often found challenging her weak ankles in towering heels through the streets of Soho. Follow her on Instagram and Twitter