Ian Russell describes his life as being split into two parts: before and after 20 November 2017, the day his youngest daughter, Molly, took her own life as a result of depression and negative social media content. “Our life before Molly’s death was very ordinary. Unremarkable,” he says. He was a television producer and director, married with three daughters. “We lived in an ordinary London suburb, in an ordinary semi-detached house, the children went to ordinary schools.” The weekend before Molly’s death, they had a celebration for all three girls’ birthdays, which are in November. One was turning 21, another 18 and Molly was soon to be 15. “And I remember being in the kitchen of a house full of friends and family and thinking, ‘This is so good. I’ve never been so happy,’” he says. “That was on a Saturday night and the following Tuesday morning, everything was different.”

The second part of Russell’s life has been not only grief and trauma, but also a commitment to discovering and exposing the truth about the online content that contributed to Molly’s death, and campaigning to prevent others falling prey to the same harms. Both elements lasted far longer than he anticipated. It took nearly five years to get enough information out of social media companies for an inquest to conclude that Molly died “from an act of self-harm while suffering from depression and the negative effects of online content”. As for the campaigning, the Molly Rose Foundation provides support, conducts research and raises awareness of online harms, and Russell has been an omnipresent spokesperson on these issues.

He has been busier than ever lately. We meet at a London hotel, a few hours before the House of Lords vote on an amendment to the children’s wellbeing and schools bill that would ban access to social media for under-16s. The amendment was expected to pass – and it did, by 261 votes to 150. Keir Starmer’s government is in favour of a consultation, and a pause to see how Australia’s trailblazing unfolds. But a UK ban has had widespread support, from Conservatives, including Kemi Badenoch, as well as more than 60 Labour MPs, bereaved relatives of children whose deaths were social media-related, celebrities, campaign groups and, it seems, the public – a YouGov poll in December found that 74% of British adults were in favour.

Russell, however, is not in favour of such a ban. Last Sunday he co-signed a joint statement, alongside the NSPCC, Full Fact, the 5 Rights Foundation and others, arguing that “blanket bans on social media would fail to deliver the improvement in children’s safety and wellbeing that they so urgently need”.

Ian Russell, photographed at the Zetter Clerkenwell. Photograph: Linda Nylind/The Guardian

An inconvenient split seems to have opened up, at a time when online harms and social media companies are finally in the spotlight. But if anyone has spent time weighing up this issue, it’s Russell, and he doesn’t seem like someone looking for controversy. “As a person, I need to gather information and process it before I can come to a conclusion,” he says. “I often find myself at meetings hearing what’s being said, and it can be difficult to contribute because I haven’t formed my opinions yet.” But on this issue he’s firm in his conviction, even if others find it difficult to understand. “We’re in danger of trying to move too fast and trying to find quick-fix solutions,” he says. “If there were quick-fix solutions, honestly, we would have found them.”

The core arguments laid out by Russell and other opponents of a ban will be familiar to followers of this debate: that children will seek more dangerous alternatives; that they will find ways to circumvent age limits; that they will face a “cliff edge” when they turn 16 and are thrown into the “high-risk” world of social media; that some groups, such as LGBTQ+ and neurodiverse children, will be deprived of valuable online sources of support and connection.

But Russell’s position also hinges on the Online Safety Act, which is just starting to do what it’s supposed to, he argues. The legislation, passed in 2023, obliges online platforms to conduct robust age verification and prevent harmful content reaching children. It also gives the government and Ofcom, the independent media regulator, powers to fine or even take down platforms not playing by the rules. Molly’s death, and Russell’s campaigning, helped push the act through.

Russell admits progress has been slow. “It took five years of parliamentary debate to put the Online Safety Act on the books. It’s taken Ofcom more than two years, but they are now implementing and enforcing that act. It’s too long, but we have literally just arrived at a place, after all that time, when the platforms, in order to operate in the UK, if their services are likely to be used by children, have to take steps to ensure the safety of those children.”

The recent furore over Elon Musk’s X and Grok is a good illustration, he says. The integration of Grok AI tools that could manipulate images of women, and even children, to remove their clothing, into X – effectively making a tool for creating deepfake child sexual abuse images widely available – horrified pretty much everyone, Russell included. “I don’t understand how X and Elon Musk can even begin to think that that was acceptable,” he says. “Hideous, wrong, disgraceful. But what was the reaction in the UK to that?” After the issue became known, Ofcom opened a formal investigation into X, acting under the powers of the Online Safety Act. Starmer and technology secretary Liz Kendall said they would support Ofcom if it chose to fine or block ​X. Within days, Musk made a U-turn and removed the software from his platform.

Molly Russell at 11. Photograph: The Russell family

The Online Safety Act did what a social media ban couldn’t, Russell argues. “If a platform is behaving in an appalling, unsatisfactory, abysmally unsafe manner, it shouldn’t be in this country … A ban [for under-16s] removes the impetus to do that. In fact, what it’s likely to do is have a really chilling effect on the Online Safety Act.”

He’s not arguing that the legislation is perfect – X was still allowing some users to post sexualised images generated by Grok after Musk’s U-turn, it took weeks for Ofcom to act, and X escaped censure for what many still consider an egregious offence – “but it’s there to be built on. You don’t get anything right the first time … You also have to accept that the development of tech moves so fast, you’re going to constantly have to be amending it and looking at it and updating it. We have to be thinking ahead of the tech crowd.”

A common argument in favour of a social media ban for under-16s is the alcohol analogy. We don’t sanction selling alcohol to minors because it is potentially harmful. Sure, some kids might find a way around it, like getting a fake driving licence, but as a society we’ve put down a marker that you have to be a certain age to drink. Shouldn’t it be the same with social media?

Russell is in favour of regulation, but says it should be sensible and proportionate. He counters with another offline analogy: cars and road traffic. “There will be casualties. It’s sad and tragic when it happens, and we accept that. We don’t say: children under 16 shouldn’t ride in cars to protect them. We say children under 12 should always be in a car seat. We say that everyone in a car, however old, should wear a seatbelt. The industry is compelled to comply with safety measures … We don’t blanket ban 16-year-olds from road travel.”

We do ban them from driving though, I point out. “I’m not saying we should let 16-year-olds on platforms that encourage irresponsible behaviour. I’m saying we should let 16-year-olds on platforms that are safe for 16-year-olds.”

Age classification should be on a platform-by-platform basis, in Russell’s view: “If there was a platform that was really safe, really good, and connected people and did all the good things that social media could do, we could say 13 is fine.” Some other platforms might be rated 16 or 18, say. “If you were to differentiate like that, you would then drive platforms who wanted to attract younger people [to] start inventing safer things.”

Some might say social media companies have had every chance to do this already. Instead, despite having been repeatedly confronted with evidence of the harms they have caused, they have routinely denied and evaded accountability, and even scaled down their internal checks and balances. Meta CEO Mark Zuckerberg, for example, apologised to parents of social media victims at a Senate hearing in January 2024, but a year later, following Donald Trump’s re-election, he announced that Instagram, Facebook and Threads would be getting rid of factcheckers, even if, he admitted, “it means we’re gonna catch less bad stuff”.

Ian Russell outside Barnet coroner’s court after the inquest into Molly’s death, September 2022. Photograph: Joshua Bratt/PA

“I don’t trust the platforms at all,” Russell agrees. “You certainly can’t judge a social media platform by what it says; you can only judge by what it does.” And again, he argues, the Online Safety Act is a better weapon to keep them in line. “If you’ve got an effective regulator and strong regulation, then you could go, ‘You’re not doing what you’re telling us you’re doing. You’re not living up to your risk assessment. You’ve got three months to fix it, or you’re out.’ That’s the sort of thing that will make a difference.”

Russell has good reason not to trust the platforms, especially Meta. After Molly’s death, Russell and his family were shocked and confused, until they began to look into her online activity. “We discovered, in a really painful way, because we could see it in her social media accounts, the content that had been fed to her by the platforms.” The feedback loop of the algorithm had served her more and more disturbing content: graphic images and videos relating to suicide, self-harm and depression, often set to music, flashing slogans like “Fat. Ugly. Worthless. Suicidal.” Even a consultant psychiatrist at the inquest into Molly’s death said that they’d had trouble sleeping after viewing it. “That discovery of this awful, nihilistic world in which she was drip-fed largely black-and-white, depressive content, that led her to the place where she thought that the only way forward was to end her life, was horrifying,” Russell says.

Initially, they reported their findings to Instagram, he recalls. “We thought that they’d say, ‘Thank you for reporting this horrible content. We’ll take it down.’” Instead they received replies along the lines of “this doesn’t infringe our community guidelines”.

Getting a complete picture of Molly’s online activity was a long, difficult, painful process. Cooperation from the tech companies was minimal, but Andrew Walker, the senior coroner at the inquest, was tenacious. Pinterest, which had algorithmically sent harmful content to Molly, complied. Twitter (now X) allowed Russell to download the data from Molly’s account, which wasn’t that helpful without the background context. Meta initially supplied tens of thousands of pages of data, which was all but unsearchable. After more demands, delays and ultimatums, eventually, five years after Molly’s death, Meta found an additional 50 lever-arch files of evidence. “Within that evidence was some of the most harmful content that Molly had seen, which completed the picture.” In her final six months, Molly had been exposed to 2,100 pieces of harmful content on Instagram alone. There were only 12 days in that period when she doesn’t seem to have interacted with harmful content on the platform.

A new documentary premiering next month, Molly vs The Machines, combines verbatim reconstructions of moments from the inquest (such as when one Meta executive disputed whether the content Molly was consuming was really harmful at all) and interviews with Russell and Molly’s schoolfriends with a broader critique of big tech and surveillance capitalism: infinite scrolling, engagement-seeking algorithms, misinformation, polarisation – this is not just a problem for teenagers, Russell points out. “There’s an online harm working at the global level, and there’s an online harm working at the personal level that Molly experienced. The two are so interconnected, and we need to collectively work to pull that together.”

Russell in Molly vs The Machines. Photograph: Publicity image

The fact that this debate has become polarised is, if not a direct consequence, then at least a reflection of how social media tends to drag every subject to its extremes. The politicisation of the issue is equally unhelpful in this regard: the Lords amendment (which now returns to the House of Commons for consideration) was led by the Tories, and is being seen as a defeat for Starmer’s government. Lady Kidron described Starmer’s stance as “the very epitome of party before country”.

But Russell is all for finding middle ground. “I know it will suit many people to divide the cake and say there’s pro-ban and there’s anti-ban. And they can try to set us against each other if they want, and that might work in some cases. In my case, I’ll never be against anyone; I always want to hear their arguments. The division shouldn’t be there; the division should be: people who want the world to be safer for children; and technology companies who are absolutely uncaring of that and caring of their profits – that’s where the dividing line is.”

On a personal level, can Russell imagine reaching a point where he’ll think his work is done? “I’ve always wanted to just get on with my very ordinary life and remember Molly fondly. But sadly, I don’t see that happening anytime soon, because this is a global problem.” He has tried to keep his life and career as ordinary and unchanged as it was, “so there was something familiar in that second part of my life after Molly’s death.” Russell’s work and his grief are inevitably intertwined. There’s not a day he doesn’t think about her, he says. “Some days, that can invigorate you and it can be tremendously powerful and comforting. And other days it can absolutely stop you in your steps and you can barely bring yourself to come out the door. As time goes on, the harder days are fewer.”

Molly vs The Machines will be in cinemas Nationwide on 1 March and will be coming to Channel 4

#Life #Molly #Ian #Russell #big #tech #daughters #death #social #media #ban #wont #work #Online #abuse