Molly Russell’s father has called for a stronger online safety bill in the UK, including criminal penalties for tech executives who endanger the well-being of children, after criticizing responses from social media platforms. social networks to a coroner’s report on the death of his daughter.
Ian Russell said the inquiry into the death of 14-year-old Molly was a “unique” opportunity for the tech industry and government to make online platforms safer. A coroner ruled in September that harmful online content contributed to Molly’s death, stating that she “died from an act of self-harm while she suffered from depression and the negative effects of online content.”
Molly, from Harrow, northwest London, took her own life in 2017 after seeing content relating to suicide, depression and self-harm on sites including Instagram and Pinterest.
Russell said those companies’ response to a set of recommendations from the chief coroner, which included considering separate platforms for adults and children, was “disappointing and unsurprising.”
He said: “That’s not good enough when young people’s lives are at risk.”
Russell said responses from Pinterest, the owner of Snapchat, and Meta, the parent of Instagram, underscored the importance of the online safety bill, which receives a third reading in parliament on Tuesday.
“It points to the online safety bill as a really important piece of legislation because I don’t think without effective regulation the tech industry is going to put its house in order to prevent tragedies like Molly’s from happening again,” he said.
Following the inquest, Chief Coroner Andrew Walker issued a Notice of Prevention of Future Deaths. He recommended that the government should review the provision of digital platforms for children and should consider: separate sites for children and adults; verify the age of a user before they register on a platform; provide age-appropriate content for children; the use of algorithms to provide content; advertising for children; and parental or guardian access to a child’s social media account.
The notice was also sent to Meta, Pinterest and Snap, who were asked to respond with details of the actions they would take in response, although the coroner’s recommendations are not binding. In your answersthe companies described their efforts to protect children from harmful content.
Pinterest’s response included a commitment to independent scrutiny of its moderation efforts, while Snap noted its recent establishment of a “family hub” that offers parents insight into who their children are friends with, while Meta outlined policies that include a content control tool on Instagram that gives teen users the option to limit the amount of sensitive material they see. Twitter, which was also used by Molly before her death, also received a copy of the coroner’s notice, but her response has yet to be posted.
Russell said the responses gave “a business-as-usual feeling,” though Pinterest’s commitment to third-party monitoring of its efforts was a “positive” development. Russell, who has become a leading advocate for Internet security and has established the Molly Rose Foundation to help young people with mental health issues, he added that he kept coming across unsafe content on platforms like Instagram and TikTok.
Russell said he supported an amendment to the bill that would expose tech executives to criminal liability and a jail sentence of up to two years if they consistently fail to protect children on their platforms. Currently, the law threatens to jail executives only if they hinder investigations by Ofcom, the communications regulator that will oversee the law. Companies found to be in breach of the law could be fined 10% of global turnover, which would be more than $11bn (£9bn) in Meta’s case.
“The key to making change happen is to change the corporate culture. To clearly focus minds at the top of these corporations, the threat of strict financial sanctions is not enough,” Russell said, adding: “The prospect of prosecution will focus minds.”
Culture secretary Michelle Donelan has said she “does not rule out” backing the amendment, which enjoys strong support among Conservative MPs and is backed by opposition parties including Labour.
In his response to the coroner’s notice, Donelan said the online safety bill had already been strengthened to provide greater protection for children, including requiring large platforms to publish risk assessments of the threat posed by the material on its services that is harmful to children.
Former Conservative leader Iain Duncan Smith urged Rishi Sunak on Sunday to accept the amendment to ensure social media bosses “face punishment” for failing to protect children on their platforms.
“We have all kinds of terrible and harmful nonsense on the Internet, from suicide to extreme levels of child pornography and abuse in general,” he said.
“It’s time for all of us to coordinate and make sure they don’t get away with this very lax system of real child protection.”
In response to Russell’s comments, Pinterest said it was “committed to accelerating its continuous improvements” for user safety and Snap said its family hub tool was designed to “encourage safer online experiences overall.”
Meta declined to comment.