The coroner at Molly Russell’s inquest has recommended that the government consider separate social media platforms for children and adults as he called for a review of child use of online content.
The senior coroner Andrew Walker, who presided over the inquest into 14-year-old Molly’s death, has issued safety recommendations that focus on child access to social media content. Molly, from Harrow, north-west London, died in November 2017 after viewing extensive amounts of material related to suicide, depression, anxiety and self-harm on platforms including Instagram and Pinterest.
Walker issued a prevention of future deaths report, recommending that the government review the provision of internet platforms to children. As part of that review, he said, it should look at: separate sites for children and adults; checking a user’s age before they sign up to a platform; providing age-appropriate content to children; the use of algorithms to provide content; advertising to children; and parental or guardian access to a child’s social media account.
In a landmark conclusion to the inquest last month, Walker said social media had contributed to Molly’s death, stating that she had “died from an act of self-harm whilst suffering from depression and the negative effects of online content”.
The prevention of future deaths notice has been sent to Instagram’s owner, Meta, and Pinterest as well as two other platforms that Molly interacted with before her death: Snapchat and Twitter. The report has also been sent to the culture secretary, Michelle Donelan, and Ofcom, the UK communications regulator charged with overseeing the online safety bill. All parties who receive the report must respond by 8 December with details of the actions they propose to take, or explain why they are taking no action.
Walker said in the report that the government should consider setting up an independent body to monitor social media content and should consider legislation to protect children from harmful online material. The online safety bill, which is due to resume its progress through parliament, imposes a duty of care on tech platforms to protect children from harmful content. Walker said platforms should also consider self-regulation.
“Although regulation would be a matter for government, I can see no reason why the platforms themselves would not wish to give consideration to self-regulation,” he wrote.
Responding to the report, Molly’s father, Ian Russell, urged social media platforms to think “long and hard” about whether their services were safe for children and to take action before the online safety bill was introduced.
“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users,” he said. “They should think long and hard about whether their platforms are suitable for young people at all.”
Russell added that the online safety bill should be introduced “as soon as possible”. Referring to the systems that repeatedly pushed harmful content at his daughter before her death, Russell called for tech bosses to face stronger criminal sanctions “if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly”.
William Perrin, an internet safety expert and trustee of the UK charity Carnegie, said Walker’s report contained recommendations that were mainly covered by the bill, such as giving Ofcom a role in monitoring how platforms deal with harmful content. However, Perrin said action had still not been taken, despite many interventions like Walker’s.
“This is yet another report, and a weighty one, that recommends action on harmful content,” he said. “But the government has yet to take that action. It’s all very well to say the online safety bill will do these things but it has still yet to be implemented.”
Beeban Kidron, a crossbench peer and child internet safety campaigner, said she did not support the concept of separate platforms for children and adults but added: “The coroner is entirely right that a child going online should be offered an age-appropriate experience.”
Merry Varney, who led the Russell family’s inquest team from the law firm Leigh Day, said: “The decision of HM senior coroner Walker to issue this report both to the government and social media companies is very welcome, and action to prevent further harm to children must be taken urgently.”
Donelan said her “thoughts will be with Molly’s family” when the bill returns to parliament shortly and that the coroner’s report matched provisions in the legislation.
She added: “What happened to Molly is heartbreaking, which is why I am considering the coroner’s report into her death so carefully. His recommendations tally with what our world-leading Online Safety Bill already delivers, which is an important step forward.”
A Meta spokesperson said: “We agree regulation is needed and we’ve already been working on many of the recommendations outlined in this report, including new parental supervision tools that let parents see who their teens follow and limit the amount of time they spend on Instagram.”
Pinterest said the coroner’s report would be “considered with care” while Twitter and Snapchat’s parent company confirmed it had received the report. The Department for Digital, Culture, Media and Sport has been approached for comment.