Prince William calls for better online safety following coroner’s verdict on Molly Russell’s death UK News
Prince William has called for improved online safety for children after a coroner ruled social media had contributed to the death of 14-year-old Molly Russell.
The Prince of Wales said: “No parent should ever have to endure what Ian Russell and his family have been through. They were so incredibly brave. Online safety for our children and young people must be a requirement, not an afterthought.”
It was the schoolgirl from Harrow, north-west London found dead in her bedroom after viewing online content related to suicide, depression and anxiety.
Executives from Pinterest and Meta, Instagram’s parent company, which also owns Facebook, were forced to attend their investigation in person.
Andrew Walker, the coroner, said he “don’t think it would be safe” to cite suicide as the cause of Molly’s death, opting instead for self-harm.
Give his results on Fridayhe said: “Molly was going through a transitional period in her young life that made certain elements of communication difficult.”
She had been “exposed to material that may have affected her negatively, and depression has turned into a more serious depressive disorder,” he told the North London Coroner’s Court.
Mr Walker also said the “particularly graphic” content she saw “romanticized acts of self-harm”, “normalized her condition” and focused on a “limited and irrational view with no counterbalance to normality”.
Continue reading: The digital trail shedding light on the final months of Molly Russell’s life
Since Molly’s death in November 2017, her father Ian Russell has campaigned for better protection from potentially dangerous social media algorithms.
Mr Russell said after the investigation: “We have heard a senior meta executive describe this deadly stream of content delivered by the platform’s algorithms to Molly as ‘safe’ and not violating the platform’s policies.
“If this insane trail of life-stealing content were safe, my daughter Molly would probably still be alive, and instead of a grieving family of four, the five of us would look forward to a life of meaning and promise that lies ahead for our adorable Molly.
“It’s about time the toxic corporate culture at the heart of the world’s largest social media platform changed.”
Dame Rachel de Souza, the Children’s Commissioner for England, told social media giants Sky News they should remove such harmful content from their platforms following the findings of the investigation.
She added it was “despicable” that the companies were “putting profits ahead of child safety”.
Dame Rachel said: “There was once talk of actually fining and jailing designated executives who were responsible. I think it can’t go far enough.
“Why can’t these companies mine this stuff now?
“I meet executives from the six big tech companies every six months. They agree to meet me and I keep asking them, ‘How many kids do you have online? Are you taking this material down?’.
“You’re trying to avoid my questions. You are not doing enough. I honestly think we’re seeing mealy answers.
“Like I said, you need to get a moral compass and get this straightened out now, you can do it.”
Frances Haugen, a former Facebook employee-turned-whistleblower after leaving the company, told Sky News the social media giant should treat deaths like Molly’s as preventable.
She said: “What’s so shocking about Molly’s case is that it’s not unique.
“The reality is that these platforms take a child away from an interest like healthy eating and constantly push them towards more extreme content just by the way the algorithms are designed.”
Ms Haugen left the social media giant in May 2021, taking with her thousands of international documents that sparked a slew of allegations – including Facebook knowing its products harmed teenagers’ mental health.
She continued: “I can imagine what’s happening on Facebook now. There’s a deep mythology within Facebook that the good they produce outweighs the bad, that there might be a few tragic cases, but there’s so much value in connecting people that they can sleep tight tonight.
“Facebook should look at these deaths and treat them as preventable. I wish they would take some responsibility and act.”
Judson Hoffman, a Pinterest executive, apologized for some of the content Molly had seen and admitted the site was “not secure” when she was using it in 2017.
He said the platform is now using artificial intelligence to remove such content.
Elizabeth Lagone, head of health and wellbeing policy at Meta, which owns Facebook, Instagram and WhatsApp, told the coroner that some of the content Molly had seen leading up to her death was “safe,” while her family argued it was suicidal .
In a heated exchange, she said the issue of removing content involving suicide or self-harm is “nuanced and complicated” and that it’s “important to give people that voice” when they’re experiencing these feelings.
Anyone feeling emotionally distressed or suicidal can call Samaritan for help call 116 123 or email firstname.lastname@example.org. Alternatively, letters can be sent to the following address: Freepost SAMARITANS LETTERS.