Meta apologises for harmful Instagram posts seen by Molly Russell
A senior Meta executive has apologised for enabling a British teenager who took her own life to view graphic posts related to self-harm and suicide on Instagram that should have been removed, but defended other controversial content as “safe” for children.
Molly Russell from Harrow, London, died in November 2017 after viewing a large volume of posts on sites like Meta-owned Instagram and Pinterest related to anxiety, depression, suicide and self-harm.
Meta’s head of health and wellbeing, Elizabeth Lagone, told the inquest into Russell’s death at the North London Coroner’s court on Monday that the teenager had “viewed some content that violated our policies and we regret that”.
When asked if she was sorry, she added: “We are sorry that Molly saw content that violated our policies and we don’t want that on the platform.”
The inquest marks a reckoning for social media platforms, which are widely used by young people and whose business models historically prioritised fast growth, engagement and time spent viewing content.
Since Russell’s death, there has been a growing awareness of how algorithms can be designed to spread content that encourages users to engage with it, which has sometimes led to children being exposed to harmful material.
The inquest heard that in the last six months of her life, Russell engaged with around 2,100 posts related to suicide, self-harm or depression.
Lagone said that some posts Russell had interacted with had since been removed because they violated policies that were tightened in 2019 to prohibit graphic self-harm and suicidal content. One video Lagone admitted was not “suitable for anyone to watch”.
However she defended some self-harm content Russell had viewed as “safe” for children to see.
When asked by the Russell family’s barrister Oliver Sanders KC whether the self-harm and depression-related material Russell viewed was safe for children to see, she said: “Respectfully I don’t find it a binary question,” adding that “some people might find solace” in knowing they were not alone.
Senior coroner Andrew Walker interjected to ask: “So you are saying yes, it is safe . . . ?” to which Lagone replied: “Yes, it’s safe.”
Lagone was taken through a number of posts which Russell engaged with in the months before she died. She described them as “by and large admissive”, meaning they involved individuals recounting their experiences and potentially making a cry for help.
At the time of Russell’s death, Instagram permitted graphic posts that might enable people to seek help and support, but not those that encouraged or promoted suicide and self-harm.
Lagone said Instagram had “heard overwhelmingly from experts” that the company should “not seek to remove [certain content linked to depression and self-harm] because of the further stigma and shame it can cause people who are struggling,” she said. She also said the content was “nuanced” and “complicated”.
In one exchange, Sanders said: “Why on earth are you doing this? . . . you’ve created a platform that’s allowing people to put potentially harmful content on it [and] you’re inviting children on to the platform. You don’t know where the balance of risk lies.”
Russell’s father, Ian Russell, told the inquest last week that he believed social media algorithms had pushed his daughter towards graphic and disturbing posts and contributed to her death.
Last year, a whistleblower leaked internal Instagram research which suggested that the app could have a negative impact on teenagers’ mental health, something the company said was misrepresented. This sparked a widespread discussion from lawmakers to parents about the affects of social media on young minds.
A few weeks later, Instagram paused its plans to launch Instagram Kids, an app for under 13s.
Anyone in the UK affected by the issues raised in this article can contact the Samaritans for free on 116 123
Read the full article Here