Pinterest leaned to ‘lighter’ moderation of potentially harmful posts, inquest hears

Social media site Pinterest told staff to err on the side of “lighter content moderation” of potentially harmful posts at the time a 14-year-old British girl took her own life after viewing content involving suicide and self-harm, an inquest has heard.

Jud Hoffman, Pinterest’s head of community operations, told the North London Coroner’s Court on Thursday that the company’s guidance was “when in doubt, lean towards . . . lighter content moderation” in November 2017 when Molly Russell from Harrow, London, ended her life.

Russell died after viewing a large volume of graphic content online relating to self-harm and suicide on social media sites including Pinterest and Meta-owned Instagram.

Her father, Ian Russell, told the third day of the inquest on Thursday that he believed social media algorithms had pushed his daughter towards graphic, disturbing posts and blamed tech sites for contributing to her death.

“I believe that social media helped kill my daughter and I believe that too much of that content is still there,” he said.

The two-week hearing is shining a spotlight on the role social media algorithms play in pushing distressing content towards users who seek it out, and ignited debate about the need for better regulation of technology companies.

Giving evidence on Thursday, Hoffman said that Pinterest’s policy in 2017 was to hide images that “may be considered disturbing” but not remove them. The company only removed images or posts that promoted harmful behaviour or self-harm.

“It’s not what is in place now, but it’s what was in place at the time,” Hoffman said.

The inquest heard that the content Pinterest had recommended to Molly Russell had included an image of a bloody razor. She was emailed “ten depression pins you might like” by the platform, according to the family’s barrister, Oliver Sanders KC. After her death she continued to receive emails from Pinterest, including one entitled “new ideas for you in depression”.

Hoffman said: “I deeply regret that she was able to access some of the content” that she viewed, adding when pressed by Sanders that he would not show some of it to his own children.

“I find it troubling, I will say that,” he said.

Hoffman said Pinterest’s community guidelines and enforcement tools were tightened up in 2019 to proactively remove harmful content from the site, adding that the “technology that we have in place now was simply not available to us then”.

The inquest heard that at the time Russell died, Pinterest’s “computer vision technology” was being trained to recognise pictures of furniture, but could not spot patterns in other images.

Ian Russell said Pinterest had clearly made improvements to its content moderation, but argued that tech platforms needed to go much further to keep users safe. “If a 14-year old can find a way of locating that content . . . I find it really hard to believe that some of the most powerful global brands in the world . . . can’t find a way to . . . help prevent the content reaching vulnerable people,” he said.

Hoffman and Elizabeth Lagone, head of health and wellbeing at Meta, have been called to appear in person after the coroner said they were not allowed to testify via remote link.

The high-profile hearing comes as the passage through parliament of the online safety bill, which aims to compel internet companies to keep their platforms safe, has been paused. Liz Truss, the new prime minister, is said to be considering relaxing a controversial clause that would make platforms responsible for removing content that was “legal but harmful”, such as bullying.

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link