After a 14-year-old Instagram user’s suicide, Meta apologizes for (some of) the self-harm and suicide content she saw

This post was originally published on this site

https://content.fortune.com/wp-content/uploads/2022/09/GettyImages-1243442155.jpg

A senior executive at Meta apologized on Monday for allowing a British teenager to view graphic Instagram posts related to self-harm and suicide before she took her own life.

Meta’s head of health and wellbeing, Elizabeth Lagone, told an inquest looking into the circumstances surrounding the death of Molly Russell at the North London Coroner’s court that the teenager had “viewed some content that violated our policies and we regret that.”

Russell was a 14-year-old girl from Harrow, London, who killed herself in November 2017 after viewing a large amount of content on Meta’s Instagram and Pinterest relating to anxiety, depression, and self-harm. In the last six months of her life, Russell had saved 16,300 images on her Instagram account, 2,100 of which were linked to depression and suicide.

Lagone told the court, “We are sorry that Molly saw content that violated our policies, and we don’t want that on the platform,” but stopped short of condemning all the controversial content on Russell’s Instagram.

Lagone argued it is not “a binary question” whether the self-harm and depression-related material viewed by Russell—and that was found to comply with Meta’s policies—was safe for children to see, saying to the court that “some people might find solace” in knowing they were not alone.

The senior coroner in the inquest Andrew Walker interrupted the proceeding to ask Lagone, “So you are saying yes, it is safe . . . ?” Lagone replied, “Yes, it’s safe.”

Nuanced and complicated content

After going through a large number of posts relating to suicide and self-harm that were saved, liked, and shared by Molly in the final six months of her life, Lagone argued that most were “by and large admissive,” because many of those individuals were recounting their experiences with mental health struggles and potentially making a cry for help.

Lagone argues Instagram had heard “overwhelmingly from experts” that the company should “not seek to remove [certain content linked to depression and self-harm] because of the further stigma and shame it can cause people who are struggling,” noting the content was “nuanced” and “complicated.”

Russell family’s barrister Oliver Sanders grew heated in response to Lagone’s answers, asking, “Why on earth are you doing this? . . . you’ve created a platform that’s allowing people to put potentially harmful content on it [and] you’re inviting children on to the platform. You don’t know where the balance of risk lies.”

“You have no right to. You are not their parent. You are just a business in America,” Sanders argued.

At the time of Russell’s death, Instagram allowed graphic posts to be posted on their platform referencing suicide and self-harm, which they claim created a space for people to seek help and support. In 2019, it U-turned on this policy and banned all graphic images of self-harm, noting at the time, “collectively it was advised [by mental health experts] that graphic images of self-harm – even when it is someone admitting their struggles—has the potential to unintentionally promote self-harm.”

Social media platforms have so far operated in a regulatory wild west which has prioritized fast growth, engagement, and time spent on their platform viewing content over many safety features. But now momentum is building against giant tech companies for more oversight on how algorithms spread content that may be harmful to the users who engage with it.

One of the most notable whistleblowers in the space, Frances Haugen, leaked a huge trove of internal data in the Facebook Papers report, which found that the social media giant Meta—then still called Facebook—disregarded user safety in the pursuit of profit.

Internal research within the company found that Instagram was especially harmful to a large portion of young users, and was having a profoundly negative impact on teenage girls.

Ian Russell, the father of Molly Russell, told the inquest last week that he believes algorithms on the social media platforms pushed his daughter towards graphic and disturbing posts and contributed to her death.

Meta did not immediately respond to Fortune‘s request for comment.

Sign up for the Fortune Features email list so you don’t miss our biggest features, exclusive interviews, and investigations.