Social networks accused of the death of a teenager

0

A London coroner ruled on Friday (September 30) that harmful social media content contributed to the death of a teenager in 2017 “in a more than minimal way”.

The ruling is perhaps the first of its kind to directly blame social media platforms for a child’s death.

Molly Russell, a 14-year-old schoolgirl from London, died by suicide in 2017 after viewing online content about suicide and self-harm on platforms like Instagram and Pinterest.

“It wouldn’t be safe to leave suicide as the conclusion. She died of an act of self-harm while suffering from depression and the negative effects of online content,” Chief Coroner Andrew Walker said on Friday. The sites she visited “were not safe because they provided access to adult content that should have been available to a 14-year-old,” he added.

What did the investigation reveal?

During the two-week inquest into Molly Russell’s death, it was revealed that although she appeared to be doing well in school, she was actually suffering from a depressive illness.

For six months before her death, Molly had saved, liked or shared 16,300 posts on Instagram, of which more than 2,100, or 12 a day, were related to suicide, self-harm and depression. The New York Times reported. It was revealed that she had also formed a digital pinboard on Pinterest with over 400 images of similar topics, The Guardian reported.

Walker told the court that Instagram and Pinterest used algorithms that resulted in access to “binge periods” of harmful material, some of which she never requested.

“These frenzied times probably had a negative effect on Molly,” Walker said. “Some of this content idealized young people’s acts of self-harm on themselves. Other content sought to isolate and discourage discussion with those who could have helped.

What are the online platforms saying?

Senior executives from Pinterest and Meta – Instagram’s parent company – were ordered to attend the investigation in person. They previously said at a pre-inquest hearing that they would provide evidence remotely, citing short notice, Covid risks and busy work schedules, according to The Guardian. However, Walker said they would be required to watch video footage and documents, which would require them to be present in court.

Pinterest’s Judson Hoffman apologized for some of the content the teenager viewed and agreed that Pinterest was “unsafe” when she used it. He said the platform now uses artificial intelligence to remove harmful content.

Elizabeth Lagone, head of health and wellness policy at Meta, said when investigated, content about suicide and self-harm, which Molly had accessed before her death, was “safe”. However, she admitted that some of the posts Molly viewed might have violated Instagram policies, The Guardian reported.

While Lagone said she was sorry Molly had seen distressing content, she nevertheless said it was important that online platforms allow people to express their feelings.

A “tobacco moment” for social networks

The inquest into Molly Russell’s death was significant because for the first time senior executives from Meta and Pinterest were summoned and given sworn evidence in court in the UK, according to the BBC.

Andy Burrows, head of child safety policy at the National Society for the Prevention of Cruelty to Children (NSPCC), called the decision “a great social media tobacco moment”. He added: “For the first time in the world, it has been decided that the content a child was allowed and encouraged to see by tech companies contributed to their death.”

This is not the first time that social media platforms have been accused of promoting dangerous content for children, sometimes with deadly consequences. In July, TikTok was sued by the parents of two young girls in the US who died after taking part in the viral ‘blackout challenge’ in 2021. The platform was accused of ‘intentionally’ providing children with videos murderous.

Michele Donelan, UK secretary of state for digital, culture, media and sport, said the investigation had “shown the horrific failure of social media platforms to put children’s wellbeing first .

Calling the Online Safety Bill the answer to this problem, Donelan said that “through it, we will use the full force of the law to compel social media companies to protect young people from horrific pro- suicide”.

Proposed changes to the Online Safety Bill

The Online Safety Bill aims to improve safety on the internet, while helping to defend freedom of expression. Presented to the UK parliament in March 2022, the bill, which attempts to lay down rules on online platforms dealing with harmful content, was a modified version of the bill first presented in May 2021.

Among other things, the bill aims to prevent the dissemination of illegal content and to protect children from harmful content. Platforms likely to be searched for by children will need to tackle content that threatens their boundaries, which includes, but is not limited to, messages that encourage self-harm or suicide.

Companies that fail to comply with the rules will face fines of up to £18million or 10% of annual global turnover (whichever is greater), according to the bill.

Calling the court’s decision historic, House of Lords Baroness Beeban Kidron said she would table an amendment to the Online Safety Bill, following the conclusion of the inquiry into Molly Russell, as reported The Independent.

“And we know that, and I’m afraid my inbox in Parliament is full of people who have unfortunately lost children, and many of them are struggling to get the information they want, to get access, to get that transparency… And I will be bringing an amendment to the Online Safety Bill to the House of Lords that aims to make it easier for bereaved parents to access information from social media companies,” said she declared.

Share.

About Author

Comments are closed.