Molly Russell: Tech firms still failing after teenager's death, says father

molly-russell:-tech-firms-still-failing-after-teenager's-death,-says-father
Molly Russell: Tech firms still failing after teenager's death, says father

Ian Russell, the father of Molly Russell, has expressed his disappointment over social media companies’ failure to take down harmful content from their platforms. He fears that young people may end up losing their lives due to the same. New research by the Molly Rose Foundation indicates that young users can still access suicide and self-harm content on popular social media platforms such as TikTok, Instagram, and Pinterest. Even though these sites have created new tools to limit access to harmful material, half of the ‘harmful content’ on TikTok was viewed more than a million times.

An inquest conducted last year revealed that the 14-year-old Molly Russell ended her life while being under depression’s influence and the negative impact of online content that she was exposed to on Pinterest and Instagram. To evaluate the situation, a researcher from the foundation studied more than 1,000 posts and videos published between 2018 and October 2021, identified through 15 harmful hashtags that Molly engaged with.

The research discovered that almost 50% of what they viewed on Instagram were posts containing content that “displayed hopelessness, feelings of misery, and highly depressive themes.” The researcher on Pinterest was actively recommended several images of “stylized people in freefall through the air, drowning, and people standing on cliff tops.” Online safety campaigner Ian Russell believes this is a “systemic failure” that will continue to cost young lives.

The platforms have been commended for introducing several tools to support teens and families, blocking sensitive search terms, and trying to detect and remove harmful content as quickly as possible. However, the report maintains that the platforms have not done enough to improve safety for their users. This includes failure to adequately tackle harmful material, a design that increased exposure to negative content, and algorithms that actively spread harmful content, as well as community standards that are too narrow.

Regulator Ofcom has drawn up codes of practice that tech companies must follow, which will be enforced by law. The government believes that the Online Safety Act, which became law in November 2021, should address these types of issues. Professor Louis Appleby, a government adviser on suicide prevention, and a professor of psychiatry at the University of Manchester, believes it is time for tech companies to take responsibility for their images and algorithms and do more to address these problems

Read the full article from The BBC here: Read More