Facebook and X must comply with UK online safety laws – minister


The recent decision by Meta, which includes Facebook and Instagram, to change its fact-checking policy only applies to the US and will not affect compliance with UK law, according to UK Science Secretary Peter Kyle. Meta will continue to be required to remove illegal content within the UK’s jurisdiction. Kyle believes that current UK laws are “very uneven” in relation to online safety and acknowledges there are gaps, including a lack of specific rules for content promoting self-harm or suicide. This is of particular concern to Ian Russell, whose daughter Molly took her own life after viewing harmful content online and who has urged the UK Prime Minister to tighten internet safety rules.

The Online Safety Act, passed in 2023, sought to compel social media companies to take down some content that was legal but harmful, including posts promoting eating disorders. The proposal faced opposition from opponents, including the current Conservative Party leader, Kemi Badenoch, who described it as being “in no fit state to become law”, arguing that it could lead to censorship and that legislation was not required for “hurt feelings.” Kyle admits to being frustrated by the change but notes that the act still includes some good powers for enforcing new safety concerns. He says that he will assertively tackle these concerns, including making sure online platforms provide age-appropriate content.

Under the Online Safety Act, social media companies will have to act to protect children from various forms of harmful content including pornography, material promoting self-harm, bullying, content encouraging dangerous stunts and child sexual abuse. The law also requires companies to take action against state-sponsored disinformation and misinformation. If their services are likely to be accessed by children, platforms will be required to adopt age assurance technologies to prevent children from viewing harmful content.

Meta’s recent decision to replace fact-checkers with community notes, based on the X model, has been criticised and is seen as moving towards a “laissez-faire, anything goes model” according to Ian Russell. However, Meta has confirmed that there will be “no change to how we treat content that encourages suicide, self-injury, and eating disorders” and that the company will continue to scan for that high-severity content using their automated systems

Read the full article from The BBC here: Read More