The UK regulator has officially released the final set of rules aimed at providing enhanced protection for children online, describing them as groundbreaking. Platforms are required to implement changes to their algorithms that recommend content to young users and enhance age verification processes by July 25, or risk facing hefty fines. Platforms hosting pornography, content promoting self-harm, suicide, or eating disorders are mandated to take stronger measures to prevent children from accessing such content.
Chairman of the Molly Rose Foundation, Ian Russell, expressed disappointment in the lack of ambition in the newly published codes. Despite criticisms from some quarters, Ofcom’s Dame Melanie Dawes defended the regulations, emphasizing the importance of age verification as a crucial first step in providing a safer online experience for children. She acknowledged the challenges of enforcing these rules, but underscored that it signifies a significant shift in online safety standards.
Former Facebook safety officer Prof Victoria Baines viewed the regulations as a positive development, highlighting the efforts of major tech companies in addressing online safety concerns. The rules require platforms to adjust their algorithms to filter out harmful content from children’s feeds and recommendations, in addition to implementing streamlined reporting and complaint systems. Each platform must appoint a designated individual responsible for children’s safety, with regular reviews of risk management practices.
Failure to comply with the regulations by the specified deadline could result in fines or even blocking of the site or app in the UK, according to Ofcom. The rules, part of the Online Safety Act, are contingent upon parliamentary approval. Despite criticisms from some quarters, the regulations are seen as a significant step towards creating a safer online environment for children in the UK
Read the full article from The BBC here: Read More