The Online Safety Act is one year old. Has it made children safe?


The Online Safety Act (OSA), introduced in the UK last year with the aim to make the nation the safest place online, faces its biggest test with the emergence of an online forum designed to promote suicide and self-harm. The site is accessible to anyone, including children, and according to an investigation by the BBC, at least five young UK nationals died after contact with the site. Despite the forum administrators being warned by the communications regulator Ofcom about breaking UK law, the site remains live and accessible to vulnerable individuals, raising serious questions over the reach of online regulation and the effectiveness of the OSA.

Ofcom recognises that smaller websites based abroad with anonymous users, for example the suicide forum, pose a significant challenge to regulatory authorities. However, social media platforms owned by big tech companies such as Instagram, Facebook, and WhatsApp, are coming under more intense scrutiny to comply with the new legal measures. One example of this includes Instagram recently announcing major changes to protect children online, such as creating “teen accounts” which have greater parental controls for under-16s. However, campaigners argue that the OSA is not tough enough and needs to be more aggressive in its implementation amid continued concerns over age verification, censorship, and disinformation online.

The introduction of the OSA is a response to the growing realisation that big tech social media platforms have a significant hold over the public but are not held accountable. The law covers a range of issues including access to pornography, terrorism content, fake news, and children’s safety. The legislation, which will be introduced in three phases, is backed by the possibility of multi-million-pound fines against platforms and even criminal sanctions for tech bosses who repeatedly refuse to comply.

There is no doubt that the OSA has sparked much-needed change in the industry. However, the fact remains that social media platforms of big tech companies are transnational, and only a global approach can force meaningful change. In the absence of a uniform international legal framework and consistent political accountability, platforms such as the suicide forum could continue to operate, putting vulnerable young people at risk

Read the full article from The BBC here: Read More