Social media firms asked to toughen up age checks for under-13s

Social media firms asked to toughen up age checks for under-13s

UK regulators have urged major technology companies to implement stronger age verification measures to better protect children under 13 on their platforms. This call comes from both the media regulator Ofcom and the Information Commissioner’s Office (ICO), targeting popular social media and video services including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X. The agencies want these platforms to adopt more rigorous methods than the current self-reporting system, which is easily bypassed by underage users.

Ofcom’s Chief Executive Melanie Dawes criticized tech firms for failing to prioritize children’s safety in their product design. Despite this, many companies argue they already have sufficient safeguards in place. Google, which owns YouTube, expressed surprise at Ofcom’s stance, recommending the regulator concentrate its efforts on services presenting higher risks. Nevertheless, both Ofcom and the ICO emphasize the need for a stricter approach to prevent children under 13 from registering accounts, as most platforms set this age as a minimum but rely heavily on users’ honesty about their birthdates.

Research from Ofcom highlights that 86% of children aged 10 to 12 possess social media profiles, underscoring the gap between age limits and actual usage. While legally enforced age verification measures exist for services featuring adult content, such as online pornography, similar stringent checks for social media aimed at young children remain voluntary. The ICO has also spotlighted concerns over the unlawful processing of personal data from children under 13 when platforms fail to enforce their own minimum age policies, further stressing the importance of compliance.

Technology companies responded with details about their current efforts. Meta, which owns Facebook and Instagram, stated it employs artificial intelligence and facial age estimation technologies and supports age verification across app stores to reduce repeated data requests from families. Snapchat and TikTok are trialing or using enhanced tools to identify and remove underage accounts, with TikTok reporting it deleted over 90 million suspected accounts belonging to children under 13 in a recent twelve-month period. Roblox highlighted new safety features and mandatory age checks for chat access. Digital mental health specialist Professor Amy Orben welcomed regulatory initiatives but insisted that safety must be integrated into products from the start rather than treated as an afterthought. Social media analyst Matt Navarra pointed to the challenges of addressing harmful algorithms, noting that verifying users’ ages is only the initial step toward creating safer online environments for children

Read the full article from The BBC here: Read More