TikTok probed for ‘providing inaccurate information’ amid clampdown on harmful child content

The UK communications watchdog is currently scrutinising TikTok for allegedly providing inaccurate information in response to a legal request – sparking an official investigation into the platform's compliance.

Its inquiry stems from a report published by Ofcom on Thursday (14.12.23), evaluating the efforts of video-sharing platforms to prevent children from accessing harmful content.

TikTok, Snap, and Twitch were all called upon by Ofcom to provide details on how they adhere to legal obligations aimed at safeguarding youngsters’ physical, mental, and moral well-being.

The platforms should "explore improving how they identify children and stop them encountering harm", the report says.

While each of the platforms named in the document demonstrated measures to mitigate exposure to harmful videos, Ofcom discovered instances where children could still be at risk while using these services.

Of particular concern was TikTok's parental control system, known as ‘Family Pairing’, introduced in April 2020.

The system empowers parents to link their accounts with their children's, giving them control over screen time, direct messages, content filtering, and privacy settings.

Despite TikTok attributing the discrepancies to a technical glitch and promptly notifying Ofcom, the regulator suspects inaccuracies in the information provided.

Notably, more than a fifth of children aged eight to 17 reportedly maintain an adult online profile, according to Ofcom's research.

Even though TikTok, Twitch, and Snap set a minimum age requirement of 13, users easily circumvent this by falsifying their age. The report emphasises the need for these platforms to enhance their methods of identifying and preventing children from encountering harmful content.

In contrast, the report highlights OnlyFans – which employs robust age verification measures for its adult content – including facial age estimation and ID checks.

The report also suggests TikTok, Twitch, and Snap should explore similar improvements in their age verification processes to ensure a safer online environment for underage users.

Ofcom notes that the current methods employed by these platforms, utilising a combination of artificial intelligence and human moderators to identify underage users, lack precision, making it challenging to ascertain the extent of the issue.

The regulator remains open to updating its findings if TikTok provides more accurate information during the course of the investigation.

© BANG Media International