TikTok is set to implement new age-verification technology across the European Union in the coming weeks. This move comes as discussions intensify regarding the appropriateness of social media access for users under the age of 16, particularly in countries like the UK. The platform, owned by ByteDance, faces mounting pressure to ensure that accounts belonging to children are accurately identified and managed.
The age-verification system has been quietly piloted in the EU over the past year. It evaluates profile information, posted videos, and user behavior to determine if an account likely belongs to someone under the age of 13. TikTok has stated that accounts identified by this system will undergo a review process conducted by specialist moderators, rather than facing an automatic ban. This approach aims to balance safety with fairness, allowing for possible appeals before account removal. During the UK pilot, thousands of accounts were removed as a result of this initiative.
As part of a broader trend, the Australian government implemented a ban on social media for individuals under the age of 16, which took effect on December 10, 2023. According to the country’s eSafety Commissioner, over 4.7 million accounts were removed across ten platforms, including TikTok, since the ban began. This has raised questions about how effectively social media platforms can enforce age restrictions.
European authorities are scrutinizing how social platforms verify users’ ages, particularly in light of stringent data protection regulations. Recently, Keir Starmer, leader of the UK Labour Party, expressed openness to a social media ban for young people. He voiced concerns about the excessive screen time reported among children and teenagers, especially alarming statistics indicating that some five-year-olds spend hours on devices each day. Starmer has previously opposed outright bans, fearing they could push teenagers toward less safe online spaces.
Calls for increased parental rights over children’s social media accounts have also emerged. Ellen Roome, whose 14-year-old son, Jools Sweeney, tragically died following an online challenge, has advocated for greater access to social media accounts for parents in such situations. This highlights the growing recognition of the need for accountability among social media platforms.
The European Parliament is actively pushing for age limits on social media platforms, while Denmark has proposed a ban for users under the age of 15. In response to these developments, TikTok confirmed to Reuters that its new age-verification technology has been specifically designed to comply with the EU’s regulatory requirements. The company collaborated with Ireland’s Data Protection Commission, the EU’s lead privacy regulator, to develop this system.
A report from The Guardian in 2023 raised concerns about existing practices, revealing that moderators were instructed to allow under-13s to remain on the platform if they claimed parental oversight. This revelation has further fueled the debate surrounding the effectiveness of current age-verification measures and the responsibilities of social media companies in protecting younger users.
As TikTok rolls out its enhanced age-verification system, the outcome will be closely monitored, both by regulators and concerned parents, amid a growing demand for safer online environments for children.
