Understanding the complexities of accurately determining user ages on social media platforms has become increasingly critical, especially with the recent legislative initiatives proposed by the Australian Government. A revealing statistic from TikTok highlights the magnitude of this challenge:
“Every month, we remove around 6 million accounts globally because we believe that our minimum age requirement has not been met.”
This figure stems from TikTok’s advanced machine-learning algorithms and identification methods, which may only capture a fraction of the younger users attempting to bypass the platform’s age restrictions. The ongoing challenge highlights the difficulties in enforcing age verification on social media.
This statement is part of TikTok’s comprehensive update regarding user safety in Europe, particularly focusing on the protective measures being implemented to shield young users from potential harm while using the app.
According to TikTok, the platform boasts a total of 175 million users across the EU, which includes a substantial number of young teenagers who are eager to engage with the app, alongside users facing mental health challenges.
In response to these concerns, TikTok is introducing a series of significant updates aimed at enhancing user safety, which encompass:
- Collaborating with non-governmental organizations (NGOs) throughout Europe to implement a new in-app feature that connects users who report harmful or distressing content directly to mental health resources and support services
- Implementing restrictions on certain appearance-altering effects for users under 18, aimed at protecting young individuals from negative body image issues
- Advancing to the next phase of its EU data separation initiative (Project Clover) to ensure that the personal data of EU users is stored and processed within the region
The most noteworthy development involves the alteration of image-altering effects, which is informed by recent studies examining how teenagers interact with social media applications.
The report indicates:
“When it came to filters and effects, teens and parents expressed concerns that beauty filters could be particularly enticing for girls who feel heightened pressure to compare themselves to their peers and meet unrealistic beauty standards […] Many teens believe that filter disclosures should be mandatory rather than optional. Additional recommendations included limiting the use of filters to older teens, removing filters that subtly alter appearances altogether, and creating barriers to discourage users from applying filters.”
This insight has prompted TikTok to impose restrictions on such filters, which may contribute to alleviating harmful comparison behaviors among users on the app.
These measures are particularly relevant for young teens who continuously seek access to the platform in large numbers.
Addressing this issue has been a longstanding challenge for TikTok, as previous internal reports suggested that approximately one-third of the app’s U.S. user base might be under the age of 14.
It’s essential to note that TikTok sets the minimum age for account creation at 13. However, the Australian Government is currently working on legislation that would prohibit users under 16 from maintaining social media accounts. Other regions are also exploring similar regulations.
Given TikTok’s statistics, it is evident that a significant number of younger users are striving to access the platform, and this could soon be enforceable with potential financial penalties in Australia.
The removal of six million accounts per month from a single platform underscores the enormity of this issue, raising questions about how Australian authorities plan to detect and enforce these new regulations effectively.
However, TikTok, like other social media platforms, is actively working to enhance its detection capabilities regarding age verification and user safety.
Whether these measures will suffice to meet the new legal requirements remains to be seen.









