In a significant legislative move, the Australian Government has officially approved a controversial law aimed at prohibiting all social media platforms from allowing users below the age of 16. This decision comes despite a mix of opinions surrounding the effectiveness and implications of such a policy, highlighting the ongoing debate about online safety and youth engagement in digital spaces.
The bill’s passage occurred late last night, marking the last full sitting day of parliament for the year. The urgency to finalize this legislation was apparent, as the government sought to enact it before the upcoming end-of-year recess and the anticipated election scheduled for early next year, indicating the political motivations behind this decision.
The amendments to the Online Safety Act introduce several key changes:
- Access to social media platforms will now be limited exclusively to users aged 16 and older.
- Exemptions will apply to messaging applications, online gaming, and any “services primarily aimed at enhancing the health and education of users,” including platforms like YouTube.
- Social media companies will be required to demonstrate that they have taken “reasonable steps” to prevent users under 16 from accessing their services.
- Platforms cannot mandate government-issued identification to verify user age.
- Fines for non-compliance can reach up to AUD 49.5 million (approximately USD 32.2 million) for significant violations.
- Neither parents nor minors will face penalties for violating these new regulations.
These new regulations are set to take effect in 12 months, providing social media platforms with a timeframe to implement necessary changes to comply with the updated guidelines, ensuring they meet the new legal requirements effectively.
The Australian Government has labeled this initiative as a “world-leading” strategy designed to safeguard younger and more vulnerable users from potential online hazards and harmful content, emphasizing their commitment to enhancing digital safety.
However, numerous experts, including some who have previously collaborated with the government, have raised concerns regarding the potential ramifications of this legislation. They question whether removing young individuals from social media platforms could inadvertently lead to more significant issues than allowing them to engage in these online communities.
Earlier this week, a coalition of 140 child safety specialists issued an open letter urging the government to reconsider its stance.
In the letter, they stated:
“The online environment serves as a vital space for children and teenagers to access information, develop social and technical skills, connect with family and friends, and explore the world around them. These experiences are crucial for children’s growth, supporting their rights and aiding their transition to adulthood.”
Other analysts have cautioned that banning mainstream social media platforms could drive young users toward less regulated alternatives, potentially increasing their risk exposure rather than decreasing it.
The specifics of which platforms will fall under this new legislation remain ambiguous, as the amended bill lacks clear definitions. While the government has confirmed that messaging services and gaming applications will not be subject to these new rules, it has verbally indicated that YouTube will be exempt. The bill broadly states that any platform with the “primary or significant purpose” of facilitating “online social interaction” will be regulated by these new measures.
This broad definition could encompass a wide range of applications, prompting debates about which services will be included. For instance, Snapchat attempted to argue that it functions primarily as a messaging app and should not be subjected to these restrictions; however, the government has stated that it will still be required to adjust its policies accordingly.
The ambiguous language in the bill suggests that alternative platforms may emerge to fill any void left by these restrictions. At the same time, allowing children to continue using apps like WhatsApp and Messenger may pose similar risks, as these platforms could be equally as dangerous under the new criteria.
It’s important to note that most major social media applications already enforce age restrictions:
This legislation effectively introduces a three-year age increase, which may not significantly impact overall user engagement for the majority of platforms, with the exception of Snapchat.
The real challenge, as highlighted by many professionals, is that despite existing age restrictions, there are currently no efficient methods for age verification or parental consent validation.
For instance, a 2020 report by The New York Times revealed that approximately one-third of TikTok’s 49 million U.S. users were under 14 years old, based on the platform’s own data. Although TikTok requires users to be at least 13 years old, concerns remained that many users fell below this threshold, and the platform lacked effective means to identify or verify their ages.
Having over 16 million accounts potentially created by users under 14 raises significant concerns about the authenticity of age claims. While TikTok has improved its detection capabilities, as have many other platforms, employing AI and engagement tracking to eliminate violators, the reality remains that if 16-year-olds can access these applications, younger teens are likely to find ways to bypass any restrictions.
In conversations with teenagers throughout the week (as a parent of two teenage children in Australia), I found that most were unconcerned about these new regulations. Many simply remarked, “How will they know?”
These adolescents have been using social media for years, regardless of their parents’ approval, and have become adept at circumventing age verification processes. Consequently, they feel confident that any changes will not affect their access to these platforms.
Given the government’s vague guidelines and descriptions, it’s likely that these teens are correct in their assumptions.
The critical question will revolve around what constitutes “reasonable steps” to prevent minors from accessing social media. Are the current measures employed by platforms deemed “reasonable” under this new framework? If so, it’s doubtful that this legislation will yield significant changes. Will the government impose stricter age verification requirements? Already, it has acknowledged that it cannot request ID documentation, leaving little room for further action. Despite discussions about alternative age verification strategies, no specific proposals have been brought forth as of yet.
Overall, it’s challenging to envision how the government will effectively implement substantial improvements, especially considering the variable nature of age detection across different platforms. Without establishing its own detection systems, the legal enforcement of these regulations could prove difficult.
For instance, Meta has developed advanced methods for age detection, while X lacks similar capabilities. Should X be held to the same standards as Meta, particularly if it lacks the resources to meet those requirements?
It remains unclear how the government could prosecute this situation unless it lowers the benchmarks for what qualifies as “reasonable steps,” allowing platforms with less effective detection systems to comply with the new regulations.
At this juncture, it seems unlikely that this approach will yield effective results, even if one concedes that social media poses risks for teenagers and that restrictions should be implemented.
The debate over the impact of social media on youth continues, and while the Australian Government is grappling with these challenges, the upcoming election appears to be influencing its stance. With a majority of Australians favoring more stringent actions regarding youth online safety, the government seems to view this legislation as a potential electoral advantage.
This appears to be the most significant motivation behind advancing this bill at this time, despite numerous uncertainties and unresolved issues surrounding its implementation.