Understanding Meta’s New Strategies to Protect Children on Facebook and Instagram
Meta has recently elaborated on its innovative measures aimed at preventing children under the age of 13 from accessing Facebook and Instagram. In addition to leveraging advanced artificial intelligence systems that analyze contextual clues—such as references to school grades or birthday celebrations found in user profiles, posts, and captions—the company is also employing sophisticated visual analysis techniques. This involves scanning images and videos to identify further indicators that help ascertain a user’s age, thus enhancing the platform’s ability to safeguard younger users.
How Does Meta Differentiate Its AI from Facial Recognition Technology?
In a recent blog post, Meta emphasized, “We want to be clear: this is not facial recognition.” The company explains that its AI examines general themes and visual cues such as height and bone structure to estimate a person’s approximate age without identifying the individual in the image. By integrating these visual insights with their comprehensive analysis of text and user interactions, Meta aims to significantly enhance the detection and removal of underage accounts from their platforms.
What Actions Will Meta Take If a User Is Identified as Underage?
Meta has announced that its visual analysis is currently being implemented “in select countries” as part of a phased rollout. If the platform suspects that a user is under the age of 13, it will proceed to deactivate their account. The deactivated user will then be required to submit proof of age, indicating they are 13 or older, to regain access. If they are unable to provide such proof, Meta will permanently delete their account, reinforcing its commitment to protecting young users.
How Is Meta Enhancing User Safety for Teens Aged 13 to 15?
In addition to its efforts for those under 13, Meta is broadening the implementation of systems aimed at identifying users between the ages of 13 and 15. This technology will automatically transition these users into teen accounts, which come equipped with parental controls and essential protections. This initiative will launch on Instagram in Brazil and extend to 27 countries within the European Union. Furthermore, these measures are being introduced to Facebook for the first time, beginning in the United States and expanding to the EU and UK in the following month. For WhatsApp, Meta has recently rolled out parent-managed accounts, allowing users under 13 to utilize the app in a safer manner.
What Regulatory Pressures Is Meta Facing to Protect Young Users?
Meta is under significant scrutiny from various jurisdictions regarding its responsibility to protect younger users and ensure that children under 13 do not access Facebook and Instagram. Recently, the European Commission disclosed preliminary findings from an investigation into both platforms, suggesting that Meta may be in violation of the Digital Services Act due to inadequate measures to prevent children from signing up. The company will have the opportunity to review these findings and address the concerns raised by investigators, prompting them to enhance their protective measures.









