Determining the actual age of a person through a smartphone screen has long been a frustrating game of cat and mouse for social media platforms. For years, the industry relied on the honor system or easily forged birth dates, leaving a wide gap for children under 13 to bypass safety filters. This week, Meta is attempting to close that gap by moving beyond what users say about themselves and instead analyzing how they are physically built. The company is introducing a surveillance layer that uses artificial intelligence to scrutinize the physical proportions of users to enforce its age restrictions.

The Mechanics of Skeletal Age Identification

Meta announced this Tuesday the activation of an AI system designed to extract visual cues from photos and videos to estimate a user's age. The company is careful to distinguish this technology from traditional facial recognition, which identifies specific individuals. Instead, this system focuses on general anatomical markers, specifically analyzing height and skeletal structures to determine if a user is likely under the age of 13. By examining the ratio of limbs and the overall frame of the person in the media, the AI generates a probabilistic age estimate.

This visual analysis does not operate in a vacuum. Meta integrates these skeletal insights with a broader array of behavioral and textual data. The AI scans user-generated content for specific keywords, such as mentions of school grades or birthday celebrations, and analyzes profile biographies for clues that suggest the user is a child. When these disparate data points align to suggest a user is under the age of 13, the system triggers an automatic response. Currently active in select regions, the system is slated for expansion across Instagram Live and Facebook Groups. Once flagged, the account is immediately deactivated, and the user is forced to undergo a formal age verification process to prevent permanent deletion.

From Textual Filters to Biometric Surveillance

For the better part of a decade, age gating was a matter of text-based censorship. Platforms looked for keywords or relied on the user's own input during sign-up. The shift to visual skeletal analysis represents a fundamental change in the density of surveillance. By moving from the analysis of what a user writes to the analysis of how a user is physically constructed, Meta is implementing a filter that is significantly harder to spoof. A child can lie about their birth year in a text field, but they cannot easily alter their skeletal proportions in a video.

This technical pivot coincides with the aggressive rollout of Teen Accounts, a suite of restrictive settings designed to create a walled garden for adolescent users. These accounts are private by default, restrict direct messages to only those the teen follows, and automatically hide harmful comments. While these features are already active in Brazil and 27 European Union countries, Meta is now expanding the rollout to the United States, the United Kingdom, and other European markets. The integration of skeletal AI provides the enforcement mechanism that the Teen Accounts policy previously lacked, ensuring that the protections are applied to the right demographic.

The timing of this deployment is not coincidental. It follows a significant legal blow in New Mexico, where a jury recently ordered Meta to pay $375 million in civil penalties. The court found that Meta had misled consumers regarding the safety of its platforms and had exposed children to undue risks. The skeletal analysis system is a direct response to this legal pressure, serving as a technical shield against further accusations of negligence. By automating the removal of underage users through physical analysis, Meta is attempting to prove to regulators that it can proactively police its own borders.

As the precision of these models increases, the boundary between safety and surveillance continues to blur. The platform's authority now extends beyond the monitoring of speech and interaction, reaching into the analysis of the human body itself.