Meta will start using AI to scan photos and videos for visual clues to see if a user is under 13 and should be removed from Facebook and Instagram, the company announced on Tuesday. These visual clues include a person’s height or bone structure, it said.
“We want to be clear: this is not facial recognition,” Meta explained in its blog post. “Our AI looks at general themes and visual cues, for example height or bone structure, to estimate someone’s general age; it does not identify the specific person in the image. By combining these visual insights with our analysis of text and interactions, we can significantly increase the number of underage accounts we identify and remove.
The visual analysis system is now operating in select countries, but Meta says it’s working toward a broader rollout.
Meta says this system is part of its efforts to keep kids under 13 off its platforms. These efforts include using AI to analyze entire profiles for contextual clues, such as birthday celebrations or mentions of school grades. The company looks for these signals across different formats, such as posts, comments, bios, captions, and more. Meta plans to expand this technology to more parts of its apps, including Instagram Live and Facebook Groups, in the future.
If Meta determines that a person may be underage, it will deactivate their account, and the user will need to prove their age using the company’s age verification process in order to prevent their account from being deleted.
The announcement comes weeks after a New Mexico jury ordered Meta to pay $375 million in civil penalties for misleading consumers about the safety of its platforms and putting children at risk. The company was also ordered to implement fundamental changes to its platforms. Meta has s …