Meta is taking further action as part of its long-running promise to combat sextortion and other forms of child sexual abuse material (CSAM). The company has revealed that Facebook and Instagram are founding members of Take It Down, an initiative from the National Center for Missing and Exploited Children (NCMEC) that helps young people and their parents remove intimate photos posted online. The system relies on locally stored photos, but theoretically protects privacy.
Instead of sharing the photos themselves, concerned users visit Take It Down to upload generated hashes. If Facebook, Instagram and other program members spot those hashes elsewhere, they can pull and block the content so it won't proliferate. Meta notes that this isn't just for those under 18, either. Parents can act on a child's behalf, and adults can scrub images taken of them when they were younger. The NCMEC warns that platforms may have "limited capabilities" to remove content that's already online, but this could still help mitigate or undo the damage from unwanted sharing. We've asked Meta for clarification.
Meta announced its anti-sextortion plans in November as part of a broader crackdown against "suspicious" adults messaging teens. The project is a follow-up to the StopNCII technology the company developed to fight revenge porn, and shares a similar implementation. This is the latest in a string of efforts to protect teens on Meta's social networks. The company already limits sensitive content for teen Instagram users and restricts ads targeting young audiences, for instance.
The action isn't entirely voluntary. Meta is under pressure from state attorneys general and other government bodies to show that it protects teens, particularly in light of whistleblower Frances Haugen's accusations that the firm downplayed research into Instagram's effects on mental health. The new takedown platform may lift some of that pressure even as it gives abuse survivors more control over their online presence.