“Once you send that photo, you can’t take it back,” is a common warning to teenagers, often overlooking the fact that many teens send explicit images under pressure or without fully grasping the consequences.
A new online tool aims to give some control back to these teens, or adults who were once teens, by helping remove explicit images and videos of themselves from the internet.
The tool, called Take It Down, is operated by the National Center for Missing and Exploited Children (NCMEC) and partially funded by Meta Platforms, the owner of Facebook and Instagram.
It allows users to anonymously create a digital fingerprint, or “hash,” of an image without uploading the actual photo. This hash is then stored in a database, and participating tech companies remove the images from their services.
Currently, the participating platforms include Meta’s Facebook and Instagram, Yubo, OnlyFans, and Pornhub (owned by Mindgeek).
Images on other sites, or those sent through encrypted platforms like WhatsApp, will not be removed. Additionally, if an image is altered—for instance, by cropping, adding an emoji, or turning it into a meme—it will require a new hash.
Visually similar images, such as the same photo with or without an Instagram filter, will have nearly identical hashes, differing by just one character.
“Take It Down is designed specifically for people who have an image that they believe is already on the web or could be,” said Gavin Portnoy, a spokesperson for NCMEC.
“If you’re a teen dating someone and you share an image, or if someone extorts you by demanding an image, this tool is for you.”
Portnoy explained that teens might feel more comfortable using this tool rather than involving law enforcement, as it offers anonymity.
“To a teen who just wants the image taken down without involving law enforcement, this is a significant solution,” he said. NCMEC has seen a rise in reports of online exploitation of children, with its CyberTipline receiving 29.3 million reports in 2021, a 35% increase from 2020.
Meta previously attempted a similar tool for adults in 2017, asking users to send encrypted nudes to Facebook, which was not well received and was only briefly tested in Australia.
However, online sexual extortion and exploitation have worsened since then for both children and adults. Many tech companies already use the hash system to share, remove, and report images of child sexual abuse to law enforcement.
Portnoy expressed the goal of having more companies join the initiative, noting, “We never had anyone say no.”
As of now, Twitter and TikTok have not committed to the project and did not respond to a request for comment.
Antigone Davis, Meta’s global head of safety, stated that Take It Down is one of several tools the company uses to combat child abuse and exploitation on its platforms.
“In addition to supporting this tool, we have reporting and blocking systems on our platform. We also implement measures to prevent such situations, such as not allowing unconnected adults to message minors,” Davis said.
The site also works with real and AI-generated images, including “deepfakes,” which are created to look like real people doing things they didn’t actually do.
Leave a Reply