Hidden Content
Facebook has revealed more details about how its test program for combating revenge porn works. The social network has been trialing a system which allows users to upload naked images of themselves to Facebook so the company can look out for and block others from sharing them.
People were rather shocked at this idea, but even more so when it transpired that the uploaded nudes would be reviewed by Facebook employees. Keen to calm the storm that has whipped up around the issue, Facebook's global head of safety, Antigone Davis, has penned a blog post explaining that "we want to be clear about how it works."
He explains that the system has been piloted in Australia with the help of the eSafety Commissioner to allow people to be proactive in combating revenge porn. Rather than waiting for intimate images of themselves to appear online, users can take pre-emptive steps to block the appearance of specific images. It is described as an "emergency option," and it's something that is likely to be used by celebrities, or by people who find themselves in a position where they fear revenge porn is likely.

Davis explains the steps the process involves:
•Australians can complete an online form on the eSafety Commissioner's official website.
•To establish which image is of concern, people will be asked to send the image to themselves on Messenger.
•The eSafety Commissioner's office notifies us of the submission (via their form). However, they do not have access to the actual image.
•Once we receive this notification, a specially trained representative from our Community Operations team reviews and hashes the image, which creates a human-unreadable, numerical fingerprint of it.
•We store the photo hash -- not the photo -- to prevent someone from uploading the photo in the future. If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches we do not allow it to be posted or shared.
•Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner's office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers.

Defending the scheme, Cindy Southworth, executive vice president and founder of the Safety Net Technology Project, says
If you've never tried to end a relationship with an abusive, controlling, and violent partner, there is no way you’d understand the very real terror victims feel of how much damage an abuser can and will do by sharing intimate images. This voluntary option provides another tool to victims to prevent harm.