Facebook will work to create tool to 'stop the sharing' of intimate photos before they post to the platform

Facebook will work to create tool to “stop the sharing” of intimate photos before it gets posted to the platform, according to g

Facebook is working to create a tool to help stop the initial posting of non-consensual photos on the platform, according to Antigone Davis, global head of safety at Facebook. Based on Davis’s explanation, before someone could post a non-consensual, intimate photo of another, Facebook would intervene.

“We are trying to develop a program which you would be able to stop the sharing of those images before that initial share,” Davis said. “To remove the ability for someone to coerce somebody or threaten somebody. Hopefully, we will find something in the next few months.”

One in 25 Americans are victims of nonconsensual image sharing, known as revenge porn, according to a 2016 study from the Data & Society Research Institute.

“The use of those images to coerce people into behaving in ways they don’t want to behave,” Davis said. “These are abhorrent, despicable behaviors.”

On Thursday, at a Bipartisan Task Force to End Sexual Violence Congressional briefing on online violence and harassment, Davis joined Delaney Henderson, an advocate for the prevention of sexual assault and a sexual assault survivor, Mary Anne Franks, a professor of law at the University of Miami, Danielle Keats Citron, a professor of law at the University of Maryland and Brian O’Connor with Futures Without Violence, to discuss non-consensual images and victims of intimate privacy.

“Nonconsensual pornography and online violence and harassment continues to grow and destroy lives,” said Rep. Jackie Speier, D-Calif., who has sponsored the Intimate Privacy Protection Act to legislate on this issue. “This is a vile form of sexual abuse and survivors are often left with no, or inadequate legal recourse. Although 35 states have adopted laws on nonconsensual pornography, this patchwork…varies in effectiveness.”

On the Facebook front, the future tool that Davis teased comes after the social network launched a tool in April where if an inappropriate image is posted on the platform, users have the option to report it, even if they’re not tagged. Then a trained professional will review and determine if it violates Facebook’s community standards, which will be removed and the account that shared it potentially disabled. The newly released feature is a photo-matching technology that stops the photo from being spread across the other platforms, like Messenger or Instagram.

“Most recently, we launched a tool that…if someone reports these images to us…if someone tries to reshare those images, they will be blocked from sharing them,” Davis said. “We’d like to go even further than that because there’s also a threat that comes with sharing those images.”

Join us, it's free.

Become a member to get access to:

  • Exclusive Content
  • Daily and specialised newsletters
  • Research and analysis

Join us, it’s free.

Want to read this article and others just like it? All you need to do is become a member of The Drum. Basic membership is quick, free and you will be able to receive daily news updates.