As calls come for Facebook to crackdown on how it moderates Live posts, The Drum looks back at some of the challenges the social network has faced over the past year when it comes to self-censorship.
At the end of last week Facebook faced mounting pressure to impose stricter measures on its Live video feature after allowing a disturbing video depicting torture to resurface on the site.
Just days after being removed for violating the platform's community standards, the video showing the attack of a young man with disabilities in Chicago resurfaced again within Facebook's walls having been repackaged and re-uploaded by right wing news site the Daily Caller, attracting millions of new viewers.
The case comes as increasing scrutiny falls on Mark Zuckerberg's site around its status as a media company and highlights the mammoth challenge faced by the Silicon Valley giant when it comes to moderating content.
Over the past 12 months in particular the social network has struggled to lock down not only how it handles sensitive editorial content, but also provocative content from advertisers and ordinary users in terms of video, photo and live content.
Facebook addressed the difficulty of real-time moderation in a blog last year, saying that on Live one of the most sensitive situations involves "people sharing violent or graphic images of events taking place in the real world."
"In those situations, context and degree are everything. For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video," it added.
Facebook is thought to employ roughly 800 to 1000 moderators who speak 24 languages between them to moderate the billions of posts on the platform each day. According to We Are Social Media most content is examined by algorithms first of alland everything users report is sorted depending on category and sent to moderation teams.
Here, The Drum looks back at the instances Facebook's self-censorship has been been challenged in the past year, and how the social behemoth has responded.
Nick Ut's 'Napalm Girl'
Facebook u-turned on a decision to censor the iconic 'Napalm Girl' photo taken during the Vietnam war amid increasing pressure from the global media in September last year.
Taken by photographer Nick Ut for the Associated Press, the 1972 Pulitzer Prize-winning picture was deemed unsuitable for Facebook due to the fact it depicted an unclothed child.
A media campaign spearheaded by Norway's largest newspaper Aftenposten gained momentum online after the editor penned a scathing letter to Mark Zuckerberg, and ultimately Facebook caved and reversed its decision.
"Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal," it said at the time.
Philando Castile police shooting
The removal of the aftermath of Philando Castile’s death after he was fatally shot by a police officer was put down to a "technical glitch" and Facebook was quick to restore the video in this instance.
In July 2016 the Minnesota father was shot dead by police in his car as he reached for his license after being asked by officers to show it. His fiancé Diamond Reynolds went live on Facebook immediately afterwards to show how the unprovoked attack had played out. The 10-minute video was removed a day later, however a spokesperson put it down to a technical error and restored the footage.
The cultural fallout from the tragedy marked a turning point in citizen journalism’s parity with news broadcasters in the mind of the viewer.
"The images we've seen this week are graphic and heartbreaking, and they shine a light on the fear that millions of members of our community live with every day," said Mark Zuckerberg at the time.
"While I hope we never have to see another video like Diamond's, it reminds us why coming together to build a more open and connected world is so important -- and how far we still have to go."
Pink Ribbon Germany 'Check It Before It's Removed'
In August last year Pink Ribbon Deutschland actually sought out to violate Facebook's community guidelines with its 'Check it Before It's Removed' breast cancer awareness campaign.
A series of images from the charity aimed to bring attention to self examination using photos of women exposing their breasts, but were swiftly removed by Facebook and Instagram for breaching its policies.
"We don’t allow nudity on Instagram… it also includes some photos of female nipples," the company said at the time.
'Sexually explicit' Neptune
Just this month the social network faced a backlash for blocking a picture of a 16th-century statue of Neptune, which it deemed "sexually explicit".
Bolognian writer Elisa Barbari posted a picture of the sculpture, which stands in the city's Piazza del Nettuno, to feature on her Facebook Page 'Stories, curiosities and views of Bologna'.
In a standard message Facebook told Barbari that the photo of the nude Renaissance art was "explicitly sexual," but soon went back on the censorship describing it as "an error."
Swedish Cancer Society 'Breast School'
Facebook was forced to apologise last October after it accidentally removed a video campaign from the Swedish Cancer Society about breast cancer awareness.
Unperturbed by the initial censorship, the group came up with a clever way to avoid Facebook’s censorship; turning breasts square (top picture).
Facebook originally banned the ad, which depicted an animated walk-through of how to self-examine breasts, because it believed it to be marketing “sex products or services nor adults products or services," however it was reinstated after it gained traction online.