The Drum Awards for Marketing - Extended Deadline

-d -h -min -sec

Mark Zuckerberg Technology Facebook

Facebook is hiring 3000 people to tackle graphic and violent Facebook Live videos

Author

By Rebecca Stewart, Trends Editor

May 3, 2017 | 4 min read

Facebook has said it will invest heavily in manpower dedicated to preventing the spread of graphic and illegal content on Facebook Live as it looks to ease user and advertiser concern.

Zuckerberg

The boss announced the news via a Facebook post on Wednesday

Despite earlier reports the tech behemoth wanted to solve the issue with artificial intelligence, it has announced it will add 3000 people to its community operations team internationally.

Chief executive and founder Mark Zuckerberg said: "Over the last few weeks, we've seen people hurting themselves and others on Facebook – either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community."

"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down."

Over the past few months Facebook has faced mounting pressure to impose stricter measures on its Live feature. A number of incidents were reported in April including, but not limited to, footage of a man killing his daughter and several real-time suicide streams.

Two days prior to Facebook's F8 developer conference a Cleveland man posted a video of him shooting and killing a 74-year-old man. While this incident was addressed by Zuckerberg during his keynote speech, and the company admitted it was too slow to remove the footage, and acknowledged that it needed to improve its processes.

Videos and updates which glorify violence are against Facebook's rules. However, in most instances, users have to report offending content to moderators before it is reviewed and removed.

Revealing that 3000 people will join the team responsible for monitoring all content on the site, Zuckerberg said: "These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else."

In February Zuckerberg revealed that the company was working on developing AI which could detect and read video content, however he conceded it was still in “very early development," which may explain why the company has turned to the human intervention as a matter of urgency.

Facebook has long faced challenges when it comes to self-censorship and moderating content. In the past the business has struggled to solidify how to best handle sensitive editorial content as well as provocative posts.

While there's not yet an ad unit for Facebook Live, the company has been serving mid-roll ads against premium video content from publishers since the start of the year. At the time it said it was expanding the ability to test ad breaks in Facebook Live to US pages or profiles with 2000 or more followers who had reached 300 or more concurrent viewers in a recent real-time video.

In addition, Facebook said it is expanding the ability to test ad breaks in Facebook Live to US pages or profiles with 2,000 or more followers that have reached 300 or more concurrent viewers in a recent live video. However, recent events mean advertisers are still likely to be wary of testing the waters when it comes to Live – particularly given YouTube's recent brand safety woes.

Mark Zuckerberg Technology Facebook

More from Mark Zuckerberg

View all

Trending

Industry insights

View all
Add your own content +