Facebook has conceded that it must do more to combat the proliferation of fake news on its network, acknowledging that ‘there’s so much more we need to do’ to deal with the problem.
It follows widespread criticism of the social media site for serving as a source of misinformation during the US election campaign, which some view as a contributing factor behind Donald Trump’s surprise election victory.
In a statement given to TechCrunch Facebook’s vice president of product management Adam Mosseri wrote: “We take misinformation on Facebook very seriously. We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation. In Newsfeed, we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution.
“In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform.”
The problem has grown worse since Facebook fired human editors in favour of algorithms to select trending topics, leading to one egregious case when an article falsely claiming Trump nemesis Megyn Kelly secretly supported Hillary Clinton appeared on its trending topics.