Fake News Facebook Media

The Fake News Network: a look at Facebook’s role in a crisis of trust, democracy and security

Author

By Jessica Goodfellow, Media Reporter

July 14, 2017 | 9 min read

Fake news is not only perceived as a threat to the future of journalism, democracy and freedom of speech, but also wields a security risk given the rise of falsehoods aimed at stoking the fire of extremism. As Facebook’s role in disseminating fake content becomes ever more complex as its identity becomes muddied, one thing is clear: it’s dangerous.

Facebook's role in the fake news crisis - and what needs to be done to fix it

Facebook's role in the fake news crisis - and what needs to be done to fix it

Such were the conclusions drawn by the New York Times and Channel 4 at an event hosted by Westminster Media Forum which saw the media, policy experts, lawyers and regulators clash over any viable solution to fake news – the learnings of which will be shared with MPs and could have an impact on public policy.

Fake news is seen as a symptom of something much bigger and broader at play, involving the decline of trust in mainstream media, the polarisation of politics in the UK and the US and the consequent echo chamber effect on social media – where people are increasingly looking for views which reinforce their own, without regard to the veracity of the source.

To understand what is fake news it's important to define what isn’t. The term has been weaponised by president Donald Trump as a pejorative label to undermine the legitimacy of the established news media.

The New York Times, Trump’s local newspaper, is one of the established media organisations under attack. Steve Erlanger, its London bureau chief, says: “We are in the middle of a fight with Donald Trump, it's not a fight we wanted. He uses us as props in his play, we are being set up by Donald Trump to appeal to his base, he lives on partisanship, he identifies everything coming from the mainstream media as fake news, and when we produce real news he wants to undermine the credibility of what we produce.”

Using the term fake news in this way puts “the civic role of journalism in danger”, says Jonathan Heawood, chief executive of Impress, and muddies the water of what fake news really is. Fake news, or as Facebook calls it false news, is a fabricated story invented to mislead for political purposes or to make money from advertising. Fake news is not something you can dispute or something you disagree with, nor is it satire or parody.

The most popular form of fake news is not that which looks to influence political votes but that which makes money from advertising. Due to the nature of social media, which until recently favoured virality over truth, coupled with a ‘virtually untraceable’ ad tech value chain, teenagers in Macedonia found a way to “print money”, as Dr Laura Dornheim, head of public affairs at Adblock Plus puts it.

‘One person’s freedom of speech is another person’s hate speech’

The question puzzling experts is who is responsible? The people who create it, clearly, but they get no traction without social media. But social media, chiefly Facebook, has managed to escape blame by claiming it is a platform not a publisher, and therefore not responsible – or liable – for what is published on its platform. There are laws in the US that provide blanket protection to these ‘pipes’ in order to protect a free internet – the power of this can be seen perhaps most insidiously in the ongoing trial against Backpage.com and its ties with child sex trafficking.

Critics argue that as Facebook pushes further into original content commissioning, it can no longer claim to be simply a platform. Julian Coles, the digital media policy consultant who drafted the BBC’s online guidelines and policy for social media, suggested that Facebook is in practice “more like a hybrid” – part-platform part-publisher.

Facebook’s director of media partnerships for EMEA, Patrick Walker, believes that the company is a “very different type of social media company”, and that “old media terms” cannot be applied to what it does.

“We need terminology that applies to a very different technological world,” he says. Walker also believes that the problem of fake news and the safety of journalism exist not because of social media but the way the internet has advanced quicker than humans can regulate: “Even without Facebook’s existence we would still be facing these problems today.”

“We have a responsibility of course, we don’t allow anyone to travel the road, we have clear policies, we have to improve them in terms of their application. But that is way beyond a Facebook challenge,” he adds.

Accepting that it is a media company that should be regulated in the same way as the press and broadcasters brings with it many challenges. Facebook founder Mark Zuckerberg wants to protect freedom of speech and information on the platform, which comes under risk when he acts as an editor. The platform has previously come under fire for its murky censorship decisions, including the censoring of an iconic Vietnam war photo, a picture of a 16th-century statue of Neptune, and a photograph of topless Aboriginal women wearing ceremonial paint as part of a protest, leading to accusations of 'racially biased censorship'.

Walker explains: “Human intervention, erring on the side of freedom of speech, is a challenge. One person’s freedom of speech is another person’s hate speech.”

Matt Tee, chief executive of IPSO, the regulator representing the majority of the UK press, believes that irrespective of how the platforms define themselves, they need to take greater responsibility given that they are regarded as publishers by their users and the industry.

New York Times’ Erlanger believes the evading of responsibility by the social platforms represents a much more alarming threat: “For social media to pretend that is simply the road on which anyone can travel is very dangerous, more dangerous than anything which is on the road.”

The solution

In any case, there is no silver bullet to the fake news problem. Facebook has introduced functionality to tag unverified content, but many are cynical as to whether the kind of users that share fake news across the internet – chiefly to reaffirm their own political views – will care about the tag.

IPSO is making its own moves in this area, revealing that it is developing a kitemark symbol for publishers that comply with a set of standards to make it easier for users to identify news they can trust.

Erlanger explains: “The problem with social media is that it appeals to people who find the complexity of the world too confusing, who want simple explanations for everything.”

Instead, policy consultant Coles suggests an algorithm change, where platforms have a quality rating and the quality stories rise to the top of the feed. IPSO’s Tee suggests Facebook should carry the branding of the publishers that first posted the content: “Newsbrands still inspire trust and that is important in the way they are presented on the internet.”

Regulation of content is chiefly seen as a last resort, and favoured in the digital advertising market which has grown into a duopoly and therefore increasingly anti-competitive. Getting the Competitions and Markets Authority to review the digital ad market could help support a more equal advertising market and in turn journalism which relies on ad-funded models, suggests Matt Rogerson, head of public policy at Guardian Media Group.

But regulation of content is much more complex, given the state of regulation across media is not comparable across countries and governments.

“A global answer is overstepping what is possible now,” says Tee.

The key lies in better educating internet users in media literacy. Coles believes the rapid rise of fake news provides clear evidence that there is a need for more critical awareness online. In April, Facebook introduced an educational list on its platform that featured bullet points on how to spot fake news, but this was live for only a few days.

By any means, Facebook admits that its core value of “just connecting people” isn’t good enough anymore, but Walker is confident that technology has solved similar issues before, and can do so again.

“Spam is a good example of what technology can do to get rid of things. Technology has found ways to track down the bad actors, you can see that more and more in the news feed,” he says.

“The most important thing is about education, collaboration, finding ways to support journalism. Our monetization issues are improving, they haven’t been as good as they could have been. Together with the industry we can solve the problems.”

Fake News Facebook Media

More from Fake News

View all

Trending

Industry insights

View all
Add your own content +