The Drum Awards for Marketing - Extended Deadline

-d -h -min -sec

Fake News Media the Observer

The Drum Review 2018: fake news (part one)

By Mike Nutley , Editor

November 22, 2018 | 9 min read

This time last year, “fake news” was announced as the Collin’s English Dictionary Word of the Year, with Collin’s head of language content Helen Newstead describing the term as “inescapable”.

Fake news

Fake news timeline : The drum review 2018

Not much has changed in the year since then. “Fake News” - and fake news - continues to dominate the headlines, with two big stories ensuring the term has become a concern for everyone, not just people who work in the media.

In February, 13 Russians and three Russian entities, one of them the Internet Research Agency, were indicted in the US on charges of interfering with the 2016 US election. A month later, The Observer alleged that Cambridge Analytica, which worked on Donald Trump’s presidential campaign, had used data from 50m Facebook profiles without users’ consent and used it to build targeting profiles for political advertising. As a result, Mark Zuckerberg and Cambridge Analytica’s CEO, Alexander Nix, were questioned by Congress. Cambridge Analytica was investigated by the Information Commissioner’s Office in the UK, and the company announced it was shutting down in May. In October the ICO fined Facebook £500,000 for serious breaches of data protection law.

The ICO has also been carrying out a broad review of the use of data analytics for political purposes, and its report was published on 6 November. One of its recommendations is that the government should issue a statutory code of practice around the use of data in political campaigning, and it is now consulting on the idea.

At the same time the Commons Committee for Digital, Culture, Media and Sport (DCMS) has been investigating disinformation. Its interim report, published in July warned of a crisis for democracy, and called for the Electoral Commission to be given greater powers, and for more regulation of social media firms’ behaviour during elections. Its final report is due in December.

Here’s a look back at some of the other key “Fake News” stories from this year. All of them are real.

January

Facebook founder Mark Zuckerberg announces plans to combat fake news by asking users to identify which news sources are trustworthy. “As part of our ongoing quality surveys, we will now ask people whether they're familiar with a news source and, if so, whether they trust that source,” he said in a blog post. Later in the month the survey is revealed to consist of two questions. Users will be asked whether they recognise a set of websites, then to what extent they trust each of them on a sliding scale from ‘entirely’ to ‘not at all’.

The UK Government announces plans for a “rapid response social media capability to deal quickly with disinformation and reclaim a fact-based public debate”. This will be led by a new team based in the Cabinet Office. The move is one of eight “professional challenges” set for the Government Communication Service in 2018 by its executive director, Alex Aiken.

Edelman’s Trust Barometer reports trust in traditional media rose 13% in 2017, to 61%, with trust in online-only media up 5% to 45%. But overall trust in the media as an institution fell from 26% to 24%. The PR firm carried out the survey across 28 markets. In the UK, only 32% of the public trust the ‘media in general’, a figure which drops to 23% among young people.

February

YouTube announces plans to label videos uploaded by news broadcasters that receive some level of government or public funding. It will also allow users and publishers to send their feedback through a ‘Send feedback’ form.

Facebook begins testing a “downvote” button in the US, allowing users to flag comments to the company that are inappropriate, uncivil, or misleading.

Unilever CMO Keith Weed uses the IAB Annual Leadership Meeting in the US to urge the digital media industry to clean up its act, saying the FMCG giant will no longer invest in platforms “that create division in society”. “2018 is the year when social media must win trust back,” he says. “Across the world, dramatic shifts are taking place in people’s trust, particularly in media. We are seeing a critical separation of how people trust social media and more ‘traditional’ media. In the US only less than a third of people now trust social media (30%), whilst almost two-thirds trust traditional media (58%).”

Researchers at the University of Cambridge design a game called Bad News to educate about how fake news and conspiracy theories travel online. The researchers believe that exposing people to tactics used by fake news producers can act as a “psychological vaccine” against bogus campaigns.

March

The Associated Press extends its collaboration with Facebook to “identify and debunk” false and misleading stories on the platform related to the US Mid-Term Elections in November at local, state and national levels.

A 11-year study by the Massachusetts Institute of Technology reveals that false news was 70% more likely to be re-tweeted than true stories; that true stories take six times longer to reach 1,500 people, and that while true stories were rarely shared by more than 1,000 people, fake news items could reach up to 100,000.

YouTube announces plans to use Wikipedia content to help the spread of conspiracy theory videos. They will now include text from Wikipedia pages that users can click on to learn more about the topic in question.

Google teams up with US fact-checking project First Draft to build a ‘Disinfo Lab’ to identify fake news stories and remove them from the Google News Feed. It also announces a partnership with MediaWise – a US-based educational organisation which trains young consumers how to identify fake news.

April

The Indian Government says it will cancel the accreditation of any journalists found guilty of writing fake news.

The European Commission plans to crack down on social media companies that spread fake news, amid fears that next year’s European Parliamentary elections could be vulnerable to online “disinformation”. Sir Julian King, the European Commissioner for security, leads calls for a ‘clear game plan’ for social media companies’ behaviour during election periods, including a ‘more binding approach’ than self-regulation.

May

Online video advertising platform SpotX teams up with web content categorisations specialist Zvelo in a bid to improve brand safety. The partnership aims to remove sites that disseminate fabricated news and disinformation from the ad inventory it offers.

A University of Buffalo study finds that fewer than 10% of Twitter users questioned the veracity of the tweets they were reading around incidents including the Boston Marathon bombing and Hurricane Sandy. Fewer than 10% of individuals took the time to delete tweets that were later proved to be false, and less than 20% actively shared the debunking with their own followers.

June

The Ogilvy Global Media Influence Survey reports a continuing drop in trust in traditional media outlets. The survey finds 50.4% of reporters feel that traditional media is the most trusted news platform, down from 72% in 2016. Owned media is seen as the second most trusted platform in EMEA at 34%, followed by social media with 10%.

Facebook announces updates to its fact-checking programme, including removing fake accounts, partnering with fact-checkers and promoting news literacy. It extends its programme to include fact-checking photos and videos, and to cover new countries.

Later in the month both Facebook and Twitter reveal plans to bolster their transparency tools ahead of the US Mid-Term Elections. Twitter launches a ‘political campaigning policy,’ that will show billing information, ad spend, demographic data, and impression data to everyone, regardless of whether they have an account or not. Facebook will enable users on Instagram, Facebook, Messenger or any of its partner platforms to view all ads as page posts, including creative and copy; and to see more page information, including recent name changes and page creation date.

July

Whatsapp launches a call for research proposals into the use of information processing for problematic content, election-related information, network effects and virility, digital literacy and misinformation, and detection of problematic behavior within encrypted systems.

Facebook announces it has shut down 32 individual pages and accounts that it deemed were “bad actors” sending misinformation to users of the social platform and Instagram.

Fake News Media the Observer

More from Fake News

View all

Trending

Industry insights

View all
Add your own content +