Why fake news is a bigger problem for Google than Facebook
Since the fake news scandal emerged, Google and Facebook have barred fake news sites from using their ad networks both Facebook and its CEO Mark Zuckerberg have detailed plans for combating fake stories. Google, however, has not been quite as forthcoming.
Google isn't saying much about it, but fake news is likely a big problem it hasn't fully figured out yet.
This could be in part because Facebook took the majority of the heat for fake news initially and, as The Wall Street Journal reported, Google avoided blame in part because its algorithm favors quality sites that include links from other established sites – making the dissemination of fake news less likely.
It could also be because, in the end, Facebook and Google are apples and oranges.
In fact, according to Brian Ussery, director of SEO at interactive agency SapientRazorfish, Google is more concerned with the quality of its organic results than anything else – perhaps even the almighty dollar…to a point.
“Internally at Google, paid and organic are siloed and organic is the most important because no one would click on paid results if they didn’t come for the quality of the organic results,” Ussery said.
In other words, it’s possible Facebook is less concerned about the alleged quality of the content users see as long as they are on the platform and clicking on ads and it is making money.
In a post on the topic, however, Zuckerberg said “more than 99% of what people see” on Facebook is authentic. But that claim is difficult to verify.
What’s more, it’s worth noting Facebook is in all likelihood making fistfuls of cash off of fake news – although it is not clear how much. A story in Forbes cited reports that say fake news could account for more than half of the platform’s ad revenue – which, based on its third quarter revenue, was billions of dollars. The Forbes contributor, Peter Cohan, however, surmised it is “probably well below half." Regardless, the odds are good that it’s still a pretty penny and a pessimist might argue it is no wonder Facebook has been so proactive in assuring the public the fake news problem is under control.
‘A rather big problem’
But, as time has passed, it has become clear fake news is a sizable problem for Google, too.
Google’s new-ish Top Stories feature – which prominently showcases three stories atop search results – has included some fake news on controversial topics like aliens and the Loch Ness Monster. But, as Mediaite reported, a search for less salacious topics, such as “Final Vote Count 2016', showcased a story from the blog 70News that said Donald Trump won the popular vote as the top news story. (This also appeared prior to Google’s Top Stories update.)
For its part, Google did not respond to questions about what topics are generating Top Stories and what criteria goes into determining a so-called Top Story for a given topic.
According to Larry Kim, chief technology officer at online advertising firm WordStream, however, Google has been factoring more user engagement metrics into its organic search rankings, rather than authoritative sites – and this means fake news in Top Stories is a genuine problem.
“[User engagement metrics include] stuff like click-through rates, dwell time, etc. Basically [Google is] trying to figure out if the content is popular and useful or not,” Kim said. “As a result, they’re giving more weight to ‘popularity’ over ‘authority,’ which has been the historical primary ranking factor [like links from Harvard University, etc.].”
Kim said Google has done this to improve search result quality because the older system was vulnerable to link spam – and user engagement can “see through” content that is artificially inflated.
“However, in the case of fake news, this seems to be a rather big problem because the people who are searching for this type of content prefer clicking on and engaging with the content that matches their pre-existing biases,” Kim said. “This high engagement rewards the fake news with even more visibility, including in suggested searches and in trending news topics. I think it’s a bigger issue than Google would like to admit it is.”
Google’s senior manager of global communications and public affairs, Andrea Faville, said the platform “[did not] have much to say on this beyond what we've already said.”
In a previous statement, Google said it wants to provide users with “high-quality and authoritative results for their search queries” including “a breadth of content from a variety of sources.” The statement also noted the search engine is “committed to the principle of a free and open web” and that “understanding which pages on the web best answer a query is a challenging problem and we don’t always get it right.”
In fact, Peter Meyers, marketing scientist at SEO firm Moz, said he suspects the Holocaust example involved some manual intervention – and could have been to avoid PR problems.
“Google hates manually intervening on a query-by-query level and would much rather automate the solution – and bake it into the algorithm,” he said. “Obviously, though, the latter is much, much more difficult in this case.”
Meanwhile, Google faces seemingly infinite queries on par with “did the Holocaust happen,” which raises the question of what platforms should do when users actively seek out what could be considered objectionable content. In other words: Are they obligated to simply give users what they want – or to take a stand and intervene?
According to Meyers, this is another example of the difference between Facebook and Google.
“I think the question about users wanting to find certain content is a good, and difficult one, especially for Google. I think Facebook can say, ‘Hey, if you friend people who believe what you do, and they share certain articles, who are we to intervene?’ and have a leg to stand on,” Meyers said. “Google will have a tougher time arguing that you should be allowed to find bad information, if that's what you want, because the algorithm applies to everyone.”
Meyers also noted most of this is speculation right now: “I don't think Google has an answer, and they've made it very clear that they don't want to talk about fake news publicly, if they can avoid it."
‘A bit of a dodge’
That being said, it’s not completely unfamiliar territory for the search engine. Look at searches for medical problems, for example, Ussery said. These queries yield tons of bogus information, which prompted Google to team up with Harvard Medical School and the Mayo Clinic to fact check some of the most common medical queries in order to provide verified content in the Knowledge Graph.
In a similar vein, Facebook’s fake news plan includes third party verification with fact-checking organizations, as well as input and guidance from journalists “and others in the news industry.”
And, again, it’s unclear precisely how Google plans to deal with fake news, but Meyers said the recent change to Top Stories could be related, even though Google is being “very quiet” and is saying – mostly off the record – that Top Stories was not part of this effort.
“Personally, I think Top Stories lets them continue to use broad sources, which launched with In the News, while pulling back on the idea that it's all ‘news’ in a traditional sense,” Meyers said. “I think the rename is a bit of a dodge, honestly – a way of saying, ‘Hey, we're not claiming these are all news sources.’ Some of that's legitimate – they want to provide stories for a wide range of topics that traditional news sources don't cover. Part of it, though, is probably [cover your ass] on their part.”
This means Google will likely have to make some hard choices ahead as news sources become more subjective and potentially less trustworthy, Meyers added.
And this is where we start to get into some heady philosophical territory about what news is in 2017.
One industry executive who asked not to be named, citing potential objections from the legal department at his agency, said most Americans simply don't want news – which is a bigger issue we have yet to collectively tackle.
“Too many people just want talking points. They don't want to think about things or create their own argument or use logic. They just want retorts they can use,” he said. “I think the answer is education. But I'm not sure how we go about that.”
This is what Cohan called confirmation bias and what Kim referred to when he spoke of users who prefer “clicking on and engaging with the content that matches their pre-existing biases.”
As one fake news writer told the Washington Post:
Nobody fact-checks anything anymore — I mean, that’s how Trump got elected. He just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn’t care because they’d already accepted it. It’s real scary. I’ve never seen anything like it.
‘All news becomes fake’
Trump does indeed have a unique relationship with the media – and social media.
In a story in the Columbia Journalism Review, Nic Dawes, the former editor in chief of South Africa’s Mail & Guardian and chief content officer at India’s Hindustan Times, wrote about his advice for US journalists in an environment in which Trump calls out his media enemies on Twitter and “[lets] the trolls do the rest.”
“The basic idea is simple: to delegitimize accountability journalism by framing it as partisan,” Dawes wrote. “Why should anyone care about your investigation of the president’s conflicts of interest, or his tax bills, if they emanate from the political opposition? The scariest thing about ‘fake news’ is that all news becomes fake. Yours too.”
Arbiters of truth
Indeed, the very concept of what the truth is seems to be up for grabs now.
Or, as Zuckerberg put it, “Identifying the ‘truth’ is complicated.”
And he’s right. The truly outrageous stories are easy to classify, but there are also those that get “the basic idea right but some details [are] wrong or omitted,” Zuckerberg wrote. “An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual,” he added.
And the industry executive who asked not to be named pointed to perceived biases among consumers about media outlets themselves as another factor that muddies the waters about what is accurate and trustworthy.
“One day machine learning will be able to fact check and see if it's properly cited or not, but we aren't there yet,” he said.
This, in turn, raises the question of who gets to make the call about what is true – and what is news – in the meantime.
Over and over again, Zuckerberg – and other Facebook representatives – have said the platform does not want to be an “arbiter of truth.”
And Meyers said he suspects Google is also hesitant to step into this role as “there is a lot of PR and probably regulatory concerns to going that route.”
So what happens now?
At the end of the day, Ussery said the determination of what fake news is may come down to public opinion. But, again, what is fake and what is genuine is different to different people, which he noted puts Google in a difficult position.
Ussery said the problem may also go back to consumers who don’t realize journalists sometimes have ulterior motives – like getting page views.
“It used to be that [journalists] wrote unbiased content that was fair and evenly balanced and it seems like some people may be pushing their own views or trying to achieve their own goals [now],” Ussery said. “It could be political, it could be to get [readers] to click on ads or to get more traffic or more tweets.”
And so the onus may fall on journalists to return to fair and balanced reporting, he said. But it may also mean lawmakers will have to get involved.
In the meantime, Dawes said US journalists facing a (potentially) fake-news-fueled Trump presidency will have to get used to having less access, spending more time in court, being stigmatized as the opposition -- and they will have to get organized.
But the good news – if any – is the media industry is doing just that. In fact, while it’s too late for the US – at least for the next four years – with elections looming in France, Germany and the Czech Republic, media organizations from around the world have banded together to form the First Draft Partner Network to improve the verification of online news and to expose the perpetrators of fake news.
In a column in The Drum, former Independent media editor Ian Burrell wrote: “the news industry’s chance to show that it is committed to public service and education rather than mere entertainment and titillation.”
Time will tell if the news pendulum will swing back – and how consumers and platforms alike will respond – or if the world has forever changed.