Youtube Technology

YouTube’s algorithm is spreading a series of unfortunate far-right events

Author

By Samuel Scott, The Promotion Fix columnist

May 14, 2019 | 19 min read

So I was walking with a friend to see Avengers: Endgame a week before Holocaust Remembrance Day here in Israel. On the way, he told me something completely out of the blue: ‘The Nazi Party was liberal!’ He said he learned it online.

YouTube

My friend had always been conservative. But one day, he had decided to become more ‘spiritual’. He had started to watch ‘documentaries’ on politics and ‘mystical’ Judaism. On the positive side, he stopped smoking and exercised more. But then he started to talk about esoteric mumbo jumbo and drift increasingly rightward to the point where liberals had evidently caused the Second World War.

That is not being ‘conservative’ – it is becoming brainwashed.

It is sad that – in 2019 – I need to remind anyone that the Nazis were far-right, genocidal ultra-nationalists. Further, Adolf Hitler’s main European ally, Benito Mussolini, led the National Fascist Party in Italy. Their opponents? Centrist democracies and the communist Soviet Union. Seriously – did schools stop teaching this at some point?

Human stupidity reinforced by YouTube’s algorithm may kill us all because the company’s recommendation engine is helping to spread a series of unfortunate far-right events.

A dangerous YouTube query on Nazis

Say you’re a high-school student who legitimately wants to learn about Nazism. Just like many other younger people today, you may first research on YouTube. Well, I searched for ‘Nazism’ and saw this.

youtube algorithmw

The first result was a video from Yad Vashem, Israel's official memorial to the victims of the Holocaust. The third was from the History Channel in the United States. Both are perfectly credible.

But the second result – the second! – was a far-right, libertarian lecture describing how Nazism was ‘socialist’. If you search YouTube specifically for that subject, you will see other examples. (For more on the topic, see the Data & Society Research Institute’s Reactionary Right on YouTube PDF report.)

The message in such videos is spreading. Two US Republican congressmen recently described Hitler as a ‘socialist’. Far-right American pundits Glenn Beck and Dennis Prager make similar points. Australia’s Sky News has been broadcasting such opinions. Brazil’s new right-wing president, Jair Bolsonaro, said ‘there is no doubt’ that Nazism was a leftist movement.

Any student of actual history can tell you what happens when the far right feels emboldened. Neo-Nazis march through a German town as a historian warns of the ‘siren call’ of a Fourth Reich. Holocaust survivors say that history may be repeating. A UN official describes the climate in Europe as similar to the 1930s.

Following a right-wing terrorist’s massacre of Muslims in New Zealand two months ago, YouTube – among other online platforms – was unable to stop the immediate spread of sick videos showing the carnage.

Measles is also spreading

measels

It’s not just far-right propaganda. Measles is also spreading around the world due to anti-vaccination misinformation online.

Measles was declared ‘eradicated’ in the US in 2000. Yet so far only this year, the US Centers for Disease Control and Prevention has confirmed 704 cases in 22 states. The World Health Organization also reported that measles cases in the first quarter of this year were up 300% over the same period in 2018.

And what is to blame? In part, videos on YouTube such as those pictured above. The world needs a vaccine against them. US states are looking into revoking the right of parents to refuse vaccines for their children on religious grounds. Germany’s health minister has proposed a $3,000 fine for such people.

Flat-earthers are the epitome of gullible

flat earthers

Far-right propaganda is dangerous. Measles is deadly. But flat-Earth theories are just stupid. Still, a YouGov poll in the US last year found that a significant number of younger people have doubts in something the world has known since the ancient Greeks.

66% of people aged 18-24 said they ‘have always believed the world is round’. In other words, 34% of them – more than 10 million people – believe the earth is flat, have doubts or are unsure. Archimedes wept. Just see the Google Trends results for the increasing searches for the topic.

Asheley Landrum ​is an assistant professor of science communication at Texas Tech University who has led the institution’s research into the subject. She has presented her findings at an annual meeting of the American Association for the Advancement of Science and visited flat earth conferences to interview attendees.

“There’s a lot of helpful information on YouTube but also a lot of misinformation,” Landrum recently told The Guardian. “Their algorithms make it easy to end up going down the rabbit hole by presenting information to people who are going to be more susceptible to it. Believing the Earth is flat in of itself is not necessarily harmful, but it comes packaged with a distrust in institutions and authority more generally.”

In other words, people who are gullible enough to believe that the earth is flat are also more likely to live in a filter bubble where vaccines cause autism and Nazism is a liberal ideology. If a person is brainwashed enough to believe any one of these things, he is more likely to believe the others.

YouTube’s algorithm is a problem

Support for the idea of a flat earth was reportedly dead until platforms such as YouTube arrived. Their algorithms have reportedly been the problem.

“People think it’s suggesting the most relevant, this thing that’s very specialised for you. That’s not the case,” Guillaume Chaslot, a former YouTube engineer who was on the team that created the platform’s algorithm in 2010, recently told The Daily Beast. He said the algorithm “optimizes for watch-time,” not for relevance. “The goal of the algorithm is really to keep you in line the longest.”

“I realized really fast that YouTube’s recommendation was putting people into filter bubbles. There was no way out. If a person was into Flat Earth conspiracies, it was bad for watch-time to recommend anti-Flat Earth videos, so it won’t even recommend them.”

At South by Southwest last year, YouTube chief executive Susan Wojcicki said: “We’re really more like a library in many ways because of the sheer amount of video that we have”. But libraries do not have algorithms suggesting increasingly extremist books to read for the benefit of the library.

When I was a child, my mother banned me from watching a few cable TV channels out of the 35 in our package. But what can parents do when kids are a few clicks away from billions of videos on who-knows-what? Just read what happened when a mother’s teenage son fell into the alt-right after, in part, watching YouTube videos.

After all, a team of BuzzFeed News reporters found that it took only nine clicks through YouTube’s recommendations to go from a bland American public television clip about the US Congress to an anti-immigrant video produced by a hate group.

How YouTube’s algorithm works

Zeynep Tufekci, an associate professor at the University of North Carolina’s School of Information and Library Science, spoke last week at a Columbia Journalism Review symposium on disinformation warfare. She also described elsewhere how YouTube’s algorithm works and added that the platform “may be one of the most powerful radicalising instruments of the 21st century”.

Say a person watches a random video on some topic. According to YouTube’s critics, the recommendation algorithm baits him to keep watching – and seeing more ads – by suggesting a slightly more provocative video on the same subject. And then more and more until someone interested in healthier eating has become a militant vegan and another researching immigration has decided all brown people are bad.

Social network executives often state that algorithms are neutral, but algorithms are editorial decisions written into code. People decide to prioritize and promote certain media over others in line with desired business objectives. Then, they create algorithms to do just that. And that makes those humans responsible for what the algorithms do.

YouTube is not legally liable – yet

From a strictly legal standpoint, YouTube is not responsible in the US for the videos that the platform spreads.

Section 230 of the US 1996 Telecommunications Act (informally called the Communications Decency Act) states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”.

“YouTube is not acting as an information content provider when its users upload their videos,” T. Barton Carter, a professor at Boston University’s College of Communication, said. “Similarly, a newspaper is not liable for comments posted on their sites by others. Even if an interactive computer service is aware of illegal material on its site, there is no liability or obligation to take it down.”

I believe that the law needs to change. YouTube reserves the right to remove videos and decides which types are recommended by the algorithm. YouTube also produces original entertainment programming and will stream live US baseball games. Those editorial decisions make YouTube in practice both an “interactive computer service” and a media company.

It is Section 230 that has allowed social media platforms to do whatever they want to maximize profits without having any fear of consequences.

“I’ve made it my personal mission to educate people about Section 230 as no one seems to know about it, yet it is the source of so many of our problems related to bad online information,” Forrester Research vice president and principal analyst Sucharita Kodali said. “When aggregators of content like YouTube put out irresponsible drivel, it’s a problem – and they have no incentive for fixing it or filtering that content at the moment. Nothing will change until the law is changed.”

Kodali, for example, thinks that “massive class-action lawsuits” should be allowed to combat material such as online misinformation on measles.

“We need a law that allows tech companies to be fined substantially if they distribute dangerous content,” she said. “These companies have had the privilege of policing themselves, and they’ve abused it. The only way they will act more conservatively is if they are forced to abide by the same standards as other companies that publish content.”

YouTube responds

Late last month, Alphabet – the parent company of Google and YouTube – reported a decline in ad revenue growth from 24% to 15% over the prior year and said YouTube was at least part of the reason.

YouTube says the company is making efforts to combat misinformation – but it is unclear whether those actions have had a direct effect on ad revenue. For this column, I contacted the company for comment and sent an extensive list of specific questions.

In response to my questions in writing this piece, Susan Cadrecha, YouTube’s communications manager for the UK, provided the following statement:

"Over the past two years, our primary focus has been tackling some of the platform’s toughest content challenges, taking into account feedback and concerns from users, creators, advertisers, experts and employees,” she said.

“We’ve taken a number of significant steps including updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content and reviewing and updating our policies – we made more than 30 policy updates in 2018 alone. And this is not the end: responsibility remains our number one priority."

YouTube said that the company has been working to increase the prominence of videos by credible news sources in search and recommendation algorithms, provide information panels with links to authoritative resources on topics such as vaccines and reduce recommendations of material that could misinform users in harmful ways. YouTube also said it would demonetize anti-vaccination videos –meaning that the company would no longer let ads appear next to the material, thereby removing an incentive to make them.

According to YouTube, the company has also been continually working to improve the platform’s algorithm. One stated effort was to reduce clickbait by changing the focus from 'number of views' to 'watch time'. YouTube also denied that its first priority is to maximise the number of videos watched and advertisements seen, saying instead that responsibility and satisfaction are the company’s main focuses.

As far as Section 230 and whether YouTube is in practice a media company, a tech platform or both, YouTube said Sky and Virgin Media are considered platforms that carry channels containing third-party material and are regulated under AVMS rather than the carrying platforms. YouTube said the same idea should apply to online platforms such as itself. YouTube also said that the company is right to be subject to different rules for its original programming but that it does not change what the platform is at its core.

‘Don’t be a sucker’

One of my favorite YouTube videos is this archived 1947 clip (start at 2:20) from the US Defence Department warning people against falling for fascist or racist propaganda against any minority group. Everyone should watch it.

If I worked for YouTube, I would tag this video to appear whenever someone searches for material related to political extremism, immigration and any other queries that may relate to potential bigotry. Someone could even create a new version starring Chris Evans as Captain America.

After all, it is impossible to snap our fingers and turn all extremist YouTube videos into dust or go back in time to rewrite Section 230. But it is only a matter of time before countries throughout the world crack down on all interactive computer services. I just hope it will not be too late for my friend.

The Promotion Fix is an exclusive biweekly column for The Drum contributed by global keynote marketing speaker Samuel Scott, a former newspaper editor and director of marketing in the high-tech industry. Follow him on Twitter. Scott is based out of Tel Aviv, Israel.

Youtube Technology

More from Youtube

View all

Trending

Industry insights

View all
Add your own content +