Stay ahead – join The Drum+

YouTube begins bouncing terror searches to anti-hate videos

The measure is the latest designed to assuage advertisers

YouTube has made good on its promise to crack down on hate speech online by beginning to redirect searches for terrorist propaganda toward anti-hate videos in a bid to halt the potential indoctrination of viewers.

The move will blacklist a glossary of unspecified terror-related keywords, referring individuals using the search terms to a curated playlist of videos designed to ‘debunk violent extremist recruiting narratives’.

The video streaming giant will also develop video content of its own designed to counter extremist narratives at every stage of indoctrination.

YouTube is moving to restore trust in the wake of an advertiser boycott in which some brands froze spend with the platform amid fears they may inadvertently become associated with videos espousing violence, extremism and hate-speech.

The new tool is the result of research by Alphabet-owned think tank Jigsaw, which collaborated with anti-extremism organisation Moonshot in order to create the Redirect Method – a means of steering potential terror recruits away from their online recruiters.

Over the coming weeks this technique will be rolled out internationally by incorporating a broad range of non-English search queries – each of which will be dynamically updated by machine learning algorithms in response to real world data.

Announcing its long-awaited move YouTube said in a blog post: “We hope our work together will also help open and broaden a dialogue about other work that can be done to counter radicalization of potential recruits.”

The measure is the latest designed to assuage advertisers that YouTube’s house is now in order after it clarified last month that material deemed to be ‘hateful’, already ostensibly banned, will be ineligible to benefit from advertising revenue.

Google has already revealed a raft of measures designed to crackdown on extremist content including expansion of its trusted flagger programme, tighter oversight of videos which test the bounds with borderline content.

By continuing to use The Drum, I accept the use of cookies as per The Drum's privacy policy