Conspiracy theory videos on the site will now include text from Wikipedia pages that users can click on to learn more about the topic in question. For instance, someone watching a video about chemtrails would see a “companion unit” featuring Wikipedia’s “Chemtrail conspiracy theory” page.
According to Wojcicki, the feature is set to roll out in the coming weeks. While she did not explicitly say how exactly YouTube plans to determine what is considered a conspiracy theory video, she did say that the site will be “using a list of well-known conspiracies from Wikipedia” to help it decide which videos should receive the additional information.
“When there are videos that are focused around something that’s a conspiracy, we will show as a companion unit next to the video information from Wikipedia,” said Wojcicki. “People can still watch the videos, but they actually have access to additional information.”
The move comes weeks after YouTube was criticized for letting a conspiracy theory video about last month’s mass shooting in Parkland, Florida take the top spot in its “Trending” section. The video accused David Hogg, a survivor of the shooting who has since spoken out for gun control, of being a “crisis actor.” YouTube eventually pulled the video for violating its policies.
Over the past year, YouTube has struggled to keep its platform free of extremist and offensive content. About a year ago, brands including Verizon and PepsiCo pulled advertising from the platform due to concerns that their ads were appearing next to videos that promote terrorist groups.