Wikimedia responded that it was not made aware of yesterday's SXSW announcement from YouTube chief executive Susan Wojcicki to add Wikipedia content to help curb the spread of conspiracy theory videos.
In a tweet to its followers, the company said, "We are happy to see people, companies, and organizations recognize Wikimedia's value as a repository of free knowledge. In this case, neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube. We were not given advance notice of this announcement."
— Wikimedia (@Wikimedia) March 14, 2018
These new "information cues" that Wojcicki spoke of will pop up during videos played on YouTube; users can click on these cues to find more information on the topic in question. She used examples of the moon landing and chemtrails during her SXSW speech.
“When there are videos that are focused around something that’s a conspiracy, we will show as a companion unit next to the video information from Wikipedia,” Wojcicki said during her speech. “People can still watch the videos, but they actually have access to additional information.”
Google already utilizes Wikipedia when its users search for individuals or entities, at times pulling shorthand info from the open-source encyclopedia in a separate box on its search results. Although Wikimedia did not directly reference YouTube in the rest of its letter, according to The Verge, the organization retweeted statements from long-time Wikipedia contributor Pheobe Ayers, who raised concerns about the world's second largest search engine treating the site like academia for fact-checking.
"I don't think YouTube can rely on our irregularly updated *encyclopedia* to solve its ranking algorithm/hate speech issue," she tweeted, "people don't read ref[erence]s."
1) ok, seems fine. Hope they credit us appropriately
2) I don't think YouTube can rely on our irregularly updated *encyclopedia* to solve its ranking algorithm/hate speech issue; ppl don't read refs.
3) web 2.0 is wierd, man.
/goes back to working on Wikipedia articles https://t.co/f9G1JZLTu2
— Phoebe Ayers (@phoebe_ayers) March 14, 2018
Ayers also questioned the potential short and long-term impact of YouTube, an ad-supported platform, mining the resource as "free labor" for its own safety.
"YouTube should probably run some A/B tests with the crew at @WikiResearch first," she said. "Does linking result in increased traffic? Increased vandalism? It's not polite to treat Wikipedia like an endlessly renewable resource with infinite free labor; what's the impact?"
Back in December, YouTube announced a four-step plan to combat brand safety issues that have plagued the video site. Spurred by Verizon and PepsiCo's decision to pull ads from the platform in fear that those ads would appear next to terrorism-related videos. Wojcicki did not give details on how the video site would determine what's "conspiracy," but did say the site will be “using a list of well-known conspiracies from Wikipedia” to help it decide which videos should receive the additional information.