Tech

YouTube CEO addresses conspiracy theory problem

Susan Wojcicki struggled to address core issues.

Photo of Audra Schroeder

Audra Schroeder

susan wojcicki youtube sxsw

YouTube CEO Susan Wojcicki talked about how the company is pushing back against conspiracy theory videos and radicalized content at SXSW on Tuesday.

Featured Video

Wired Editor-in-Chief Nicholas Thompson interviewed Wojcicki and pressed her on several issues that have plagued YouTube over the last year. Regarding misinformation on the platform, she talked about how the company changed the homepage in early 2017 to add breaking news in an effort to give viewers more accurate and authoritative content, as well as adding the Top News shelf. 

What does “authoritative” news mean? Wojcicki didn’t elaborate when it came to detailing what algorithms they use to determine that, but she said sometimes “journalistic awards” and the traffic of an outlet are taken into consideration. However, she also stated, “We’re not a news organization,” which only muddied the previous statement.

The big announcement was the upcoming implementation of “information cues” to combat the epidemic of conspiracy videos on YouTube, which became an issue once again after February’s Parkland shooting. Additional info will be added alongside videos that might be conspiracy-related and show alternative sources. Where is the info coming from? Wikipedia, a site that can be edited, is apparently one source. Wojcicki used the moon landing as an example of an event that’s been questioned by conspiracy theorists; hypothetically a link to historical context will debunk misinformation, but what about conspiracy videos that are recommended by YouTube’s algorithm?

Advertisement

Regarding YouTube’s hiring of 10,000 humans to review content, Wojcicki said for the first time last year the company used machine learning to remove extremist content. However, she cautioned, “We still need humans,” and that continues to be true. She said they’ll start limiting the number of hours a day part-time moderators can view disturbing content to four, which still seems like a lot. 

The radicalization of YouTube is another big issue, and Thompson referenced a recent New York Times op-ed that posited that the company might be benefitting from serving extreme content: “What we are witnessing is the computational exploitation of a natural human desire: to look ‘behind the curtain,’ to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

Wojcicki said the company always want to do the right thing for the user, but again stayed vague when it came to solutions. She said early in the interview that she considers YouTube more of a library, something educational, which sounds almost quaint—but it has surpassed that analogy. 

 
The Daily Dot