Advertisement
Tech

YouTube plans to have 10,000 moderators battling deluge of awful videos

YouTube is seriously beefing up its moderation team.

Photo of Christina Bonnington

Christina Bonnington

Article Lead Image

In an effort to battle the ever-growing issue of extremist content on its site, YouTube plans to grow its moderation team to 10,000 people in 2018. Algorithms can only go so far, YouTube CEO Susan Wojcicki said.

Featured Video

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” Wojcicki wrote in a blog post detailing the announcement.

While YouTube’s moderators have manually examined more than 2 million videos since June alone, YouTube is looking to expand this human team even further in the coming year in order to more quickly and more efficiently identify and remove content that violates its guidelines. It also aims to be more transparent about how it handles such “problematic content.”

Right now, YouTube uses machine learning to initially flag videos for review by moderators. After deploying this method in June, YouTube has removed more than 150,000 violent extremist videos. YouTube’s machine learning identification algorithms were able to spot 98 percent of such videos on the site, helping moderators remove five times more videos than they were able to do previously. Staffers are able to remove content more quickly than before, as well: Half of such uploaded content is removed within two hours of upload, 70 percent within eight hours.

Advertisement

Right now, YouTube’s machine learning technology is fine-tuned specifically to identify violent extremist videos. However, the company is working to customize the algorithms to identify other areas such as hate speech and child safety. The latter has proved a serious issue for the video-watching platform.

Beginning in 2018, YouTube will regularly publish reports about how it’s enforcing its community guidelines. The report will discuss the type of flags YouTube sees, as well as what actions the company takes to remove inappropriate comments and video uploads.

After discovering ads were placed against more than 2 million inappropriate videos, YouTube also says it will take a new approach to advertising. The company plans to augment its team of ad reviewers to perform more manual curation, to more relevantly pair ads with videos. This should benefit advertisers, who shouldn’t find their product paired with unsavory content, as well as creators, who aim to make money off their uploads. It should benefit those of us just watching too, since we should see more appropriate advertising on the videos we view and less unsavory content on the site.

H/T The Hill

Advertisement
 
The Daily Dot