Advertisement
Streaming

YouTube is quick to remove Islamic extremist content—but not Nazi videos

Perhaps only human flaggers can help.

Photo of Josh Katzowitz

Josh Katzowitz

Atomwaffen YouTube

While YouTube remains a haven for white supremacy groups, an investigation by Motherboard shows the streaming website has been more effective in deleting Islamic extremist videos.

Featured Video

According to Motherboard, YouTube videos made by Nazi groups have been left on the platform for months and, in some cases, years. But YouTube was much quicker to delete the content of Islamic extremists. Oftentimes, those videos would be taken down within hours of them being uploaded.

Late last month, YouTube said it wouldn’t censor white nationalist channels like Atomwaffen—which has been implicated in five murders in the last 10 months—or the Traditionalist Worker Party, despite the fact YouTube’s terms of service proclaim that it will ban “content that promotes violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes, such as: race or ethnic origin, religion, disability, gender, age, veteran status, sexual orientation/gender identity.”

Since then, YouTube took action to delete the Atomwaffen channel. But Motherboard reported that copies of the group’s videos still exist on the site.

Advertisement

Google, which owns YouTube, said last year it would “increas[e] our use of technology” and find capable human flaggers to fight terrorism online.

“We tightened our policies on what content can appear on our platform, or earn revenue for creators,” YouTube said in a blog post in December. “We increased our enforcement teams. And we invested in powerful new machine learning technology to scale the efforts of our human moderators to take down videos and comments that violate our policies … 98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms.”

While that appears to be working for pro-ISIS content, YouTube needs more help in determining whether pro-Nazi videos are actually hate speech and should be removed from the platform.

“The hard part is actually joining that up with a sort-of context in order to make a judgment on whether the image that you’re looking at is being used for a white supremacist purpose or not,” ex-NSA hacker Emily Crose told Motherboard.

Advertisement

Perhaps that will be the job of the more than 10,000 human flaggers YouTube said it will employ in 2018.

Click here to read Motherboard’s entire report.

 
The Daily Dot