Trending

Just how trustworthy is YouTube’s ‘Trusted Flagger’ program?

Membership gives users access to “advanced flagging tools.”

Photo of Audra Schroeder

Audra Schroeder

Article Lead Image

With the volume of videos being uploaded to YouTube every day, it’s impossible for the company to police and screen every single piece of content in-house. That’s where YouTube’s Trusted Flagger program comes in.

Featured Video

The program’s existed since October 2012, in an effort to root out videos that might contain pornography, hate speech, animal abuse, or copyright infringement. YouTube hasn’t been very public about recruitment, which is invite-only, though interested parties can submit their names for consideration. According to the program’s official page, “Membership in the Trusted Flagger Program gives users access to more advanced flagging tools as well as periodic feedback, making flagging more effective and efficient. As always, the policy team at YouTube makes the final determination of whether content should be removed.”


This process came under scrutiny last week, when the Financial Times reported Google, YouTube’s owner, allegedly gave expedited “super-flagging” privileges to certain users, including British security officials, as a means to root out extremist material “at scale,” which means up to 20 videos at a time. James Brokenshire, the U.K.’s security and immigration minister, told the Financial Times that they’re trying to do more to address Internet content “that may not be illegal but certainly is unsavoury.”

Advertisement

YouTube maintains they have final say over whether a video is removed. A Google spokesperson told the Wall Street Journal, “Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong.”

A Wall Street Journal source familiar with the program further explained that a “vast majority of the 200 participants in the super flagger program are individuals who spend a lot of time flagging videos that may violate YouTube’s community guidelines. Fewer than 10 participants are government agencies or non-governmental organizations such as anti-hate and child-safety groups.”

This situation is especially timely, as questions about the ethical boundaries of U.K. intelligence agencies abound. Google might just be extending more power to trusted flaggers, but that power allegedly yields more results than a casual viewer flagging a video. The Wall Street Journal’s source reported that 90 percent of content marked by super flaggers is either removed or restricted.

An email to YouTube about the program was not returned.

Advertisement

Illustration by Jason Reed

 
The Daily Dot