YouTube hasn’t cleaned up its conspiracy-themed videos problem. Instead, the issue is worsening every time a new mass shooting or terrorist event occurs. That’s the takeaway from a data researcher who performed an extensive search of “crisis actor” videos that eventually recommended as many as 9,000 other conspiracy-themed videos that had been watched nearly 4 billion times.
According to professor and journalist Jonathan Albright, YouTube is unwittingly helping the conspiracy theory industry grow with each new mass shooting because the website incentivizes these disinformation campaigns by hosting a site where content creators can upload their videos and can make money while doing so.
“Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value,” Albright wrote in a Medium post on Sunday. “The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach. In other words, due to the increasing depth of the content offerings and ongoing optimization of YouTube’s algorithms, it’s getting harder to counter these types of campaigns with real, factual information.
“I hate to take the dystopian route, but YouTube’s role in spreading this ‘crisis actor’ content and hosting thousands of false videos is akin to a parasitic relationship with the public.”
YouTube did not respond to a Daily Dot request for comment on Albright’s assertions.
Interest was renewed in YouTube’s conspiracy-themed videos this month in the wake of the Parkland shooting when a video accusing survivor David Hogg of being a “crisis actor” landed in the top spot on the trending page. That continued a conspiracy-tinged trend that occurred after the Las Vegas and the Sutherland Springs shootings.
If you had searched on YouTube for David Hogg on Feb. 21, the top three results emanated from conspiracy channels.
As a result, YouTube reportedly gave the Alex Jones Infowars channel a strike for the video that was eventually deleted. If Jones’ channel receives two more strikes in the next three months, YouTube would terminate his account.
But Jones’ channel isn’t the only one to be making money off these videos.
As Albright explained, 50 of the most-watched mass shooting-related conspiracy videos have been watched about 50 million times, and if you keep following YouTube’s recommended videos algorithm, it’ll lead you to content that has been viewed billions of times.
In his study of what YouTube recommends while somebody is watching a conspiracy video—he began by searching for “crisis actor” videos—Albright wrote that 90 percent of the titles are “a mixture of shocking, vile and promotional. Themes include rape game jokes, shock reality social experiments, celebrity pedophilia, ‘false flag’ rants, and terror-related conspiracy theories dating back to the Oklahoma City attack in 1995.”
Here’s one example he posted.
From Albright’s perspective, YouTube, no matter how much it’s trying to clean up these conspiracy-themed videos, is empowering those who are creating them. It’s a problem YouTube hasn’t figured out how to solve, and at this point, there’s an argument to be made that it’s a problem that perhaps can’t be solved at all.