Advertisement
Tech

Most videos people regretted watching on YouTube came from its recommendation algorithm

Mozilla said 71 percent of ‘regret reports’ came from recommended videos.

Photo of Andrew Wyrich

Andrew Wyrich

The YouTube logo.
Rego Korosi/Flickr (CC-BY-SA)

A majority of YouTube videos that internet users regretted watching were recommended to them by the platform, a new report from the Mozilla Foundation has found.

Featured Video

Mozilla, the internet company behind the Firefox browser, released findings today from its “RegretsReporter” browser extension that it launched last year. The extension allowed users to report videos they regretted watching, and allowed for Mozilla to crowdsource information about YouTube’s recommendation algorithms.

YouTube’s recommendations have long been criticized for sending users down extremism rabbit holes.

Since it launched last year, Mozilla said 37,380 people installed RegretsReporter. In its report released today, Mozilla said it gathered information from 3,362 reports from 1,662 volunteers who were in 91 countries.

Advertisement

The reports were submitted between July 2020 and May 2021.

Specifically, Mozilla said 71 percent of the Regret reports came from videos that were recommended to users. Meanwhile, recommended videos were 40 percent more likely to be regretted than videos that a user searched for. Additionally, Mozilla said there were “several cases” where YouTube’s recommended videos actually violated the company’s own community guidelines.

Overall, Mozilla said the kinds of videos that were most frequently regretted were misinformation, violent or graphic content, hate speech, and spam and scams.

Advertisement

Finally, Mozilla said non-English speakers are “hit the hardest.” The rate of YouTube regrets being reported to them was 60 percent higher in counties that don’t have English as a primary language.

“Our research suggests that the corporate policies and practices of YouTube, including the design and operation of their recommendation algorithms, is at least partially responsible for the regrettable experiences that our volunteers had on the platform,” Mozilla wrote in its report. “We believe our research has revealed is only the tip of the iceberg, and that each of these findings deserves and requires further scrutiny.”

In a statement to the Daily Dot, YouTube asserted that recent changes have decreased the number of times “borderline content” is viewed by users through recommendations.

“The goal of our recommendation system is to connect viewers with content they love and on any given day, more than 200 million videos are recommended on the homepage alone. Over 80 billion pieces of information is used to help inform our systems, including survey responses from viewers on what they want to watch. We constantly work to improve the experience on YouTube and over the past year alone, we’ve launched over 30 different changes to reduce recommendations of harmful content. Thanks to this change, consumption of borderline content that comes from our recommendations is now significantly below 1%,” a YouTube spokesperson told the Daily Dot in a statement.

Advertisement

This article has been updated with a statement from YouTube.


Read more about Big Tech

Congress barrels forward with EARN IT Act, determined to end encrypted messaging online
How little tech is turning the tide in the fight against big tech
FTC warns of ‘huge surge’ in social media scams
How the FTC can use ‘data minimization’ to immediately strengthen consumer privacy
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
Advertisement
 
The Daily Dot