Advertisement
Streaming

YouTube has been recommending explicit self-harm videos

This is potentially very dangerous.

Photo of Josh Katzowitz

Josh Katzowitz

YouTube self-harm videos

YouTube videos that show images of people who have graphically self-harmed themselves have been popping up in users’ recommended videos section, according to a report by the Telegraph.

Featured Video

At least 12 explicit self-harm videos were discovered by the newspaper. Business Insider noted that YouTube took down two of the videos found by the Telegraph but that others still remain.

One is from January 2017 where a YouTuber shows off her self-harm scars, though she says she’s displaying her body to keep other people from cutting themselves. Nearly 400,000 people have watched the vlog in which she displays dozens of scars on her legs, arms, and hands, including an old cut on her wrist where she tried to commit suicide. The video that immediately began playing after that one had more than 1.5 million views in which a YouTuber describes how her school reacted to her self-harm scars.

According to the Telegraph, it also discovered search term recommendations for “how to self-harm tutorial,” “self-harming girls,” and “self-harming guide.” Since being informed of that, YouTube has removed those recommendations.

Advertisement

YouTube continues to try to improve its algorithm so that it doesn’t necessarily recommend conspiracy videos or inappropriate videos for kids. But the social media giant, thus far, has found it impossible to shield every harmful or distasteful video from journeying its way through the algorithm to wind up on your recommended videos list and eventually on your screen.

“We know many people use YouTube to find information, advice or support sometimes in the hardest of circumstances,” YouTube told Fox News in a statement. “We work hard to ensure our platforms are not used to encourage dangerous behavior. Because of this, we have strict policies that prohibit videos which promote self-harm and we will remove flagged videos that violate this policy. Our policies also prohibit autocomplete predictions for these topics, and we will remove any suggestions which don’t comply with our policies.”

YouTube also has a page in its help section that gives guidance to how people can call, text, or look up online how to find support. Writes YouTube, “If you come across content in which someone is suicidal or engaging in self harm, please contact local authorities and flag the video to bring it to our immediate attention. We reach out to these individuals with resources and work with suicide prevention agencies to provide assistance when possible. If you believe that someone is in imminent danger, please call the police.”

YouTube isn’t the only social media site that has this issue. Instagram said it will begin using “sensitivity screens” to blur images of suicide or self-harm.

Advertisement

For more information about suicide prevention or to speak with someone confidentially, contact the National Suicide Prevention Lifeline (U.S.) or Samaritans (U.K.).

 
The Daily Dot