Streaming

YouTube limps into 2018 with bigger problems than Logan Paul

The algorithm goes off the rails.

Photo of Audra Schroeder

Audra Schroeder

Cartoon of three youtuber that will be featured in the 2017 state of youtube article

As 2018 was ushered in, Logan Paul, a 22-year-old YouTube star with 15 million followersmany of them teens and youngerposted a video of a dead body hanging from a tree to his channel.

Featured Video

It was eventually removed (by Paul, not YouTube), and he issued two apologies this week in an effort to move past the outrage, but something about this felt different from other YouTube scandals. It showed the chasm between content and creator, and the consequences of the raw impulse to document everything. To viewers, it was insensitive and sickening. To Paul, it was “a moment in YouTube history.”

YouTube isn’t experiencing growing pains. It has reached a point where it’s promoting cognitive dissonance and accountability is absent. Advertisers have pulled money after ads were shown next to extremist content; elsewhere, the LGBTQ community spoke out about YouTube apparently restricting content from certain channels as inappropriate. At the same time, YouTube is attempting to trot out original content and live TV, with ad revenue to match, and its YouTube Red originals are apparently seeing millions of views. But YouTube’s house is not in order.   

‘Please the algorithm’

A few months ago, I fell asleep to a YouTube video about aliens. It was one of those oddly automated clips categorizing the different “kinds” of aliens who have visited Earth. The tone was light, and the video featured cartoonish illustrations. When I awoke roughly 20 minutes later, YouTube’s algorithm had brought me to a channel that trades in videos about hidden symbols in entertainment and secret reptilian celebs, as well as more nebulous videos about “hidden messages” in Pepe the Frog and repurposed news stories about Vladimir Putin. (The channel lists its country as Russia.)

Advertisement

It’s not surprising this content exists, but the swiftness with which I arrived there caught me off guard. YouTube’s always leaned on its algorithms to “learn,” but they’ve taken over in more ominous ways.

While creator-first content and community-shaping personalities still thrive, cracks have started to show. Most troubling, its algorithm revealed that there are thousands of videos that contain abusive content aimed at kids or involving kids. Elsewhere, pedophiles were allegedly using YouTube to post illicit clips of children. (YouTube declined to comment for this article. Instead, we were pointed to two blogs by CEO Susan Wojcicki about combating harassment and abuse in the community.)

In a BuzzFeed report from December, it was revealed that YouTube channels depicting child abuse or endangerment under the guise of “family” entertainment were lucrative. In late 2017, the popular channel Toy Freaks, which features Greg Chism and his two daughters engaged in distressing activities, was shut down and Chism was investigated though not charged. Elsewhere, the weird subgenre of superhero videos aimed at children but containing very adult (and often violent) themes was flagged by rapper B.o.B., among others. Reached for comment on those videos in July, a YouTube rep claimed: “We’re always looking to improve the YouTube experience for all our users and that includes ensuring that our platform remains an open place for self-expression and communication. We understand that what offends one person may be viewed differently by another.”

In that same BuzzFeed report, a father who runs a “family” channel and had his account demonetized because of questionable videos offered a telling quote: “In terms of contact and relationship with YouTube, honestly, the algorithm is the thing we had a relationship with since the beginning. That’s what got us out there and popular. We learned to fuel it and do whatever it took to please the algorithm.”

Advertisement

To reverse this new kind of worship, YouTube ostensibly needs more humans and has said it’s hiring a team of 10,000 moderators to review offensive content. But, as proven by Facebook’s attempts at moderation, what humans see in the course of reviewing content has lasting effects. And the guidelines YouTube gives its human reviewers are apparently flawed: BuzzFeed recently interviewed people tasked with reviewing and rating videos, and found that it was often unclear what they were rating: YouTube’s guidelines put emphasis on high-quality videos, even if they were disturbing.

Advertisement

Free speech vs. hate speech

While Twitter has finally started purging hate accounts, YouTube’s scale and automation make that difficult. Fake reports and conspiracy videos about the Las Vegas shooting showed up at the top of search results in October. An August NYT report outlined the growing far-right presence on YouTube, one that’s young and “acquainting viewers with a more international message, attuned to a global revival of explicitly race-and-religion-based, blood-and-soil nationalism.”

Felix Kjellberg, aka PewDiePie, one of the most popular personalities on YouTube with more than 50 million subscribers and brand recognition, yelled the N-word during a gaming livestream and paid two men to hold up a sign that said “Death To All Jews.” The Wall Street Journal found nine videos on his channel that include Nazi imagery or anti-Semitic jokes. This led to his contract with Disney being terminated and his show Scare PewDiePie canceled. Many fans claimed this was a violation of free speech, but roughly a year later, his subscriber count has not dwindled. (Kjellberg did not respond to a request for comment.) In a piece for BuzzFeed, Jacob Clifton argued that Kjellberg isn’t some lone monster; he’s just one “symptom of a majority illness” that’s sweeping online platforms.

In July, poet and musician Joshua Idehen tweeted a lengthy thread about the “YouTube thingy” and the state of gaming culture, racism, and harassment on YouTube, including PewDiePie’s stunt.

Advertisement

He told the Daily Dot he used to be an “edgelord,” a term for those men (and it’s typically men) who engage in provocative behavior and ideologies for laughs, but that he “only made it out with my humanity thanks to many, many good women.” Asked what YouTube should do to combat this sweeping tide, he offers: “Ban Nazis. Ban hate speech. Ban targeted harassment. Hire a dedicated moderation staff with all that world domination money.”

Advertisement

YouTuber Megan MacKay says “we’re getting a glimpse of the dark side of the democratization of content creation.”

“When anyone can pick up a camera and make a video without being beholden to anyone, we’re bound to eventually get content that crosses the uncrossable line,” she says. “I think clarifying and enforcing the terms of service is really the only way major platforms can ensure the safety of their community while also stemming the flow of hate speech and alt-right garbage, but even when these corp[orations] take a performative stand against this type of content, they seem to fail or waffle when it comes to actually doing something about it. I can’t speak to why exactly they continue to drop the ball, whether it’s fear of losing big-name users or struggles with scale, but it’s a major problem that only ever seems to be halfheartedly addressed.”

Bad actors

This was reflected in YouTube’s official response to the Logan Paul video on Tuesday, which falls in line with so many of its other responses. A YouTube spokesperson said, “Our hearts go out to the family of the person featured in the video,” but beyond that, they just reiterated the Community Guidelines. It doesn’t offer any solutions or answers, and Paul will likely continue making money.

Advertisement

YouTube has always banked on drama, and in the process, it’s fostered fanbases that know no boundaries. In November, fans of Alissa Violet and Ricky Banks trashed a Cleveland bar online and menaced innocent residents of the city after the two were thrown out of the venue. Jake Paul, the younger brother of Logan Paul, is also wildly popular with kids and teens, but people who don’t enjoy his sophomoric, braying brand of humor weren’t too happy about living next to his chaotic prank house. He was called out for being a racist, after he joked that a fan from Kazakhstan might “blow someone up,” and a bully, after fellow YouTubers the Martinez twins accused him of abuse. (His “pranks” on the twins involved airhorns and leaf blowers.) After Hurricane Harvey, he showed up in San Antonio to “help” and his rabid fans created even more chaos in a Walmart parking lot.

In a 2015 profile of Logan Paul, who got his start on Vine, he expressed his desire to move past “clean” content and expand his fanbase: “I want to be the biggest entertainer in the world. That’s my deal. I’ll do whatever it takes to get that.” Two years later, he posted a video of a dead body and YouTube’s response was essentially the shrug emoji.

YouTube has built a massive global audience in its decade-plus of existence, one that hinges on self-starting. We’re discovering what happens when self-starting has no boundaries and when an open platform keeps revising its Community Guidelines instead of ripping them up and starting over. In her December blog, Wojcicki claimed that YouTube’s goal for 2018 “is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube.”

But what if some bad actors are your most popular creators? And is one step really enough?

Advertisement
 
The Daily Dot