Advertisement
Internet Culture

Reddit’s war on trolls isn’t about making the site PC—it’s about harassment

Why doesn’t the Internet understand the difference?

Photo of S.E. Smith

S.E. Smith

Article Lead Image

Reddit made headlines last week when the site upped the stakes in its ongoing quest to clean up its reputation as an “anything goes” website. Reddit removed five subreddits known for their harassing content, though it left plenty more behind. The site may have infuriated many of its users, but Reddit is only following a larger online trend—as a growing number of websites recognize the need to get proactive about online harassment. While Reddit’s interim CEO Ellen Pao may be taking heat for the move, it likely didn’t originate with her alone.

Featured Video

https://twitter.com/arthur_affect/status/609200472260358144

https://twitter.com/penguinman2/status/608736291614674944

Advertisement

When it comes to facing the issue, Reddit’s users, along with many others on the Internet, have trouble with an important distinction: offensive content versus harassing content. This lack of nuanced understanding when it comes to content and community standards lies at the root of many ardent defenses of moderation-free, no-holds-barred communities, and it makes the Internet a more dangerous place.

The site’s decision and open discussion of the logic behind it reflected its commitment to transparency, especially when it comes to moderation and community standards. Reddit administrators have historically adopted a hands-off approach to site moderation, leaving the issue up to individual subreddits, but the site’s new approach to protecting users and maintaining the community involves more aggressive measures. It also illustrates that the site is responding to user concerns, as some 50 percent say they wouldn’t recommend Reddit due to the prevailing site culture.

Advertisement

Reddit’s announcement was clear on the subject:

We want to be open about our involvement: We will ban subreddits that allow their communities to use the subreddit as a platform to harass individuals when moderators don’t take action. We’re banning behavior, not ideas.

The result was rather predictable: Many Reddit users flew into a frenzy of rage, struggling with the departure from the site’s historical norms. This and other recent changes at Reddit reflect a new era for the site, and given that it’s long been one of the Web’s most notably open and unmoderated communities, it also sets a precedent for the larger Internet. When even Reddit takes a stance on moderation, it’s notable.

Some users cried “censorship,” reflecting a lack of understanding about what censorship actually is: a concerted government effort to suppress free speech. Had the U.S. government stepped in to shut down some or all of Reddit, users would have had good cause to complain about censorship, but the site’s decision to stop hosting some kinds of content was simply an administrative decision. Everyone is free to have an opinion, but that doesn’t mean everyone’s obligated to host that opinion, especially when it leads to harassment.

Advertisement

Users in turn were free to stop supporting Reddit, which was the case for at least some.

Advertisement

Voat, a Swedish site that presents itself as “censorship free,” is being inundated with Reddit users abandoning ship. Ars Technica’s Megan Geuss noted that, ironically, the move actually resulted in more harassment from infuriated users.

What critics of Reddit’s new policies don’t understand is that there’s an important distinction behind the kind of behavior Reddit moved to address and the expression of opinions and beliefs. Reddit has clearly indicated that it plans to make no moves to remove “questionable” content, with the exception of nonconsensual nudity and child pornography. It will, however, move to delete subreddits that promote harassment and related behaviors, likely based on harassment complaints from users.

Offensive content is the cornerstone of the Internet—people active to any degree on sites like Reddit and forums like Twitter have opinions and they want to voice them. The Internet provides a means for those thoughts to be shared, and not all of them are pleasant for people to view. However, people have a right to air those beliefs, and Reddit staunchly stands in support of that freedom. The front page of the Internet doesn’t act as judge or jury when it comes to the opinions expressed by its members.

Case in point is r/fatpeoplehate, which is generating the bulk of the controversy over the bans. Defenders of the subreddit, which revolved around mocking fat people, argued that those offended by it could choose to leave. These defenders even included precisely the kind of people who would have been targeted:

Advertisement

The problem with r/fatpeoplehate—and the reason Reddit chose to remove it—wasn’t about the opinions voiced on it or the ubiquitous use of photographs without the owner’s consent. The problem with r/fatpeoplehate was that the subreddit encouraged harassment, doxing, and other behaviors that put people directly at risk. Users frequently spilled over onto the rest of the site to stalk and harass users targeted in threads on the subreddit. However, they didn’t stop there, hounding them on other social media accounts and turning the Internet into a firing range.

Advertisement

“While we do not always agree with the content and views expressed on the site,” administrators wrote in their announcement, “we do protect the right of people to express their views and encourage actual conversations according to the rules of Reddit.” Administrators recognized that it’s difficult to have a functional conversation with the looming threat of harassment in the background.

Online harassment can take a number of forms, and it can have a serious effect on those targeted by it. For people with depression and underlying mental health conditions, it can exacerbate these issues and may contribute to suicidal ideation or depressive episodes that interfere with the ability to function. When harassment escalates to doxing, it can force people out of their homes, interfere with their jobs, and lead to threats to their family and friends, as well as themselves.

The problem with r/fatpeoplehate was that the subreddit encouraged harassment, doxing, and other behaviors that put people directly at risk.

One of the most stark examples of online harassment has been going on for over a year, as dogged Gamergaters continue to persecute women who speak out about diversity and sexism in video games. Several women, like Anita Sarkeesian and Brianna Wu, have been driven out of their homes by harassers, while others have been forced to cancel public appearances and have had problems at work and with their families as a result of endless harassment. The explosion of the Gamergate movement would have happened without Reddit, but it was an outgrowth of an Internet culture where numerous sites make no move to put a stop to behavior when it crosses the line from offensive to dangerous.

Advertisement

Reddit’s shift toward tighter community controls involves keeping a closer watch on known problem areas to determine when their content is offensive—including content actively mocking and belittling individuals, sometimes by name—and when it constitutes harassment. 

Defining harassment can be a nebulous and challenging task, and for Reddit administrators, it’s complicated further by the judgment call needed to determine when a community engages in systemic harassment and when it’s an isolated problem. A single thread encouraging people to stalk someone might not be indicative of an entire subreddit’s culture, for example, although it might be grounds for a stern warning to the moderators.

Decisions about where to interact on the Internet are a personal matter, and sites need to decide for themselves what kind of community they want to cultivate. Some sites want to create a safe space for specific conversations, with aggressive moderators taking a firm hand. Others want to tread a middle ground, admitting lively debate but attempting to stem actively offensive content. Sites like Reddit promote a free exchange of all ideas, including offensive ones, but now apparently wish to draw the line at harassment.

For Reddit to move so openly is a sign that the Internet is undergoing some fundamental changes. Harassers will always be able to find new sites to host their content, even if they have to resort of hosting it themselves, rather than relying on the benevolence of the administrators of a social networking site. Decisions like Reddit’s, however, are an important step when it comes to choking harassment out of the Internet, making it increasingly difficult for people to find a space make the lives of random strangers on the Internet a living hell.

Advertisement

Far from getting soft in its old age, Reddit is finally turning into a grownup. 

S.E. Smith is a writer, editor, and agitator with regular appearances in the Guardian, AlterNet, and Salon, along with several anthologies. Smith also serves as the Social Justice Editor for xoJane and will be co-chairing Wiscon 40—the preeminent feminist science-fiction conference—in 2016.

Photo via ssoosay/Flickr (CC BY 2.0)

 
The Daily Dot