IRL

Fox News misses the point about Wikipedia and porn

The article claimed the Wikimedia Foundation has failed to address the problem of explicit images on Wikipedia. In fact, Wikimedia made its decision months ago.

Photo of Kevin Morris

Kevin Morris

Article Lead Image

There’s porn on Wikipedia. It’s hard to aspire to hold the sum-total of human knowledge without a little bit of smut.

Featured Video

Should users have the option to filter explicit images of sex acts—everything from bukkake to bestiality to anal sex? That’s a serious debate that’s raged on for years at the “free encyclopedia that anyone can edit.” A recent Fox News story raised important questions about Wikipedia’s odd relationship with explicit content but missed the real story entirely.

Israeli company NetSpark has a technology that Wikimedia, the non-profit foundation that runs Wikipedia, needs to check out right now because it’s offering a low low low special introductory price. NetSpark is shocked that Wikimedia won’t use its image filtration software, which it promises is the best out there. The technology is famous for helping make Israel’s Internet “kosher.” And Wikimedia, the foundation that runs Wikipedia, has in the past promoted the idea of filtration software, something that would allow users to avoid explicit content on the site with the click of a button.

Why did Wikimedia ignore the calls? That was the unfortunate subject of the Fox News article published Tuesday, which parroted the frustrations of NetSpark’s sales team, rather than digging into the real story at hand.

Advertisement

The article read: “Representatives for NetSpark, an Israeli-based Internet-solution company, say they have developed a filtering technology that not only detects and blocks questionable imagery, but is also able to learn over time, making it ideal for determining the appropriateness of an image.”

It’s a good sales pitch, but it’s a sales pitch. And Wikimedia has no responsibility to answer a telemarketer’s phone call. “As a top five, global web project we get dozens or hundreds of offers, like any popular web project would,” Jay Walsh, Wikimedia’s head of communications, told the Daily Dot.

The Fox News story suggested that Wikimedia was failing to live up to promises, painting the foundation as duplicitous. It went on to conclude: “A vigorous debate about the topic appears to have ended, with no resolution in sight.” And: “Oddly, the Wikimedia Foundation is in favor of filters—it simply seems paralyzed, unable to adopt a solution.”

Both sentences are false. Wikimedia has been far from paralyzed. The foundation has adopted a solution. It dumped the idea of a filtering system months ago.

Advertisement

As Walsh pointed out to the Daily Dot, Wikimedia announced its decision on July 16:

Following a community poll organized by the Foundation, and extensive discussion in various venues, it has become clear that this issue can be highly divisive and distracting to the Wikimedia community. We trust our community, and we respect the arguments that have been made opposing the feature as well as those in support of it. We affirm our support for better user choice and user preferences, but do not want to prescribe a specific mechanism for offering that choice. Therefore we rescind the request to develop this feature.

That’s a momentous decision with huge repercussions for Wikimedia users and the Internet at large—and certainly one that shouldn’t be muddied by the accusations of one jilted company.

The debate over explicit images has been raging on Wikimedia for years. The controversy has even claimed victims among top Wikimedia brass. In July, Wikimedia U.K. chairman Ashley Van Haeften resigned after getting embroiled in the explicit images debate.

Advertisement

The most vocal critic of the site’s porn problem has been Larry Sanger, Wikipedia’s estranged cofounder who left the foundation in 2002. As a parent of two children, Sanger sees the community’s stringent adherence to free speech absolutism as potentially harmful to children.

“The presence of enormous amounts of unfiltered adult content, the ‘educational’ purpose of which is questionable for anyone, directly conflicts with the goal of supporting the education of children,” Sanger wrote in a blog post earlier this year. (Sanger also advised NetSpark’s proposal to Wikimedia, and you can see the gestation of the idea in another blog post earlier this year.)

Sanger’s critique, “What should we do about Wikipedia’s porn problem?” was widely shared in May 2012. Wikipedia’s other cofounder, Jimmy Wales, who remains an influential member of the foundation’s board, responded to Sanger by explicitly supporting the idea of an opt-in image filter.

But even Wales’s significant influence didn’t matter in the end. The Wikipedia community has clearly decided that the benefits of unfiltered information outweigh the potential harm done to minors who see explicit images. The Wikimedia foundation has decided that filtration software is simply too divisive an issue for the community, and has dumped the proposal entirely.

Advertisement

Is that the right decision for the foundation, and the millions of people who use the encyclopedia every day? The proposal, after all, is a simple filter. It’s opt-in, meaning that if you don’t like, it you don’t have to do use it.

What, exactly, is so scary about choice?

Photo by bestofyou/Flickr

 
The Daily Dot