Advertisement
Internet Culture

Why can’t Facebook admit that its fake news problem changed the election?

The first step is getting Facebook to admit it has a problem.

Photo of Jay Hathaway

Jay Hathaway

Article Lead Image
Photo via Imgur Remix by April Siese

Two days after Donald Trump won the presidency, and with people desperate to explain how every poll-based projection could have been wrong, Facebook CEO Mark Zuckerberg addressed his website’s role in the election: “Personally, I think the idea fake news on Facebook… influenced the election in any way is a pretty crazy idea,” he said on a livestream from the Technonomy conference. 

Featured Video

https://twitter.com/jwherrman/status/794593339995783170


(Sorry, this embed was not found.)
Advertisement


That’s either naïve or extremely disingenuous. It’s not “crazy” to acknowledge that the rise of misleading, inflammatory partisan news sites, which have become some of the most-shared information sources on Facebook, had a role in this election. It’s crazy to pretend it didn’t.

A Pew Research Center study published in May showed that 62 percent of Americans get their news on social media, and 66 percent of Facebook users say the site is a news source for them. And there are a ton of voting-age Facebook users in the U.S.—more than 150 million of them, give or take some fake accounts and tweens lying about their ages.  

“The two-thirds of Facebook users who get news there, then, amount to 44 percent of the general population,” the study concludes.

Advertisement

A lot of what those millions of people see comes from nakedly biased “news” operations. John Herrman of the New York Times recently explored the ecosystem of political pages, ranging from the ultra-liberal US Uncut to the thoroughly Trumpian Make America Great Today, and demonstrated the low barrier of entry for a page or group of pages to reach millions of riled-up partisans. 

Craig Silverman, perhaps the internet’s foremost crusader against fake news, reported for BuzzFeed on a group of young people in Macedonia who’ve built an entire microeconomy of pages targeting Trump supporters with exciting fabricated news. These are the guys who pushed the quickly-debunked story that Pope Francis had endorsed Donald Trump.

Watchdogs like Silverman and Snopes.com can debunk the fake stories, but it’s a hopeless game of whack-a-mole. Once something like the fraudulent pope endorsement makes it into Trump supporters’ Facebook news bubble, it’s self-reinforcing, shared over and over with family and friends. 

All told, Herrman estimated on Twitter, this one version of the false Pope story reached more than 1.2 million people. Where did it come from? Very reliable news source “fantasynewswebsite@gmail.com.” 

Advertisement

https://twitter.com/jwherrman/status/794590826437472256

https://twitter.com/jwherrman/status/794591524206088196

https://twitter.com/jwherrman/status/794591699259494400

https://twitter.com/jwherrman/status/794591854826319872

Advertisement

https://twitter.com/jwherrman/status/794592215238639616

https://twitter.com/jwherrman/status/794592422143594496

The reach of something like this is startling, and Facebook is doing so little to stop it that they’re in denial it’s even a problem. 

The company’s main rationalization for allowing fake news to spread is that Facebook is a tech company, not a media company. The company rarely exercises human editorial control over what appears on its website, and when it does, it’s only to promote “a really safe community.” Doing anything more, COO Sheryl Sandberg has argued, would compromise Facebook as “a platform for all ideas.” Including, we now understand, fake ideas spread by Macedonian Trump-teens. 

Advertisement

This refusal to admit that Americans’ new favorite news source is part of “the media” serves Facebook in a couple of really big ways. One is (semi)plausible deniability. As Max Read argued at New York Magazine in a piece called “Donald Trump won because of Facebook,” Facebook’s free-market, non-editorial approach means that users experience fake news and good journalism exactly the same way: as catchy headlines with equal algorithmic and visual weight. The more people share the catchy hoaxes that seem to confirm their existing beliefs, the more Facebook’s algorithms feed their hunger for similar platefuls of crap.

And why wouldn’t people click on them? Facebook is essentially laundering bogus news, of unknown providence and little or no sourcing, through your family and friends. Most people wouldn’t intentionally visit a no-name disinformation site they’d never heard of, but they’d certainly remember a shocking headline shared by someone they know. They don’t even have to click on the story or look at the URL to fall into the trap, and re-share it so it can ensnare others.

This is all great for Facebook’s engagement numbers, and it also shifts responsibility from the company to the consumer. You can’t possibly blame your big blue overlords for giving you more of what you asked for, can you?

On Tuesday, Republicans who argued that Trump’s unprecedented social media engagement meant more than his hopeless poll numbers were proven right. But Zuckerberg can still come out two days later and say with a straight face that fake news “surely had no impact” on the election. He could ask, seriously, “Why would you think there would be fake news on one side and not on the other?” 

Advertisement

Gosh, how could the market fail to self-correct and, if not eliminate the hoaxes entirely, at least cause them to offset each other? 

Advertisement

Silverman at BuzzFeed explained exactly how, looking at popular partisan pages on both sides over seven days in September. Left-wing and right-wing pages both published a shocking amount of false information compared to mainstream news sources, but the right-wing pages were decisively the worst. 

But Zuckerberg washed his hands of even partial responsibility, saying “there’s a profound lack of empathy in asserting that the only reason someone could’ve voted the way they did is because they saw fake news.”

Advertisement

There’s a whole lot of territory between “fake news had no effect” and “fake news is the only reason,” though, and Zuck seems unwilling to wade into it. That middle ground is the domain of the news media, an industry both widely loathed and increasingly unprofitable. To understand how little Facebook values the media, consider that one of its board members is wealthy Silicon Valley technocrat Peter Thiel, who funded a legal crusade that destroyed Gawker.com, a news outlet with which he disagreed (and, full disclosure, was my former employer). 

Facebook’s board chose to retain Thiel as a member after he revealed his role in crushing Gawker, and again after he went on to donate $1.25 million to the Donald Trump campaign. On Friday, the Trump campaign revealed that Thiel will be part of the executive committee of his transition team.

“Peter did what he did on his own,” Sheryl Sandberg said, “There’s been no implication that [Thiel] was speaking for Facebook.”

Advertisement

No one speaks for Facebook on matters like these, because Facebook wants to be the neutral blue backdrop against which you consume all your information. Its hand is meant to be invisible, and blameless. 

As Kate Knibbs recently wrote at the Ringer, “It wants to be the stadium, not the umpire.”

Facebook can’t let itself become the media. It can’t get into important arguments about what journalism is, and what constitutes good and bad journalism, even though those arguments are hugely important to civil society. It can’t even dip its toes in by blocking some of the more obvious and popular hoax sites. Doing so would be a distraction from the main objective of Facebook: getting you to use Facebook and click on shit.  If the fake problem isn’t making people quit the site, why fix it? 

As New York‘s Read puts it, “Every buzzkill debunking, every warning of caution, makes the site as a whole less engaging.”

Advertisement

And even if Facebook wanted to fix it, how? Algorithms are ill-equipped for fact checking at this point in technological history, and Facebook has already shown it doesn’t want the potential backlash that comes with paid human curators. After Facebook’s trending topics team came under fire for alleged partisanship, Facebook laid them all off and turned the job of selecting stories over to the machines.

The machines immediately coughed up a fake story about Fox News’s Megyn Kelly, and trended a story about a man fucking a McChicken sandwich

Considering Facebook loses either way, they went with less-accountable computers over humans, who ruin the illusion of a perfectly neutral universe of information (and cost money to employ). There’s no reason to think they would, for example, hire a human ombudsman to protect users from news hoaxes.

Why bother, when they can foist the responsibility of sorting the nutritious news from the poisonous onto their users for absolutely free? They can even spin it, as Zuckerberg seems to be doing, as faith in the wisdom of crowds, the intelligence of the electorate.

Advertisement

One thing Facebook can do, though, is make the fake news problem less obvious and harder for concerned parties to track. This week, Facebook bought CrowdTangle, one of the few tools that provided insight into the spread of stories and news sources on the site.

https://twitter.com/jwherrman/status/797105641865314304

https://twitter.com/MikeIsaac/status/797115072136888320

https://twitter.com/bafeldman/status/797112023695978496

Advertisement

https://twitter.com/bafeldman/status/797110498709897216

Not only is Facebook unwilling to deal with the problem, it’s actively working to control the tools available to outside parties who might want to help. 

We’re left with noble efforts like FakeNewsWatch.com, Snopes, and Silverman’s BuzzFeed column, but debunkers can’t hope to achieve the same reach as the flashy conspiracy theories they’re trying to debunk. The most intractable thing about Facebook’s fake news problem is that audiences want to believe, and they’ll do so as long as Facebook allows them to hook up to a limitless IV of garbage information with no intervention.

If Facebook took away the partisan news that has ruled users’ lives for the past two long, long years of the U.S. election, there’s a chance those users would stop logging on, furiously posting and sharing every day. They might even go elsewhere. And, for Facebook, that’s a bigger, more existential problem than accidentally electing President Trump.

Advertisement
 
The Daily Dot