Advertisement
Tech

Facebook execs nixed measures for fixing radicalization problem

It’s more concerned with appearing neutral than promoting social good.

Photo of Claire Goforth

Claire Goforth

Article Lead Image

It’s largely undisputed that Facebook divides society. The company has been aware of this for years, and has come up with numerous solutions—but top executives have killed or severely limited them.

Featured Video

And a new story in the Wall Street Journal today details Facebook’s longtime awareness of, and indifference to, its deleterious effects on discourse.

Some of its findings have been stunning. For example, in 2016, a Facebook researcher found that 64% of users who joined extremist groups did so due to its own algorithm’s recommendations tools, such as “Groups You Should Join.”

Even more stunning is how unwilling Facebook is to change—even when it claims that it will.

Advertisement

The 2016 election and its aftermath exposed much of Facebook’s dirty laundry to the public. Not only was its algorithm rewarding bad behavior, it was being used as a tool for election interference by the Russian government.

Facebook subsequently vowed to improve. To that end, in February 2017, it overhauled its mission statement for the first time.

Its former goal was “to give people the power to share and make the world more open and connected.” The new one: “To give people the power to build community and bring the world closer together.”

Mark Zuckerberg said at the time, “Look around and our society is still so divided. We have a responsibility to do more, not just to connect the world but to bring the world closer together.”

Advertisement

The company also endeavored to find out if it did in fact encourage division. The answer was yes.

According to the Wall Street Journal, in 2018, slides from an internal presentation warned, “Our algorithms exploit the human brain’s attraction to divisiveness.” “If left unchecked,” it continued, Facebook would feed users “more and more divisive content in an effort to gain user attention & increase time on the platform.”

Facebook’s unwillingness to change in a meaningful way was reportedly driven by concerns that it would come across as “paternalistic,” or that its efforts would affect more conservatives than liberals. It was also concerned that it would lower engagement. In the end it decided to adopt a far more laissez-faire approach than that urged by some within the company.

The company did come up with some solutions, the Wall Street Journal reports. However, proposals that would’ve made the greatest impact on how Facebook divides and radicalizes were ultimately either shelved entirely or implemented in severely weakened form.

Advertisement

It considered limiting the reach of content favored by super sharers, which would elevate posts by more moderate users. Another was to broaden the type of groups that it suggests users join. Yet another involved diminishing the spread of clickbait, which more often contains misinformation.

Facebook implemented the first in a greatly weakened form as per Zuckerberg’s orders. Sources told the Wall Street Journal that Zuckerberg also told his team not to bring him decisions like that again, indicating he was losing interest in altering the platform to promote social good.

Facebook found that a small number of hyperpartisan users were responsible for a disproportionate amount of bad behavior on the platform. Among these, a large number were super sharers who engaged in suspicious behavior like spamming or being on the platform for up to 20 hours a day, suggesting the accounts were inauthentic. It also found that there were more accounts of this nature on the far right than the far left, the Wall Street Journal reports.

Thus, even an apolitical intervention tactic would disproportionately impact conservatives. This proved too tough a sell for a company that has fought accusations of leftist bias for years.

Advertisement

The company did implement some changes, such as promoting news stories that a larger user base engages with, and penalizing publishers that repeatedly post false stories. The Wall Street Journal reports that these were implemented with a more limited impact than some Facebook employees recommended.

More recently, Facebook has shifted to focus on individual value rather than social good.

“You can’t impose tolerance top-down,” Zuckerberg reportedly said in an October speech at Georgetown University. “It has to come from people opening up, sharing experiences, and developing a shared story for society that we all feel we’re a part of. That’s how we make progress together.”

READ MORE:

Advertisement
 
The Daily Dot