Tech

Put down that pitchfork: Facebook isn’t reading your unpublished posts

Ever hear of the boy who cried wolf? These are the people who cried privacy scare.  

Photo of Kate Knibbs

Kate Knibbs

Article Lead Image

Over 27,000 users signed a petition on Care2 called “Facebook Stop Stalking Our Unposted Thoughts!” The petition is a response to an article published by Slate detailing research conducted by Facebook employees about the way people censor themselves while posting status updates, wall posts, and responses to comments. The research looked at a random sample of 5 million English-speaking Facebook users in the U.S. and U.K. for posts they started typing, but did not choose to post. According to the study, 71 percent of Facebook users self-censor at some point, which suggests there’s a ton of unpublished content. It was conducted during the summer so an intern could do the work, which suggests Facebook didn’t think it was particularly sensitive information.

Featured Video

The petition signers are understandably creeped out by the idea of a Facebook intern looking at the stuff they ultimately decided they did not want to share with their Facebook audience — considering that people may second-guess their drafts because they do not want to put a certain message out there, it’s a disconcerting thought that Facebook’s employees are watching you compose your rough drafts and botched attempts at profundity.

But Facebook flatly denied it’s doing any such thing. And while Facebook has previously shimmied harder around privacy issues than a caffeinated flapper, in this case, the company is telling the truth. The Javascript used for this research does not even have the capability of collecting data. And it’s not Facebook reading the posts, it’s your browser code, which sends Facebook only information about whether a post was deleted or not, not what was in the post.

Facebook was relayed a binary representing “censored” or “uncensored” based on whether something typed in was published within 10 minutes. Even if Facebook was contacted by frantic, desperate, pure-of-heart police officers who knew a kidnapper was going to type his location into his status update box and then not hit publish and discovering this location was between life or death for his three infant kidnapping victims, Facebook couldn’t have helped using this program.

Advertisement

Even if Boo the Dog’s eerily adorable life was on the line and instructions from the only veterinarian who could save him were typed on Mark Zuckerberg’s wall and then not sent, Facebook would have had to let Boo tragically perish, powerless to use this program to uncover the wee pup’s whereabouts. The metadata simply does not hint at the content, something the research paper makes clear. This has been verified independently, although when I did the same thing as the verifier (opening developer tools on my Chrome browser, typing in and then deleting a status after a minute, and checking the network activity for the “censor logger” in the form data for my post) I didn’t see anything indicating it was still going on. (If you have a different experience, please let me know in the comments.)

In short, the Care2 petition is misguided about the technical capabilities of the research project when it rails against Facebook by saying “If they choose to save them as they claim their policies enable them to, it could mean that every key stroke entered at Facebook could be sent to a government agency.” That’s not what it means, although Facebook did not respond when I reached out and asked if they could create a different program that did measure content… which isn’t very reassuring.

Facebook’s privacy policies have created a deep reservoir of bad faith from users. In this specific scenario, the company’s behavior doesn’t seem like anything to get worked up about, since it doesn’t look at content users assumed to be private. But Facebook’s habit of downplaying legitimate privacy concerns means it’s difficult to give the company the benefit of the doubt. And while the anger isn’t warranted in this scenario, it’s smart to be skeptical about all companies that profit off information you provide them. If Facebook was doing what the petition said it was, there’d be cause for concern.

It might be hard to swallow, but Facebook users need to acknowledge that they are giving information to a company that is solely valuable because it is such a vast aggregator of personal data. This doesn’t mean Facebook is evil and must be stopped; it means we need to adjust our thinking in regards to the company’s overall attitude towards privacy. It is appropriate to protest Facebook’s behavior and use things like petitions to urge the company to adopt more transparent, user-friendly policies. But it’s more effective to limit protests for actual bad behavior. In this scenario, the petitioners are assuming the worst without checking the details, and protestations like this actually undermine the overarching effort to get the company to enact better policies, because it makes people who are worried about privacy look like hand-wringers who can’t read.

Advertisement

H/T Mashable | photo credit: Flickr/Joe Lodge

 
The Daily Dot