To close out 2014, Facebook unveiled its Year In Review app, which gathered users’ photos from the past year into a slideshow, turning your photos into a celebratory narrative. While innocuous enough, the app also stood to remind people of 2014’s more painful moments, shoehorning an ex-boyfriend or a lost loved one into decorative clip art more befitting beach and party photos.
This is what happened to Web designer Eric Meyer, who, not having installed the app, was treated to an involuntary preview that featured his recently deceased daughter being used to lure him into the app. In a blog post entitled “Inadvertent Algorithmic Cruelty,” Meyer notes such an inclusion was a mistake, urging Facebook to consider “empathetic design.” There was no way for Facebook or the app to know it was dredging up a painful memory for Meyer; software has no definitive way to differentiate between the sorrow or joy a photo can stir.
Which is the one of the central problems of a service like Facebook. You’re urged to document your entire life, for better or worse, on a site that has no ability to make a distinction between the two extremes. The compassion that the death of a child deserves is not within the computational ability of an app meant to act like an automated photo album. By uploading the totality of human existence to automated services like Facebook, we’re exhibiting the same machine-like lack of depth.
How we mourn on social media has been the topic of discussion since the days of MyDeathSpace.com, which published and highlighted the MySpace profiles of the recently deceased. Memorials on your Facebook feed are as sadly typical as memorials along a highway. As Lia Zneimer wrote for Time last May, “Mourning online allows us to stake our claim on the effect of a tragedy—even one that doesn’t have a direct impact on us.”
Social media, and Facebook in particular, is ripe for the misuse of death and loss. Even when accidental, as in Meyer’s case, it highlights the cold, mechanical way social media handles all the information with which we entrust it. The “Like” Button, for example, is incredibly useless as a method of expressing anything but joy. While Facebook did toy with the idea of a “Sympathize” button, Mark Zuckerberg himself has defended the clunkiness of the site’s singular voting system.
The inclusion of a “Sympathize” button could have helped the Year In Review app—perhaps spawning a twin “In Memoriam” app—but it would be a faulty adapter between Facebook’s algorithm and human grief. As the “Like” button itself can attest, the intent of a technology matters little. It could become a joke, a sarcastic jab, or even a hurtful slur. Suddenly a post about your social life being “over” because of your work is grouped in with photos of your dead loved ones.
The Year In Review app could’ve avoided Meyer’s situation fairly simply, but larger design problems exist. In his own post, Meyer notes that the app used his photo in a preview without even asking him, a feature that should definitely be opt-in, which would have been an obvious solution. The bigger issue, however, is how the app analyzed photos. As designer Jonathan Gheller told the Washington Post, the heaviest indicator of a photo’s inclusion in the slideshow is the sheer number of interactions (likes, comments, etc).
Any heavy user of Facebook can see where that would cause problems. A sonogram of your new baby could earn as many likes and comments as a photo of your dead grandmother or even a rant about a recent divorce. The Year In Review app was merely creating a collage of the most popular photos of or by each user, never taking into account the sensitive nature contained therein.
And how could it? Facial recognition technology—something that has Facebook very interested—is pointless here. In the photo nabbed by the app, Meyer’s daughter looks plenty happy. Doesn’t mean that it’s necessarily appropriate to put around clip-art illustrations of a party? The app would genuinely need to learn and understand Meyer’s thoughts and feelings surrounding each of photos.
But what if it could find what others were saying about individual images? Facebook earned mixed reviews for conducting an experiment among its users by introducing more “negative” content in their automated News Feed to see how each user would react. In the study, Facebook analyzed the text of posts and showed posts more pessimistic, or even just about sadder things, to users and found that they would produce more negative content in response.
The potential implications of the study in this case are considerable. While human nature consists of far more than the binary classifications of negative and positive, it’s a start that could lead to sincerely diagramming users’ posts in a way that makes sense to the algorithm. However, this presents the same problems as the theoretical “Sympathize” button. People use phrases like “RIP” and “sorry for your loss” sarcastically and sincerely. Users are as likely to upload a photo of their crashed iPhone screen and express sorrow for it as they are for their grandmother, but the two should probably be put nowhere near each other.
But even the most advanced text mining technology might find itself doing just that. While the technique might work to, say, analyze Yelp reviews to predict the success or failure of restaurants, Facebook wants to manage our entire life. Facebook wants you to use nothing but Facebook all of the time, feeding the beast with your memories, your jokes, and your loss. It’s their business model. That’s a far more realized—and more complex—totality of the human experience than your thoughts on a taco truck.
As Meyer’s case highlights, it’s often a job Facebook doesn’t entirely seem ready to hold. Technology changes and social media gets more intelligent by the day. But it also stands to mold us in its image, altering the way we mourn instead of adjusting to it. As selfies are taken at deathbeds and the death of a beloved comedian becomes an eager opportunity to gain likes and followers, it’s more important than ever to strengthen the humanity in our websites rather than let the ghostly machinations of algorithms and temptations of popularity strip us of it.
Photo via Gadjou/Flickr (CC BY S.A.-2.0)