In his 1922 autobiography My Life And Work, historic automaker Henry Ford famously quipped about the Ford Model T, “Any customer can have a car painted any color that he wants, as long as it is black.” The Ford Model T, much like the first few generations of the iPod, sold immensely well but provided largely the same experience to every customer, an astonishing feat of consistency. In a century of mass production since, we’ve largely recreated that experience: Consumers should have the same experience with a product no matter who they are, let loose from that cage only by their own individuality.
Silicon Valley has predominantly (though unsuccessfully) sought to squash that feature of 20th century production, with products very carefully tailored to the individuals using them. Facebook is the chief example: Its product is artfully curated for and—to a certain extent—by each user. Its algorithms adjust what you see on Facebook based upon how you use Facebook with little effort from the user themselves. Pandora or Last.fm offer the same capabilities in a more straightforward manner, allowing each user to experience their products according to their own desires.
If the Ford Model T were invented in Silicon Valley, it would alter its color based on your habits, your moods, your lifestyle, and an analysis of your preferences so close the car might know you better than you do.
This aesthetic is being taken to its extreme by Affectiva, a startup specializing in software that promises to analyze your emotions through your facial expressions. By paying close attention to the behavior of your lips, eyes, nose, and other facial muscles, Affectiva’s Affdex software can immediately assess your reaction to an advertisement, a political debate, or even a horror movie.
If the Ford Model T were invented in Silicon Valley, it would alter its color based on your habits, your moods, and your lifestyle.
However, what Affectiva also promises is to not only analyze your emotions but rebuild what you’re reacting to in accordance with your emotions. If the horror movie is shocking you too much, it might tone down the intensity. If a video game is making you too frustrated, the software will note your stern eyes and reddened face and respond by making the game easier.
Created from MIT’s Media Lab, the software analyzes over 11 billion facial tics and movements to find your true feelings about what you’re seeing; it can read your heart rate from just a blush. Affdex has analyzed the emotions of 2.8 million faces in 75 countries. Using the software, Hershey’s Chocolate has built a display stand that will give you free samples if your response to the logo is positive, and CBS has used it to test audience enjoyment of new programming.
Unlike the comparatively clunky Facebook or Pandora, software built with Affdex technology doesn’t require the user to volunteer his preferences with a click. With Affdex, the user must either pay close attention to how their face reacts to certain things—a tiresome prospect—or allow this machine to distinctly alter their experience of media around their involuntary reactions. It’s the ultimate goal of personalization: a distinct experience for each consumer custom fit to their desires, one built around the user as the star.
What this spells is the end of what social theorists call “monoculture.” When we all experience things—such as TV shows or news events—in separated cliques, very little crossover occurs. When there were only three TV channels and a handful of national papers, it was easy for everybody to experience things at the same time in the same way, creating just one mass culture.
As cable TV and print media splintered, however, we all earned the ability to adjust our consuming habits to our preferences. At some point in the last decade, however, the monoculture has returned, and while we all have our own individual corners of the Internet where we experience things, we often do experience them together.
It’s the ultimate goal of personalization: a distinct experience for each consumer custom fit to their desires, one built around the user as the star.
Few things displayed this more than the events of this past February 26. Along with the enshrining of equality on the Internet by the FCC, two llamas escaped their enclosure in Arizona. The image of the wily guanacos avoiding police and leaping through streets caught social media’s attention and we all experienced it together; those that were late could find footage easily available online. Those llamas and their popularity were a feature of the monoculture that social media creates.
Later, another event had the opposite effect. The spread of “The Dress” forced the monoculture to analyze (and analyze and analyze) the unavoidable ways our own experience of something is different, despite how communal our culture has become. It’s an effect that does not blend well with forced curation, one of the reasons the free range feel of Twitter has been winning a battle for live event coverage against master-controlled Facebook.
Were Affdex to walk into this minefield, it could stand to make each user experience events or products in a manner separate and isolated. Were a music recommendation service to utilize the software, it could measure your enjoyment of particular songs and adjust your music accordingly. If an episode of Game of Thrones is grossing you out, Affdex might replace the gruesome shots with PG-13 edits of it. A news aggregator might use Affdex to recommend stories based on what content elicits a positive reaction, a dangerous possibility given the disastrous extent to which the Internet is already a political echo chamber.
Their is a benefit to, as Henry Ford might have said, all of us owning the same car in the same color. When we do, we can all own and reflect upon the same successes and failures. A shared experience invites us to see worlds we might otherwise not, be that llamas on the run or a little-heard news story deserving of more attention. When algorithms and software like Affdex lock us up inside our own preferences, our experience online becomes isolated to our own detriment.
If we do face the possibility of having our world rebuilt according to what triggers positive reactions, we stand to have our minds mass produced by the cold profit-hungry logic of machines instead of the chaotic connections we make between each other.
Gillian Branstetter is a social commentator with a focus on the intersection of technology, security, and politics. Her work has appeared in the Washington Post, Business Insider, Salon, the Week, and xoJane. She attended Pennsylvania State University. Follow her on Twitter @GillBranstetter.
Photo via la_farfalla_22/Flickr (CC BY 2.0)