Two years ago, “Tasha,” a social media star with over one million followers, found an X-rated photo of herself on the internet—only, the photo wasn’t actually of her. Tasha’s face had been crudely edited onto a pornographic image. Her head sat atop the body of a naked woman with her legs spread eagle.
“I just kind of looked at it with disgust and said ‘grosss,’” she said in an email. “Since then, I haven’t searched into it any further or seen any other pictures.”
But many more exist of Tasha, who requested anonymity to avoid encouraging people to look up the images. In just one thread on the website FakeTheBitch.com, which is dedicated to these kinds of Photoshopped “fakes,” she’s shown holding two dicks while ejaculate drips down her bare chest. Another photo renders her in a threesome with two men, one of whom appears to be ejaculating into her mouth. There are others featuring a gangbang, a glass dildo, and a facial.
Tasha is just one of countless women who, whether they know it or not, have ended up within the thriving subculture of pornographic fakes. Across more than a dozen websites, men with varying skill levels edit women’s faces onto explicit photos and gifs. Many are hack jobs that look like horny adolescent collages, but others seem so strikingly realistic that they are virtually indistinguishable from the real thing. The targets range from everyday women to minor internet stars to major celebrities.
The news has been chock-full in recent years with stories about the harms of so-called “revenge porn”—a somewhat misleading term that is used to apply to all sorts of nonconsensual pornography, regardless of whether it’s a video posted by a spiteful ex or an image leaked in a celebrity photo hack. We’ve heard all about the humiliation that victims experience when the most private of moments are made public, and seen several attempts to legislate against such trespasses. But fake revenge porn exists in a much trickier, much grayer area—both legally and morally.
Message boards like Fake The Bitch allow users to make personalized requests for fakes. Requesters supply photos of their step-moms, sisters, neighbors, teachers, and classmates, along with messages like, “Fake this busty slut” and “Please help make my dreams cum true.” Or they might embed a favorite photo of a celebrity or a few Instagram shots of a particular YouTube star. Often, a resident photo expert will comply with an image of someone’s stepsister engaged in double-penetration or a gif of “Game of Thrones’” Maisie Williams receiving an epic facial, and onlookers will join in with virtual high-fives.
When they aren’t sharing images, photo experts dispense photo-editing tips and provide tutorials on things like realistically covering women’s faces with fake cum and something called “bubbling.” The latter is a technique where a woman—often in a bikini—is partially obscured so that the bikini-covered bits are hidden, but her arms, legs, face, and stomach are visible, allowing the viewer to imagine that she’s naked back there. (It’s a lot easier to understand if you just click this SFW link.)
Outside the message board realm are sites typically dedicated to celebrities or a particular star. Here we see things like fake, photoshopped images of Ariana Grande giving a hand job, Jennifer Lawrence masturbating, and Emma Watson being, well, face-fucked. The sub-genre of “fuxtaposition” sees videos combine a close-up clip of a celebrity with one featuring a pornographic doppelgänger—so it might show Margot Robbie walking through a bathroom, followed by a blonde porn star performing oral sex in a similarly styled loo.
In this realm, it isn’t just women who are targeted, either. There are plenty of examples of male stars being Photoshopped—say, into a photo of Zayn Malik eating Justin’s Bieber’s ass or a gif of Chris Evans giving head—although men typically aren’t fuxtaposed or bubbled.
When it comes to digitally manipulated images of non-famous women, they often exist solely within forums like Fake The Bitch, where they are typically attached to first names or broad identifiers like “my stepsister.” The women in them will likely never even find out about them—nor will anyone in their lives. (Although, I was able to identify one woman—rendered into a fake image in which she’s having sex with her dad —just via the minor details in her social media photos that the requester provided. The woman, a student attending a Christian college, did not respond to a request for comment.)
But rarely, these photos appear outside the virtual dens of fake pornographers, popping up everywhere from so-called “revenge porn” sites to Tumblr, and can be attached to women’s full names and social media profiles. Take the case of a Reddit user who discovered that fake porn of her had been posted on revenge porn site MyEx.com, as well as on Tumblr and in a Backpage sex ad. “They took real pictures of me from my Facebook and Instagram and slapped them next to random nude pictures where the girls have no faces,” she wrote. “Isn’t this identity theft? Extortion? Slander? Destruction of character? Sexual harrassment? (sic) Should I contact the police? A lawyer? What kind?”
In recent years, thirty-four states instituted laws against revenge porn, and a bill was introduced this summer that would make nonconsensual pornography a federal crime. But these laws are based on the protection of what Mary Anne Franks, a professor at University of Miami School of Law, calls “truthful private information.” She explained, “By definition, manipulated images are not true, so a private-information approach is not a good fit. The harm done to the victim of manipulated sexual imagery derives from its falsity, not its truth.”
For that reason, she says laws around defamation would be a better approach to targeting these images. Depending on the circumstances, she says “the conduct could also be actionable as harassment or as a form of intentional infliction of emotional distress.” But both of those approaches, she says, “sit a bit uneasily with First Amendment doctrine.” That is especially true if the person in the image is a public figure.
“While the cases you describe are horrific, there are more ambiguous scenarios: the Kanye West video that displays realistic nude replicas of famous people, the naked Donald Trump statues that appeared in various cities across the U.S.,” she said. “Photo manipulation is often used for artistic, political, and educational purposes, and as such any law regulating it runs the risk of chilling expression protected by the First Amendment.” It’s worth noting that digitally simulated child pornography that is indistinguishable from the real thing—for example, realistically editing a child’s face onto a pornographic image—is illegal in the U.S.
Even absent these legal issues, there is the simple fact that pursuing legal action is time-consuming and expensive. That’s not to mention, trying to remove offensive things from the Web is like keeping frogs in a bucket. “I absolutely think it is disgusting and should be stopped,” said Tasha, “but I do not have the interest, time, or resources to fight all the internet’s perverts.”
It’s also the case that not everyone who finds themselves the subject of a pornographic fake is disturbed by it. “I’m not too surprised this stuff is out there and it doesn’t affect me much,” one social media star who has been faked told me. “I don’t think I’m sensitive to this stuff.” Another said, “To be quite honest, people [know] that I wouldn’t do that. Which is good enough for me.” Most of the women I contacted did not respond at all.
Even Tasha, although she finds these images “disturbing and wrong,” is able to sympathize with why someone might make a fake. “Simply put, many people find it easier or more comfortable to be sexually aroused by someone they feel like they ‘know,’” she said, noting that her followers get a glimpse of her life on a daily basis. “This makes them feel like, on some level, they know me.”
David Ley, a clinical psychologist specializing in sexuality and author of “Ethical Porn for Dicks: A Man’s Guide to Responsible Viewing Pleasure,” sees the fake porn phenomenon as a form of fantasy fulfillment. “It’s a creative visual version of the same drive that leads people (mostly females) to write slash fan fiction about sexual relationships between characters from favorite television or book series,” he said in an email. “Females are interested in the relational aspects and express it in written word. Males are more stimulated by the visual medium and create cobbled together porn pics.” Research has indeed backed up the notion that male arousal is generally more visually-driven, although researchers have raised questions about how this might be influenced by socialization.
There are also much more innocuous applications of fake porn: He’s seen cases where men commission fakes of their wives so they “can experience the sexual fantasy of seeing his wife with another man, even when the wife would be unwilling to fulfill the fantasy in real life.”
It’s the aspect of fantasy fulfillment—not necessarily the inauthenticity—that makes fakes generally less problematic than revenge porn. They tend to lack the malice of revenge pornographers, who make every effort to spread their content and link it to the woman starring in it. The intent of fake pornographers is often to get off, not to name, shame, and destroy. While there are certainly revenge applications, which should be considered separately, as well as rampant misogynistic language used on fake porn message boards, these images largely exist in a world of masturbatory make-believe. It’s the spank bank made virtually real.
That isn’t to deny that fakes are disturbing in their sexualizing of unwilling participants, but the truth is that fantasies are often—in reality, if not in their imagined scenarios—nonconsensual. Few people give permission to be masturbatory material. And, even absent the issue of consent, the particulars of personal fantasies tend to be disturbing to anyone other than the person doing the fantasizing. It’s just that technology has given us the ability to artificially manifest those fantasies, and then share them with the entire world.
Of course, once they are made real online, the original intent doesn’t necessarily matter. The image can be shared or commented on in any number of ways and could carry the potential to be just as harmful as a photo shared with malice. That’s the other side of the world technology has created: our likenesses are not always our own.
This story originally appeared on Vocativ and has been republished with permission.