Getting your content—or worse—your profile removed from a social media platform without explanation or recourse is an alienating feeling that a growing number of people are experiencing. To reflect what it can mean for people’s community, mental health, and even livelihood, London-based creative agency RANKIN launched a project meant to re-platform hundreds of people whose content had been removed from spaces like Instagram, TikTok, Facebook, YouTube, and Twitter.
Called The UNSEEN, it tells the stories of over 350 people who have been subjected to unfair or unequal moderation practices for publishing content that doesn’t violate platform community guidelines, or where guidelines are being applied inequitably.
A few months back, as they were working on what was supposed to be a two-page special to be published on Hunger magazine, the RANKIN team put out a call on their social media profiles asking people to share their experiences of being unfairly deplaftormed. Content that was taken down because it contained scientifically inaccurate information, active porn, hate speech, organized crime, active self-injury, the promotion of eating disorders, or graphic violence was explicitly excluded from the project.
“We got hundreds and hundreds of entries, hundreds and hundreds of words from people sharing how intensely this experience had affected them, their livelihoods, and their mental health,” Opal Turner, who coordinated the project alongside Luke Lasenby, told the Daily Dot. The two-page article became a massive multimedia project. Thirteen of the participants had professional photoshoots taken by John Rankin Waddell—the celebrity photographer and founder of RANKIN Agency—and the photos were first displayed along hundreds of censored photos, videos, and other content at Quantus Gallery, London.
As was to be expected, many of the experiences collected by RANKIN archive relate to the censorship of female and gender non-conforming bodies. There’s Jude Guaitamacchi, whose artistic photoshoot meant to display “trans joy” on Transgender Day of Visibility was removed by Instagram for allegedly violating the company’s rules on sexual content despite not displaying full nudity, and was only reistated after a long fight.
There’s artist Bri Cirel, whose art often represents naked bodies and who has had her profile deleted from Facebook and who says she was later shadowbanned repeatedly on Instagram as a result, despite the community standards saying nudity in photos of art is allowed.
There’s photographer Amy Woodward, whose work showing the highs and lows of maternity has been removed from Instagram for “adult sexual solicitation, nudity, sexual activity,” leaving her feeling as if “motherhood needs to stay behind closed doors,” as if “it’s too much, offensive, somehow violating.” And there’s pole-dance instructor and academic researcher Carolina Are, who was already studying content removals related to sexual content before her own account was banned from Instagram.
“Celebrity nudity, or nudity posted by brands like Playboy, is accepted, while the same nudity or even less nudity posted by sex workers or people who are unverified or aren’t famous in general is coded as immediately dangerous,” she told The Daily Dot.
“It’s happened to me so many times that posts that are sensual and show a bit of nudity but are not straight up sexual would be censored, and I think it’s quite worrying that a lot of non-sexual nudity is just seen as sexual by default, too,” she added. “Whether you’re performing a sex act or not, women just existing in their body are coded as sexual by platforms, which is terribly worrying. We are objectified just by existing, even when we’re trying to use our voice and our bodies to express ourselves. And this is striking, because it turns the platforms into arbiters of taste, of what is allowed.”
Are welcomed RANKIN’s idea of replatforming censored posts as a way of acknowledging all the work and time that goes into creating content and building communities, ones that now can be destroyed by an algorithm’s split-second decision. The archive includes both specific posts that were removed and examples of content that was published on now-banned profiles.
“As a creator who is often censored, it makes you feel seen for the posts that you’ve spent ages creating to appear somewhere: it signals that what you did and the time you spent mattered, it celebrates your work,” Are said. “At the same time, historically, it’s quite worrying that platforms can get rid of content so easily, because posts are memories, too. They’re a log of your journey. So having these posts archived somewhere is a testament to your history and your journey, as well as a kind of reward for your work.”
Social media is now too essential to our daily lives to consider being kicked off or deplatformed a minor nuisance. Artists, photographers, small business owners, and influencers depend on them to reach audiences, interact with their community, redirect people to their shops, and build professional networks. And even people who don’t depend on them for work need them to stay in contact with friends and family. Bans are no longer just bans—they can have deep consequences on people’s income, reputation, social life and mental health.
The RANKIN team doesn’t see the exhibition and the online archive as the last steps for The UNSEEN project. Their next goal, they say, is to keep collecting stories in order to showcase as diverse and inclusive a range of issues as possible, while also partnering with advocacy groups to promote equality in digital spaces.
“Those that are affected by these policies need to be involved in setting them, it’s this simple,” Opal Turner said. “Part of the problem is that content moderation is so obscure and opaque that the process to get your content or your account back after it was removed is currently a matter of who you know working at the platforms—and that’s obviously fundamentally inequitable.”
Users whose content or accounts have been deleted by mistake have very little options to reverse the situation. One of the fastest ways is to get in touch with someone working at the platform, but many don’t have access to that and they can potentially stay unjustly banned forever.
Aside from making this process less opaque, Turner and Lasenby also suggest that platforms should move beyond simply removing content when dealing with posts that could be considered inadequate for some users, such as sexually explicit content.
“Platforms like TikTok and Instagram have already built protection mechanisms to protect you from sensitive content you might not want to see,” Turner explains. “They just don’t use them as much.”
And in pivoting away from a punitive structure to police content on their sites, platforms can instead foster the exact kind of safe communities they claim to want to host.