The recent abrupt ban of a popular pole dancing teacher and blogger has reignited questions about Instagram and other social media’s enforcement of questionable “sexual solicitation” policies.
As a City University of London lecturer and researcher, Dr. Carolina Are has been looking into the moderation practices around nudity and sexuality of the major social media platforms. As an activist, she has been calling tirelessly on Instagram to reconsider its policies since November 2020, when the company’s new terms of use further restricting sexually explicit content first got announced.
As a pole dancing teacher and blogger who has been using her Instagram account to network and spread her work for almost a decade, she has long seen her engagement metrics fluctuate due to what many refer to as “shadow bans.” Some of her Stories have been removed, but her profile had never been deleted—until July 20.
“So I’ve just logged onto Instagram to find that my account, @bloggeronpole, has been disabled. I do not know why and have not received insights as guideline I’ve violated. @InstagramComms care to explain? Might it be that I shared this petition link?” Are tweeted on July 20th, linking to a Change.org petition she has launched in November 2020 alongside several other activists, artists and performers.
The petition, titled “Instagram, stop unfairly censoring nudity and sexuality,” has collected almost 120,000 signatures so far.
The next morning, her account was still gone, and the researcher shared her distress on Twitter.
“Being deleted by @instagram without a warning is such a distressing experience. In 9 years of digital labour to build a reputation & a network, it’s become my main comms channel with my community. It’s my research, it’s a huge part of my work. Now it’s gone without an explanation,” she wrote.
The last picture on her profile, which is followed by over 18,000 people, was a harmless picture with her grandma. But several of her previous posts called the platform out for its censorship of sexually explicit content, an issue that many people who publicly publish content related to sexuality—be they activists, performers, educators, or sex workers—have been raising for months.
Instagram’s Terms of Use change is part of a larger trail of decisions by mainstream platforms—Reddit, Cragislist and Tumblr as early as 2018, TikTok in late 2020—that are linked to two twin laws passed by the U.S. Congress in 2018: the Stop Enabling Sex Traffickers Act and Allow States and Victims to Fight Online Sex Trafficking Act. The two laws are referred to as SESTA/FOSTA.
Lawmakers billed the laws a way to combat online sex trafficking, but it has been widely criticized since it passed. SESTA/FOSTA amended Section 230 of the Communications Decency Act to allow for the liability immunity it affords all websites for user generated content to be taken away if sites “knowingly” allowed sex trafficking.
As a consequence, social media platforms introduced a number of policies that have had far-reaching consequences. The polices have alienated sex workers, but also those who publish content on sex education, or those with passions that are deemed too sexually explicit, such as pole dancing or strip teasing.
The consequences on sex workers in particular have been dire: for instance, a 2020 report by sex workers’ rights tech collective Hacking//Hustling shows they have severely increased economic instability among sex workers and driven them towards less safe options.
As Vox reported in 2019, SESTA/FOSTA forced numerous sites that sex workers used for advertising to close down along with messages boards that sex workers used to talk about and screen clients. With a lack of safe online spaces, the law led to sex workers becoming houseless and being killed, according to Vox.
One of the measures Instagram adopted because of SESTA/FOSTA is its sexual solicitation policy, prohibiting every piece of content “that is implicitly or indirectly offering or asking for sexual solicitation.”
As Are explained in one of her more recent posts, all anyone currently has to do on Instagram to get flagged for sexual solicitation is posting a “suggestive element” accompanied by a “method of contact,” such as links to subscription-based websites or sentences like “slide in my DMs.”
“This policy disproportionately affects sex workers, who are working online during the pandemic, but also creators posting nudity and sexuality, which may be mistaken for soliciting,” she wrote in the post.
Are suspects Instagram’s approach to soliciting is also what got her account banned in the first place.
“I am heavily inspired by sex workers as a pole dance instructor and activist, since strippers popularized pole dancing and are often leading the charge in anti-censorship activism, but since I’m not a sex worker by trade it can’t be said I solicit through my account. I share links to online classes, to my blog, my writing, or my research,” she told the Daily Dot.
Are added: “It’s very weird that the account got deleted exactly when I started sharing the link to the petition again. It looks like it was an algorithmic decision, since all the people I’ve been in touch with at Instagram and Facebook can’t tell me what happened, but in case the issue was that the petition link got interpreted as a solicitation attempt, it would be extremely problematic.”
Her Instagram profile eventually got reinstated after a day, and Instagram told Are it was disabled in error. The whole experience, though, shined a light on the platform’s double standards when it comes to handling deleted profile cases.
Instagram did not respond to a request for comment by the Daily Dot.
“I got my account back because 1) of your help 2) because I emailed IG press in London and FB policy in the US 3) because of a Twitter shitstorm 4) because of media and academic contacts. This is important because despite all these contacts, I still got deleted. This means no one is safe,” Are wrote on a new Instagram post.
Citing the cases of two sex worker activists, Gemma Rose and Akynos, whose profiles got deleted in the past few weeks, she noted that her case “shows you how freak cases—aka an online moderation expert like me being censored—are trickier for the platform’s image than the everyday deletions of sex workers, activists, other users. This is worrying and unfair. And don’t get me started on platforms ruling over people’s livelihoods.”