Tech

TikTokers report losing 90% of their views after trolls flag their videos

Is TikTok’s haphazard moderation system harming creators long-term?

Photo of Elisa Shoenberger

Elisa Shoenberger

A woman looks out the window with a phone in her hands.
rawpixel (CC-BY)

TikTok creator @Zevulous (Zev) is waiting. It’s been over 66 days since he first posted a video asking TikTok to release its content moderator guidelines to explain why creators have recently been having their videos taken down. 

Featured Video

It all started when Zev began posting videos to help people recognize Nazi and white supremacy dog whistles in the wake of the Jan. 6 Capitol riot. His friends felt there was a white supremacy problem on the platform and he wanted to help TikTokers recognize symbols that would clue them in on what to watch out for in particular creators. But his video about the “flag of Kekistan,” an alt-right symbol, was taken down. Initially, he thought it was weird, but then he was tagged on more videos by the user he initially highlighted, who was still brandishing the flag. It hadn’t been taken down.

So he decided to post a video every day asking TikTok directly to release their internal moderation standards so the community could have greater transparency and also allow users to educate people about white supremacy. 

Zev soon found out that he was not the only one suffering from this problem. Many creators who talked about racism, sexism, transphobia, and other issues were finding themselves censored and/or suppressed by the platform, while creators who advocated racist, sexist, transphobic things, even animal abuse, were not similarly impacted. 

Advertisement

And TikTokers are finding out that having a video taken down can have a lasting impact on their reach.

Creators ascribe a combination of weaponized reporting and automated moderation as a big part of the problem. In an interview, Patrick Loller called himself “hyperpolitical” on TikTok. He frequently talked about mental health issues and spoke out about both transphobia and former President Donald Trump. Occasionally videos of his were taken down but he’d appeal them and then they’d be reinstated, like a video talking about polling data during the 2020 election. But he noticed that his live streams would be taken down constantly and that there were more trolls commenting on his accounts. Then many of his videos started getting pulled down, especially after he started speaking out against Super Straights, a TikTok transphobic movement.

Loller has stopped doing live streams and believes he’s close to being permanently removed from the application. He believes that trolls have figured out how to weaponize the reporting feature on TikTok; if enough people report a video or live stream, it usually will be taken down. It’s a pattern seen elsewhere on other social media, such as YouTube and Twitter.

TikTok creators believe that the site’s moderation standards are exacerbating the problem. Catie Osborn, who goes by @Catieosaurous, has also been a victim of TikTok’s flagging and censorship system. She says that creators speculate that it takes only a few reports, like 8-12, for a live video to get shut down automatically. It’s a perfect storm of weaponized reporting and automated moderation.

Advertisement

Creators like Zev, Loller, Osborn, and Cecelia Gray are asking for greater transparency from TikTok. Gray says, “Creators have no insight into internal processes. They are constantly shifting goalposts. One day you can say Nazi, another day you can’t.” Loller noted that creators have to guess which words will cause a video to get reported. Loller has found that videos containing words like “depression” to “podcast” will get taken down; he’s even gone so far as to write the word “podcast” on a piece of paper to get around moderators, but it didn’t work.

While TikTok has community guidelines, creators do not find them sufficient to explain the constant pulldown of posts. Catie Osborn had two live streams flagged and taken down, the first for “vulgarity” when she was talking about a lost dog, and the second for talking about the experience with the first video. Soon she found that most of her videos had been removed from the platform without an explanation. After a popular campaign #bringbackcatieosaurous gained significant traction, her account was restored suddenly and without notification. Despite reaching out to TikTok repeatedly, Catie never got an explanation for why her account was suddenly suppressed. 

Many creators have similar stories; Loller notes he had videos taken down for “community guidelines” but did not get further explanation of what exactly the problem was.

What little is known about TikTok’s inner workings is extremely troubling. Leaked documents, obtained by the Intercept in 2020  “instructed moderators to suppress posts created by users deemed too ugly, poor, or disabled for the platform” as well as censor political speech in live streams.  A spokesperson for TikTok told Intercept that those guidelines were no longer used and never implemented, claiming they were a failed attempt to prevent bullying. 

Advertisement

Some creators see the problems stemming from TikTok’s attempt to deal with bullying on the platform. Gray ascribes the response as more from incompetence than maliciousness. It’s easier to ban a few words using an algorithm than send a human moderator out to actually deal with the issue which might require subtlety. That’s why creators who speak out against racism, sexism, etc. are getting picked up. Meanwhile, bigots have gotten better at their dog whistles. When Gray posted a video about how her family had converted from Judaism at some point, someone commented with a Nazi dog whistle. But when she called them out for it, her video got taken down for violating community guidelines. Zev is more skeptical about it as time goes on, saying his patience is wearing thin.

But the issue goes further than just having live streams stopped and videos taken off the platform. Creators are noticing that their views are dropping significantly after they have issues with their videos and live streams resolved. Loller noted that his views went from 200,000-400,000 a day to flatlining for a month with nothing topping 40,000. Now a good video will see 50,000 views, when previously he would get a couple million. Zev noticed a similar dynamic; when he first started his campaign, he was getting 30,000-50,000 views and now he’s getting 10% of that. Osborn noted a dramatic decrease in her views, particularly curious after the big campaign to reinstate her account, and people have been telling her that her videos aren’t showing up in their “For You” section anymore. 

TikTok’s For You is a section that delivers videos to users that are supposed to be individualized to their interests. It’s how many people find creators’ videos and are a key way to get views.

Keeping creators from it for arbitrary moderation matters can harm their ability to grow on the platform. 

Advertisement

Os Keyes, Ph.D. student and researcher at the University of Washington Department of Human Centered Design and Engineering, theorizes to the Daily Dot that “Day-to-day prioritization … is presumably based on user popularity and status. One can imagine all-too-easily that being shut down counts as a black mark on one’s record, or nullifies the old data altogether, dropping someone to the bottom of the prioritization queue for appearing in other people’s feeds.”

In fact, several creators the Daily Dot talked to mentioned that they were worried that talking to a journalist may further tank their numbers or worse, get them kicked off the platform and lose the communities they’ve built. “The fear and apprehension I have about speaking out is profound. I don’t want to lose this, it is so important to me. But I would rather speak out about what is happening to smaller creators, diverse creators. It’s not right, It’s not fair,” Osborn says.

Some creators have gotten together to try to figure out how to deal with these issues. In August 2020, Cecelia Gray (@ceceliaisgray) co-founded the Online Creators Association (TOCA) as a place where creators can talk about problems outside of the TikTok platform. Through TOCA, creators are realizing their issues with the app are not one-offs but more systematic. 

With auto-moderation, creators have noticed something is off, Gray said. When a user reported a video for pedophilia, they received a message two minutes later that the video in question was not in violation. The video was something obviously problematic, Gray said, of a man filming up a 15-year old girl’s skirt. Two minutes is just not enough time for a human moderator to review the issue and judge it appropriately. What might be happening is that the auto-moderation system is looking for particular things, like words and symbols, that a creator can potentially avoid using and get past. A human moderator would likely not make the same mistake since they’d see what was going on and act. The group hope to present their findings to TikTok and have the company take them seriously.

Advertisement

Back in November, TikTok appeared to be on a hiring spree for human moderators but given that these issues happened recently, problems still seem to persist.

“I would want them to know it would be less work for them to hire some people instead of constantly having to clean up their messes,” Gray says. .


Read more of the Daily Dot’s tech and politics coverage

Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
Advertisement
 
The Daily Dot