Advertisement
Tech

How police persecute the LGBTQ community—and how big tech can help its most vulnerable users

‘They’re forcing their way to them manually, very easily.’

Photo of Viola Stefanello

Viola Stefanello

Two security cameras and mobile phone with Arab League flag ripped to reveal rainbow watercolor painting

The Tunisian policemen had been tipped off by a local informer. They had noticed something “unusual,” “abnormal,” they said: There were three men, “effeminate looking and dressing,” who often visited a café and looked like they were offering sex for money, the source told them.

Featured Video

The officers headed for the venue, where they found one of them. They led him straight to the police station and searched his phone: Facebook Messenger, Tinder, all of his pictures. The only thing they could find were a couple of tame messages he had sent to men on Messenger: one read “I like you”. Another read “You are charming”. He was charged with six months in jail under article 230 of the penal code, which criminalizes sodomy and homosexuality. 

It was 2019. Today, LGBTQ+ people still face jail time for their orientation or gender identity in over 70 countries, mostly due to laws that criminalize “sodomy,” “buggery,” “unnatural offenses,” or “cross-dressing”. And in some of those countries, technology is proving extremely useful to law enforcement seeking to persecute people for their queer identities.

In a new, one-of-a-kind study, digital rights researcher Afsaneh Rigot looked into how police have been relying on digital technology to find proof of suspects’ queerness in order to arrest, prosecute, or harass members of the LGBTQ+ community in Egypt, Lebanon, and Tunisia. 

Advertisement

The three countries—which Rigot chose because of her personal understanding of their legal system and society—have varying levels of tolerance for LGBTQ+ people, but they all have laws against “indecency,” “debauchery,” or “sexual acts against nature” that are broadly interpreted by law enforcement to criminalize queerness and gender nonconformity. 

In practice, it means queer people have to watch their every step, in fear that their neighbours could report them to the authorities, that they could be stopped in the street for looking too effemminate, that the person they’re talking to on a dating app could be an undercover cop trying to entrap them—and end up in jail for the crime of being themselves.

Titled Digital crime scenes, the study paints a frightening picture of how apparently innocuous digital footprints, like taking selfies or using dating apps, play an increasingly important role in the police’s search for proof of “deviant” behavior. And it raises questions about what tech companies can do to minimize risks for their most marginalized users.

While most attention has been paid to government requests for personal data and technical solutions, such as end-to-end encryption, that can keep users safe from authorities, Rigot’s study shines light on the fact that, often, law enforcement agents can access digital information in a much more direct way. Her report suggests that, at least in the countries she took into account, the police search digital devices within traditional policing methods, such as street-level stop-and-frisk operations.

Advertisement

For instance, Rigot tells the story of Ahmed and Sameer, a couple that was arrested on suspicion of homosexuality in 2015 while grocery shopping in the suburbs of Beirut. The police confiscated and searched their phones, looking for evidence of “immoral behavior”: they found nudes and photos of them kissing. The two were charged under a law prohibiting “sexual acts against nature” in 2016, and the sentence was confirmed in 2019. They had to pay a large fine—but they avoided the jail time that, according to a colonial-era article of the Lebanese penal code, they could have been sentenced to. 

The study, published with support from the Harvard Law School’s Cyberlaw clinic, the Berkman Klein Center for Internet and Society at Harvard and the data rights organization Article 19, analyzes 29 similar cases that have happened in the three countries between 2011 and 2020. It is also informed by dozens of interviews with victims, local lawyers, and advocates. Although individual cases have at times made the news or have been picked up by international human rights organizations.

“We really need to pay attention to how we talk about sophisticated surveillance technologies and mass surveillance. Manual methods of policing and other least resource-intensive methods are the most useful for the police to target marginalized groups.” the author told the Daily Dot. “In a lot of these cases, they’re not hacking the apps to access the data: they’re forcing their way to them manually, very easily.”

Entrapment—a technique that has historically been used on a large scale to identify queer people—has also been made much easier thanks to the spread of apps that cater to the LGBTQ+ community, especially in Egypt. Rigot tells the story of Adel, a gay man from Egypt who, in 2020, was meant to meet up with someone he had been chatting with on Grindr, only to be arrested for debauchery, cybercrimes, and telecommunication crimes. Evidence from his phone, like incriminating photos and chats, was also used to make the prosecution’s case. 

Advertisement

According to Bedayaa, an Egyption NGO fighting for LGBTQ+ rights, entrapment has overtaken street arrests as the main way law enforcement persecutes the community. Yet, a victim quoted in the report underlines that physical surveillance—which quickly turns into a digital invasion of privacy—still happens: “they target gay hot spots, like places where for people usually meet and hang out. They just randomly arrest people based on their looks and then they search their phones and if they find anything on the phone they use it as evidence to build the case further.”

As a result, queer users in these countries—and in other countries where LGBTQ+ identities are criminalized—have often fine-tuned a series of protection mechanisms: Some may use fake names on dating apps, avoid sexting, and routinely delete apps like Grindr or even Whatsapp from their phone before leaving their house. According to the report, WhatsApp screenshots in particular were used in almost 30 cases to try to prove a person’s queerness.

“Some of these platforms are really vital modes of communication, and connection and community: not having them is not an option, because other people really rely on them,” Rigot, the paper’s author, told the Daily Dot. “But it becomes a double-edged sword, because this sort of homophobic policing will go to the place where these communities are..

With countries like Lebanon, Saudi Arabia, Indonesia, Turkey, Qatar, and the United Arab Emirates banning Grindr, others closing down digital queer communities and scanning social media to identify LGBTQ+ advocates, the internet is turning into yet another place where it’s dangerous for people to be their authentic selves. Yet, the web is notoriously central to many queer people’s understanding of themselves, as well as in the creation of vast communities where they can feel free to voice their concerns, share their emotions and fears, celebrate their accomplishments and find companionship in an accepting, safe space.

Advertisement

“The more you try and push people on the ground, the more people become savvy and creative to find a way to continue to exist,” Rigot says. “And that’s why some of the companies become really important. Because if you’re providing a space to all of these individuals who are trusting your products and tools to exist on them, you have a responsibility towards them to provide the safety tools.”

The lawyers she interviewed recommended several features tech companies could incorporate in their products to make them safer for vulnerable users, such as the introduction of ephemeral or disappearing messages (as seen on Signal, Telegram, Instagram, Wire, and more recently WhatsApp); the introduction of personalized PIN codes to enter the app; the triggering of a warning if somebody takes screenshots of a chat; and the abandonment of the current practice of automatically saving photos you are sending or receiving on the phone’s gallery; greater acceptance of anonymity and the creation of a “panic button” to immediately delete all data in high-stress situations such as checkpoint interrogations and other forced access to the device were also suggested.

Some companies are taking notice: Grindr, for example, provides features such as locked screenshots and a more discrete version of the app, built precisely for users in high-risk countries.

“If you’re designing for those most at risk, you’re keeping everyone safe and you’re coming up with really innovative, necessary ICT changes,” Rigot says, stressing the importance of “designing from the margins.” “You can’t just create something in Silicon Valley, and then expand internationally without understanding the global context and the necessities of people on the ground. Building features from the standpoint of those most at risk isn’t an abstract concept. More companies should do it.”

Advertisement

Read more of the Daily Dot’s tech and politics coverage

Nevada’s GOP secretary of state candidate follows QAnon, neo-Nazi accounts on Gab, Telegram
Court filing in Bored Apes lawsuit revives claims founders built NFT empire on Nazi ideology
EXCLUSIVE: ‘Say hi to the Donald for us’: Florida police briefed armed right-wing group before they went to Jan. 6 protest
Inside the Proud Boys’ ties to ghost gun sales
‘Judas’: Gab users are furious its founder handed over data to the FBI without a subpoena
EXCLUSIVE: Anti-vax dating site that let people advertise ‘mRNA FREE’ semen left all its user data exposed
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
 
The Daily Dot