Facebook has confirmed that it scans your Messenger threads for links and photos that violate its policies, Bloomberg reported Wednesday. It also uses automated systems to read conversations if they get flagged and takes them down if they contain content that violates the rules.
“On Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses,” a Facebook spokesperson said. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
The company was cornered to disclose its practice after an interview between Mark Zuckerberg and Vox’s Ezra Krane raised questions about how Facebook surveils Messenger without intruding. During the conversation, Zuckerberg told a story about how he learned people were spreading sensational propaganda on Messenger in Myanmar to incite violence. The company blocked those messages from going through.
“In that case, our systems detect what’s going on,” Zuckerberg said. “We stop those messages from going through.”
For years, Facebook has tasked algorithms to scan what people post on their walls. If certain images, words, or malicious links are discovered that don’t follow its Community Standards, it takes action by removing the content and potentially punishing the offender by suspending their account. While users feared the same level of scrutiny would be applied to their private messages, Facebook never divulged the extent to which it searches private messages.
Spurred perhaps by its growing effort to provide transparency about user data, Facebook opened up to Bloomberg, admitting that while messages are private, Facebook uses the same tactics it employs for its main site on Messenger.
“For example, on Messenger, when you send a photo, our automated systems scan it using photo matching technology to detect known child exploitation imagery or when you send a link, we scan it for malware or viruses,” Facebook said. “Facebook designed these automated tools so we can rapidly stop abusive behavior on our platform.”
The company later emphasized to Slashgear that the review process is not done by humans.
The revelation couldn’t come at a worse time for Facebook, which finds itself in the midst of a privacy crisis that won’t go away anytime soon. In March, it was revealed that political data firm Cambridge Analytica exploited the data of up to 87 million users. In the short time since then, Facebook’s reputation and market value have collapsed, and Mark Zuckerberg has been scheduled to testify in front of Congress.
The company has scambled to change how it handles the endless mounds of information it collects. Earlier Wednesday, Facebook rewrote its Terms of Service and Data Use Policy to more clearly define how it uses personal data.
If you’re worried about Facebook reading your Messenger conversations, you have some options short of deleting the app entirely. The easiest is to enable encryption for Messenger, but you have to turn it on manually. To do so, go into any conversation, select the “i” in the top right corner, and select “Go to Secret Conversation.” Unfortunately, you’ll have to do this for each individual thread, and we couldn’t find an option for group chats.
Alternatively, you can use messaging apps designed for privacy, like Signal and WhatsApp, which come with end-to-end encryption enabled by default. But, of course, the only foolproof way to protect yourself from Facebook is to delete your account entirely.