Last week, Facebook came under fire when it censored the famous “Napalm Girl” photograph from the Vietnam War. The photograph, which depicts a nude 9-year-old girl running and crying in the streets, was deleted on the grounds that it contained child nudity.
A 14-year-old Irish girl is now suing Facebook for not doing enough to censor a revenge porn photo of her.
On Monday, Facebook lost a legal bid to keep the teen from suing the company for hosting the nude photo that was posted on a “shame page” repeatedly without her permission. She’s suing for misuse of private information, negligence, and breach of the U.K.’s Data Protection Act, and is also suing the man who allegedly posted the photo.
Facebook’s community guidelines state that it will “remove photographs or videos depicting incidents of sexual violence and images shared in revenge or without permissions from the people in the images,” and Facebook claims it always removed the teen’s photo when it was notified it had been reposted. However, according to the Guardian, the girl’s legal team is arguing “Facebook had the power to block any re-publication” from the get-go by using a certain technology, but didn’t. That technology, PhotoDNA, detects child exploitation material by preventing photos from being uploaded—and it still used by Facebook today, CNN reports.
Regarding the Napalm Girl photo, Sheryl Sandberg said, “We don’t always get it right. Even with clear standards, screening millions of posts on a case-by-case basis every week is challenging.” The options seem to be automated censorship that could needlessly erase things like the Napalm Girl photo, or case-by-case reviewing that relies on user reporting and usually takes a lot of time, which is dangerous in cases of revenge porn. It’s unclear why PhotoDNA didn’t catch the image in the first place, but maybe Facebook could put some work into that.