Website GitHub is banning copycat versions of the “DeepNude” app that uses neural networks to remove women’s clothing from photos.
The app, first reported on by Motherboard in June, was pulled offline after the creator received an overwhelming amount of backlash.
DeepNude’s creator succumbed to the fact that “the probability that people will misuse it is too high.”
“We don’t want to make money this way,” the creator wrote in a statement on Twitter. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones to sell it.”
https://twitter.com/deepnudeapp/status/1144307316231200768
Despite the creator’s last-minute attempt to keep the app from spreading, countless users already downloaded the tool and began creating their own versions of it. The codes for those new versions were uploaded to GitHub and then further downloaded.
GitHub responded by removing the new apps from their site, pointing to its Community Guidelines.
“Don’t post content that is pornographic. This does not mean that all nudity, or all code and content related to sexuality, is prohibited,” the guidelines read. “We recognize that sexuality is a part of life and non-pornographic sexual content may be a part of your project or may be presented for educational or artistic purposes. We do not allow obscene sexual content or content that may involve the exploitation or sexualization of minors.”
While Github’s action sends a strong signal about its stance on the issue, the proverbial genie has unfortunately already been let out of the bottle. The apps have only continued to spread throughout other online avenues.
DeepNude, which was programmed to only work on photos of women, is raising important and serious questions about how society views such programs.
Lawmakers have attempted to tackle similar issues such as deepfakes, which have been used to place female celebrities and everyday women into pornographic videos. At the beginning of July, Virginia updated a 2014 law banning the distribution of revenge porn to also include deepfake videos.
READ MORE:
- Virginia just banned deepfake revenge porn
- ‘DeepNude’ app that removes women’s clothing from photos pulled offline
- An impressionist morphed into 11 different celebrities in this deepfake
- Jon Snow apologizes for the final season of ‘Game of Thrones’ in this deepfake
Got five minutes? We’d love to hear from you. Help shape our journalism and be entered to win an Amazon gift card by filling out our 2019 reader survey.
H/T Vice