A program that uses a machine learning algorithm to remove women’s clothing from photos was pulled offline after its creator received widespread attention and backlash.
The DeepNude app, first reported on by Motherboard Wednesday, digitally places realistic-looking breasts and female genitals on pictures of clothed women with a click of a button. The app’s anonymous creator, who goes by the alias “Alberto,” reportedly fed an algorithm 10,000 photos of nude women in order to teach his program how to make subjects appear naked.
Photos produced by DeepNude included a large watermark unless users paid $50, in which case the watermark would be removed.
“I’m not a voyeur, I’m a technology enthusiast,” the creator told Motherboard. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That’s why I created DeepNude.”
When questioned on the ethics of his app, Alberto argued: “If I don’t do it, someone else will do it in a year.”
But the developer’s tune changed shortly after the website for his app began crashing due to overwhelming traffic. The app, which does not work on photos of men, also received criticism for objectifying women.
Alberto wrote on the DeepNude Twitter account early Thursday that he will not continue working on the app.
“Despite the safety measures adopted (watermarks), if 500,000 people use it the probability that people will misuse it is too high,” the statement said.
Although the app was taken down, users who already downloaded the software are still capable of using it and spreading photos it made online. And now that the concept has surfaced, copycat apps are likely to follow.