The inventors of an application that allows users to digitally “undress” women using artificial intelligence have shut it down after a viral backlash and social media uproar over its potential for abuse.
“We never thought it would be viral and (that) we would not be able to control the traffic,” the DeepNude creators, who listed their location as Estonia, said on Twitter stating their reason in an official statement.
— deepnudeapp (@deepnudeapp) June 27, 2019
DeepNude used artificial intelligence to create the “deepfake” images, uses neural networks, presenting realistic-looking nude images of women.
The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures.
Even offering a free version of the application that app also had a paid version. The app was definitely not designed to work on men but for the ones who might look at women without cloths.
Eh in few months there will be another software so it's not a big loss. Good on you for pulling out early.
— rapbeast (@rapbeastly) June 27, 2019