A terrifying app for making any woman appear naked was killed off by its creator - CNN

A terrifying app for making any woman appear naked was killed off by its creator

San Francisco (CNN Business)On Wednesday afternoon, I clicked on a picture of a woman on a website called Deepnude.com. Suddenly, her outfit disappeared, and naked breasts were on my computer screen. It was transfixing and nauseating. I felt like I had just peeped through a stranger's window, utterly violating her privacy.

A day later that website had disappeared; its creator apparently had a crisis of conscience. If you type in the URL, you'll see a blank, white page and the words "not found." But before it disappeared, it offered visitors like myself free previews of a horrific AI-enhanced world where photos of women — any woman, really — could be undressed via algorithms and shared with reckless abandon. Like the woman I saw, the resulting nudes weren't real. But they certainly looked like it.
The website, initially reported on by Samantha Cole at Vice site Motherboard on Wednesday, began selling a $50 Windows and Linux application just a few days earlier that could take a photo of a clothed woman and, using artificial intelligence, replace it with a fairly realistic-looking naked image of her. There was a free version, too, that would place a big watermark on resulting images (the paid version of the app, according to Vice, instead had a "FAKE" stamp in one corner).
    DeepNude was meant to work on women, specifically. (Vice reported that it would insert a vulva, in place of pants, in a photo of a man.) It is the latest example of how it's getting increasingly easy to use technology to shame and demean women, in particular, online. The images it created in the online samples I saw didn't look perfect, but they were good enough to make a casual observer gasp. And such AI-crafted images are likely to keep spreading: any copies of DeepNude that are already out there could easily be replicated, and other similar programs are likely to pop up.
      The anonymous person behind the app, who reportedly spoke with Vice, said DeepNude was trained on over 10,000 photos of naked women and used the AI technique behind many deepfake videos, known as generative adversarial networks, or GANs. GANs consist of two different neural networks pitted against each other in an effort to come up with new outputs — which could range from realistic-looking faces to paintings or, in this case, nude women — that mimic those in a mountain of training data.
      "Digital tools, once they're in the wild, are cheap and easy to replicate, and cheap and easy to use, increasingly," Danielle Citron, a law professor at Boston University and author of the book "Hate Crimes in Cyberspace," told CNN Business.
      The swift rise and fall of DeepNude shows there is a demand for such software. In the hours after Vice's story appeared on Wednesday, a Twitter account that appears to be linked to the DeepNude website indicated its server was down due to "unexpected traffic." A few updates later, the account tweeted that DeepNude would be back online "in a few days." Then, on Thursday afternoon, a new message appeared on Twitter, saying that DeepNude had been killed off due to concerns about potential misuse in the wake of "greatly underestimated" demand.
        Though the free version of the application created images that have watermarks on them, the Twitter message read, "if 500,000 people use it, the probability that people misuse it is too high. We don't want to make money this way. Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones that sell it."
        "The world," it concluded, "is not yet ready for DeepNude."
        According to GoDaddy, the service through which DeepNude.com was registered, the site was registered by an organization in Estonia calling itself DeepInstruction. A message sent to the domain registrant via GoDaddy received no immediate response, nor did a tweet directed at the DeepNude Twitter account.
        Though DeepNude could create images that look like realistic nudes, they're not actual photos of naked bodies. So while there are a growing number of laws criminalizing non-consensual pornography, such as revenge porn, the images churned out by such AI-assisted software aren't covered by existing legislation, Citron explained.
          Yet Citron, who said she has spoken to women who have been the subject of such ersatz images, stressed that those depicted in them feel the same kind of invasion of sexual privacy as those who have had actual naked photos spread around.
          "It has the same impact," she said. "You feel like you have 1,000 eyes on your body."