Copies of AI deepfake app DeepNude are easily accessible online — and always will be

  News, Rassegna Stampa
image_pdfimage_print

Once something has been shared online, it never truly goes away. This adage is particularly relevant for DeepNude, software that uses AI to create fake nude images of women.

The app came to public attention last week after a report from Motherboard highlighted its existence. Shortly afterward, the app’s creator pulled it from the web, saying that the probability the software would be misused to harass and shame women was “too high.”

Of course, the app is still available, with numerous copies floating around forums and message boards. The Verge was able to find links that ostensibly offer downloads of DeepNude in a variety of places, including Telegram channels, message boards like 4chan, YouTube video descriptions, and even on the Microsoft-owned code repository GitHub.

The report from Motherboard found that the app was being sold on a Discord server (now removed) for $20. The anonymous sellers said they had improved the stability of the software, which was prone to crashing, and removed a feature that added watermarks to the fake images (supposedly to stop them from being used maliciously).

An example of the sorts of images the DeepNude app produces. The watermarks can be removed; the censorship bars were added by The Verge.

“We are happy to announce that we have the complete and clean version of DeepNude V2 and cracked the software and are making adjustments to improve the program,” wrote the sellers on Discord.

The individual who uploaded an open-source version of DeepNude to GitHub claimed they were annoyed that people were trying to “censor knowledge.” However, the uploader also included a screenshot of news coverage of the app from Vox and mocked concerns expressed in the article that the app could be harmful to women.

While The Verge was not able to test all of the links mentioned, we did verify that several copies of the software are being shared on forums, including a version that was tweaked to removal all watermarks. As with any modified free software, it is likely that some versions have been altered to include malware, so extreme caution is advised.

We noted in our original coverage of DeepNude that the nonconsensual nude images this software creates are often of dubious quality, and, indeed, many people sharing this software say they’re disappointed by its output. But while these images are easy to identify as fake, that doesn’t necessarily minimize their threat or the impact they could have on people’s lives.

Since the term “deepfake” was coined, the technology has consistently been used to target women. People can use deepfakes to create pornographic and nude images of co-workers, friends, classmates, even family members, and the realism of this content has only increased over time. The best images created by DeepNude look real at a glance, and that’s all that might be needed to cause terrible damage to someone’s life.

DeepNudes represents a depressing milestone in the history of this technology, making the creation of nonconsensual nudes as easy as clicking a button. Now that the software has been released, it will continue to be shared and spread all over the web. We’ve seen this dynamic already with deepfake porn videos, which sites like Pornhub said they would remove but are still easily accessible.

The conclusion here is obvious but worth repeating: technology like this is not going to go away. People will continue to refine the quality and accessibility of deepfakes, and the resulting software will cause real harm to peoples’ lives. DeepNude is just the tip of the iceberg.

https://www.theverge.com/2019/7/3/20680708/deepnude-ai-deepfake-app-copies-easily-accessible-available-online