An app that makes deepfake nudes was shut down by its creator — but it’s probably not enough

Mihai Surdu/Shutterstock
Impact
Updated: 
Originally Published: 

The conversation about deepfakes has recently turned to videos targeting public figures. The fear is that these videos, which depict political leaders and other powerful people saying and doing things they never actually would, will continue to erode trust and allow people to further distance themselves from reality. Valid as those concerns are, politicians and celebrities aren't the only targets of deepfakes. Average, everyday people are as well — and they don't have the same tools to fight disinformation available to them. That makes the DeepNude app, that allows basically any person with a computer to create nude images of a woman without permission, particularly frightening.

DeepNude first cropped up online a few months ago but didn't release a publicly available app until June 23. The app, which could be downloaded for Windows or Linux machines, allows any person to upload a picture of a woman and have the algorithm create a realistic-looking nude image of her. The free version of the app comes with a large watermark on it, but users could pay $50 for the full version of the app and have that removed. The app still puts a stamp that reads "fake" on every image, but it would be easy for anyone to crop out.

DeepNude

In a conversation with Vice, the creator of DeepNude said that the program is based on an open source algorithm called pix2pix that was developed by the University of California, Berkley. The algorithm uses generative adversarial networks (GANs) to produce images. The process works like this: two different neural networks are fed the same batch of data (in the case of DeepNude, about 10,000 nude photos of women). One of those networks, known as a generator, creates new images based on the dataset it is given. It then presents those created images to the discriminator, which is tasked with determining if it is part of the original dataset or a new image. Eventually, the generator gets so good that it fools the discriminator, and the end result for DeepNude is realistic-looking naked pictures of women. The process takes about 30 seconds, and while some images are more believable than others, the amount of time it takes makes it essentially trivial. If the outcome is bad, there's nothing stopping a person from just trying another image until they get one that is passable.

The creator of DeepNude has held that the app isn't about voyeurism or trying to do harm to people, it's solely an experiment in the potential of the technology. To that end, he pulled the app entirely on June 27. "Despite safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high," he said in a statement on Twitter. He also noted that he didn't want to profit off potential abuse of the technology and is offering refunds to everyone who purchased the full version of the app.

While the decision to pull DeepNude is a morally correct one, there is a problem: the software is already out there now. The creator of the app acknowledged as much, noting, "Surely some copies of DeepNude will be shared on the web." While he said that sharing and using the software without permission is against the terms of service, there's not a lot to stop its spread across more nefarious corners of the web. There's even less to stop someone with a much looser moral compass from creating their own version of it.

Deepfakes first gained prominence with this same sort of situation. The technology started gaining attention when it was used to create nude images and videos of women without their consent. This has always been one of the primary uses of deepfakes, and the defenses against it are few and far between. While tools to detect deepfakes of politicians and celebrities are being developed, those who aren't public figures are stuck dealing with a new kind of revenge porn: images that put them in exposed and compromising situations that they were never actually a part of.

Stopping the spread of deepfakes is like stopping the spread of a rumor, and there are no clear mechanisms for combatting the issue. Because they aren't actually nude images of the victim, it's hard to say if they fall under revenge porn laws. And while social networks like Facebook prevent nude images from being posted and shared publicly, nothing stops people from spreading the image through chat apps. That makes reporting the image to prevent it from being shared much harder.

The Electronic Frontier Foundation suggested that some existing laws can be used to fight back against deepfakes. Defamation cases or even copyright laws could hold the keys to legal challenges against these images. The problem is that most people aren't going to want to go through a legal challenge just to get a fake picture of them removed from the internet. The Cyber Civil Rights Initiative has a comprehensive guide of how to request the removal of content from a variety of online platforms ranging from Facebook to Snapchat. The organization also offers a 24/7 crisis hotline for victims of non-consensual pornography. These resources may help combat the spread of deepfakes, but it's becoming increasingly clear that not much will stop people from making them.