People have been making fake sex pics since before computers, by clipping out heads from photos and pasting them on nude bodies. The same was true for the first photo and video cameras. It is no surprise whatsoever that one of AIs first uses is for porn. You simply can't stop it and only a control freak would even try. As is usual for the modern leftist, their answer is to "Ban all the things!". Sorry, but I think it's hilarious to see articles like this with people so outraged over a fake photo! Do it to men too, we don't care, lol. What do you think? How should we as human beings respond to DeepNude’s return- and the moral hazards it and similar AIs create? How should we protect potential victims? And who is responsible for doing so? Join the conversation and leave your thoughts below! Given that an AI spurred this ethical debate, what about a technological response? Should DeepNude and other AIs be expected or required to implement something like facial recognition-based consent by the person whose image will be altered? If the response should be legislative, how should different countries and regions account for the global availability of DeepNude’s source code? If it becomes illegal to have or use the algorithm in one country and not another, should the code be subject to smuggling laws? Just this month, it passed an amendment expanding its ban on nonconsensual pornography to include deepfakes. Should we respond legislatively? Legally, creating a DeepNude of someone who didn’t provide consent could be treated as a felony similar to blackmail (independent of the fake image’s use). Should our response be social? Is it even possible for us teach every person on the planet (including curious adolescents whose brains are still maturing and may be tempted to use DeepNude indiscriminately) that consent must be asked for and given freely? What role does Corporate Responsibility play? Should GitHub, or Microsoft (its parent company), be held accountable for taking down the DeepNude source code and implementing controls to prevent it from reappearing until victimization can be prevented? And instead of overlaying 1 person’s face onto 1 (other) person’s body, because it’s a machine learning algorithm trained on a dataset of over 10,000 images of nude women, reverse-engineering the output images to its component parts would be nearly impossible.Īll this begs the question- how should we respond? Can we prevent victimization by algorithms like these? If so, how? What’s significant is that it does so very quickly via automation. In the below example, Photoshop is used to overlay Katy Perry’s face onto Megan Fox’s (clothed) body:ĭeepNude effectively follows the same process. Thanks to applications such as Photoshop and the media’s coverage of deepfakes, if we don’t already question the authenticity of digitally-produced images, we’re well on our way to doing so. If technology’s ability to create fake images-including nudes- well enough to fool the human eye isn’t new, why is this significant? The downside of DeepNude becoming open source is that the algorithm can be trained on a larger dataset of nude images to increase (“improve”) the resulting nude image’s accuracy level. The upside for potential victims is that the algorithm is failing to meet expectations: Quite the opposite- it’s back as an open source project on GitHub- making it more dangerous than it was as a standalone app. Read more here.Īfter 4 days on the market, the creator(s) of DeepNude, the AI that “undressed” women, retired the app following a glut of backlash from individuals including leaders of the AI community:Īlthough DeepNude’s algorithm, which constructed a deepfake nude image of a woman (not a person a woman) based on a semi-clothed picture of her wasn’t sophisticated enough to pass forensic analysis, its output was passable to the human eye once the company’s watermark over the construced nude (for the free app) or “FAKE” stamp in the image’s corner ($50 version of the app) was removed. Update July 9, 7:55 p.m EST: GitHub removed the DeepNude source code from its website.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |