AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Convert photos to nudes4/13/2024 It’s notable that their findings were only possible because the LAION data set is open source. This leads to AI models that sexualize women regardless of whether they want to be depicted that way, Caliskan says-especially women with identities that have been historically disadvantaged.ĪI training data is filled with racist stereotypes, pornography, and explicit images of rape, researchers Abeba Birhane, Vinay Uday Prabhu, and Emmanuel Kahembwe found after analyzing a data set similar to the one used to build Stable Diffusion. ![]() With all the existing hate and controversy surrounding the toxic nature of deepfakes, the app only seems to be exacerbating their use.And because the internet is overflowing with images of naked or barely dressed women, and pictures reflecting sexist, racist stereotypes, the data set is also skewed toward these kinds of images. And thirdly, although he claims to want to “improve the AI algorithm”, he could have come up with other app ideas to test it. This is because somewhere he believes that people would be willing to pay the money to make the picture seem real. Secondly, the very fact that the app charges $50 by promoting people to remove the “FAKE” watermark reeks of hypocrisy on the creators part. For instance, DARPA is known to be working on certain AI forensic tools to catch deepfakes. Firstly, the app promotes the whole idea around “revenge porn” which has long been a bone of contention for companies trying to get the deepfakes situation under control. Now although the creator of the app claims to be “improving the AI algorithm”, the very concept of the app screams of misogyny and sparks some major concerns. SEE ALSO: Australia’s New Sex-Changing Tomato Is Called 'Solanum Plastisexum' He further added that “I’m not a voyeur, I’m a technology enthusiast. If I don't do it, someone else will do it in a year”, the DeepNude creator told Vice. So if someone has bad intentions, having DeepNude doesn't change much. "I also said to myself: the technology is ready (within everyone's reach). Also, the app is free to download and try, but charges $50 from users who want to remove the watermark that says “FAKE” from the image. DeepNude creator further added that he also wants to make the app applicable for male pictures, but since the nude pictures of women are easily available online, he decided to make the female version first. The algorithm also keeps self-learning to improve itself over time. The algorithm was trained using a large dataset which included nude pictures of over 10,000 women. Similarly, if the picture doesn’t have good lighting or angle, or in case it’s animated, the app wouldn’t perform well.Ĭreator of DeepNude used pix2pix, an open-source algorithm by the University of California, Berkeley based on General Adversarial Networks (GANs). Also, the app doesn’t seem to work well for pictures where the person is fully clothed (eg winter wear) and works best for pictures of women in swimsuits and short dresses. What’s even more pathetic is the fact that the app doesn’t work on men, as reported by Vice. The app, DeepNude, only works on women where it swaps their clothes, with intimate parts. ![]() ![]() SEE ALSO: The Way You Lock Your Smartphone Can Now Reveal Your Age: Study One such AI app, called DeepNude, can create nudes of women out of their fully clothed pictures by using neural networks. And now, some app manufacturers are moving beyond the concept of Deepfakes. What started out as a Reddit thread by a user named 'deepfakes' who posted fake and explicit celebrity videos using deep learning has now become a toxic tool to bully and harass other people. DeepNude team would not be releasing any other future versions of the app.ĭeepfakes have become all too common in today’s world. The DeepNude app team posted on Twitter that they “greatly underestimated the request” and despite the ‘safety measures’, the probability that people will misuse it is too high. Update, 28th June, 2019: The deepfake app DeepNude has now been shut down.
0 Comments
Read More
Leave a Reply. |