Deep Nudes and many other morphing websites have become very controversial and the top of it is Deep Nude which is a very very controversial website that was in the news for all the wrong reasons before being shut down has now gone operational once again.
It has caught the attention of Indian cybercriminals, who are using it to prey on women. The site is still available when you search for the term online still.
Cybercrime officials in India is constantly tracking several apps and websites that uses algorithm to create nude photographs, or convert photos to naked form. These photos have been used to blackmail victims, seek revenge or commit fraud on social networking and dating sites.
So what is a deep nude? why you should be concerned?
A website like Deep Nude requires a user to upload any picture and within seconds generates a nude version of the subject of the picture, which are typically of women.
What is more worrying, officials said, is that there are several versions of Deep Nude on the Internet, including an app and a Twitter Handle; one page even seeks donations from users.
Cybercriminals and websites like deep nude use Artificial Intelligence (AI) software — now easily available on apps and websites — to superimpose a digital composite (assembling multiple media files to make a final one) onto an existing video, photo or audio.
Deep nudes are computer-generated images and videos. In March 2018, a fake video of then US First Lady Michelle Obama appeared on Reddit. An app called FakeApp was used to superimpose her face onto the video of a pornstar.
In 2017, a pornographic video surfaced on the Internet featuring actor Gal Gadot. Again, using the same AI technology. Other deepfake videos have used the facial features of Daisy Ridley, Scarlett Johansson, Maisie Williams, Taylor Swift and Aubrey Plaza.
And it’s not just restricted to nudes or pornography. In 2018, comedian Jordan Peele used Adobe After Effects and FakeApp to make a video in which former US President Barack Obama appears to be voicing his opinion on the Hollywood film Black Panther and commenting on current President Donald Trump. In the recent Delhi riots case, a Hindi video message of Delhi BJP president Manoj Tiwari was recreated with English audio.
Essentially, using AI algorithms a person’s words, head movements and expressions are transferred onto another person in a seamless fashion that makes it difficult to tell that it is a deepfake, unless one closely observes the media file.
When were deep nudes /deep fakes first found?
In 2017, a Reddit user with the name “deepfakes” posted explicit videos of celebrities. Since then, several instances have been reported along with the development of apps and websites that were easily accessible to an average user.
The debate around “deep nudes” and “deep fakes” was rekindled in July 2019 with the popularity of applications such as FaceApp (used for photo-editing) and DeepNude that produces fake nudes of women.
The criticism and objection to such software
Because of how realistic deepfake images, audio and videos can be, the technology is vulnerable for use by cybercriminals who could spread misinformation to intimidate or blackmail people. In a presentation, the Fayetteville State University in North Carolina called it one of the “modern” frauds of cyberspace, along with fake news, spam/phishing attacks, social engineering fraud, catfishing and academic fraud.
Can anyone create or produce a deep nude or a deep fake ?
According to a CSIRO Scope article from August 2019, “Creating a convincing deepfake is an unlikely feat for the general computer user. But an individual with advanced knowledge of machine learning (the specific software needed to digitally alter a piece of content) and access to the victim’s publicly-available social media profile for photographic, video and audio content, could do so.”
Even so, there are various websites and applications that have AI built into them and have made it much easier for a lay users to create deepfakes and deep nudes. As the technology improves, the quality of deepfakes is also expected to get better.
Experts said that once a nude of any woman was generated, the possibilities for misuse were endless. Already, information about some women being targeted had started trickling in.
Disclaimer : thenewsfacts does not endorse or support such software. We urge our readers to exercise deep caution and refrain from visiting such websites or use such software.