Stories

An Invention From Saving And Helping Humans To Exploiting Mankind, Particularly Women And Children, AI Has Revolutioned Like A Phoenix!

Many women are already being digitally inserted into porn photographs or movies without consent, which is a problem that is rarely discussed. As we are heading towards an AI boom, these kinds of unregretful acts displays that a deeply sexist culture persists!

Times have changed, but the problems for the female human species are never in the back seat. Just a few days after the deep fake video of Indian Actress Rashmika Mandana went viral, it created havoc in the world of news and justified Elon Musk’s words that “AI will be dangerous”; here arrives another saddening news for women. By the passage of time, apps that undress women’s photos surged in popularity. What? You read that right. After pornographic content and violence against women, here is another statement that shows women from ancient to modern day are surrounded by goons who are present just to toss ‘her modesty’.

DeepNude further rejects the notion that this technology may be utilised for purposes other than claiming control of women’s bodies.” With its promise of one-click undressing of any woman, it made it simpler than ever to fabricate naked images — and, by extension, to exploit those bogus nudities to harass, blackmail, and publicly embarrass women everywhere. According to a new social media study, apps that employ AI to undress individuals in images are becoming increasingly popular. These applications alter existing photographs and videos of actual people to make them look naked without their consent. Many of these “nudifying” applications are exclusively effective on women. And if you think this is the first case of such an event, just hold on. 

In 2019, the Creator of DeepNude- An app that undresses photos of women, took it offline, citing server overload and potential harm. The application was designed by a programmer who took advantage of neural networks to remove clothes from photos of women, making them appear realistically nude. He tweeted at that time that they created this project for user’s entertainment!! The witty creator mentioned that they never thought it would become viral and the company would not be able to control the traffic. Despite the safety measures adopted, like watermarks, if 500,000 folks use it, the probability that people will misuse it is too high. It is so shameful that we are still entertained by ideas that objectify women!

How pathetic a world we are living in! The AI, which was once conceptualised to make human processes easy and time-saving and was made intentionally to predict future dangers to save mankind, is now the most powerful tool to exploit mankind. If you search on the net, you will be horrified to see headlines like ‘12 Best Undress AI APP 2023’, ‘Undress App: Remove Clothes for Free’, ‘Top 8 Undress AI Apps to Undress Girls for Free and Safe’ and many more. 

Despite witnessing such a great ethical concern, the creator mentioned that the world is not yet ready for DeepNude! One creator may have experienced a moral crisis and removed his application from the market, but programs like this will continue to appear – and get more sophisticated. Not to add that DeepNude is not the only tool that allows you to generate realistic-looking fake nudes. Many women are already being digitally inserted into porn photographs or movies without consent, which is a problem that is rarely discussed. Though the creator pulled the app, these kinds of unregretful comments displays that a deeply sexist culture persists!

Even if you go back to 2017, On the internet, there is a video of Gal Gadot having sex with her stepbrother. But it’s not truly Gadot’s body, and her face is hardly visible. It’s an approximation, with her face altered to appear in an existing incest-themed porn film. This new form of fake pornography, along with the Adobe tool that can make people say anything and the Face2Face algorithm that can replace a recorded video with real-time face tracking, indicates that we are quickly approaching a future in which it will be incredibly simple to create convincing videos of people saying and doing things they never would. even engaging in sexual activity.

This sexist culture is not a new topic in the town; it has been hovering around for ages. From making nude portraits to photoshopping images to AI-creating nudes, DeepNude has evolved like a Phoenix. For millennia, the nude—the naked or partially dressed human body—has been portrayed in European art. After 1400, as the Middle Ages came to an end, painters showed nudes as more three-dimensional, bright, and lifelike—in other words, more alive and real. 

The nude, as a type of visual art focusing on the naked human figure, is a long-standing tradition in Western art. It was a concern of Ancient Greek art, and it returned to prominence with the Renaissance following a semi-dormant era in the Middle Ages. Unclothed people are frequently used in different forms of art, including historical painting, allegorical and religious art, portraiture, and the decorative arts. Nude female figures were often regarded as emblems of fertility or well-being from prehistory to the early civilisations. The nude female figures, which were once celebrated as symbols of fertility, are now exploited to endanger women’s morality.

How DeepNude has been used as a tool for ages to cover the wrong?

One might be thinking that the concept of DeepNude is harmful as AI has to have the advantage of neural networks, and it is difficult to determine who will be the next prey. However, this is not the case. The nudes, which had been used as a deceptive tool to enjoy an undue advantage, can be seen in history as well. Recall ‘Screw’, The pornographic online magazine aimed at heterosexual males that was initially published as a weekly tabloid newspaper in the United States.

The newspaper, which was described as “raunchy, obnoxious, usually disgusting, and sometimes political”, was a pioneer in introducing extreme pornography into the American mainstream during the late 1960s and early 1970s. Al Goldstein, the founder, won a series of nationally significant court cases involving obscenity.

It would not be wrong to call this man the godfather of nudity. See how this culprit yet intelligent mind gave birth to the world of nudes, by exploiting his beloved ones, even. Goldstein kept his firm running despite his difficulties. In 1974, he established Midnight Blue, a thrice-weekly public access TV show that aired on Manhattan’s Channel J for over 30 years and included interviews with porn stars as well as advertisements for bordellos and sex hotlines.

Goldstein discovered First Amendment loopholes in federal laws that prevented the cable system from refusing to carry his program. And in 1977, Screw placed an advertisement for “Al Goldstein’s Cinema,” a porn theatre on 8th Avenue near Times Square that functioned for a while but did not appear in Screw’s 1979 list of porn theatres. 

He soon got wealthy enough to purchase a townhouse on Manhattan’s Upper East Side as well as a residence in Pompano Beach. However, his vindictive personality ruined his personal life. He had five marriages and divorces. When his only son, Jordan, refused to invite him to his Harvard Law School graduation, Goldstein publicised doctored images of Jordan having sex with guys and even his own mother, Goldstein’s third ex-wife. That’s what happens when one uses their wit to harm society and ends up hurting their own loved ones.

But as the nature of this guy, he never gave up. The height of gritty images became gutter when, in 1973, his magazine published naked paparazzi pictures of former First Lady Jacqueline Kennedy Onassis. But with his sudden celebrity as New York’s Smut King came controversy—and legal issues. In the same year, the United States Supreme Court ruled in Miller v. California that obscene materials were not entitled to the First Amendment, and “obscenity” covered anything lacking “serious literary, artistic, political, or scientific value.” This conservative view gave federal, state, and local prosecutors around the country the authority to charge Goldstein.

Goldstein, notorious for his vindictiveness, capitalised on his accusers’ legal difficulties by writing scathing editorials about them and even produced composite picture collages of them as stars in his pornographic editorials. Despite its petty nature, this rage ended up serving as a “get out of jail free” pass. This was sure that way back; it was understood that nudes(today, DeepNude) could be used as a powerful tool to defame somebody to the extent that one can get out of jail even after doing shameful acts.

Coming back to the creator of DeepNude(in 2019), He argued that he’s mere a “technology enthusiast” drove by curiosity and a desire to learn. This is the same message given to Motherboard by the creator of Deepfakes in December 2017: he was merely a programmer with an interest in machine learning. However, as the following growth of fake revenge porn generated using deepfakes shows, tampering with women’s bodies is a harmful and often life-destroying enterprise for the victims of “enthusiasts.”

Why is the law needed to regulate the DeepNude Apps?

According to August 2023 news, there has been a tenfold spike in complaints relating to modified photographs or deep nudes generated using advanced techniques, according to cyber specialists. The instances involve influencers and other popular figures who have been blackmailed into revealing deepfake photographs unless a ransom is paid. According to data acquired, phrases such as ‘Remove Dress AI’ and ‘Deep Fake AI tools’ have experienced a significant increase in searches, with relevant inquiries coming from websites such as Undress AI, Magic Eraser AI, and Soulgen AI.

In the last week of August, searches for ‘Soulgen AI’ increased 1,000%, while searches for ‘Soulgen’ increased 1450%. Soulgen touts itself as a universe where there are no limits to one’s creativity. It bills itself as an AI-powered image software “that brings your wildest imaginations to life.” These and other applications that appear to be innocuous AI-image generators allow users to input a picture of their choosing and obtain a naked version of it for free or at a cost.

According to privacy experts, what was previously limited to celebrities and individuals in the public spotlight may now see growing usage to target average citizens in order to extort, intimidate, embarrass, or destroy reputations. According to public policy expert Kanishk Gaur, the barrier between public and private life is becoming increasingly blurred. He notably advised against ‘sharenting,’ which is the habit of parents posting sensitive information about their children on social media networks. He advised parents to use caution and constraint while posting images of their children online, recognising the hazards and prioritising their children’s safety and privacy.

In recent times, these methods have been widely utilised by fraudulent loan applications that acquire access to a person’s gallery and then use modified nude photographs of users to extort money from them. According to Jaspreet Bindra, creator of Tech Whisperer, technology must progress to the point where, in addition to downloading an anti-virus, consumers must also have a ‘classifier’ technology that distinguishes between what is legitimate and what is phoney.

“The solution has to be two-pronged – technology and regulation,” he added.  Classifiers are needed in society to distinguish between what is real and what is not. Similarly, the government should require that everything developed by AI be explicitly labelled as such. The government should consider including this provision in the Digital Personal Data Protection Bill.”

The patriarchal technologies that are used as weapons against women on the internet lack a firewall, ending the world a bitter palace for women.

Powerful women are particularly susceptible to cruel internet assaults when they stir up trouble. The weight of these societal outcries falls on women. The capacity to recast a character makes it possible to target vulnerable people like women and girls and create a never-ending stream of hoaxes, fake news, and scenarios featuring nude and sexual content, as well as revenge porn. The DeepNude App, according to Mutale Nkonde, a fellow at the Data and Society Research Institute in New York, “proves their worst fears about the unique way audio-visual tools can be weaponised against women,” ultimately changing the perceptions of people and controlling women’s bodies.

Chakraborty

Chakraborty is a Journalist at Inventiva who drafts content on current social topics. Her forte is documenting opinionated content based on data, facts, and numbers while adhering to media ethics, which go beyond simply crafting news headlines. Her core intent in writing such content is that every word her viewers read should give meaningful insights to their time spent on the articles.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button