Behind the Mask: Rising Issue of Deepnude- AI and Corresponding Legal Landscape

Author: Riddhima Gupta is a 2nd year law student at the Institute of Law, Nirma University.

Introduction

Artificial Intelligence, as an entity, is born from the dreams of silicon minds and nurtured by the digital dawn. The most prosperous period of artificial intelligence has been observed throughout the course of the past few years, mainly because it became accessible to the common people at large. Several generative-based AIs took the world by storm. In the realm of AIs, generative systems are the ones that are capable of producing a wide range of content, including text, images, audio, and synthetic data. In contemporary times, artificial intelligence is expanding at a pace unmatched by any other sector.

Deepfake and Deepnude are other generative-based AIs that have been in the spotlight recently due to their controversial methods and capabilities. Deepfake is an AI which manipulates the videos or images to create a false impression, it has been used over the years, and therefore, several nations have banned the AI. However, Deepnude is a relatively new form of Deepfake which needs to be talked about, therefore, this blog further discusses the legal aspect related to Deepnude. It has been designed to generate realistic-looking nude images and videos of women using deep learning algorithms and artificial intelligence. Its software is able to take a clothed photo of a woman and then apply a sophisticated algorithm to remove the clothing, creating an image/video that appears to be a nude version of the original. The finished picture gives the false impression that the person was initially photographed naked. This poses multiple threats to humans, along with raising several ethical questions. The AIs work as a service provider since they charge a modest fee for their customers.

Deepnude poses several concerns; most importantly, raising severe privacy issues for individuals. When services such as these are freely accessible on the internet, it encourages individuals to utilise them for a variety of objectives, including but not limited to defamation, revenge pornography, harassment, and cyberbullying. This makes it much more likely that someone will commit cybercrime; therefore, nobody can be safe using the internet. The boundaries between truth and falsehood are blurred.

Violation of Right to Privacy

Deepnude violates Article 21 of the Indian Constitution. This article essentially discussed the right to life and personal liberty. This article’s scope of the right to privacy has been expanded by several significant case laws. The release of fraudulent photos violates the ability of individuals to live their lives with dignity, which is a fundamental right. When a woman is photographed against her will, it is a violation of her right to privacy.

In the recent case of Kunal Kamra v Union of India, a Writ petition was filed in 2023 against the IT Rules, 2023. He contended that the rules are not efficient for false news or information. The government has not been successful in detecting these fraudulent information and the people behind them. The Deepfake AI was specifically targeted in this judgement, it was mentioned that these need to be dealt with at the earliest. He also submitted that the IT Rules 2023 violate Articles 14, 19(1)(a) and 19(1)(g) of the Constitution as they are contrary to the principles of natural justice, and restrict freedom of speech. Recently, a Deepnude video of actress Rashmika Mandanna was released. This caused severe mental anguish to the actor. There have been many such instances globally where celebrities have suffered because of Deepfake and Deepnude.

Revenge Pornography is another category which has increased with the introduction of technologies of Deepnude. People use nude pictures to blackmail their previous partners as a method of revenge, they upload these pictures on the internet without consent of the women, and technological advancements make it easier for people to procure the nude pictures. This is a major example of the bane that the technology brings along with it. It is a rampant and disturbing act against women which causes great emotional and psychological trauma that can be serious and long-lasting. Even though revenge pornography is punishable under Section 354A of IPC and Section 66 and Section 67 of IT Act, 2002, there are chances that the culprit might escape the legal consequences. The process to catch the culprit is a tricky one since the crime is committed from behind the screens.

In the event that the photographs are made public, women’s professional and personal life is compromised. Owing largely to the factor that the picture becomes seemingly more popular than the fact that it is a Deepnude. In the eyes of society, this continues to be detrimental to both one’s personal and professional reputation. It is possible for relationships with friends, family, and co-workers to become strained as a result of the proliferation of false nude photographs. People may become socially isolated and experience difficulty in both their personal and professional lives as a consequence of the stigma that is connected with their experiences. The mental health of the victim is severely impacted; the victim may suffer from extreme emotional anguish as a result, feeling guilty, embarrassed, anxious, and other negative emotions.

The IT Act under Section 66E does punish the act of intentionally capturing and publishing nude pictures of a person, but this does not include using AIs for publishing the pictures. As a result of the fact that Deepnude does not satisfy this criterion, using the AI is not legally prohibited.

Criminal Culpability of Deepnude

The entire issue with AIs committing offences is determining the liability for the crime and the consequent punishment. The primary purpose shall be to determine the criminal intent of the parties making such AIs and the ones using those. It is essential to decode the intention of the algorithm makers. There is a certain level of criminal intent behind the making of such acts. The mens rea, the mental element behind the crime, can be traced to both the inventors and the users of AI. There is a presence of a guilty mind from the moment the inventor comes up with the idea of inventing an AI with such capacities. The liability for mens rea of the user arises when he instructs the AI to convert the picture. Therefore, it has been emphasised to interpret the criminal liability of both parties.

Section 354B of the IPC discusses the criminal force with intent to disrobe a woman. This section’s punishment only extends to the point at which women are physically disrobed. However, Deepnude images, due to their false nature, may not technically disrobe the woman physically, since there was no physical harassment. However, they do give the impression that they have removed the clothes of the woman. The functions of Deepnude are similar as described in 353B but the means is different, since they disrobe using algorithms.  This, however, has not yet been penalised legally in India. Technology advances at a quicker rate than the law does, and as a result, developers are constantly looking for legal loopholes that they can exploit to their advantage. Cyber-based sexual offences have not been punished in any way throughout the IPC’s chapter on sexual offences.

The users of Deepnude shall also be held liable for Stalking under Section 354D of IPC. Typically, there are two ways to obtain an image and turn it into a nude image. The first option is to make use of a camera and take a picture, while the second option is to make use of a picture that was taken of the individual in the past and obtained from social media or any other source. Getting images using either of these means constitutes stalking.

Where the videos and photos are used to harass a person in such a way as to cause the person to reasonably fear for their safety, this would constitute criminal intimidation under Section 503 of IPC. Women are blackmailed that their private pictures would be leaked causing disruption in their personal lives; this counts as criminal intimidation.

Figuring out who is ultimately responsible for the crime is a challenging endeavour for the legal system. There are various questions that are needed to be answered by the law. Should the algorithm be held accountable, what are the potential ways to penalise it? What is the criminal culpability of the makers? Where does the criminal culpability of the users stand for using the algorithm? Dealing with these intricate matters with the existing laws is a significant challenge for the legal department. 

The Way Ahead

Currently, the Information Technology Act, 2002 caters to the need relating to cybercrime. However, in recent times, these AIs have surpassed the scope of the provisions in the act. The act could not have possibly anticipated the mischief AI would have caused in the future. Amendments are required in the current regulations to punish the AIs along with their makers and users. There is a need for an entirely new act in order to meet the diverse requirements of the countless AIs. For those who abuse the capabilities of the internet, the punishment that is handed out should be severe. Currently, due to the loophole in the system, criminals escape easily.

Further, a blanket ban should be imposed on Deepnude and other AIs of such nature. Nations like the US and Canada have already declared these AIs illegal. The makers of these algorithms should be held criminally liable for encouraging and motivating people to act in a manner they otherwise wouldn’t have. The legal sector of the nation should step up to compete with the AIs and not give them any opportunity to misuse the loopholes. The government needs to take strict actions. For instance, pornography is legally banned in India but it is widely available on the internet for anyone who wants to watch. The ban on Deepnude should actually be in action for people to take it seriously. There shall be no website available online for the common public to access. Unless strict action is taken, the makers of such algorithms will continue to harm the nation in similar ways.

Leave a comment

Website Powered by WordPress.com.

Up ↑

Design a site like this with WordPress.com
Get started