The Legal Youngster
Empowering Future Legal Minds

Mitigating Deep Fakes: Insights From The Digital Personal Data Protection Act 2023

Tvesha Uniyal
Symbiosis Law School, Hyderabad 

 

“AI is a mirror, reflecting not only our intellect but our values and fears.”
                                                                                                                      – Ravi Narayanan

In the vast expanse of our digital realm lies a region filled with wonders and mysteries, where we confront the intriguing phenomena of deepfakes—a digital tapestry in which reality and fabrication blend perfectly. Faces change, voices imitate, and reality becomes a subjective concept. How can we recognize the truth in an era when the simple push of a key may alter reality? Our screens act as portals to many places, blurring the distinction between truth and fiction. This confluence challenges us to navigate a terrain where deceit lurks around every corner.

UNDERSTANDING THE CONCEPT OF DEEP FAKES

Deepfakes, named for the fusion of ‘deep learning’ and deceptive artistry, carry significant negative implications, from creating illicit content to influencing political narratives. They employ Generative Adversarial Networks (GANs), which are generators that create realistic fake images or videos. To produce a deep phony video, an AI system is fed a set of pictures or videos of the target and the facial expressions, movements, and other characteristics of its face.

Technological advancement has made it possible to produce almost real videos that one cannot differentiate from the original ones. Formerly, creating deep fakes was possible only in a relatively expensive manner; however, it has changed with the help of online tools. Besides the videos, deepfake extends to voice cloning that creates opportunities for other criminal activities like phone scams in which the criminals compel their victims into making certain transactions believing that they are dealing with real people. They also enhance the elaborate fake photo and text generation, which impacts misinformation, such as on social media platforms and more. 

ETHICAL ISSUES REGARDING DEEP FAKES

On November 17, PM Modi identified deepfakes as a significant threat to the Indian system, warning of potential societal chaos. At the BJP’s Diwali Milan event in Delhi, he referenced a deepfake video depicting him performing Garba, which he described as remarkably convincing. Emphasizing the need for the media to raise awareness, PM Modi highlighted the importance of educating the public about the growing issue of deepfake manipulation. He also noted that he had not “played Garba” since he was young.

Another highlighted case was on July 9, when Kozhikode resident Radhakrishnan P S fell victim to Kerala’s first suspected AI deep fake fraud. Believing a WhatsApp message from an unknown number, he sent Rs 40,000 to a supposed longtime friend for his sister’s surgery. Four months later, Kozhikode city police arrested Shaik Murtuzamiya Hayat Bhai from Gujarat, marking the region’s first cybercrime case involving deep fake technology.

Considering these malicious cases, it can be stated that deepfake technology has multiple ethical issues. They pose a serious threat to society through deception and manipulation as a result of the advancement in technology. One thing is they can fuel political agendas by depicting politicians as doing something or saying something they never did. Moreover, the increasing realism of the deepfake videos and the spread of the same on social media also intensifies social issues such as the distribution of fake news and propaganda which can lead to violence, instability, and damage the reputation of people and organizations. The media and social network platforms amplify misinformation and falsehoods and therefore, calls for improving public awareness, and precautions as well as the regulation of their effects on civil society. 

 

 Furthermore, consider deepfakes utilized in nonconsensual pornography and defamation; it brings up privacy and ethics dilemmas. It is also possible to induce emotional stress and harm one’s reputation by overlaying obscene images onto the mentioned face or generating obnoxious materials. Thus, knowledge regarding deepfakes and especially the legality behind the same becomes a must.

DIGITAL PERSONAL DATA PROTECTION BILL, 2023

Data that relates to the individual, directly or indirectly, is collected and utilized by organizations and authorities for several reasons, including delivering products and services, targeting advertisements, and for the sake of enforcement agencies. However, the availability of plunder processing may infringe on privacy rights, resulting in loss of impersonation, dent to reputation profile, and other gains.

India had no specific Data Protection law, with the Information Technology (IT) Act 2000 functioning as the primary regulation. To address this, the Committee of Experts on Data Protection was set up in 2017, leading to the Personal Data Protection Bill, of 2019. This was replaced by the Digital Personal Data Protection Bill, 2023, following the withdrawal of the former in 2022. The new legislation aims to provide comprehensive regulations for digital personal data processing in India, covering online and offline data in digital format and data processed outside India for services offered within the country. On 11 August 2023, the Indian parliament enacted the Digital Personal Data Protection Act 2023 (DPDPA), marking India’s first comprehensive data protection law.

THE DATA PROTECTION PROTOCOLS OF THE DPDPA, 2023

Data protection protocols define two key roles: According to the requirements of the Data Fiduciary, whether an individual or legal entity, it is mandatory to inform and state the purpose and the way of personal data processing. This duty applies to entities such as businesses and government institutions, which means corporations should manage individuals’ information ethically and responsibly. A Data Principal means the individual that is the subject of the processed data and this will entail the minors and disabled persons who are represented by the legal guardian. It also protects their rights within the stringent data protection rules. 

The processing of data should only be done if the Data Principal agrees or it is required by law. In any case, where consent is sought, the Data Fiduciary should provide a notice of the intended processing of the Data Principal’s data and their rights in the process. 

Permissible data processing activities consist of cases when Data Fiduciaries may process personal data with consent and for compliance with legal obligations. Data Fiduciaries are legally required to meet all of the Act’s requirements, guarantee the correctness of their data, including the correct technical measures, and report the violation. They have to remove such data from complaints and introduce complaint handling procedures.

Children’s and disabled persons’ data are special and need protection, to be collected with a parent’s or guardian’s permission, and cannot be used to cause harm, track, or advertise. The Central Government can partly relieve some Data Fiduciaries of such obligations if the processing of the data is considered secure.

The Central Government can provide that some firms, referred to as Significant Data Fiduciaries, are distinguished by the quantity and type of data and the threat to rights and national security. These Fiduciaries must designate a Data Protection Officer and do risk evaluation and other procedures prescribed by the Act.

RIGHTS AND RESPONSIBILITIES OF DATA PRINCIPAL

As for data protection, individuals are protected with rights and responsibilities whereby the two exist in parity with the law. The data subjects have the right to be informed on how the data controller and processor process their data and third parties that receive their data. It has its exceptions like situations where the data is shared with legally authorized entities in areas such as crime prevention, among other things, when working on the balancing of openness and confidentiality in data management.

The persons whose data is processed have the right to access and rectify inaccurate and irrelevant data. Among other things, this empowers an individual to correct, supplement, or erase his/her data where such action is legally possible and where the data concerning him/her is inaccurate or not pertinent.

It is provided that there are some legal ways through which people can seek redress for their data rights violation or improper handling. They can employ internal data processing mechanisms of the Data Fiduciary or the designated Consent Manager as first-line recourse before involving the regulatory bodies on the matter, all to make sure that accountability in data processing mechanisms is well served. Moreover, the act provides that data principals can delegate a representative to undertake the task if they are unable to do so, hence compromising the constant exercise of protection of rights in certain conditions.

Data principals are equally expected to defend the rights provided under the Act. This involves meeting the stipulated legislative measures, as well as maintaining the accuracy of their data while avoiding any misleading tactics, which, in turn, improves data security and the relationship between the data subjects and fiduciaries.

BEYOND LEGISLATION: PROACTIVE MEASURES

Maintaining the credibility of media information from the time the idea is conceived and green-lighted, through production, distribution, and dissemination is very important. Confidence and credibility, especially in sensitive areas like the media, must be maintained. This verification procedure entails applying several features, such as watermarking, media verification, and chain of custody recording. However, verification can be limited, only covering some of the material posted online.

Another feasible option is to use blockchain technology, known for its decentralized and secure mechanisms, to build a reliable and trustworthy electronic registry. This ledger checks the authenticity of pictures and videos, making the content more trustworthy and providing a significant barrier to the widespread use of deepfakes. This solution also includes a means of identifying fake people, thus limiting the application of a blockchain-based registry that manages user identification based on submitted videos and social endorsements. These endorsements require vital parameters like name, brief description, and photo.

It is always helpful to repeat information to a friend and ask them to confirm the same. In addition, maintaining security requires solutions like multi-factor authentication technologies that embrace the use of biometrics. These technologies, which might include voice recognition and face authentication, provide trusted methods for identifying deepfakes and safeguarding the integrity of important activities, such as elections. These authentication procedures enhance general security, making it difficult to mimic a genuine identity, especially for politicians or spokespersons, thereby enhancing the believability and accuracy of election processes.

CONCLUSION

There is certainly no single solution to address the threats that deepfake technology brings. The Digital Personal Data Protection Act 2023 also enshrines appropriate measures to protect personal data and prevent cases of deep fake misuse. Carrying out the duties and responsibilities of data fiduciaries and principals to make proper use of personal data, the Act will facilitate the ethical management of data. But it takes more than the passage of laws. It is now imperative that industries engage in practices like media verification, blockchain technology, and multi-factor authentication to keep information credible and secure. Thus, legal guarding combined with the elaboration of novel technical solutions becomes a key to safeguarding society against Deepfake’s adversities.

Spread the love

About the Author

3 thoughts on “Mitigating Deep Fakes: Insights From The Digital Personal Data Protection Act 2023

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like these