Facial Recognition Technology and The Right to Privacy in India
- Lets Learn Law
- Oct 13
- 4 min read
Imagine walking into a temple or boarding a train, only to discover later that your face has been scanned and stored without your knowledge or consent. In India, such scenarios are no longer hypothetical. From the Digi Yatra program at airports to facial recognition being introduced in schools, temples, and crowded public spaces, the spread of facial recognition technology (FRT) is rapid and often unregulated. This brings us to an urgent question: Can India reconcile the adoption of powerful surveillance tools with its constitutional guarantee to privacy?
The applications of FRT in India are wide-ranging. At airports, the Digi Yatra initiative allows passengers to board flights using their face as a digital boarding pass. While marketed as convenient and paperless, questions about data retention and independent audits remain unanswered. On Indian Railways, especially in Bihar and Jharkhand, AI-equipped cameras now scan crowds in real time to identify individuals with criminal records. In Delhi and Karnataka, schools have experimented with FRT to monitor students, drawing criticism from activists who argue that exposing minors to biometric surveillance is a grave violation of their rights. Even religious institutions are deploying the technology where temples across states have begun installing FRT-enabled cameras, raising concerns about profiling and mass surveillance in sensitive spaces. At a national level, the National Crime Records Bureau (NCRB) has proposed the Automated Facial Recognition System (AFRS), aiming to create one of the world’s largest biometric databases for policing.
The constitutional framework is clear in principle. In Justice K.S. Puttaswamy v. Union of India (2017) 10 SCC 1, the Supreme Court recognized privacy as a fundamental right under Articles 14, 19, and 21 of the Constitution. Any intrusion into privacy, the Court held, must meet the tests of legality, necessity, proportionality, and procedural safeguards. However, India’s current legal landscape falls short. The Digital Personal Data Protection Act, 2023 (DPDP Act) provides general protections for personal data but carves out broad exemptions for State surveillance in the name of “public interest.” This creates a wide gap between constitutional promises and practical safeguards. The IT Rules of 2011, which govern CCTV deployment, are outdated for technologies as invasive as FRT.
The risks of unchecked deployment are substantial. First, there is the issue of consent. In most cases, individuals are not informed when their facial data is collected, nor do they have any control over its retention or use. Second, FRT suffers from algorithmic bias. Research has shown that these systems have higher error rates for women and for darker-skinned individuals, raising the danger of false identifications and wrongful arrests. A 2021 study titled “Cinderella’s shoe won’t fit Soundarya” highlighted inconsistencies in FRT performance on Indian faces, especially among women. Third, there is the problem of function creep where tools initially introduced for safety or convenience are gradually being used for unrelated purposes. During the 2020 anti-CAA protests, for example, Delhi Police reportedly used facial recognition to track protestors, a practice that threatens the freedoms guaranteed under Article 19(1)(a) and 19(1)(b). Fourth, there are grave data security risks. In 2023, a data breach leaked biometric information including facial scans from police recruitment applicants, exposing the dangers of centralizing sensitive data without robust safeguards. Finally, unchecked FRT strengthens excessive state power, enabling surveillance without legislative scrutiny or judicial oversight.
The way forward lies in creating a strong legal and institutional framework. India needs a dedicated law regulating FRT deployment, defining permissible uses, retention periods, and independent oversight mechanisms. Procedural safeguards such as judicial warrants, mandatory audits, and public disclosure of deployment locations must be put in place. The Data Protection Board of India, established under the DPDP Act, should be empowered to act independently and investigate misuse of FRT by both state and private actors. Technology itself must embed privacy by design principles, ensuring anonymization, data minimization, and time-bound deletion. Beyond law and technology, there is a need for public awareness campaigns, educating citizens about their rights and the implications of biometric surveillance. Particularly sensitive contexts such as schools and religious sites should be designated “no-go zones” for FRT.
In conclusion, the unchecked growth of facial recognition in India poses a constitutional challenge. While the technology promises efficiency and security, without clear regulation it risks creating a surveillance state where privacy becomes an illusion. Protecting the right to privacy does not mean rejecting technology. It means governing it with fairness, transparency, and accountability. India must act decisively through law, oversight, and civic dialogue to ensure that innovation enhances democracy rather than erodes it.
References:
1. Justice K.S. Puttaswamy (Retd.) v. Union of India (2017) https://indiankanoon.org/doc/91938676/
2. Digital Personal Data Protection Act, 2023 https://www.meity.gov.in/digital-personal-data-protection-act-2023
3. Digi Yatra Program https://www.digiyatra.gov.in/
4. National Crime Records Bureau – Automated Facial Recognition System (AFRS) Tender Document.https://ncrb.gov.in/sites/default/files/AFRS-Tender.pdf
5. Internet Freedom Foundation (IFF) Report, Panoptic India https://internetfreedom.in/panoptic/
6. The Hindu, Delhi Police used facial recognition at CAA protests (2020) https://www.thehindu.com/news/national/delhi-police-used-facial-recognition-software-at-anti-caa-protests/article30877223.ece
7. Vidushi Marda & Shivangi Narayan (2021), Cinderella’s Shoe Won’t Fit Soundarya Study https://doi.org/10.2139/ssrn.3774990
8. Carnegie India, Regulating Facial Recognition Technology in India (2022) https://carnegieindia.org/2022/03/09/regulating-facial-recognition-technology-in-india-pub-86694
9. BBC News, India’s Use of Facial Recognition Raises Concerns(2020) https://www.bbc.com/news/world-asia-india-51499498
10. Data Breach Incident, Biometric data of police applicants leaked (Medianama, 2023) https://www.medianama.com/2023/07/223-facial-recognition-biometric-data-leak/
11. Internet Freedom Foundation, Surveillance and the Right to Privacy https://internetfreedom.in/surveillance-and-the-right-to-privacy/
12. Supreme Court Observation – Anuradha Bhasin v. Union of India (2020) https://indiankanoon.org/doc/75363923/
13. Brookings India, Biometric surveillance and democracy (2021) https://www.brookings.edu/articles/biometric-surveillance-and-democracy-in-india/
14. Economic Times, Facial recognition cameras in Indian Railways(2021) https://economictimes.indiatimes.com/industry/transportation/railways/indian-railways-to-install-facial-recognition-cameras-to-identify-criminals/articleshow/81263564.cms
This article is authored by Sreshta Ann John, Law Student from India and Trainee of Lets Learn Law Legal Research Training Programme. The views and opinions expressed in this piece are solely those of the author.




Comments