The main concerns with imposing the government’s proposed Automated Facial Recognition System (AFRS) are the absence of a law governing personal data protection, absence of checks and balances, and, societal biases that could lead to violence.
“Every breath you take….every smile you fake….I’ll be watching you” — the lyrics of the 1983 Police song seemed to be on the Government of India’s mind when, on June 28, the National Criminal Record Bureau (NCRB) released a tender for the Automated Facial Recognition System (AFRS). The NCRB contends the AFRS to be a system that is capable of ‘modernizing the police force, information gathering, criminal identification, verification”.
The AFRS, upon implementation, would help in automatic identification and recognition of individuals from digital sources, including CCTV feeds and newspaper images, by creating a repository of facial images that could be utilised in identifying individuals. For this purpose, the AFRS is intended to access photographs uploaded in ‘Passport, CCTNS, ICJS and Prisons, Ministry of women and child development (KhoyaPaya) State or National Automated Fingerprint Identification System or any other image database available with police/other entity, match suspected criminal face from pre-recorded video feeds obtained from CCTVs deployed in various critical identified locations, or with the video feeds received from private or other public organization’s video feeds’.
The tender indicates two key stakeholders: the NCRB, and the state police. It will benefit the NCRB in identifying criminals, missing persons, unidentified dead bodies, and unknown traced persons across India. With a repository of photographs of criminals, the system will help detect crime patterns across states, and communicate with state polices in crime prevention. For the police, such a system will help personnel check the suspect with a hotlist of criminals.
Previously, in PUCL v Union of India, the Supreme Court of India laid down guidelines for interception of telephone calls to safeguard an individual’s privacy rights. More recently, privacy has been held to be a ‘fundamental right’ by the Supreme Court in the landmark decision of J Puttaswamy v Union of India. This decision, overruled an earlier decision in Kharak Singh v The State of UP, which maintained that privacy was not protected under the Constitution.
The Puttaswamy case maintains that privacy, however, is not an absolute right and is subject to exclusions involving instances which dealt with sovereignty and integrity of India, the security of the State, friendly relations with foreign states, public order, decency or morality, or in relation to contempt of court, defamation or incitement to an offence as per Article 19(2) of the Constitution. The Puttaswamy judgment, held out the three-fold test, in determining privacy violation: (a) legality; (b) legitimate state aim; and (c) proportionality.
In response to a notice issued by the Internet Freedom Foundation, a non-profit organisation, questioning the legality of the AFRS system, the Union Home Ministry’s response was that such a system had procured the Union Cabinet’s approval. It further contended that such a system did not violate ‘consent’ given that it was primarily to be used for the identification of recovered children, and the identification of dead bodies.
The primary concerns with imposing such a system are three: Absence of a law, absence of safeguards, and, societal biases:
Absence of a Law: India desperately needs legislation governing personal data protection; however, the draft Bill, made public in 2018, is expected to be tabled in the ongoing winter session of Parliament. In the absence of such legislation, there are no legal recourses available for violation of fundamental rights occurring from the implementation of such a system.
Absence of safeguards: The Puttaswamy judgment prescribed the three-fold test for privacy denial. The AFRS in its current form suffers from an absolute lack of checks and balances, which would ensure that the prescribed guidelines are abided by.
Societal Biases: Despite the best efforts to maintain security, the threat of such a database falling into the wrong hands is a very palpable one. In December, it was revealed that the London Gangs Matrix; a database of suspected gang members in the United Kingdom, leaked, resulting in the targeting of gang members and increased acrimony towards suspected members, with neighbours resorting to increased security measures against ‘suspects’.
The Gangs Matrix was created after the 2011 London riots, but the Information Commissioner's Office (ICO) said it had been so poorly managed since its inception that it constantly failed to properly differentiate between dangerous offenders and their victims. It is contended that the ‘leakage’ of the AFRS could lead to similar disastrous circumstances and result in violent insurgence against members of particular communities.
In 2018, it was reported that China was building the world’s largest surveillance system, which consisted of hundreds of cameras equipped with facial recognition software, combined with Artificial Intelligence (AI) that were able to track with great accuracy. It was later reported that the system had propagated segregation of ethnic groups. Whether India too is headed this way, time will tell/Vikram Koppikar is senior legal and privacy counsel at Tata Consultancy Services. Views are personal.Are you happy with your current monthly income? Do you know you can double it without working extra hours or asking for a raise? Rahul Shah, one of the India's leading expert on wealth building, has created a strategy which makes it possible... in just a short few years. You can know his secrets in his FREE video series airing between 12th to 17th December. You can reserve your free seat here.