Slamming the usage of facial recognition technology by the Greater Chennai Police (GCP), Lok Sabha MP Karti P Chidambaram said that the usage of such technology without any data protection law is illegal and 'has the potential to turn Tamil Nadu into an Orwellian Surveillance State".
This comes a few days after the GCP admitted to using the technology in response to a tweet by a Chennai resident. "Facial recognition is being used during night hours to verify the persons moving around at night. The system is very useful in identifying the criminals instantaneously. Nothing to worry," police had tweeted.
In a letter written on December 13 to Shankar Jiwal, commissioner of GCP, Chidambaram asked the official to share the legislation which authorised TN Police to use the technology; details regarding stakeholder consultations taken by police in this regard; and standard operating procedure for the usage of such technology.
Apart from that, the Congress politician also asked the police commissioner to furnish details regarding the total amount allocated, sanctioned, and utilised for the deployment of facial recognition, along with a list of all officials who have access to the technology.
"The police department assured citizens that there is 'nothing to worry'. But how can citizens not be concerned? Particularly, in the absence of a data protection law, or regulation aimed specifically at FRT," the Lok Sabha MP from Sivaganga constituency said.
The Indian government has currently floated the draft of the Digital Personal Data Protection Bill (DPDP) for consultation.
"In the absence of legal provisions and safeguards that clearly lay down the FRT regime, the use of such systems is illegal. Their use is in clear violation of the fundamental right to privacy upheld by the Hon'ble Supreme Court in Justice K. S. Puttaswamy vs Union of India (2017)," Chidambaram added.
"Further, it is unclear whether the use of FRT by the Greater Chennai Police fulfils the four-fold test of legality, legitimate aim, proportionality, and procedural safeguards as laid down in the judgement," he added.
Attributing to studies conducted by experts, Chidambaram said that FRT is inaccurate and that it poses a risk to citizens of being wrongly identified as criminals.
"Further, owing to inherent biases prevalent in policing there may be disproportionate targeting of certain groups of people. The impact of FRT may also depend on distribution and placement of police stations and CCTV cameras. Areas with more CCTVs are likely to be over-policed and over-surveilled, which will subsequently mean that FRT will have an uneven impact, posing a serious challenge to the fundamental right to equality," he further said.
There has been an increase in the usage of facial recognition technology by the Indian government since the onset of the pandemic in 2020. Critics of the technology and digital rights groups have time and again pointed out that the country still lacks a data protection law to protect against the misuse of sensitive data such as biometrics.