
Indian law enforcement Begin to attach great importance to facial recognition technology. Delhi police are investigating people involved in civil unrest in northern India over the past few years and say they consider an 80 percent and above accuracy rate as a “positive” match, according to documents obtained through public records by the Internet Freedom Foundation. Require.
The arrival of facial recognition technology in India’s capital region marks the expansion of the use of facial recognition data by Indian law enforcement officials as evidence for potential prosecutions, raising alarm bells among privacy and civil liberties experts. There are also concerns about the 80% accuracy threshold, which critics argue is arbitrary and too low due to possible consequences for those flagged as a match. The lack of a comprehensive data protection law in India makes things even more worrying.
The document further states that even a match rate of less than 80 percent would be considered a “false positive” rather than a negative, which would allow the individual to be “subject to appropriate verification by other corroborating evidence.”
“This means they will continue to investigate even if facial recognition does not give them results at thresholds they decide for themselves,” said Anushka Jain, deputy policy adviser for surveillance and technology at the IFF, who submitted the information. Harassment of individuals simply because technology suggests they look similar to the person police are looking for.” She added that this move by Delhi Police could also lead to the harassment of people from communities that have historically been targeted by law enforcement officers.
In response to the IFF’s request for records, police said they were using criminal photos and file photos for facial recognition. They added that these could serve as evidence, but declined to share further details. However, they clarified that if the match is successful, police will conduct further “empirical investigations” before taking any form of legal action. Delhi police did not respond to Wired’s emailed request for comment.
Divij Joshi, who has spent time researching the legitimacy of facial recognition systems, says the 80 percent match threshold is practically meaningless. Joshi explained that the accuracy numbers largely depend on the conditions under which the facial recognition technology models are tested against specific benchmark datasets.
“The normal accuracy of a facial recognition or machine learning system is determined by comparing a model developed on training and validation data with a benchmark dataset,” said UCL PhD student Joshi. “Once the training data is adjusted, it has to be benchmarked against a third-party dataset or a slightly different dataset.” Such benchmarks are often used to calculate prediction accuracy percentages, he said.
Evidence of racial bias in facial recognition models has long made the use of the technology problematic. While many variables can affect the accuracy of a facial recognition system, it seems highly unusual for police to widely use a system with an overall accuracy threshold of 80 percent. A 2021 study by the National Institute of Standards and Technology found that a system used to match a single scan of a traveler’s face with a database containing their photos was 99.5 percent accurate or better. However, other studies have found errors as high as 34.7 percent when used to identify women with darker skin.