
five days later A year after Russia launched an all-out invasion of Ukraine, US-based facial recognition company Clearview AI offered the Ukrainian government free access to its technology this week, suggesting it could be used to reunite families, identify Russian agents and combat misinformation. Shortly after, the Ukrainian government revealed that it was using the technology to scan the faces of dead Russian soldiers to identify their bodies and notify their families.By December 2022, Ukraine’s Deputy Prime Minister and Minister of Digital Transformation Mykhailo Fedorov will become tweet A photo of him with Clearview AI CEO Hoan Ton-That, thanking the company for its support.
Tracing the whereabouts of the dead and informing families of the fate of their loved ones is a human rights requirement enshrined in international treaties, protocols and laws, such as the Geneva Conventions and the International Committee of the Red Cross (ICRC) guidelines for the management of the dead with dignity. It also has to do with deeper obligations. Caring for the dead is one of the oldest human acts, and it makes us human, as does language and the capacity for self-reflection. Historian Thomas Laqueur, in his epic meditation, the work of the dead, Writes: “As far back as the topic has been discussed, caring for the dead has been considered a foundational life of religion, polity, clan, tribe, ability to mourn, understanding of finitude, civilization itself.” However, using facial recognition technology to identify the dead Use the moral weight of this care to authorize a technology that raises serious human rights concerns.
in Ukraine, In Europe’s bloodiest war since World War II, facial recognition appears to be just another tool, alongside digitized mortuary records, mobile DNA labs and the digging of mass graves, for the grim task of identifying the dead.
But does it work? Ton-That says his company’s technology “works effectively whether or not the face of the deceased is damaged.” Few studies support this claim, but the authors of a small study found that even for faces in a state of decay, The results were also “promising”. However, forensic anthropologist Louise von de Bried, a former head of the ICRC’s forensic services who has worked in conflict zones around the world, casts doubt on these claims. “This technology lacks scientific credibility,” he said. “It’s definitely not widely accepted by the forensic community.” (DNA testing remains the gold standard.) The forensic field “understands the importance of technology and new developments,” but in Fondebrider’s view, the rush to use facial recognition is “political and commercial.” Combined, there is little science”. “There is no magic solution for identification,” he said.
Using unproven technology to identify fallen soldiers can lead to mistakes and trauma for families. But even if the forensic use of facial recognition technology is backed up by scientific evidence, it shouldn’t be used to name the dead. It’s too dangerous for a living person.
Groups including Amnesty International, the Electronic Frontier Foundation, the Surveillance Technology Oversight Project, and the Immigrant Defense Project have declared facial recognition technology a form of mass surveillance that threatens privacy, expands racist policing, threatens the right to protest, and can lead to lawlessness arrest. Damini Satija, head of Amnesty International’s Algorithmic Accountability Lab and deputy director of technology at Amnesty, said facial recognition technology “reproduces structural discrimination at scale and automates and deepens existing societal inequalities.” Violation of human rights. In Russia, facial recognition technology is being used to suppress political dissent. It was used against legal and ethical standards by law enforcement in the UK and US, and was used against marginalized communities around the world.
Clearview AI, which sells primarily to the police, has one of the largest known mugshot databases, with 20 billion images, and plans to collect another 100 billion — the equivalent of 14 photos of every person on earth. The company has promised investors that soon “virtually everyone in the world will be identifiable.” Regulators in Italy, Australia, the UK and France have outlawed Clearview’s database and ordered the company to delete photos of its citizens. In the EU, Reclaim Your Face, a coalition of more than 40 civil society organizations, has called for a total ban on facial recognition technology.
AI Ethics Researcher Stephanie Hare said Ukraine was “using a tool to promote a company and a CEO who behaved not only unethically but illegally.” She speculated it was a case of “unscrupulous” but she asked, “Why is it so important that Ukraine is able to identify dead Russian soldiers using Clearview AI? How does that matter in defending Ukraine or winning the war?”