HomeNewsUkraine uses facial recognition software to identify Russian soldiers killed in combat
Ukraine uses facial recognition software to identify Russian soldiers killed in combat
March 25, 2022
Ukraine is using facial recognition software to help identify the bodies of Russian soldiers killed in combat and track down their families to inform them of their deaths, Ukraine’s vice-prime minister told the Reuters news service.
Mykhailo Fedorov, Ukraine’s vice-prime minister who also runs the ministry of digital transformation, told Reuters his country had been using software facial recognition provider Clearview AI to find the social media accounts of dead Russian soldiers.
“As a courtesy to the mothers of those soldiers, we are disseminating this information over social media to at least let families know that they’ve lost their sons and to then enable them to come to collect their bodies,” Fedorov said in an interview, speaking via a translator.
Ukraine’s Ministry of Defense this month began using technology from Clearview, which scrapes images on the web to match with faces featured in uploaded photos. Reuters first reported Ukraine’s use of Clearview earlier this month, but it was not clear at that time how the technology would be used.
Clearview offered its service free of charge to Ukraine after the Russian invasion and has said its search engine includes more than 2bn images from VKontakte, a popular Russian social media service. VKontakte did not respond to a request for comment.
A New York based software company, Clearview AI has sparked criticism over its privacy practices from users and authorities around the world.
Just this month, Italy fined the company €20m for violating EU consumer privacy laws and ordered the company to delete all its data on Italian residents. Earlier, both the UK Information Commissioner Office and authorities in France demanded that Clearview AI stop processing all user data.
The company is also battling a lawsuit in US federal court in Chicago filed by consumers under the Illinois Biometric Information Privacy Act. The ongoing case concerns whether the company’s gathering of images from the internet violated privacy law.
Clearview has said its actions have been legal, and that its face matches should only be a starting point in investigations.
Several reports have also raised questions about the technology’s reliability. Studies have shown that facial recognition software often fails to identify Black and brown faces and can introduce biases in policing. Clearview has disputed such assertions.
Richard Bassed, head of the forensic medicine department at Monash University in Australia, said facial recognition can be unreliable when used to identify the dead and that fingerprints, dental records and DNA remain the most common ways of confirming someone’s identity.
Obtaining pre-death samples of such data from enemy fighters is challenging, though, opening the door to innovative techniques such as facial recognition.
But clouded eyes and injured and expressionless faces can render facial recognition unusable on the dead, said Bassed, who has been researching the technology.
“If the technology is truly only used for identifying the dead, which I’m quite skeptical of, the biggest risk is misidentification and wrongfully telling people that their loved ones have died,” said Albert Fox Cahn, the founder of Surveillance Technology Oversight Project, a privacy advocacy group.
Fedorov, the Ukrainian vice-prime minister, declined to specify the number of bodies identified through facial recognition but he said the percentage of recognized individuals claimed by families has been “high”. Reuters and the Guardian were unable to independently verify that claim.
Fedorov said Ukraine was not using the technology to identify its own troops killed in battle. He did not specify why.
In the US, the Armed Forces Medical Examiner System said it has not adopted automated facial recognition because the technology is not generally accepted in the forensic community.
In addition to concerns about reliability and breaches of privacy, there are also questions about what Clearview AI will do with the data it collects, including “photos of battlefield casualties”, said Cahn.
“I have no transparency around how about data is used, retained, and shared,” he said. “But it’s hard to imagine a situation where it is harder to enforce any restrictions on the use of biometric tracking than an active warzone. Once the technology is introduced into the conflict for one reason, it will inevitably be used for others. Clearview AI has no safeguards against that sort of misuse of the technology, whether it’s investigating people at checkpoints, interrogations, or even targeted killings,” he said.
Clearview said in a statement it is ensuring each person with access to the tool is trained on how to use it safely and responsibly. “War zones can be dangerous when there is no way to tell apart enemy combatants from civilians. Facial recognition technology can help reduce uncertainty and increase safety in these situations,” the company said. It added that some tests have shown the software is bias-free and can pick the correct face out of a lineup of over 12m photos at an accuracy rate of 99.85%.
A Kremlin spokesperson told Reuters that Moscow has “no knowledge” of Ukraine’s use of Clearview software. “There are too many fakes coming out of Ukraine,” the spokesperson added. The spokesperson did not provide further details.
Ukraine’s military has said some 15,000 Russian soldiers have been killed since Russia invaded on 24 February. Russia has not updated its casualty figures since 2 March, when it said 498 soldiers had been killed in what it describes as a “special military operation” to demilitarize Ukraine.
Asked about the recovery of soldiers’ bodies from Ukraine, the Kremlin spokesman said the question of casualties from the military operation was the competence of the defense ministry. Russia’s defense ministry did not immediately respond to a request for comment.