Table of Contents
“I don’t think taking technology away from [the police] is going to solve the problem,” iOmniscient CEO Rustom Kanga told CNN Business.
iOmniscient makes systems that can detect faces and analyze behavior in a crowd, and says its technology has been deployed by companies and governments in over 50 countries. Kanga supports protests against excessive force by police, and sees some behavior by US officers as “unduly brutal.” But ultimately, he said, this is a “management issue.”
“We need to give the police the tools to do their jobs, to find lost children and stop terrorism,” Kanga said.
Civil society groups, academics and some politicians disagree. They warn that use of facial recognition technology by governments and police poses huge risks in a democratic society, permitting surveillance in public spaces and vastly expanding the powers of law enforcement to secretly identify and trace citizens. Some algorithms are less effective at identifying people of color, raising fears that its use harms minority communities.
Silicon Valley’s exit
The large US tech companies put facial recognition projects on hold as protests rocked the United States after George Floyd, an unarmed Black man, was killed by police officers in Minneapolis, Minnesota.
“Technology can increase transparency and help police protect communities but must not promote discrimination or racial injustice,” IBM CEO Arvind Krishna said in a letter to Congress, highlighting the need for testing on bias to be audited and reported.
“They are innovators … but they’re not very significant players in the product market,” he said.
Use of facial recognition technology has exploded among police departments over the past two decades, touted by both suppliers and law enforcement as an efficient and accurate way to narrow down leads or identify people wanted by authorities. There are more than 80 vendors around the world offering facial recognition or facial verification capabilities, according to a Gartner report from 2019.
Most commonly, officers use these systems to try to match a photo of a person against a database of images. There are also systems that allow law enforcement to use facial recognition to verify a person’s identity, like at airport passport gates. Some cities have signed contracts for live facial recognition technology and at least one — London — has used it to scan crowds for people on a watchlist in real time.
But the technology is largely unregulated, raising concerns about misuse and bias, which can arise when the datasets used to train algorithms aren’t sufficiently diverse.
Clare Garvie, a senior associate at the Georgetown Law Center on Privacy and Technology, worries about the consequences of deploying such technology within a society that “disproportionately surveils and incarcerates Black people.”
“Facial recognition will be disproportionately used on communities of color” in the United States, she said.
The companies still selling
Despite the announcements by IBM, Amazon and Microsoft, the smaller companies that provide the technology to police around the world have no plans to exit the market. Multiple executives at these firms said the decisions made by their larger rivals were motivated by political considerations.
“There’s no reason not to provide facial recognition technology to them,” Ayonix CEO Sadi Vural said in an interview with CNN Business, calling police departments “good customers.”
Vural said Ayonix — a Japanese firm that’s a smaller player in the US market — has between 60 and 70 police contracts, including some in the United States. NEC and Cognitec, which also sell facial recognition technology used at airports, have a bigger presence, as does Clearview AI.
In response to questions from CNN Business, the facial recognition companies stressed the ways their technology can help law enforcement, including its ability to prevent human trafficking and identify individuals who have left suspicious packages in public places.
Kanga of iOmniscient said that America has a “specifically unique problem” and that the company was selective in choosing its business partners. He also defended the company’s sales to Hong Kong, saying those deals were made before local officers began to act like the “long arm of the Chinese police.”
Elke Oberg, marketing manager for Cognitec, told CNN Business that facial recognition technology should be used exclusively as a “lead generation tool,” and that Cognitec had stopped offering facial recognition technology that allows officers to identify suspects in the field, even though it’s still listed on the company’s website.
In a statement, NEC Corporation of America President Mark Ikeno expressed “sadness, anger, grief, frustration, and a strong desire for change” as a result of Floyd’s death. He said the company, however, will maintain its partnerships with law enforcement to “cooperatively ensure that our efforts to make society safer equally make society more just and inclusive.”
For these companies, facial recognition is a key or growing part of their business, making it more difficult for them to exit the market than Amazon or Microsoft, Gartner’s Ingelbrecht said.
“This is their bread and butter,” he said. “They’ve got a lot more to lose if the facial recognition market runs into headwinds.”
Moving ahead
That would deal a major blow to companies like NEC and Cognitec, which have honed in on the US market.
NEC, in particular, is a popular choice due to its high performance. Its algorithm was among the most accurate of those tested by the National Institute of Standards and Technology and one of its algorithms was found to yield no detectable demographic bias.
Police departments from Texas to Virginia have purchased software that uses NEC’s algorithms, either directly or through companies that supply technology to law enforcement, such as South Carolina-based DataWorks Plus, according to NEC materials and queries from the Georgetown Law Center on Privacy and Technology.
Georgetown’s Garvie thinks the bill introduced by Democrats last week is “finally” strict enough to address the problems she believes are inherent to facial recognition technology. Its prospects under a highly partisan Congress aren’t clear, though restricting facial recognition has garnered bipartisan support in the past.
Police departments say they should have access to facial recognition technology, which the iPhone has already helped make part of our daily lives. They note that officers decide who to arrest, but in some cases, the tech could provide a crucial assist.
While the United States is an important market, it’s far from the only country where law enforcement is relying on facial recognition technology.
Cressida Dick, commissioner of the Metropolitan Police, has ardently defended the program, citing public support.
London police haven’t used live facial recognition technology since the city was locked down in March, but that could change as restrictions on movement are eased.
“At present, we have not announced any specific plans for future deployments but will do so based on the need to deploy aimed at tackling violence and other serious offenses,” London’s Metropolitan Police said in a statement to CNN Business.
Stephanie Hare, a London-based independent researcher who focuses on tech ethics, doesn’t think use of the technology is going away, even if the software needs to be tweaked for the age of face masks.
“If anything, I see a greater demand,” Hare said, pointing to a post-pandemic world where both governments and citizens agree to higher levels of surveillance as a means of protecting public health.
Civil liberties groups have also expressed concern that police have the capacity to deploy facial recognition technology against Black Lives Matter protesters. They worry this could dissuade people from exercising their right to assembly.
“We’ve never before had the ability to surveil people, to identify people, out of a crowd,” Garvie said. “[That] has huge consequences.”