What Can We Learn From a Flawed Live Facial Recognition Experiment?

Facial recognition technology can be used to prevent criminal activity. But, in London, one study shows the police system gets it wrong 81 percent of the time.
A passenger uses his biometric passport at an automated ePassport gate equipped with a facial recognition system at the British border of the Eurostar at the Gare du Nord in Paris on February 17th, 2017.

On 10 occasions between 2016 and 2019, London’s Metropolitan Police Service tested the use of live facial recognition (LFR) to match thousands of faces in public spaces with offenders on a watch list. They conducted these trials at soccer matches, music festivals, and transport hubs.

Now, an independent study commissioned by the police and conducted by the University of Essex argues that the experiment neglected fundamental human rights concerns. The study also raises serious doubts about the experiment’s efficiency as a deterrent of criminal activity, as well as the legal basis to support its implementation.

From June of 2018 to February of 2019, researchers witnessed the last six trials. Out of the 42 matches considered eligible for analysis, only eight proved to be correct—that is, less than 20 percent.

Police use of this controversial technology—which relies on the detection of faces by livestream cameras and comparison with images on a database of suspects or missing people—has twice been challenged in court in the United Kingdom for violation of privacy and data protection rights. It has also brought about concerns around gender and racial bias. In the United States, San Francisco became the first city to ban its implementation by local agencies in May, amid further scrutiny in the state of California. This new study represents yet another blow to the technology’s enthusiasts and shines an even brighter light on the issues, both ethical and practical, which may arise from its widespread use.

Necessary in a Democratic Society?

Taking into account potential risks to privacy, freedom of expression, and freedom of association, researchers found that the trials as operated by the London police didn’t meet a basic standard under the European Convention on Human Rights, which requires that any interference with individuals’ rights be “necessary in a democratic society.” The report also adds that the experiment would likely be deemed unlawful if challenged in court, as there’s no explicit authorization for the use of live facial recognition in British law.

“Of particular concern is the lack of effective consideration of alternative measures, the absence of clear criteria for inclusion on the watchlist, including with respect to the seriousness of the underlying offence, and the failure to conduct an effective necessity and proportionality analysis,” the study states.

Who Is Wanted?

The definition of “wanted” for the purposes of inclusion on the police watch list was ambiguous: It included not only people of interest to the police, but also to the courts, and involved a wide range of offenses. Additionally, because of outdated data, individuals whose criminal cases had already been resolved were also stopped by the police. “Ensuring accurate and up-to-date information from across these different data sources posed a significant challenge,” the researchers write. “Such difficulties made compliance with overall standards of good practice complex.”

In one instance reported by the researchers, a 14-year-old boy in a uniform was mistakenly stopped, surrounded by five officers, and taken to a side street. “He was visibly distressed and clearly intimidated,” according to the report.

Oversight Needed

When pointing out ethical issues associated with the use of LFR for police operations, such as consent and the risk of eroding public trust, researchers also criticized the lack of government oversight on a national level to determine whether and how trials should be conducted. “Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process,” Daragh Murray, co-author of the report, said in a statement.

Related Posts