UK police’s use of facial recognition does not meet ‘legal and ethical standards’

UK police’s use of facial recognition does not meet ‘legal and ethical standards’

UK police’s use of live facial recognition technology does not meet “minimum ethical and legal standards” and should be banned from public places, say researchers from the University of Cambridge.

A group of researchers at the Minderoo Center for Technology and Democracy(Opens in a new window) analyzed three separate cases of facial recognition technology (FRT) used by two police forces—South Wales Police and the Metropolitan Police Service (MPS). In any event, the FRT was found to potentially violate human rights.

The researchers created an audit tool to check FRT deployments against current legal guidelines—including the UK’s Data Protection and Equality Acts—as well as outcomes from UK court cases.

They applied their ethical and legal standards to three UK police uses of FRT. In two cases, the technology was used by the MPS and South Wales Police to scan crowds and compare individuals with those on a criminal database and “watch list.” In the third case, officers from South Wales Police used FRT smartphone apps to scan crowds. and identify “wanted” people in real time.

In all three cases, it was found that there was a lack of transparency, accountability and oversight in the use of the FRT.

The study found that important information about police use of FRT is “kept out of sight,” such as demographic data released on arrests or other outcomes, which researchers say makes it difficult to assess whether the tools “perpetuate the racial profiling’. The police had not released internal audits into whether their technology was biased.

In addition to a lack of transparency, the researchers found that there was very little accountability for police — with no clear recourse for individuals or communities adversely affected by police use or misuse of the technology. “Police forces are not necessarily responsible or held liable for harm caused by facial recognition technology,” said Evani Radiya-Dixit, lead author of the report.

“There is a lack of robust redress mechanisms for individuals and communities harmed by police deployment of technology. To protect human rights and improve accountability in how technology is used, we need to ask what values ​​we want to embed in technology,” said Radiya-Dixit.

Professor Gina Neff, Executive Director at the Minderoo Center for Technology and Democracy, said: “In recent years, police forces around the world, including in England and Wales, have developed facial recognition technologies. We aimed to assess whether these deployments used known practices for the safe and ethical use of these technologies.

Recommended by our editors

“Building a unique control system allowed us to address the issues of privacy, equity, accountability and oversight that should accompany any police use of such technologies,” Neff said.

The researchers have joined experts from the EU and the UN High Commissioner for Human Rights in calling for the ban(Opens in a new window) of FRT in public places.

British police have been testing the use of FRT for years in multiple situations to fight crime and terrorism. Its first documented use in the UK was in 2015 by Leicestershire Police on festival goers. It has since been used extensively by South Wales Police and the Metropolitan Police to scan hundreds of thousands of faces at demonstrations, sporting events, concerts, Notting Hill Carnival, train stations and busy shopping streets.

There is worldwide concern about the use of FRT by police forces. The same technology used by the Metropolitan Police was found to misidentify black men. In 2020, Amnesty International led a call(Opens in a new window) to ban the use of FRT by police forces as it could “exacerbate human rights abuses”.

SecurityWatch<\/strong> newsletter for our top privacy and security stories delivered right to your inbox.”,”first_published_at”:”2021-09-30T21:22:09.000000Z”,”published_at”:”2022-03-24T14:57:33.000000Z”,”last_published_at”:”2022-03-24T14:57:28.000000Z”,”created_at”:null,”updated_at”:”2022-03-24T14:57:33.000000Z”})” x-show=”showEmailSignUp()” class=”rounded bg-gray-lightest text-center md:px-32 md:py-8 p-4 mt-8 container-xs”>

Do you like what you read?

Sign up SecurityWatch newsletter of the top privacy and security stories delivered straight to your inbox.

This newsletter may contain advertisements, promotions or affiliate links. Signing up for a newsletter indicates your agreement to our Terms of Use and Privacy Policy. You can unsubscribe from newsletters at any time.

Leave a Reply

Your email address will not be published. Required fields are marked *