Big Brother Watch is calling for police to stop using facial recognition technology which it claims is “dangerous and inaccurate,” after revealing potential human rights violations.
The group’s Face Off report was launched in parliament on Tuesday, with shadow home secretary Diane Abbott and shadow policing minister, Louise Haigh, slated to speak at the event.
The research details Freedom of Information (FOI) responses from three police forces which use the controversial technology at sporting events and similar to identify suspects in real-time, including the Met and South Wales Police.
Big Brother Watch claimed that the tech is “almost entirely inaccurate,” with false positives at the Metropolitan Police of 98%, despite millions of pounds of taxpayers’ money being spent.
There are also serious privacy concerns, with South Wales Police said to have stored images of 2400 innocent people incorrectly matched by facial recognition for a year, without their knowledge.
The use of this technology could breach the Human Rights Act, according to the group.
“Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified — or misidentified — everywhere they go,” warned Big Brother Watch director, Silkie Carlo.
“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals. It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.”
The campaign has the backing of MP David Lammy and 15 rights and race equality groups including Liberty, Article 19 and the Race Equality Foundation.
Big Brother Watch also raised a wider issue of police handling of custody images, claiming that the photos of innocent members of the public are still kept on file even if they’re released without charge.
They can then end up on the Police National Database and be turned into facial biometrics used to identify individuals via specialized software. The rights group argues that police should delete such images as they do fingerprints and DNA once an individual is found to be innocent or released without charge.