Facial recognition
Cardiff man gets go-ahead to bring first UK legal challenge to police use of facial recognition technology on the streets
Posted on 02 Jul 2018
Ed Bridges has been given the go-ahead to start the first legal challenge to a UK police force’s use of automated facial recognition (AFR) technology, in what will be a critical nationwide test of the state’s power to deploy radical biometric surveillance methods.
A Cardiff resident has been given the go-ahead to start the first legal challenge to a UK police force’s use of automated facial recognition (AFR) technology, in what will be a critical nationwide test of the state’s power to deploy radical biometric surveillance methods.
Ed Bridges – represented by human rights organisation Liberty – had threatened legal action against South Wales Police if it did not immediately end its use of AFR technology in public spaces.
Chief Constable Matt Jukes has now confirmed the force will not seek to prevent the case from taking place – paving the way for the High Court to review South Wales Police’s ongoing deployment of the intrusive technology. The Chief Constable has said that South Wales Police welcomes the scrutiny of the Court on the issue.
Surveillance cameras equipped with AFR software scan the faces of passers-by, making unique biometric maps of their faces. These maps are then compared to and matched with other facial images on bespoke, often error-ridden police databases.
South Wales Police has used facial recognition in public spaces on at least 22 occasions since May 2017. Ed believes his face was scanned by South Wales Police at both an anti-arms protest and while doing his Christmas shopping.
He will seek to challenge the use of AFR technology in court because it violates the privacy rights of everyone within range of the cameras, has a chilling effect on protest, discriminates against women and BAME people, and breaches data protection laws.
Members of the public have so far donated more than £3,450 to Ed’s challenge via crowdfunding site CrowdJustice.
Ed Bridges said: “This dystopian style of policing has no place in Cardiff or anywhere else and I am delighted this legal challenge will go ahead. Without warning the police have used this invasive technology on peaceful protesters and thousands of people going about their daily business, providing no explanation of how it works and no opportunity for us to consent.
“The police’s indiscriminate use of facial recognition technology on our streets makes our privacy rights worthless and will force us all to alter our behaviour – it needs to be challenged and it needs to stop.”
Megan Goulding, Lawyer for Liberty and solicitor for Ed Bridges, said: “We are pleased South Wales Police has recognised the importance of this issue and agreed to a judge reviewing its actions. The police’s creeping rollout of facial recognition is not authorised by any law, guided by any official policy or scrutinised by any independent body.
“Scanning the faces of thousands of people whenever they see fit and comparing them to shady databases which can contain images sourced from anywhere at all has seriously chilling implications for our freedom.”
South Wales Police and facial recognition
AFR technology scans the faces of all passers-by in real-time. The software measures their biometric facial characteristics, creating unique facial maps in the form of numerical codes. These codes are then compared to those of other images on bespoke databases.
Three UK police forces have used AFR technology in public spaces since June 2015 – South Wales, the Metropolitan Police and Leicestershire Police. South Wales has been at the forefront of its deployment, using the technology in public spaces at least 20 times.
Ed’s face has likely been mapped and his image stored at least twice. He believes he was scanned as a passer-by on a busy shopping street in Cardiff in the days before Christmas, and then again while protesting outside the Cardiff Arms Fair in March 2018.
South Wales Police has admitted it has used AFR technology to target petty criminals, such as ticket touts and pickpockets outside football matches, but they have also used it on protesters.
On 27 March 2018, the police used AFR technology at a protest outside the Defence, Procurement, Research, Technology and Exportability Exhibition – the ‘Cardiff Arms Fair’. Ed attended the protest and he believes he, like many others there, was scanned by the AFR camera opposite the fair’s main entrance.
Protestors were not aware that facial recognition would be deployed and the police did not provide any information at the time of the event.
Freedom of Information requests have revealed South Wales Police’s use of AFR technology has resulted in ‘true matches’ with less than nine per cent accuracy – 91 per cent of ‘matches’ were misidentifications of innocent members of the public.
South Wales Police has wrongly identified 2,451 people, 31 of whom were asked to confirm their identities. Only 15 arrests have been linked to the use of AFR.
On one occasion – at the 2017 Champions League final in Cardiff – the technology was later found to have wrongly identified more than 2,200 people as possible criminals.
Images of all passers-by, whether or not they are true matches, are stored by the force for 31 days – potentially without their knowledge.
Members of the public scanned by AFR technology have not provided their consent and are often completely unaware it is in use. It is not authorised by any law and the Government has not provided any policies or guidance on it. No independent oversight body regulates its use.
The case
Ed Bridges is taking legal action against South Wales Police because its use of AFR technology:
- Violates the general public’s right to privacy by indiscriminately scanning, mapping and checking the identity of every person within the camera’s range, capturing personal biometric data without consent – and can lead to innocent people being stopped and questioned by police.
- Interferes with freedom of expression and protest rights, having a chilling effect on people’s attendance of public events and protests. The presence of a police AFR van can be highly intimidating and affect people’s behaviour by sending the message that they are being watched and can be identified, tracked, and marked for further police action.
- Discriminates against women and BAME people. Studies have shown AFR technology disproportionately misidentifies female and non-white faces, meaning they are more likely to be wrongly stopped and questioned by police and to have their images retained.
- Breaches data protection laws. The processing of personal data cannot be lawful because there is no law providing any detailed regulation of AFR use. The vast majority of personal data processed by the technology is also irrelevant to law enforcement – belonging to innocent members of the public going about their business – and so the practice is excessive and unnecessary.
Contact the Liberty press office on 020 7378 3656 / 07973 831 128 or pressoffice@libertyhumanrights.org.uk
I'm looking for advice on this
Did you know Liberty offers free human rights legal advice?
What are my rights on this?
Find out more about your rights and how the Human Rights Act protects them
Did you find this content useful?
Help us make our content even better by letting us know whether you found this page useful or not