Predictive policing
Report: Policing by machine
Posted on 01 Feb 2019
Predictive policing and the threat to our rights
Policing by Machine – Predictive Policing and the Threat to Our Rights collates the results of 90 Freedom of Information requests sent to every force in the UK, laying bare the full extent of biased ‘predictive policing’ for the first time – and how it threatens everyone’s rights and freedoms.
It reveals that 14 forces are using, have previously used or are planning to use shady algorithms which ‘map’ future crime or predict who will commit or be a victim of crime, using biased police data.
The report exposes:
- police algorithms entrenching pre-existing discrimination, directing officers to patrol areas which are already disproportionately over-policed
- predictive policing programs which assess a person’s chances of victimisation, vulnerability, being reported missing or being the victim of domestic violence or a sexual offence, based on offensive profiling
- a severe lack of transparency with the public given very little information as to how predictive algorithms reach their decisions – and even the police do not understand how the machines come to their conclusions
- the significant risk of ‘automation bias’ – a human decision-maker simply deferring to the machine and accepting its indecipherable recommendation as correct.
I'm looking for advice on this
Did you know Liberty offers free human rights legal advice?
What are my rights on this?
Find out more about your rights and how the Human Rights Act protects them
Did you find this content useful?
Help us make our content even better by letting us know whether you found this page useful or not