Police forces in the UK are using police prediction tools based on discriminatory data, effectively “supercharging racism,” according to a new report by Amnesty International UK.
Released late last week, the 120-page report found that a minimum of 33 police forces, located throughout the Midlands, Manchester and Essex, have used some type of predictive profiling or risk prediction system.
These policing systems rely on information that targets specific geographic locations and places people in a secret database that profiles someone as likely to commit a crime. In the cases of systems such as The Metropolitan Police Service’s Violence Harm Assessment, people can be profiled without ever having committed a crime or a general offense.
🚨 BREAKING: Police Trying to “Predict Crime” EXPOSED
— Amnesty UK (@AmnestyUK) February 21, 2025
Full report: https://t.co/nqeIDW3haO
The evidence that this technology keeps us safe just isn’t there.
The evidence that it violates our fundamental rights is clear as day. pic.twitter.com/Q498Mi5JWE
In these databases, particularly the Greater Manchester Police’s gang profiling, Black people are overrepresented, leading to targeted attacks as the police continue to try to ban people from events in Manchester.
The same overrepresentation persisted in other systems, such as that of the West Midlands Police, as police continued to stop and search Black people. According to the Amnesty International UK report, in 2023, there were 24.5 stop and searches per every 1,000 Black people; in comparison, there were only 5.9 stop and searches for every 1,000 white people.
“The evidence that this technology keeps us safe just isn’t there, the evidence that it violates our fundamental rights is clear as day. We are all much more than computer-generated risk scores,” said the Chief Executive of Amnesty International UK, Sacha Deshmukh, per a statement. “The future they are creating is one where technology decides that our neighbours are criminals, purely based on the colour of their skin or their socio-economic background.”
The latest report comes amidst warnings from activist groups that technology poses a threat to citizens’ rights in the U.S. police force as well.
⚠️ A vicious cycle of discrimination…
— Amnesty UK (@AmnestyUK) February 21, 2025
Read the full report: https://t.co/j83vFeNC0y
DID YOU KNOW: ~75% of police forces around the UK are using technology to "predict crime"- resulting in racial profiling of whole communities.
Eroding our rights does not keep us safe. pic.twitter.com/OdplQXEqc7
In a statement released late in 2024, the American Civil Liberties Union highlighted the harm that generative AI can cause when the U.S. police force incorporates it to write their police reports. Along with opening the door for errors in which the AI tool fictionalized the situation to make up for the lack of audio, the ACLU emphasized how AI perpetuates biases through situations such as the built-in racial profiling of AAE.
The AI program, known as Draft One, is still currently being used by police forces throughout the nation.