The American Civil Liberties Union recently called on the U.S. police force to halt using generative AI tech to write police reports, warning that it might be a threat to citizens’ American Civil Liberties.
Published last week in a six-page white paper, the American Civil Liberties Union, also known more simply as the ACLU, highlighted a need for a human element when discussing such a crucial role in the judicial system, denouncing the usage of “Draft One.”
Draft One is currently being used to transcribe audio from body camera footage and turn it into a report for officers to submit, “swearing” that the events happened.
Per the ACLU, using this technology, however, opens the door for errors known as “hallucinations,” in which the AI tool is unable to decipher body camera audio, mixing up situations and, in some situations, making them up completely.
The nonprofit civil rights organization also emphasized the need to move further away from the AI tool and more towards focusing on the officers’ subjective experiences and memories before any footage is seen.
The ACLU continued to point to a lack of transparency on how the report is generated, including what prompt is entered, as well as the loss of reflection through writing on the liberties police should take when exercising their legal limits.
“Because police reports play such an important role in criminal investigations and prosecutions, introducing novel AI language-generating technology into the criminal justice system raises significant civil liberties and civil rights concerns,” said ACLU researcher Jay Stanley per the report. “In the end, we do not think police departments should use this technology.”
As part of their research, the ACLU also pointed to the biases the AI tool may perpetrate, pointing to the tool’s poor transcription of African American English.
Previous studies have already highlighted the biases that AI has against AAE. In a study published by the University of Chicago, when researchers fed sentences written in AAE to AI models, the technology generated negative stereotypes. The stereotypes included being “ignorant,” “lazy” and “stupid.” These associations, according to researchers, are the worst ever recorded, including those perpetrated by humans during the Jim Crow Era.
AAE was also less likely to associate with jobs, associate fictional characters with less prestigious occupations, and be more likely to be convicted in hypothetical criminal cases.
“We’re in a moment where you have all these emergent ideas about how to use this technology,” said the study’s researcher, University of Chicago Asst. Prof. Sharese King. “These findings are really a word of caution about how we’re considering its use and how it might disproportionately affect one group more negatively than another.”