The usage of facial recognition software in criminal cases is not often disclosed to the defendants, according to a new investigation by the Washington Post.
The news company released records from police departments in over 15 states on over 1,000 criminal investigations. About 30 were from cases in which they’ve used facial recognition. In those reports, none of those accused of the crimes were informed of the newest technology, stripping them of the right to contest whether the tool made an error.
Instead, they informed them that “investigative means” were used and that witnesses and officers were used to provide identification.
Overall, a majority of police departments claimed that, while they incorporated the new technology in their search to make an arrest, they refrained from making an arrest just on what the findings were. Others reportedly said that officers are not required to reveal its usage.
To use the software, images from the scene of the crime are run through facial recognition tools to compare the suspect to photos pulled from websites and social media; in the case of Clearview AI, these photos include up to billions of potential results.
Hundreds of Americans have been arrested after being connected to a crime by facial recognition software, a Washington Post investigation has found, but many never know it because police seldom disclose their use of the controversial technology. https://t.co/NUnybJOgag
— The Washington Post (@washingtonpost) October 6, 2024
Of each demographic, BIPOC citizens, women and the elderly are found to be the most likely to be misidentified because of a lack of representation in training systems.
“One of the basic tenets of our justice system is due process, is knowing what evidence there is against you and being able to challenge the evidence that’s against you,” said Carlos J. Martinez, a chief public defender from Miami, per the Washington Post. “When that’s kept from you, that is an all-powerful government that can trample all over us.”
The results from the investigation come months after the city of Detroit agreed to pay a settlement to a man who was accused of a crime he did not commit when the police department used facial recognition software.
In July, the American Civil Liberties Union and the Civil Rights Litigation Initiative announced that Robert Williams would receive $300,000 as part of the settlement because the new technology incorrectly matched a low-quality still of the actual perpetrator to an expired license of Williams.
Williams was wrongfully accused of shoplifting watches at a Shinola store six years ago.
As part of the agreement, the city of Detroit also promised to conduct an audit of cases involving facial recognition from 2017 to 2023.