Detroit police change AI software policies after Black man wrongfully arrested in 2020.

Robert Williams' false arrest resulted in reforms within the Detroit PD.

June 29th 2024.

Detroit police change AI software policies after Black man wrongfully arrested in 2020.
In a recent development, the Detroit Police Department has made significant changes to their facial recognition software policies as part of a settlement with the federal government. This decision comes after the wrongful arrest of Robert Williams, a Black man who was mistakenly identified as a theft suspect due to flaws in the technology. The Detroit Free Press reported that the city council approved a $300,000 compensation for Williams in May, as part of the agreement filed in U.S. District Court.

This incident, which occurred in 2018, was previously covered by BLACK ENTERPRISE. Williams was arrested in 2020 in front of his family for a robbery at a Shinola store, which he did not commit. The detective responsible for his arrest had used a poor-quality image from security footage, which resulted in a false match with Williams' expired driver's license photo. This led to his detention for 30 hours before the case was eventually dismissed. Williams then took legal action against the Detroit Police Department, citing the distress and trauma inflicted upon him and his loved ones.

At a press conference on June 28, the American Civil Liberties Union (ACLU) of Michigan, who represented Williams, announced the sweeping changes to the Detroit police software policies. These new guidelines now prohibit arrests solely based on facial recognition results or related lineups. Furthermore, officers are now required to disclose any limitations of the technology and its use in arrests, including cases where it fails to identify suspects or produces conflicting results. The revised policies also mandate comprehensive training on the software's risks, particularly its higher error rate with people of color. Additionally, an audit of all cases involving facial recognition technology since 2017 is now mandatory.

These revised policies, which are enforceable by federal court for four years, aim to prevent future miscarriages of justice. Phil Mayor, a senior legal expert at the ACLU of Michigan, believes that Detroit's new guidelines will serve as a benchmark for law enforcement agencies nationwide on the ethical use of facial recognition software. Similarly, Detroit police officials expressed their satisfaction with the policy overhaul, stating their belief that these changes will set them as a national role model for best practices in facial recognition. They also highlighted the common ground between their mission and that of the ACLU.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]

 0
 0