Police in Detroit sued for reckless use of facial recognition tech, yet again.

Six people falsely accused of crimes by facial recognition tech, all Black. ACLU warned of harm to Black citizens due to tech deployment.

October 16th 2023.

Police in Detroit sued for reckless use of facial recognition tech, yet again.
The Detroit Police Department's use of facial recognition technology has come under intense scrutiny after three wrongful arrests were made by officers using this technology to solve crimes.

The most recent case is that of Porcha Woodruff, an expectant mother who was wrongfully accused of carjacking and robbery by Det. LaShauntia Oliver. Despite knowing that the suspect was not pregnant, Oliver failed to investigate the case properly and Woodruff was detained in front of her sobbing children and questioned for 11 hours. She was also subjected to a phone search and due to the high-stress levels, suffered from severe dehydration and contractions. It was only after she met her bond of $100,000 that she was taken to the hospital for treatment.

The federal lawsuit filed against the city highlights the need for more accurate investigative methods as the use of facial recognition technology has had a disproportionately negative effect on Black people. The ACLU took the Detroit Police Department to court in April 2021 after Robert Williams was wrongfully arrested in 2020 using the same technology.

The use of artificial intelligence in policing has also become a civil liberties issue. The nonprofit civil liberties group Electronic Privacy Information Center (EPIC) sent a letter to United States Attorney General Merrick Garland in September asking him to investigate whether or not cities using ShotSpotter were violating the Civil Rights Act. ShotSpotter is a technology that is supposed to be able to detect gunshots and is being placed in areas with a high concentration of Black people, but its compliance with the Civil Rights Act has never been seriously assessed.

Senator Ron Wyden, a key figure in privacy issues, has said he will push for Garland to accept EPIC's recommendations. He noted that technologies such as ShotSpotter do nothing to stop crime and instead have a well-documented discriminatory impact on marginalized and vulnerable communities.

[This article has been trending online recently and has been generated with AI. Your feed is customized.]
[Generative AI is experimental.]

 0
 0