Some Capitol Hill Democrats and civil rights activists are concerned about how police and prosecutors can use algorithms that can increase racism.
The Chicago-Democratic senator said the U.S. Department of Justice should investigate allegations of racist law enforcement in law enforcement and lead to wrongful arrests.
Oregon Democrat Senator Ron Widen was responding to an investigation by the Associated Press into allegations of bias in court-based shooting evidence. Department of Justice-funded systems are used by law enforcement agencies in more than 110 communities to detect gunfire and respond quickly to criminal scenes.
Despite national debate over policy in the United States, it is becoming clear that algorithms and technologies used in investigations, such as ShotSpotter, could increase racism and increase the ability to send innocent people to prison.
Chicago prosecutors relied on ShotSpotter sensors’ audio evidence last year that someone had been shot in the car. ShotSpotter says there is a problem detecting bullets in closed areas. Williams spent about a year in prison, but by the end of the month, a judge rejected the prosecutor’s request, arguing that there was insufficient evidence.
“Basically, these devices are letting people like Michael Williams go to the computer and make important police decisions,” said Weiden.
In Chicago, where Williams was arrested, community members gathered in front of a police station on Thursday to demand that the city terminate its contract with ShotSpotter. . ”
The Chicago Police Department on Friday defended the technology in a call to terminate the city’s ShotSpotter contract. Chicago is ShotSpotter’s largest customer.
He did not report hundreds of shootings to ShotSpotter, but said in a statement to APE that the technology was one of the many devices the department relied on to “protect public safety and ultimately save lives.” ”
ShotSpotter Instant Warning Alerts Officers are quicker and more responsive than someone who calls 911 to report a shooting.
“The system provides an opportunity for police law enforcement to serve and protect them and to ensure that they build bridges with residents who do not want to be identified,” the department said.
ShotSpotter uses a secret algorithm to analyze sounds generated by sensors mounted on light poles and buildings. Employees at the company’s evaluation centers in Washington, DC, and New York, California, look at the wavelength and make a final decision before the computer alerts police.
“The point is to have eyes and ears on a gun,” said Ralph Clark in an interview. “Human eyes and ears, okay?”
Civil rights activists say human assessments can promote bias.
Widen said he and seven other Democratic lawmakers are still waiting for a response from the Department of Justice as federal funds go to local law enforcement agencies to buy various artificial intelligence technologies, including those that combine firearms. In addition to Widen, the letter was signed by Sense. Ed Markie and Massachusetts Elizabeth Warren, Alex Padilla of California, Rafael Warnok of Georgia, and Jeff Merckley of Oregon, and New York representatives Yvette Clark and Sheila Jackson Lee Texas.
These algorithms, which automate policy decisions, may not only fail to provide meaningful control over public safety, but they may also increase discrimination against marginalized groups, ”he wrote to Attorney General Merrick Garland.
The Department of Justice did not comment on AP’s comments.
———
Mendoza reports from Newark, California.
.