Is incarceration in Chicago dependent on artificial intelligence?

Is incarceration in Chicago dependent on artificial intelligence?

A 65-year-old man has been jailed for murder based on evidence provided by police software deemed unreliable. Now they will be forced to drop the case due to the shortcomings of the technology.

Michael Williams spent almost a year in prison before prosecutors asked a judge to dismiss his case due to insufficient evidence. The 65-year-old man was jailed for shooting a man in his car. Now he’s suing the city of Chicago for using an unreliable artificial intelligence system called ShotSpotter as critical evidence to support the murder charge. The Northwestern University-based human rights advocacy group claims that city police relied solely on technology and did not pursue other leads in their investigation. The suit, filed by the MacArthur Justice Center, seeks damages from the city for mental anguish, lost income and legal bills for Williams, who still suffers from hand tremors that developed during the detention. The document also details the case of Daniel Ortiz, who was arbitrarily arrested and imprisoned by police responding to a ShotSpotter AI alert. The suit also seeks class-action status for all Chicago residents who were stopped by the alerts – reports the Associated Press. The MacArthur Center for Justice even asked the court to order a ban on the use of artificial intelligence technology in the nation’s third largest city. Speaking about her ordeal, Williams explained that while she is free, she doesn’t think she will ever get over the effects of what happened to her. “As my hands are shaking, I keep going back to the idea that I was in that place. I just can’t put my mind at ease,” he told the AP. If the lawsuit is successful, Chicago would have to cease all ShotSpotter activity. That would be spicy because the city quietly renewed its $33 million contract with the artificial intelligence company last year. According to The Byte, the case could be a turning point in the spread of AI-based policing. Interestingly, this isn’t the first time ShotSpotter has come under scrutiny. A 2021 report by the MacArthur Project claimed that 89 percent of the technology’s alerts contained no on-the-ground evidence. That same year, a Vice investigation recounted the horrific death of an unarmed 13-year-old black boy who was shot by Chicago police in response to a ShotSpotter alert. The report accused the artificial intelligence software of racial discrimination. The new 103-page filing does not name the technology as a “defendant.” However, the filing alleges that the company’s algorithmic technology is flawed. According to the lawsuit, it is also racial discrimination that the city placed most of the gunshot detection sensors in neighborhoods inhabited by blacks and Latinos. Meanwhile, police have not even confirmed a motive for Williams, who is accused of shooting 25-year-old Safarian Harring while driving him home from a protest against police brutality, according to Engadget. All they had was a silent security recording of a vehicle and an AI alert. However, ShotSpotter has denied claims of inaccuracy and racial bias, insisting that their system has an overall accuracy of 97 percent for real-time detections across all customers. According to the statement, the system works by mapping objective historical data of shootings and murders.Hardware, software, tests, interesting and colorful news from the world of IT by clicking here!

Leave a Comment

Your email address will not be published.