A report by the Chicago Police Department (CPD) that said officers respond more quickly to ShotSpotter alerts than 911 calls doesnât make a convincing argument for keeping the technology, according to experts who reviewed it. Eight university professors specializing in data science, sociology and criminology said the report lacked a number of key statistical measurements, and several questioned the accuracy of the reportâs response time data.
Ald. David Moore (17th Ward) and other City Council members who are advancing an ordinance that would allow them to keep ShotSpotter in their wards requested the data from CPD. The ordinance, which the Reader reported was written with help from a ShotSpotter lobbyist, would also direct CPD to collect data on the number of shell casings and weapons recovered as a result of alerts. On April 1, the Committee on Police and Fire advanced the ordinance to the City Council, setting up a clash with Mayor Brandon Johnson, who announced in February that the Cityâs contract with ShotSpotter will expire in November. Moore did not respond to a request for comment.
On May 1, the Sun-Times reported that the CPD data bolsters the arguments of alderpersons who support the ordinance. But the report showed that officers were far more likely to render aid to victims, recover firearms or make arrests when responding to alerts combined with 911 calls than ShotSpotter alerts alone.
The Weekly obtained the CPD report via a public records request. The report used arrival times logged by responding officers to show that between January 2018 and April 2024, officers responded to ShotSpotter alerts more than two minutes faster on average than to 911 calls alone or 911 calls that were accompanied by ShotSpotter alerts. The difference in response times shrank to seventy seconds between January and April 2024.
A 2023 scientific study found a much smaller gap in police response times to 911 and ShotSpotter than CPD reported. That study, by researchers at Northeastern University in Boston, used GPS coordinates from officersâ patrol cars rather than self-reported arrival data to measure response times. The results showed Chicago police arrived at the scenes of nonfatal shootings and shots-fired calls only about ten seconds faster when responding to ShotSpotter alerts than 911. That study also found officers arrived at fatal shootings more than three seconds slower when responding to ShotSpotter alerts than 911 calls.
Eric Piza, the co-director of Northeasternâs Crime Prevention Lab who co-authored that study, said via email that arrival times logged by officers arenât always accurate. âIâve had police officers in other cities tell me they often mark themselves on-scene before arriving to a shooting as a safety precaution,â he said. âIf they are going to arrive to an active shooting, theyâd rather devote their full attention to the situation.â
Piza added that patrol car GPS data isnât skewed by inaccuracies in officer-reported arrival times. âThat likely explains why our findings differ from the CPD numbers,â he said.
The CPD reportâs section on ShotSpotter and 911 response times acknowledged issues with officer-reported data in a footnote: âThe calculation for average response time was dependent on the âon-sceneâ timestamps associated with the events,â it read. âFactors such as responding units who failed to mark themselves as âon-sceneâ or marked themselves as âon-sceneâ significantly later than when they arrived at the scene affected the average response time calculation.â
CPD declined the Weeklyâs request to interview NoĂ© Flores, the assistant director of the Strategic Initiatives Division, which prepared the report.
The CPD report also includes data on the number of gunshots that CPD reported were missed by ShotSpotter sensors. The companyâs contract requires ShotSpotter to detect at least 90 percent of unsuppressed outdoor gunfire in the twelve police districts that make up its coverage area. The police department is also required to report verified gunfire incidents for which there was no ShotSpotter alert to the company, via an online portal and email.
According to the CPD report, the department reported 205 misses to ShotSpotter in 2023, a year that had 43,503 ShotSpotter alerts.
An investigation the Weekly published in January found CPD reported far more misses to ShotSpotter that year. Between January 1 and December 18, 2023, CPD emailed the company 575 times to report unique gunfire incidents that were missed or mislocated by ShotSpotter sensors.
Ravi Shroff, an applied statistician at New York University, noted that comparing ShotSpotter alerts to misses reported by CPD alerts doesnât accurately measure how much gunfire the technology is missing. âThatâs probably an underestimate of the actual misses,â Shroff said. âI donât think itâs absurd to measure that wayâitâs sort of hard to measure things you donât have recorded in dataâbut I wouldnât say thatâs convincing evidence.â
Missed Shots
Several of the researchers noted that the CPD report also failed to include statistics that are commonly used in data analyses, such as the median value, variance, and standard deviation. Together, these measurements show how accurate the data is. None of them are in the report.
David Buil-Gil, a quantitative criminologist at the University of Manchester, said the failure to include such information means thereâs no way to tell how reliable the average response times in the report are.
âWe do not know if the proportion of cases in which the responding unit did not mark themselves as âon-sceneâ, and took long to do so, vary systematically depending on who initiated the call,â Buil-Gil said via email. âSometimes average scores are highly skewed by a few very high or very low values, which we call outliers, but we do not know if this is the case here or if outliers have been removed.â
Without such information, Buil-Gil said itâs impossible to know how accurate the average response times are or whether the response times reported for ShotSpotter alerts and 911 calls are statistically different from one another.
Robert Vargas, the deputy dean of social sciences and director of the Justice Project at the University of Chicago, has been a critic of ShotSpotter; in January he wrote an op-Ed in the Sun-Times calling ShotSpotterâs claims of effectiveness âquestionable.â Like Buil-Gil, Vargas immediately noted the lack of a median or minimum and maximum values in the CPD report.
âThe average can be misleading,â Vargas said. âItâs really puzzling to me why anyone would put so much weight on a single report based on a single statistic.â
Other researchers who reviewed the report said the results showing faster response times for ShotSpotter alerts are probably correct, even if all of the officer-reported response times arenât.
Journal of Quantitative Criminology co-editor Greg Ridgeway said that the caveat about response times indicates they could be shorter than what the report indicates, but added that ShotSpotter alerts nonetheless probably get faster responses than 911 calls.
âThe average response times shown are overestimates,â he said via email. âIf an officer delays or forgets to log the arrival time, then some fraction of the arrival timestamps could be much later than when the arrival actually happened.â
Assuming officers forget to log their arrival times at roughly the same frequency for ShotSpotter and 911 alerts, then the CPD results are probably roughly correct, he added. âIt would seem strange (but not impossible) if ShotSpotter calls had more officers recording their timestamps later than their arrival times,â Ridgeway said. Based on the reportâs totals, his âbest guessâ is that police probably are responding more quickly to ShotSpotter than 911.
University of Chicago data scientist Amanda Kube Jotte agreed with Ridgeway but added that the CPD report should have broken the response times down by police district. Jotte said the report makes a âdecentâ case for ShotSpotter. âI donât see anything [in the report] that makes me think that ShotSpotter is not useful,â she said.
For Vargas, regardless of what the figures contained in the report are, CPD did not provide anything linking faster response times to lives saved. Vargas also noted that the report did not account for controls, or offer any explanations for the resultsâboth of which are standard practices in rigorous scientific research.
âItâs meaningless. It’s just numbers,â Vargas said. âThe report itself is not saying that ShotSpotter is effective.â
According to the CPD report, between January 2018 and April 2024, when someone did call 911, officers made 1,173 more arrests, recovered 883 more guns, and found 112,592 more shell casings than when they only got ShotSpotter alerts. Between 2021 and 2024, CPD rendered aid to 349 gunshot victims when they got a 911 call along with a ShotSpotter alert, and 103 times when there was an alert but no 911 call.
âOne of the ways that ShotSpotter sells itself is it allows [police] departments to respond faster to shooting incidents, and to the extent that you believe those [CPD] numbers, I feel like that maybe suggests that claim is true,â Shroff said. âNow the question is, are they actually finding anything when they go there?â
Evaluating ShotSpotterâs value is more difficult and multifaceted than simply looking at response times, according to Shroff. âItâs hard for me to say objectively that this [report] validates a claim that ShotSpotter is a good thing or a bad thing,â he said.
In 2018, when then-Mayor Rahm Emanuel announced the contract with ShotSpotter, his administration cited the companyâs ability to âââreduce gun violence.â That was a central claim in ShotSpotterâs marketing materials at the time. A 2011 study commissioned by the company claimed: âGunfire crime has been reduced since installation, which commanders at least indirectly attribute to ShotSpotter.â
The study also claimed ShotSpotter can help police âget to the scene faster than 911 alone, increasing the likelihood of an arrest and, as we have seen, decreasing the time to get gunshot victims to life saving medical treatment.â
Studies published by the Chicago Office of Inspector General (OIG) and the MacArthur Justice Center in 2021 found alerts rarely led to documented evidence of gun crimes or prosecutions.
In February, the Cook County Stateâs Attorneyâs Office published a review that confirmed the OIGâs findings. âShotSpotter is not making a significant impact on shooting incidents,â the review found, and it only led to arrests in about 1 percent of cases over a five-year period. Of those arrests, almost a third did not involve a firearm. Less than a quarter resulted in criminal charges tied to gun violence.
A 2024 working paper about police response times found that ShotSpotter causes slower 911 responses overall because âpolice officers are forced to allocate a significant portion of their time to fulfill ShotSpotter requirements, thereby incapacitating them from attending to 911 calls.â According to the authors of that paper, the increase in 911 response times led to a decline in arrest rates for domestic violence.
Michael Topper, a Ph.D. candidate at University of California at Santa Barbara who co-authored the working paper, said that while policing technologies like ShotSpotter may seem like attractive solutions in communities experiencing high levels of gun violence, they nevertheless involve trade-offs because of how much officer time they demand.
âDepartments really need to think about their resource constraints before jumping on board with something like [ShotSpotter],â he said.
David Carter, the director of Michigan State Universityâs Intelligence Program, said via email that recent research has shown response time is less important than other factors when it comes to assessing ShotSpotterâs impact on gun violence. âIt seems that in most cases…the response time did not have an effect on identifying the source of the shots fired,â Carter said. âWhat determines if we find someone at the scene of gunfire? [That] would include ShotSpotter notifications, response time, information from callers to the police.â
He added that Kansas City and Indianapolis both canceled their ShotSpotter contracts as a result of that research.
The timing of the reportâs release struck Vargas as an attempt by CPD to aid the company in what he called a âproduct defense strategy,â as ShotSpotter has come under intense scrutiny by academics, journalists, and policymakers in recent years.
âThe City Council and a number of elected officials have been asking the police department for this data for a long time. So you just canât help but be skeptical that this is finally being shared only after several studies have shown that this product is ineffective,â Vargas said. âIf the City were to base any of its decisions in this manner, it would set a really poor precedent for the use of evidence to inform policy.â
Jim Daley is the Weeklyâs investigations editor. Max Blaisdell is the Weeklyâs Investigative Hub coordinator, a staff writer for the Hyde Park Herald, and an Invisible Institute fellow.
Good work, Daley and Blaisdell.
Despite all the issues, I think it just comes down to whether the benefits of maintaining the systems (while working to improve it) is worth its cost. Basically, is there a positive return on investment?
That can be measured somewhat in number of lives saved and successful arrests that can be attributed solely to the system.