A ShotSpotter sensor is attached to a chimney in Chicago. Credit: Jim Daley

A report by the Chicago Police Department (CPD) that said officers respond more quickly to ShotSpotter alerts than 911 calls doesn’t make a convincing argument for keeping the technology, according to experts who reviewed it. Eight university professors specializing in data science, sociology and criminology said the report lacked a number of key statistical measurements, and several questioned the accuracy of the report’s response time data.

Ald. David Moore (17th Ward) and other City Council members who are advancing an ordinance that would allow them to keep ShotSpotter in their wards requested the data from CPD. The ordinance, which the Reader reported was written with help from a ShotSpotter lobbyist, would also direct CPD to collect data on the number of shell casings and weapons recovered as a result of alerts. On April 1, the Committee on Police and Fire advanced the ordinance to the City Council, setting up a clash with Mayor Brandon Johnson, who announced in February that the City’s contract with ShotSpotter will expire in November. Moore did not respond to a request for comment.

On May 1, the Sun-Times reported that the CPD data bolsters the arguments of alderpersons who support the ordinance. But the report showed that officers were far more likely to render aid to victims, recover firearms or make arrests when responding to alerts combined with 911 calls than ShotSpotter alerts alone. 

The Weekly obtained the CPD report via a public records request. The report used arrival times logged by responding officers to show that between January 2018 and April 2024, officers responded to ShotSpotter alerts more than two minutes faster on average than to 911 calls alone or 911 calls that were accompanied by ShotSpotter alerts. The difference in response times shrank to seventy seconds between January and April 2024.

A 2023 scientific study found a much smaller gap in police response times to 911 and ShotSpotter than CPD reported. That study, by researchers at Northeastern University in Boston, used GPS coordinates from officers’ patrol cars rather than self-reported arrival data to measure response times. The results showed Chicago police arrived at the scenes of nonfatal shootings and shots-fired calls only about ten seconds faster when responding to ShotSpotter alerts than 911. That study also found officers arrived at fatal shootings more than three seconds slower when responding to ShotSpotter alerts than 911 calls. 

Eric Piza, the co-director of Northeastern’s Crime Prevention Lab who co-authored that study, said via email that arrival times logged by officers aren’t always accurate. “I’ve had police officers in other cities tell me they often mark themselves on-scene before arriving to a shooting as a safety precaution,” he said. “If they are going to arrive to an active shooting, they’d rather devote their full attention to the situation.” 

Piza added that patrol car GPS data isn’t skewed by inaccuracies in officer-reported arrival times. “That likely explains why our findings differ from the CPD numbers,” he said. 

The CPD report’s section on ShotSpotter and 911 response times acknowledged issues with officer-reported data in a footnote: “The calculation for average response time was dependent on the ‘on-scene’ timestamps associated with the events,” it read. “Factors such as responding units who failed to mark themselves as ‘on-scene’ or marked themselves as ‘on-scene’ significantly later than when they arrived at the scene affected the average response time calculation.”

CPD declined the Weekly’s request to interview NoĂ© Flores, the assistant director of the Strategic Initiatives Division, which prepared the report.

The CPD report also includes data on the number of gunshots that CPD reported were missed by ShotSpotter sensors. The company’s contract requires ShotSpotter to detect at least 90 percent of unsuppressed outdoor gunfire in the twelve police districts that make up its coverage area. The police department is also required to report verified gunfire incidents for which there was no ShotSpotter alert to the company, via an online portal and email.

According to the CPD report, the department reported 205 misses to ShotSpotter in 2023, a year that had 43,503 ShotSpotter alerts. 

An investigation the Weekly published in January found CPD reported far more misses to ShotSpotter that year. Between January 1 and December 18, 2023, CPD emailed the company 575 times to report unique gunfire incidents that were missed or mislocated by ShotSpotter sensors.

Ravi Shroff, an applied statistician at New York University, noted that comparing ShotSpotter alerts to misses reported by CPD alerts doesn’t accurately measure how much gunfire the technology is missing. “That’s probably an underestimate of the actual misses,” Shroff said. “I don’t think it’s absurd to measure that way—it’s sort of hard to measure things you don’t have recorded in data—but I wouldn’t say that’s convincing evidence.”

Missed Shots

The Weekly’s investigation of ShotSpotter, a controversial gunshot-detection company.

Several of the researchers noted that the CPD report also failed to include statistics that are commonly used in data analyses, such as the median value, variance, and standard deviation. Together, these measurements show how accurate the data is. None of them are in the report.  

David Buil-Gil, a quantitative criminologist at the University of Manchester, said the failure to include such information means there’s no way to tell how reliable the average response times in the report are.

“We do not know if the proportion of cases in which the responding unit did not mark themselves as ‘on-scene’, and took long to do so, vary systematically depending on who initiated the call,” Buil-Gil said via email. “Sometimes average scores are highly skewed by a few very high or very low values, which we call outliers, but we do not know if this is the case here or if outliers have been removed.” 

Without such information, Buil-Gil said it’s impossible to know how accurate the average response times are or whether the response times reported for ShotSpotter alerts and 911 calls are statistically different from one another.

Robert Vargas, the deputy dean of social sciences and director of the Justice Project at the University of Chicago, has been a critic of ShotSpotter; in January he wrote an op-Ed in the Sun-Times calling ShotSpotter’s claims of effectiveness “questionable.” Like Buil-Gil, Vargas immediately noted the lack of a median or minimum and maximum values in the CPD report. 

“The average can be misleading,” Vargas said. “It’s really puzzling to me why anyone would put so much weight on a single report based on a single statistic.” 

Other researchers who reviewed the report said the results showing faster response times for ShotSpotter alerts are probably correct, even if all of the officer-reported response times aren’t.

Journal of Quantitative Criminology co-editor Greg Ridgeway said that the caveat about response times indicates they could be shorter than what the report indicates, but added that ShotSpotter alerts nonetheless probably get faster responses than 911 calls. 

“The average response times shown are overestimates,” he said via email. “If an officer delays or forgets to log the arrival time, then some fraction of the arrival timestamps could be much later than when the arrival actually happened.”

Assuming officers forget to log their arrival times at roughly the same frequency for ShotSpotter and 911 alerts, then the CPD results are probably roughly correct, he added. “It would seem strange (but not impossible) if ShotSpotter calls had more officers recording their timestamps later than their arrival times,” Ridgeway said. Based on the report’s totals, his “best guess” is that police probably are responding more quickly to ShotSpotter than 911.

University of Chicago data scientist Amanda Kube Jotte agreed with Ridgeway but added that the CPD report should have broken the response times down by police district. Jotte said the report makes a “decent” case for ShotSpotter. “I don’t see anything [in the report] that makes me think that ShotSpotter is not useful,” she said.

For Vargas, regardless of what the figures contained in the report are, CPD did not provide anything linking faster response times to lives saved. Vargas also noted that the report did not account for controls, or offer any explanations for the results—both of which are standard practices in rigorous scientific research.

“It’s meaningless. It’s just numbers,” Vargas said. “The report itself is not saying that ShotSpotter is effective.”

According to the CPD report, between January 2018 and April 2024, when someone did call 911, officers made 1,173 more arrests, recovered 883 more guns, and found 112,592 more shell casings than when they only got ShotSpotter alerts. Between 2021 and 2024, CPD rendered aid to 349 gunshot victims when they got a 911 call along with a ShotSpotter alert, and 103 times when there was an alert but no 911 call. 

“One of the ways that ShotSpotter sells itself is it allows [police] departments to respond faster to shooting incidents, and to the extent that you believe those [CPD] numbers, I feel like that maybe suggests that claim is true,” Shroff said. “Now the question is, are they actually finding anything when they go there?” 

Evaluating ShotSpotter’s value is more difficult and multifaceted than simply looking at response times, according to Shroff. “It’s hard for me to say objectively that this [report] validates a claim that ShotSpotter is a good thing or a bad thing,” he said. 

In 2018, when then-Mayor Rahm Emanuel announced the contract with ShotSpotter, his administration cited the company’s ability to “​​reduce gun violence.” That was a central claim in ShotSpotter’s marketing materials at the time. A 2011 study commissioned by the company claimed: “Gunfire crime has been reduced since installation, which commanders at least indirectly attribute to ShotSpotter.” 

The study also claimed ShotSpotter can help police “get to the scene faster than 911 alone, increasing the likelihood of an arrest and, as we have seen, decreasing the time to get gunshot victims to life saving medical treatment.”

Studies published by the Chicago Office of Inspector General (OIG) and the MacArthur Justice Center in 2021 found alerts rarely led to documented evidence of gun crimes or prosecutions. 

In February, the Cook County State’s Attorney’s Office published a review that confirmed the OIG’s findings. “ShotSpotter is not making a significant impact on shooting incidents,” the review found, and it only led to arrests in about 1 percent of cases over a five-year period. Of those arrests, almost a third did not involve a firearm. Less than a quarter resulted in criminal charges tied to gun violence.

A 2024 working paper about police response times found that ShotSpotter causes slower 911 responses overall because “police officers are forced to allocate a significant portion of their time to fulfill ShotSpotter requirements, thereby incapacitating them from attending to 911 calls.” According to the authors of that paper, the increase in 911 response times led to a decline in arrest rates for domestic violence.

Michael Topper, a Ph.D. candidate at University of California at Santa Barbara who co-authored the working paper, said that while policing technologies like ShotSpotter may seem like attractive solutions in communities experiencing high levels of gun violence, they nevertheless involve trade-offs because of how much officer time they demand. 

“Departments really need to think about their resource constraints before jumping on board with something like [ShotSpotter],” he said.

David Carter, the director of Michigan State University’s Intelligence Program, said via email that recent research has shown response time is less important than other factors when it comes to assessing ShotSpotter’s impact on gun violence. “It seems that in most cases…the response time did not have an effect on identifying the source of the shots fired,” Carter said. “What determines if we find someone at the scene of gunfire? [That] would include ShotSpotter notifications, response time, information from callers to the police.” 

He added that Kansas City and Indianapolis both canceled their ShotSpotter contracts as a result of that research.

The timing of the report’s release struck Vargas as an attempt by CPD to aid the company in what he called a “product defense strategy,” as ShotSpotter has come under intense scrutiny by academics, journalists, and policymakers in recent years. 

“The City Council and a number of elected officials have been asking the police department for this data for a long time. So you just can’t help but be skeptical that this is finally being shared only after several studies have shown that this product is ineffective,” Vargas said. â€œIf the City were to base any of its decisions in this manner, it would set a really poor precedent for the use of evidence to inform policy.”

✶ ✶ ✶ ✶

Jim Daley is the Weekly’s investigations editor. Max Blaisdell is the Weekly’s Investigative Hub coordinator, a staff writer for the Hyde Park Herald, and an Invisible Institute fellow.

Join the Conversation

2 Comments

  1. Despite all the issues, I think it just comes down to whether the benefits of maintaining the systems (while working to improve it) is worth its cost. Basically, is there a positive return on investment?

    That can be measured somewhat in number of lives saved and successful arrests that can be attributed solely to the system.

Leave a comment

Your email address will not be published. Required fields are marked *