Log in Newsletter

Police technology expert discusses AI-powered gunshot detection

Posted

In mid-November, the Fayetteville City Council decided to contract with ShotSpotter, a company that deploys a gunshot detection system using artificial intelligence in cities across the state and country. The one-year contract is contingent on the city holding three public forums to get community input.

The contract with ShotSpotter is controversial among the city council members in Fayetteville, as well as Durham, where the one-year use of the technology has been indefinitely delayed.

To learn more about ShotSpotter and police technology in general, Carolina Public Press spoke with Daniel Lawrence, research scientist at the Center for Justice Research and Innovation at CNA.

You can read the transcript below.

Intro to the interview 

Ben Sessoms: Welcome, everyone. This is Ben Sessoms, reporter at Carolina Public Press. Today we’ll be discussing the California-based company ShotSpotter. 

ShotSpotter operates a gunshot detection system that uses artificial intelligence to pinpoint the location of a gunshot. To do this, ShotSpotter places acoustic sensors in neighborhoods with high gun crime. The company then uses these sensors to detect the gunshots. 

To help address increasing gun violence, cities across the country have contracted with ShotSpotter and have brought the technology to their communities. These cities include Chicago; Albuquerque, N.M.; and Houston, among others. Here in North Carolina, the technology is used in cities such as Greenville and Winston-Salem, among other cities across the state as well. 

There has been some tension among other city councils in the state regarding bringing ShotSpotter to their communities. For instance, the Durham City Council approved a contract with the company in September, but the council decided earlier in November to indefinitely delay the implementation of ShotSpotter technology in the city. 

The Fayetteville City Council has gone back and forth on the issue. After initially approving its own contract with the company in August, the council later decided to reconsider until finally approving the contract with ShotSpotter on the condition that public forums be held to gather community input first. 

Both of the contracts would cost Durham and Fayetteville over $197,000 each for one year of use. 

To discuss ShotSpotter further, I spoke with Dr. Daniel Lawrence, research scientist at the Center for Justice Research and Innovation at CNA. Dr. Lawrence researches police technologies such as ShotSpotter, but he also studies body-worn police cameras and public surveillance systems, among many other research areas that he’s interested in. My interview with Dr. Lawrence was recorded on Nov. 28, 2022. Hope you enjoy.

Start of interview on Nov. 28, 2022

Ben Sessoms: Welcome, Dr. Lawrence. Thank you for speaking with me today.

Daniel Lawrence: Thanks for having me.

Ben Sessoms: So just to start off with. I wanted to talk to you about ShotSpotter, one area of study that you’ve analyzed the data for. And so, at least here in North Carolina, there’s a few cities that have the gunshot detection software. We have Rocky Mount, and Greenville, and it’s been controversial in a few other places. 

Here in Fayetteville, where I’m based, there’s been some back and forth on the City Council. They did finally decide to implement the technology and go forward with the contract a few weeks ago. There’s also some controversy in Durham, where they keep going back and forth. So just to start off with, based on your knowledge of ShotSpotter and the data that you look at, what are your general thoughts on the effectiveness of this technology?

Daniel Lawrence: Sure. So let me give a little bit of background before I begin about where my knowledge comes from. 

So back in 2016, I was awarded a project from the National Institute of Justice that funded a three-city project in Denver, Milwaukee and Richmond, California, to evaluate the ShotSpotter system. So as a three-city evaluation of this technology, my team and I went in and did community member focus groups. We talked to community members about their experiences with gun violence. We also did interviews with police officers and police sergeants about the technology. And then we also collected a whole host of different administrative data, as well as doing reviews of case files. 

So this was a rather large study, and there are a lot of mixed results associated with it. But I’m happy to talk about some of the main findings. 

So, the way that I see ShotSpotter being implemented is that there’s actually three levels a police agency can implement this technology. So, we have a published article out that looks at these three different approaches to this technology.

And there’s the first approach, so level one, for example, is just using the technology in and of itself. So getting the technology and having officers respond to those gunshot notifications. And I should also mention here, too, how the technology works. So ShotSpotter is a network of acoustic sensors that are placed in a city primarily based on crime data where shootings occur. So, this information is collected from community members’ call for service from their 911 calls, saying that they heard a shooting or they witnessed a shooting. And then that data is compiled and police agencies are able to identify where the prominent areas of the city experienced gun violence. So that’s where the technology is going to have, for lack of a better word, the most bang for the buck, right. 

So placing this type of technology, which is meant to identify and get officers to gunfire is going to be placed in areas where there is a known historical impact of gun violence. So, the acoustic sensors are placed in high areas across a neighborhood. And when a gunshot occurs, I think they actually use four or more now, acoustic sensors will identify that noise as gunshot. And that information then can triangulate the location of the gunshot. And that information actually gets sent to ShotSpotter’s headquarters out in California in the Bay Area. And they actually have an individual, a real person review the file, and either confirm that it’s gunshot or say it’s not gunshot. So things like helicopters going by or construction noises, or train tracks, or train going on train tracks. That information can also be picked up and said like this is an anomaly in noise, and it can be flagged as a gunshot. But then they have an individual who’s reviewing it and confirming whether or not it’s a gunshot. 

Once confirmed, the information’s sent back to the department, and it goes into its CAD (computer-aided dispatch) system, its computer-aided dispatch. And then an officer is assigned. And everything I just described there happens really fast, like within 30 seconds from gunshot to an officer being assigned. And the officers actually have been found to respond faster to these types of events because the assignment is much quicker for them. 

So, that’s level one is getting the officer to the scene as quickly as possible.  I should mention, too, so community members, depending on community relations with the law enforcement agency, depending on how community members view police officers, how they face crime in their city, many community members may or may not report gunfire. It can become something where they’re so accustomed to it that they don’t report it. Or even in this study, we heard that when a department gets ShotSpotter, community members are actually less likely to report it because they assume that the department now is aware of it automatically. 

But what we found in the NIJ study was that the amount of notifications of gunfire in these three cities, in Denver, Milwaukee, and Richmond, Calif., almost increased by twofold, actually, by 1.8 times more notifications of gunfire. So previously, the department wasn’t aware of about more than 80%, or I should say, 180% of their gunfire that’s occurring in the city. So now they’re being sent to respond to these notifications to a much higher degree, increasing the officer’s workload. So in the three cities, we actually found that the workload for the officers increased by a range of 108% to 191%. So, it’s increasing the workload for officers responding to these events. 

Level two responses for law enforcement agencies. So that’s when the officers respond to the gunshot alert, and they actually get out and canvass the area. So it’s one thing just to get to the scene and drive through, make sure there’s no injuries, make sure there’s no people running with a gun, making sure that there’s nobody, no fighting going on or anything like that. That’s level one. 

Level two is then getting to the scene, securing the area and getting out of their vehicle and conducting a canvass. So canvasses are meant to collect evidence, collect information from community members. They can get out of, the officers, can get out of their cars, their vehicles, look for shell casings, look for bullet debris, bullet holes in surrounding buildings or trees, and then doing door knocks, talking to community members. 

And we found that the canvasses increased for gunfire cases from 58% prior to ShotSpotter to about 71%. So getting the officers out of their vehicle to conduct these investigations increased as a result of this technology. That corresponded with increases in interviews with residents and as well as increases in finding shell casings. So the amount of shell casings for cases increased from 55% to up to 70%.

So the technology, this level two approach, is to get officers out and start collecting evidence. But level three is then what you do with that evidence. And it’s really the integration with other technologies. 

So connecting shell casings from a single event to the ATF’s [U.S. Bureau of Alcohol, Tobacco, Firearms and Explosives] NIBIN, which is the National Integrated Ballistic Information Network can help police agencies connect a single event to other events across the city. So when the same gun is used, it creates that ballistics fingerprint on the shell casing, and then they can connect that information across a network of events. So, if there’s a shooting, and they have a casing, but they don’t have a suspect, and then they connect it to the NIBIN network, they may be able to identify a person that they know has used that gun in other events elsewhere in the city. There’s other technologies they can use, such as eTrace, which helps identify ownership of weapons. And then there’s also more recent technology that connects to the public surveillance cameras. So having ShotSpotter automatically turn a camera when a gunshot occurs near that area. So that level three approach is really getting to the scene, conducting canvasses and connecting collected evidence with other technologies that the department uses.

Ben Sessoms: So, speaking of collecting evidence, when you’re talking about, in level two, and level three. So earlier this year, there was a report in The Associated Press. They kind of did a profile on a man who said he was wrongfully accused of killing someone and he was jailed for nearly a year based on evidence from ShotSpotter. And then, the AP, later on in that report, they published findings of an investigation where they said, quote, “Based on review of thousands of internal documents, emails, presentations and confidential contracts, along with interviews with dozens of public defenders in communities where ShotSpotter has been deployed.” The AP said they had “identified a number of serious flaws in using ShotSpotter as evidentiary support for prosecutors.” 

Based on the studies that you’ve done and what you were just talking about with police officers collecting evidence, do you think there’s any concern with ShotSpotter producing evidence that isn’t accurate and could result in wrongful convictions of people that weren’t involved in that shooting?

Daniel Lawrence: So I think ShotSpotter shouldn’t be viewed as the panacea to gun violence here. It’s one single tool that can be used in a broad array of different law enforcement approaches. 

So, like I was discussing earlier, ShotSpotter is meant to get the officers to the scene to collect evidence. But then it’s connecting additional information. So, statements from witnesses, statements from victims, statements from individuals who are in the area and may have heard the gun violence. All that information they can connect to cellphones, to cellphone towers. They can look at public surveillance systems. So, a detective is meant to do their due diligence, to create a case that would lead to a conviction. They then pass that information along to the prosecutor. The prosecutor reviews and makes the decision whether or not they want to prosecute, and then it goes to court. And then from there, it goes through the court system. Either there’s a plea bargain process, if a suspect wants to go through that process, or they can go through a judge or they can go through a jury trial. 

So there’s all these checks and balances along the way. So I get a little — that’s sensitive — but I get a little, I guess, concerned when I think that a lot of these studies or these different news articles are emphasizing that it’s ShotSpotter that is resulting in these different outcomes. 

I myself haven’t studied the direct correlation or causation of ShotSpotter’s impact on court outcomes, court dispositions, such as being charged with something. But I will say that in our study, we did find, and this is where the results are a little bit mixed, more mixed for ShotSpotter, is the impact on crime. So a lot of news articles, a lot of reporters, they really emphasize the effect it has on arrests, and I can understand that. 

You want a technology that’s going to either affect officers making good arrests, and you want them to get people off the streets who are making serious violent crimes. And so you associate this type of technology with getting those individuals off the street via an arrest. 

In Richmond, there was actually no impact that we found for ShotSpotter’s effect on arrests. The same was in Milwaukee. In Denver, though, we did see an increase in arrests for crimes that involved a firearm. So those are very mixed results. 

Other cities have also found not much of an impact on arrests associated with firearm violence connected to this technology. And, I would reiterate, that’s primarily because this technology’s primary focus is to get officers to the scene as quickly as possible and collect evidence that’s available. 

In a logical and ideal world, that evidence would then be able to build a case against an individual, but this is a very complex, complicated event. A shooting where that may occur, where there may or may not be an individual who is injured, may or may not lead to sufficient evidence being collected to lead to an arrest. 

So in terms of ShotSpotter, in and of itself, leading to negative results on an individual, I’d say ShotSpotter is just one piece of the overall case, of the overall evidence collection. And that may or may not be the results of valid outcomes.

Ben Sessoms: Would you say, based on studies and data that because, as you’re saying, ShotSpotter is being used to get police officers out there to investigate a shooting. So would you say that ShotSpotter really shouldn’t be used exclusively as an evidence gathering tool, but more as a tool that can help police officers gather evidence?

Daniel Lawrence: That’s right. Yeah. I mean, it can’t be used exclusively as the only tool. I don’t think any prosecutor would just say, “Because a ShotSpotter alert happened in this location, we’re going to prosecute this individual.” It needs to be connected to a whole host of different supporting evidence to back that claim.

Ben Sessoms: And just to follow up on the study that you’ve been talking about. So, you said Denver, Milwaukee and Richmond, California. What was the time frame on those studies? You said it started in 2016?

Daniel Lawrence: So the study itself was from 2016 to 2019. But it looked back across the different agencies’ implementation of ShotSpotter and that range from I think — I’d have to look back at my report — but I think it was around 2010 to 2016 or so.

Ben Sessoms: OK. And you said there was some, some mixed results with this study. I know you already mentioned the impact on arrests, not having much impact except in Denver. What other mixed results did you find in that study?

Daniel Lawrence: So, one of the general positive findings that I found consistently across my analyses is that ShotSpotter does result in a reduction to response time. 

So in, for example, for in Denver, we saw that 911 calls for service, for shots fired, was about 8 ½  minutes. With gunshot detection technology, with ShotSpotter, that decreased to four minutes and 45 seconds or so. 

So there was this dramatic decrease in response time, which is a positive thing. And when you break response time apart, so there’s three main components to a response time. 

There’s the call or the notification of the alert to an officer assignment. There’s the assignment to their arrival. And then there’s the fuller picture, which is the call to the arrival, which is what community members are most interested in, because they call 911, and they want to get an officer there as quickly as possible. 

But when you break response time apart in those three domains, what you actually find is that officers aren’t responding quicker to ShotSpotter calls because they view them differently. The reality is that there’s a faster response time in that notification or call to the assignment. Period. Because it’s automated. Why that community member calls 911 to report shots fired, they might be stressed, they might be scared. The dispatcher is trying to collect information from them, the location, any evidence they can collect on the suspect’s clothes that they’re wearing, or their demographic characteristics. And that information takes time to get from the community member as well as to put into the computer aided dispatch system. So, that takes time and can delay the response time. 

So, ShotSpotter is more automated in that it automatically identifies with the location, and an officer can start responding. ShotSpotter also will feed information. So if the GPS location coordinates of the event is on a roof, they can actually put that in the notes. So the officer says, “OK, I’m looking at that roof instead of down on the street level.” 

And I will say there is some inaccuracy in the location, the specific location. It’s not down to feet. I think they give like an 80 foot circle around the location. So that’s  generally the area that the gunshot occurs in, but it does get them to the general block in a quicker way. 

So that’s a positive thing. Response times are faster. I already mentioned previously that the workload increases. So, the amount of gunshots that officers are responding to increases. That could be viewed as a positive, or it could be viewed as a negative. That’s going to require more money for officers. If they’re spending more time on gunshots, they’re spending less time on other things, then that might require additional officers being hired by the department. 

And then, in terms of crime, I mentioned arrests, but there are mixed results on the impact of crime. So, in Richmond, we actually found a reduction in violent crime, pre- and post-gunshot detection technology implementation. So, pre- and post-ShotSpotter implementation. That primarily came from robberies, and we didn’t study this directly, but the hypothesis here is that as more officers are responding to these priority-one events, so gunfire, they’re arriving to the area with sirens, with lights flashing, and that might actually reduce the opportunity for street violence for street robberies.

So, we’re seeing in Richmond, and in two of the locations in Milwaukee, decreases in the amount of robberies. But in the other two locations in Milwaukee, and in Denver, we didn’t see this decrease. There was no impact on robberies or violent crime. So, it’s a very generally mixed findings on the impact it has directly on crimes. 

But I would also emphasize that they’re getting officers to the scene faster. And actually, I should also mention, too, there’s been some medical research in this area as well, because officers are getting faster. How are people who are struck by gunfire affected by that? And they found that ShotSpotter alerts are correlated with individuals who have more severe injury, they’re more likely to be taken to the hospital directly by the officer. So, getting to the hospital quicker because instead of waiting for an ambulance or EMT, the officer will  see that this person gets to the hospital and put them in their cruiser and get them there as quickly as possible. That being said, multiple studies looking at this didn’t find any effect on mortality levels. So, individuals either notified police departments by 911, or through a ShotSpotter alert. The mortality levels are equivalent between those two groups.

Ben Sessoms: Yeah, and so you were talking about the impact on arrests earlier. So here in North Carolina, based on data that ShotSpotter itself puts out. So, in Greenville, in 2019, their data shows that incidence of gun violence and homicides both decreased. 

They report similar numbers in St. Louis, Mo. 

But there’s other cities such as Charlotte, who ended their program in 2016 because they said the city’s police force didn’t make arrests or identify crime victims to the level that they thought that they could — kind of going back to what you’re saying with the impact on arrests. 

There’s also a report last year from the Chicago’s Office of Inspector General that concluded, quote, that the technology, quote, “Rarely produce evidence of gun related crime, rarely gave rise to investigatory stops and even less frequently led to the recovery of gun crime related evidence during an investigatory stop.” Does this go into what you’re talking about with the mixed results that there are some benefits, where you were talking about some studies conclude that officers can get medical attention to gun crime victims quicker, but in other places where, maybe such as Charlotte, maybe such as Chicago, where it’s not getting the amount of arrests and evidence that they thought it would. Is this the mixed results that you’re talking about? Does this make sense with the studies that you’ve done?

Daniel Lawrence: So yes, I think that’s right. So, I would say the Chicago study and implementation in major cities. They had it in San Francisco. I’m not sure if they still do in Chicago. So this technology is ideal in open areas. So just imagine like an open field. It’s going to be very accurate in an open field. 

But in an urban setting, you have buildings, you have what’s called reports, or echoes from gunshots. That can be collected as well. If there’s a vehicle, a shooting next to a box truck, that’s going to, or in an alleyway, that’s going to affect the accuracy of this technology. And it’s difficult and complicated to actually collect this type of evidence. Some offenders may be more aware that ShotSpotter is in place and may result to using revolvers. So that doesn’t leave a shell casing. So that way, they can’t collect that type of evidence. And these types of events too, keep in mind, are extremely quick. You might have a shooting, and then everyone in that immediate area leaves the area. So the officer arrives, and nobody’s there. So, whether or not they actually get out at that point, and then conducting canvass, try to collect information from residents. 

Some programs — I think in Denver and Milwaukee they were doing this — they acknowledged that community members may not want to be seen speaking with officers. So, they’ll leave little note cards or door hangers saying, “There was a shooting in your neighborhood. If you’ve heard or saw anything, please call us at this number.” So that way the community member can actually contact them in a more private way. And all that said, the technology is primarily meant to collect, to help officers collect evidence, but these types of events are challenging to collect evidence. So it’s an aid to officers in that way because it gets them to the scene. It gets them to the pinpoint of where it occurred, gets them there faster, but that doesn’t necessarily always mean, as the Chicago report found, that they’re actually going to be able to find the evidence that they can then actually use to conduct, to start initiating an investigation and make an arrest and that leads to prosecution and so on.

Ben Sessoms: So, I also want to talk about some privacy concerns that some people have about ShotSpotter. So first off, I know you broke this down earlier, but could you again, break down how this technology works. You have acoustic sensors that are listening for gunshots. Once they hear that loud noise, can you go through the process of how it gets from that to police on the scene?

Daniel Lawrence: Sure, so, it’s worth noting here. So the acoustic sensors are always on. So they’re always attempting to sense those anomalies in noise, those loud bangs that may or may not be a gunshot. 

When that occurs, it collects a four-second segment of audio. So two seconds before that noise, two seconds after that noise, or if it’s a long, I mean, the noise itself, say it’s like a gunbattle, where someone has multiple guns, two people are fighting it out. That segment will be much longer. So, during that segment of gunfire, noise is being collected. And that can include people yelling, people screaming, people saying people’s names, “Don’t fire back,” “Leave the area.” Things like that. 

But what it doesn’t do, it doesn’t constantly record a community. It’s not being used outside of that focus on gunshot activity. When that anomaly occurs, that noise anomaly occurs, the information is sent to ShotSpotter’s headquarters. There, they have trained people who review that information. They actually can listen to the sound segment. They can better identify the type of weapon it might be. Or they can make a decision. If it’s actually a train going over tracks. They actually review the .wav file to the sound file. So, sound, when it’s collected by sensors creates a sound file that you can actually better assess to know if it’s a gunfire or not. Different sounds make different types of waves. So they can actually assess, based on .that .wav file, if it’s a gunfire. And then all that information. So, the departments can actually access this information in a number of ways. 

So, once a gunshot is published by ShotSpotter, which means that it’s confirmed as a gunfire. It’s going to go to their computer aided dispatch. It can get to the dispatch and then be assigned to an officer. That happens pretty automatically through the department’s dispatch. But it also can, it also goes to the department’s ShotSpotter software. So police agencies can access this information either in police cruisers. They might have the program on their computer. And every time there’s a gunshot alert, they’ll make a noise. They can go in the program and see where it is. They can actually hear the .wav file. They can assess if there’s people getting hit, if there’s people screaming, things like that. They can also access that information on their smartphones if their department provides them with one or if they have an account for ShotSpotter. 

So, they have a number of different ways to actually access the ShotSpotter alert. That gets them to the scene. So, either through their dispatcher or through the software itself on their computers in their vehicles or through their smartphones.

Ben Sessoms: So the noises and the data that these acoustic sensors are collecting, it’s not completely interpreted by artificial intelligence? There’s people listening to determine if this is actually a gunshot?

Daniel Lawrence: Right. So the artificial intelligence is meant to flag instances where it’s a higher likelihood of gunfire, but all those incidences, to my knowledge, at least when I was there doing an evaluation of ShotSpotter, goes through — I forget what they call them — but they go through an employee of ShotSpotter who reviews that and then publishes it to the department.

Ben Sessoms: So kind of going into what you were saying. ShotSpotter, they say that the acoustic sensors are high up, out of earshot of private conversations. I do think some people may feel cautious, when they hear, because there’s always concerns about surveillance, especially with modern technology. 

Do you think that’s an issue here? With having acoustic sensors in the neighborhood? Do you think there’s any concerns, or legitimate concerns, about privacy beyond the loud noise where, like you said, it could pick up someone screaming, but it’s not going to pick up private conversations? Right?

Daniel Lawrence: So, I mean, I think there’s valid concerns there. In that we’re currently in a time where we’re sensitive to Big Brother. We’re sensitive to law enforcement taking control of our public aspects of life. And we want to be sensitive to that. I think that’s valid. And I also think there’s valid concerns in the implementation of this technology in primarily communities of color. 

It’s unfortunate, but that American society, the way things have developed in the past 200 years, there’s a high correlation between high social economic poverty and being a community of color. So, when you have increased poverty, you have increased violence, increased crime. And that’s associated with increased shootings, which is then taking this technology and implementing it in those areas that are predominantly communities of color. And that’s an unfortunate reality of the society that we’ve created. 

In terms of the audio sensitivities. So, I think it was interesting. In Durham, I read an article recently about its implementation of ShotSpotter. And they made a decision at a City Council meeting that the schools wouldn’t partner with ShotSpotter to have the acoustic sensors on their buildings. And the reason that they provided was that they didn’t want, for lack of a better term, microphones on their school properties recording students. 

And what I was caught surprised by was that the schools already have public surveillance cameras. And while public surveillance cameras aren’t collecting audio, audio with the recordings, they’re much more intrusive, at least in my view, of recording students and their activities that they’re doing and things like that. 

So, I just think there’s a lot of misconceptions about this technology and misunderstandings of what it’s meant to do and how it’s used. So, all this said, I think, yes, there are valid concerns, because we don’t want police departments just recording everything and listening in on public conversations. And there’s lots of due process issues associated with that. 

But that being said, in my experience, that’s not how I observed this technology being implemented. I mean, I’ve listened to ShotSpotter recordings. I think, actually all the ones I listened to, were just gunfire. That’s all you hear in the recording. 

One officer did mention how they had a case where the prosecutor used the audio from ShotSpotter because a community member yelled out someone’s name, and like, “Don’t do that.” And then there was a shooting. So they use that audio as part of their case to connect people to the event that ShotSpotter identified.

Ben Sessoms: But that was just one recording that the officer said. All the ones you’ve heard, there wasn’t even any voices? It was just gunshots or loud noises?

Daniel Lawrence: Right.

Ben Sessoms: Yeah. I also wanted to go back to where — and I’m glad you brought that up earlier about this being deployed in communities of color — as you talked about, we have a long history of socioeconomic conditions due to a history of systemic racism in communities of color, and how that’s impacted crime rates and such. So, yeah, I did want to ask you about that. 

So, these acoustic sensors are often deployed in high crime neighborhoods, which as you were just saying, those are usually communities of color. I know there’s a lot of concern about over-policing in Black and Brown communities in this country. 

Do you think there’s any concerns about ShotSpotter being used as a tool to contribute to that overpolicing of Black and Brown communities?

Daniel Lawrence: So, yes and no. So, as I mentioned previously, the technology does increase the amount of gunfire notification that police departments are now aware of. So that’s going to result in more officers responding to those gunfire events, more officers present in those communities. 

That being said, the technology, the implementation of the technology is based off of community members calls for service, 911 calls for gunfire. So, it’s a known issue that these are areas of a city that have this issue of gunfire. Any assistance is going to be helpful. We also, as part of the NIJ study, collected or conducted community focus groups. 

So, we did focus groups with community members within gunshot detection technology ShotSpotter coverage areas as well as outside of those coverage areas. Within the coverage areas of the technology, communities want police to respond to this violence. They don’t want high crime, they don’t want gunfire going off when they’re walking home. They don’t want to fear for their children’s lives. They don’t want to have to put kids in bathtubs when they hear gunfire outside their door. So, they want the police to respond, and that’s predominantly what you find. You can see that in Gallup national surveys is that these communities want police to respond to violent crime. And this is a technology that just gets those officers to the scene in a much more efficient way.

Ben Sessoms: So yeah, I mean, obviously everyone in a community, like you we’re saying, they want the police to respond to gun violence. I think anyone along the political spectrum wants to reduce gun violence. People may disagree on how to get there. But that’s the common goal. Do you think, based on studies and data that you’ve seen, is ShotSpotter a useful tool for preventing that gun violence? Or do you think, you’re talking about earlier with the mixed results, do you think it’s more of a problem than it’s worth?

Daniel Lawrence: Yeah, I mean, we haven’t even talked about the cost. So that comes into this conversation. It is an extremely expensive technology for cities to implement. I think we found that it was like, per square miles, like $65,000, or something like that. 

So, it’s an extremely expensive technology middleman for cities. So, you see these huge contracts for ShotSpotter’s implementation, and then most chiefs and city councils will ask, “Is it worth that amount of money? Is it actually having an impact?” And, like I said previously, it’s very specific to how cities use the technology. You can have a city that’s just responding to the alerts themselves, getting to the scene, driving through making sure that there’s no one injured, no one with the gun, and then leaving at that. 

That is possible in these really high crime areas where officers are required to go from call to call to call to call, and they can’t actually get out of their car for an hour and investigate whether or not if a shooting occurred. And so that’s what I suspect is likely to happen in Chicago, where officers just have so many priority calls, that they don’t, if they drive through, and they don’t see a person injured, or they don’t see evidence of gunfire while they’re driving through, they’re not getting out of their car. They’re just going on to the next call. 

And, yeah, so in terms of, is it having the effect that it’s meant to have? I would say it is because it’s primarily meant to be a responding tool to collect additional evidence. And the NIJ study in Denver, Milwaukee, in Richmond found evidence of that. They’re responding faster, and they’re doing canvasses. They’re collecting casings. But is it reducing crime? No, that’s a much more complex issue. And it may have, based on how a city implements it. If they’re just responding versus collecting evidence, versus connecting it to other technologies to build a case. So, it’s all city-specific. And it really relies a lot on how cities manage a program, if they have a sergeant whose full-time job is responsible for this single program in the city and making sure that officers are being held accountable to do canvasses, making sure that if an officer comes back and says, “There were no casings found.” That Sergeant may go out within 24 hours and actually conduct their own canvass. So, in Milwaukee, they do that. In Denver, they do that. And they actually found there are instances where an officer says, “Yeah, I did a canvass, but we didn’t find anything.” And then the sergeant goes back the next day, or within 24 hours, and actually finds some shell casings under leaves or something like that. 

There’s other aspects of technologies such as using canines to sniff for shell casings. They also have magnets that can pick up shell casings. So all these different aspects to making sure that the program, and the technology itself is going to be as successful as it can be. And that correlates with again, that terrible pun for this conversation. But is it the best bang for the buck? Are you actually getting the most return for the investment that you’re doing? 

And I also say, too, so we didn’t publish this. It’s in our final report, in the NIJ study, but we did look at a cost-benefit ratio. And it’s really complicated because the estimate for a homicide, the cost associated with the estimate for a homicide, I think, is just shy of $10 million. So that includes things like the societal impact, the impact on family members, the reduced taxes that are collected from that individual over their lifetime, all these other components. So when you’re looking at a thing, like gunfire, where the likelihood of death is much higher than other crimes, and you’re looking at the impact of homicides associated with the technology. You can find dramatic shifts in whether or not it was cost efficient, depending on how you include homicide in that model.

Ben Sessoms: I think some people when they look at gun violence, they would say, what some people would call the root cause of addressing poverty and socioeconomic conditions. Do you think, based on your research of addressing gun violence and crime in general, do you think something like ShotSpotter, if it fits the city, if a city finds it’s worth the cost of this expensive technology. Do you think it should be used in tandem with other things that aren’t necessarily police related, more of addressing socioeconomic conditions as well as using something like ShotSpotter to directly address crime?

Daniel Lawrence: Yeah, so I think we’re at a point in society — and it was more prevalent a couple of years ago than it is now — but we’re at a point in society where we’re really trying to question the application of police officers and how they are present in our communities and the things that they respond to. Can we have social workers respond to different types of events? All those different questions. 

So, I don’t think it’s an either-or type of question, like you were saying, it needs to be done in tandem. So a city paying a million dollars for ShotSpotter. Could that million dollars have been implemented to create community programs to get teenagers summer jobs, for example? The answer is, yes, that certainly could happen. 

But at the same time, then you’re taking that money away from the police department, and they’re going to have lower response times. They’re going to have less knowledge of gun violence in the community and all these other things. So, maybe the answer is — and I don’t know, the perfect ratio — but say $500,000 to a community organization, $500,000 to ShotSpotter. Working city budgets in a way that can both emphasize and make these types of community violence programs most beneficial to reduce violence for the community, within the community, through community programs, as well as the traditional law enforcement response to these types of events. So yes, it needs to be in tandem. It can’t be all money goes to the community, all money goes to the police department. It really is a partnership that is necessary that’s going to have the largest impact.

Ben Sessoms: Well, thanks for your time, Dr. Lawrence. I appreciate you talking about this technology with me.

Daniel Lawrence: Thanks so much.

End of interview

ShotSpotter responds to points in the interview

Ben Sessoms: Thank you everyone for listening to that interview I did with Dr. Lawrence regarding ShotSpotter. CPP contacted ShotSpotter Inc. to address some of the points brought up in the interview with Dr. Lawrence. I contacted them after the interview was completed to give them a chance to respond to some of the studies and criticisms regarding their technology. Representatives from ShotSpotter responded via email, separate from the interview, with regards to some points included below:

  • Regarding the lack of increased arrests found in the NIJ study that Dr. Lawrence cited throughout my interview with him, ShotSpotter did respond and they said this quote, “ShotSpotter’s goal is to save lives and improve public safety. Perpetrators do not typically remain at the scene of a gunfire incident awaiting police response. Thus, the number of arrests made at the scene or suspects named in an initial police report is not an accurate way to measure ShotSpotter’s effectiveness. Rather, ShotSpotter’s value is in helping police officers find and aid victims faster, increasing evidence collection and building community trust,” unquote.
  • Regarding the cost of ShotSpotter’s gunshot detection system, ShotSpotter said this, “The National Institute for Criminal Justice Reform has conducted research on the cost of gun violence and found that gun violence costs the U.S. economy an estimated $299 billion every year, significantly more than ShotSpotter’s average cost of $7.99 per square mile per hour of coverage. By itself, ShotSpotter is not a cure-all, but studies have shown it’s a critical part of a comprehensive gun crime response strategy that saves lives,” unquote.
  • Regarding the Chicago Office of Inspector General’s report from last year that I mentioned in the interview with Dr. Lawrence, ShotSpotter said this, “The Chicago Police Department consistently described ShotSpotter as a critical part of how they respond to and solve crime. Nonetheless, partisans twist the OIG report’s conclusions in order to spread a false narrative about ShotSpotter’s effectiveness. It is wrong to call an alert without immediate evidence of a crime a false alert and the OIG report did not specifically suggest that ShotSpotter alerts are not indicative of actual gunfire. In fact, Inspector General Deborah Witzburg stated: ‘Our study of ShotSpotter data is not about technological accuracy, it’s about operational value,’ and ‘It’s entirely possible that every one of the 50,000 alerts correctly identified the sound of gunfire.’”
  • ShotSpotter went on to say regarding this Chicago OIG report that: “Linking an alert with evidence of a shooting can be challenging as a high number of alerts happened late at night, making evidence collection and identification of witnesses difficult. In fact, calls to 911 for gunshot incidents in the same location led to evidence recovery and only 4% of incidents. Superintendent David Brown credited ShotSpotter alerts with 125 lives saved in the last five years, recovery of 2,985 firearms and 24,421 pieces of evidence. And in a recent survey, 72% of Chicago residents showed support for gunshot technology,” unquote. 
  • In response to ShotSpotter being used in communities of color, ShotSpotter said this, “ShotSpotter coverage areas are determined by police using objective, historical data on shootings and homicides to identify areas most impacted by gun violence. All residents who live in the communities experiencing persistent gunfire deserve a rapid police response, which gunshot detection enables regardless of race or geographic location,” unquote.
  • In regards to the privacy concerns that were brought up in my interview with Dr. Lawrence, ShotSpotter said this, “ShotSpotter sensors are designed to record only loud impulsive sounds — pops, booms and bangs. An independent audit conducted by the New York University Policing Project concluded that the risk of voice surveillance is extremely low. Also, in 2019, the Privacy Advisory Commission of the City of Oakland unanimously approved the Oakland Police Department’s continued use of ShotSpotter despite one of the strongest surveillance ordinances in the country,” unquote.
  • In regard to The Associated Press report that I brought up during one of my questions to Dr. Lawrence, ShotSpotter has a very lengthy response on their website regarding this Associated Press report. I will link that in the transcript of this interview, in the article where you found this story. I’ll also link all the studies and independent audits that ShotSpotter sent me, if you want to take a look at those. So again, thank you everyone for listening to my interview with Dr. Daniel Lawrence, and I hope you enjoyed it. I hope you learned something, and I hope you have a great rest of your day.
 

Ben Sessoms is a Carolina Public Press staff writer based in Fayetteville. Send an email to bsessoms@carolinapublicpress.org to contact him.

Carolina Public Press is an independent, in-depth and investigative nonprofit news service for North Carolina.

Fayetteville, City Council, ShotSpotter, gunshot detection

X