Of Kitchen Strainer Algorithms, Cherrypicked Data, and Ignoring Positive News

Sept. 1, 2021
ShotSpotter President and CEO Ralph Clark responds to recent reports attacking the company's technology and services.

By ShotSpotter President and CEO Ralph Clark

When I joined ShotSpotter 11 years ago, I thought our toughest battle would always be the fight against gun violence. I was wrong. I never imagined it would be trying to set the record straight after weeks of baseless, flawed attacks on our technology and services.

Advocates and reporters with poor understandings of police data and inept analyses have used classic, misleading rhetorical tricks to falsely portray ShotSpotter technology. I’m here to set the record straight with three key facts.

1. The Kitchen Strainer Algorithm

First, expert human reviewers — not some mysterious, secret algorithm — decide whether or not to publish gunfire alerts to police.

If you read recent attacks, you may think we have some diabolical computer code that decides what is a gunshot and whether or not to send police to the scene of a shooting. But this is nonsense. Our algorithms don’t do this.

To understand what our algorithms do, we need to zoom out a bit.

ShotSpotter’s technology and services are actually simple and transparent. We place acoustic sensors around neighborhoods that suffer from gun violence to detect and time stamp the distinct audio signature of impulsive noises — like pops, booms, and bangs. While our critics have attempted to claim that the placement of these sensors is along racial lines, this is again not true. Sensor placement is determined by historical gunfire data to ensure communities underserved by police responses to gunfire are protected.

ShotSpotter places enough sensors across a given coverage area so that the location of impulsive events can be determined by analyzing various time stamps from each sensor. This is called “multilateration” — a basic technique that has been used in various forms since World War I. The essential math behind it is no secret, but ShotSpotter scientists have taken it a step further to a greater degree of accuracy. It is simple math and physics that engineers have been using for a long time.

The other way we use algorithms — or machine learning computer code — is to weed out sounds that our sensors detect that may seem like gunfire but are not. Our sensors may detect a car backfiring or a firecracker exploding, but ShotSpotter has no interest in sending police to investigate a car that needs to go to the mechanic or kids lighting firecrackers.

To prevent that, we have two levels of review: First, an algorithm that analyzes the sounds to disregard those that are not gunfire, known as “non-gunshots.” It acts like a kitchen strainer, allowing impulsive events that are likely not gunshots down the drain, while keeping what remains in the strainer for reviewers to assess — clearing out the noise. The remaining information — sensor audio and other data — are sent to our specially trained human review team to further evaluate and determine whether or not there was a gunshot.

While I recognize that algorithms — or artificial intelligence — have a bad rap and the potential to cause serious violations of one’s civil rights or privacy, our “kitchen strainer” version does not qualify.

In fact, ShotSpotter’s algorithm serves the opposite function that it has been wrongly smeared with: It prevents police officers from going to locations where gunshots do not occur by eliminating similar sounds before they even reach our reviewers.

The classification algorithm and ShotSpotter’s human review process are not only good at what they do, but they are getting better every day. How do we know? Because if ShotSpotter didn’t meet customer expectations, we would have a high false positive rate. Even worse, we would have a high false negative rate, meaning there would be many instances where police determined a gunshot occurred in a neighborhood covered by ShotSpotter, but we did not publish a gunshot. With our technology and human review team, that happens quite rarely.

Overall, ShotSpotter operates at a 97% aggregate accuracy rate for real-time detections across all customers — a figure verified independently by Edgeworth Analytics, a data science firm. This means our machine classification is only weeding out “non-gunshots,” so we can pass a manageable number of incidents to human reviewers for analysis. And those human analysts do an outstanding job.

Before they take a seat at one of our Incident Review Centers (“IRCs”), each individual undergoes significant training and must prove that they can differentiate gunshots from similar sounds with at least 99.96% accuracy to be signed off as an official reviewer.

Finally, uninformed voices are asserting that ShotSpotter evidence has not been tested sufficiently. It is important to note that, every time ShotSpotter evidence is introduced at trial, it is open for inspection and cross-examination by the other side. In fact, ShotSpotter evidence and ShotSpotter expert witness testimony have been successfully admitted in over 200 court cases in 20 states. ShotSpotter evidence has prevailed 13 successful Frye challenges and one successful Daubert challenge throughout the United States.

ShotSpotter evidence can be helpful to either the defense or the prosecution. In the Chicago case, The People of the State of Illinois v. Michael Williams, we believe ShotSpotter’s forensic report and our likely testimony to the facts in the report, may in fact have contributed to charges against Mr. Williams being dropped and his release from prison.

2. Faulty Assertions Based on Incomplete, Limited Data Sets

This leads to the second rhetorical sleight of hand that critics have recently employed: Cherry picking poor data and treating that data as definitive evidence.

In the Covid-19 era, when data has become so important to how we navigate this deadly pandemic, one would think we would have learned a valuable lesson. Unfortunately, that isn’t the case.

Recently, the MacArthur Justice Center (“MJC”) report data were too limited to render the judgments they made. ShotSpotter commissioned Edgeworth Analytics to examine this report. Edgeworth found that the use of 911 data alone could not possibly lead to a conclusion that a gunfire incident did not occur. As Edgeworth put it, that data set alone does not tell the full story — not with victims and perpetrators leaving the scene or investigations playing out after the incident.

By contrast, the City of Chicago Office of Inspector General made clear in their report of last week that they are aware that more, and higher quality, data would help them understand the operational value of how ShotSpotter is deployed in Chicago, and that they acknowledged in the report that they might not have all that data before them. Contrary to how media and others interpreted the results and data, the Inspector General’s Office was not making a case that ShotSpotter technology does not work.

3. ShotSpotter’s Positive Results

Over the last month, what confounds me most is the media’s complete disregard of the positive impact that ShotSpotter has made in communities we serve. The articles give critiques, but they have not shared the overwhelming number of positive results we achieve.

Imagine: a lifeguard jumps into the water to save someone from drowning, but in reporting the incident, onlookers are only focused on the drink the lifeguard spilled on their way into the water. That is exactly what news outlets have been doing these past few weeks. They’ve focused on a modicum of rare errors, but never once mentioned the legion of success stories across the country.

Examples that the recent media has ignored are easily found. In Oakland, California, 101 gunshot wound victims were found and aided by police due to a ShotSpotter alert when no one called 911. Pittsburgh, Pennsylvania reported a 36% reduction in homicides year-over-year. In Greenville, North Carolina, a 29% reduction in gun violence injuries was reported in the first year that ShotSpotter was deployed. In Chicago this April, ShotSpotter alerted police to a shooting and they were able to arrive to the scene quickly enough to save a 13-year-old boy’s life. Without a ShotSpotter alert, a slower and less precise police response would have likely resulted in the child’s succumbing to his gunshot wound injuries. And, in the last week alone, ShotSpotter helped in ColumbiaPittsburgh, and Toledo.

Reporters have made no effort to find and speak to victims whose lives were saved because of an alert. No effort to interview police officers who were able to save victims due to ShotSpotter alerts. No effort to speak to neighborhood residents where guns have been taken off the streets thanks to ShotSpotter. Sadly, this has all been about a predetermined cause rather than community — and certainly not about facts.

….

Regardless of how the media continues to cover these issues, we at ShotSpotter know the value that we provide not only to police departments across the country but also to their communities. And, most importantly, so do the police departments and civic leaders that we serve.

At the end of the day, we also know that the current battle we face is a sad distraction from the issue at hand: Addressing gun violence to keep our communities safe. This summer, gun violence has surged in many parts of the country, robbing us of American lives. ShotSpotter is a tool that helps law enforcement put a stop to this senseless violence and helps to break the cycle of normalization of gun violence in our communities. We will continue to vigorously focus on doing work that matters, in making communities safer for everyone.

Ralph Clark

President and CEO, ShotSpotter, Inc.

Follow

20

1

20

Sponsored Recommendations

Voice your opinion!

To join the conversation, and become an exclusive member of Officer, create an account today!

Request More Information

By clicking above, I acknowledge and agree to Endeavor Business Media’s Terms of Service and to Endeavor Business Media's use of my contact information to communicate with me about offerings by Endeavor, its brands, affiliates and/or third-party partners, consistent with Endeavor's Privacy Policy. In addition, I understand that my personal information will be shared with any sponsor(s) of the resource, so they can contact me directly about their products or services. Please refer to the privacy policies of such sponsor(s) for more details on how your information will be used by them. You may unsubscribe at any time.