As the debate over data-driven predictive policing continues to heat up, not every department is ready to back away from its use.
Santa Cruz, CA is ending the practice due to concerns over racial profiling. But one Florida sheriff remains all-in, using data to track down, monitor and interrogate those on a watch list, without probable cause, a search warrant or any evidence of a crime, according to a recent investigation published by the Tampa Bay Times.
Pasco County Sheriff Chris Nocco, whose county is just north of Tampa, FL, has, for years now, been using databases to predict where crime might occur and who may commit it. But according to the Times investigation, instead of creating a cutting-edge intelligence program to thwart crime before it happens, he has built a system used to continuously monitor and harass residents on a watch list.
The sheriff’s office disputes this.
One instance listed in the article involves a 15-year-old boy whose crime was stealing a motorized bicycle from a carport. The kid got on the watch list and since that time, sheriff’s deputies have gone to his house 21 times, showed up at a car dealership where his mother works, checked to see if he was at the gym and looked for him at a friend’s house, according to the article.
Rio Wojtecki, who became a sheriff’s office target last year, already had a state juvenile probation officer checking on him. But that did not stop the surveillance and visits to his house.
“More than once, the deputies acknowledged that Rio wasn’t getting into trouble. They mostly grilled him about his friends, according to body-camera video of the interactions. But he had been identified as a target, they said, so they had to keep checking on him,” The Times reported. “Since September 2015, the Sheriff’s Office has sent deputies on checks like those more than 12,500 times, dispatch logs show.”
Analytics are key to predictive policing
Predictive policing uses mathematical, predictive analytics and other analytical techniques in law enforcement to identify potential criminal activity. Predictive policing falls into four general categories: methods for predicting crimes, methods for predicting offenders, methods for predicting perpetrators’ identities and methods for predicting crime victims.
The methods, though, are under continued scrutiny at a time when police brutality, systemic racism and this type of data-driven prevention tool is being used, MIT Tech Review reported.
In Santa Cruz, it simply was not having a positive impact, said Mayor Justin Cummings. In discussion with the police chief, he decided to end the use of predictive policing earlier this year.
“Some members of the community brought it to my attention because they were just concerned with that level of surveillance,” Cummings said. “When I met with the police chief, he had actually mentioned that when he came on in 2017, he didn’t find the technology to be effective. As I continued to look into it, we decided to end it. What comes out of the algorithms can be biased. It was around the time when social unrest after the murder of George Floyd was building emotion within the community to do more to protect people of color. This was one more factor biased against people of color.”
Santa Cruz also put a ban on the books against facial recognition software, although it was not in use there.
“There is no real positive impact of using predictive policing and we haven’t seen any changes in policing as a result,” Cummings said. “There was no drop in crime” while using the data-driven method of policing.
Pasco denies abuse
In the case of Pasco County, Nocco’s office denies any abuse of the system and takes exception to the Times article.
“We have several fundamental issues with their reporting, as it strays far from fact and attempts to pigeonhole an integral part of our operations in Intelligence-Led Policing as a unit with the sole purpose of harassing citizens prior to them committing a crime,” said Pasco lead Public Information Officer Amanda Hunter.
She said the sheriff was not available to respond directly for this article, but the office will “vigorously defend the good work of our members, nor will we apologize for keeping our community safe.”
John Hollywood, senior scientist for the Rand Center for Quality Policing, who studies predictive policing, called Pasco’s use of this data-driven technique “a conflation of ideas thrown together. Occasionally we hear about this where things are getting lost in translation, using prediction in the intelligence cycle. As someone who has done a lot of work in IT and homeland security, it does seem to be a confusion on what you are supposed to do with that (information).”
Data-driven policing has not delivered the expected results, the MIT Tech Review states. Activists, mathematicians and academics across the country have denounced the use of untested software police are using.
Andrew Ferguson, author of The Rise of Big Data Policing: Surveillance, Race and the Future of Law Enforcement and a professor at American University’s Washington College of Law, said the goal of predictive policing was to put officers at the right place and the right time to reduce or deter crime. “That, over time, evolved into person-based policing which we’ve seen in Los Angeles and Chicago and in Pasco County.”
If a department is using person-based predictive policing, it must consider the financial costs, opportunity costs and “what you are not focusing resources on, and also the downstream impacts on communities, individuals’ lives, liberty and relations between police and the people they police. There has been a series of experiments with person-based policing and none of them have worked.”
What the Pasco Sheriff’s Office is doing, he said, is determining who belongs in a community and who it wants out.
“They are proactively saying certain individuals are not wanted in the community. That is not preventing crime,” Ferguson said. “It is certainly not solving crime. What it is doing is undermining respect for police and the law. It’s not healthy. If police did this without the pretend legitimate theory of predictive policing, they would be considered rogue officers. That would seem completely against good policy. That they are claiming this is based on a theory of predictive policing policy doesn’t legitimize bad policing.”
New technology changes police methods
His book, written in 2017, focuses on how new technologies are changing where police patrol, whom they target and investigate and that these technologies are changing the relationships “because they are redefining racial inequity in our society and building off of a false myth that data can be objective and can work independently. The book looks at technology and policing and traces through some of the failed experiments.”
There are less harmful examples of how to use predictive policing, Ferguson said. “Figure out where in a society there is need, where there are gaps and perhaps redirect financial or social resources to those gaps. The analytics might spotlight a young man who needs help or the location of potential crime. Try to help that young man or figure out why there is environmental vulnerability.”
The American Mathematical Society, in an open letter, urged its fellow experts to stop assisting law enforcement in this way.
“In light of the extrajudicial murders by police of George Floyd, Breonna Taylor, Tony McDade and numerous others before them, and the subsequent brutality of the police response to protests, we call on the mathematics community to boycott working with police departments,” the letter stated.
“There are also deep concerns about the use of machine learning, AI (artificial intelligence) and facial recognition technologies to justify and perpetuate oppression.”
The MIT Tech Review article states that predictive policing using historical data has been used to target the poor, minorities, low-income neighbors and people involved in low-level crimes. That has led to the controversy over its use and decisions to end the practice in some places.
The Rand Center’s Hollywood agrees.
The Chicago Police Department, which used what it called a Strategic Subject List, or SSL, quietly ended the data-gathering effort of predicting who would commit crime or become a victim.
The Chicago Tribune reported in January that the department acknowledged the effort did not reduce violence. It cited a national study that found SSL ineffective. The Los Angeles Police Department made a similar move in 2019.
“One of the higher performing districts in Chicago where intervention-based policing was used, it might look at where they just had a shooting with two gangs,” Hollywood said. “It is dealing with it on a more human level (than the previous SSL) and using common sense.”
If police know there is a dispute between two gangs, they can work at that human level to try to quell violence, he said. He does not believe Pasco County is attempting to do that.
Police can make a box and determine what is going on in that geographic space, Hollywood said.
“Statistically I can say yes, a robbery is much more likely in this box over the next month. Practically, it depends. We know that in general if you have more visible patrols and are doing more policing activity you do tend to see crime reduction. There is a lot of published research that you get moderate to very good effects. The question is, what are the underlying things that are driving crime and what can I do to work with the community, whether it is services or improvements to make this space safer? It could be the physical environment or working with the convenience store that is being robbed so many times to figure out why it is happening.”
Predictive policing, he said, works better if police go beyond the algorithm, instead, using it as a starting point to figure out how to make communities safer by solving underlying crime problems.