Is the NYPD Juking Its Stats?

+ Brandon Fuller

Does the NYPD have an entrenched culture of manipulating stats — the practice of downgrading crimes to lesser offenses or intentionally ignoring victims’ complaints of alleged crimes? Based on evidence from their recent email survey of retired police officers, the results of which were previewed in The New York Times, Professors Eli Silverman and John Eterno believe that the answer is “yes.”

Yet, as the Times article points out, the notion that the NYPD systematically manipulates crime stats seems at odds with Franklin Zimring’s research, summarized in his recent book about NYC’s two decades long decline in crime.

Zimring, a criminologist at Berkeley Law School, that compared the department’s crime data for homicide, robbery, auto theft and burglary to insurance claims, health statistics and victim surveys and found a near-exact correlation.

Silverman and Eterno’s survey went out to 4,069 police officers who had retired since 1941. Roughly half of the retired officers responded. Of the respondents, 871 had retired since 2002. More than half of these recent retirees said they knew of manipulation and more than 80 percent of recent retirees said they knew of three of more instances of manipulation.

“I think our survey clearly debunks the Police Department’s rotten-apple theory,” said Eli B. Silverman, one of the criminologists, referring to arguments that very few officers manipulated crime statistics. “This really demonstrates a rotten barrel.”

This assertion, along with “smoking gun” language from the summary, is somewhat surprising given the limitations of this sort of survey. It will be interesting to see the full wording of the survey along with the authors’ analysis of how the self-selected respondents in their sample differ from the population of retired NYC police officers.

Responding to a previous survey conducted by Silverman and Eterno in 2008, former police commissioner William Bratton raised a number of criticisms:

[C]ategories of crime that are nearly impossible to downgrade, notably homicide and auto theft, have declined much more than the categories that might be more readily manipulated. Auto thefts, which must be reported accurately because victims need crime reports to make insurance claims, are down 90 percent since 1993, the year before CompStat was inaugurated. In contrast, grand larceny, the category that can be most readily downgraded (by reducing the value of the property stolen), has declined only about 55 percent. Homicides, which generally report themselves when the body is discovered, are down about 76 percent, from 1,951 in 1993 to 471 in 2009.

This is not to suggest that numbers games never occur, but the idea that NYC’s crime drop is the illusory product of systemic underreporting and manipulation seems inconsistent with the broader evidence. Any organization committed to data-driven performance assessment may see some efforts to “juke the stats.” The more important question is whether the behavior is widespread. That seems like a question that a survey is incapable of answering on its own, particularly when other sources of evidence contradict the results.

That said, the problem is certainly worth keeping an eye on. Numbers-driven management, which arguably made the NYPD more effective and accountable, can also be put to use in ensuring that police reports of crime reflect reality. As cities continue to adopt better data-driven management techniques they’ll need to think carefully about how to discourage manipulation of the numbers — about the deep question of “Who guards the guardians?” In many cases, including the reporting of crime statistics by the police, independent auditors can help to both validate hard-won successes and identify potential problem areas.

Back to top
see comments ()