3,836
Views
44
CrossRef citations to date
0
Altmetric
Articles

Studying New York City’s Crime Decline: Methodological Issues

 

Abstract

Methodological issues that must be considered in doing research on the New York City crime drop include the choice of a spatial unit of analysis, the choice of a mathematical representation of the processes responsible for the drop, and the choice of estimators. This paper considers the strengths and weaknesses of a time series analysis of data for New York alone, a panel analysis for the city’s precincts, and a panel analysis for a sample of cities, for studying the drop. The possibilities and limitations of precinct-level data are illustrated with annual precinct data for New York between 1988 and 2001. The paper considers static and dynamic fixed effects panel models estimated in various ways, including difference and systems generalized method of moments. These analyses find no evidence that misdemeanor arrests reduced levels of homicide, robbery, or aggravated assaults. Felony arrests reduced robberies, but only to a modest degree. Most of the decline in these three felonies had other causes.

Acknowledgments

I am grateful to Richard Rosenfeld for making the NYC Precinct data-set available, and to Steven Messner, Ann Piehl, Richard Rosenfeld, David MacDowall, and the anonymous reviewers for their helpful comments and suggestions.

Notes

1. Between 1992 and 2010, the rate of violent crimes reported to the police dropped by 46.7% nationally. For property crime rates the drop was 40.0%. The comparable figures for the City of New York are 72.6 and 73.0%.

2. Between 1996 and 2011 the number of marijuana arrests rose from 9433 a year to 50,684 (http://www.marijuana-arrests.com/graph8.html, last accessed June 12, 2012). Liquor law violations rose between 1990 and 2000 by 127, 262 (Chilton and Chambliss, Citation2012). Between the start of the year 2002 (when the current mayor, Michael Bloomberg took office) and the present, street stops increased by 600% (Taylor, Citation2012).

3. Critics have also faulted the NYPD’s tactics for disproportionately targeting racial and ethnic minorities, and violating constitutional standards for conducting involuntary searches (Fagan & Davies, Citation2000; Gardiner, Citation2012; Gelman, Fagan, & Kiss, Citation2007; Gonzalez, Citation2012; Stoudt, Fine, & Fox, Citation2011/12). The large and increasing volume of these searches, and the very small percentage of cases in which contraband such as illegal drugs or weapons are found, render it implausible that all of them take place in conditions that would justify reasonable suspicion of a crime, or fear that an officer’s safety is at risk. According to the New York Civil Liberties Union (Citation2012) in the first three months of 2012, the NYPD made 203,500 street stops, in which 89% of those stopped were completely innocent. Eighty-seven percent of these stops were of blacks and Hispanics. In the year 2011, guns were recovered in approximately 1 in 1000 stops (Taylor, Citation2012). Presumably in response to sharp criticism, police stops dropped by 34% in the second quarter of 2012 (Goldstein & Ruderman, Citation2012).

4. In fact, our estimates would remain unbiased in the presence of under-reporting and under-recording if the proportion of crimes undercounted remained the same for each precinct over time.

5. Wendel et al. (Citation2011) assert that a massive shift in drug use patterns did occur for this reason. However, our data raise doubts about this argument. The rate of deaths from cocaine overdose was .35 per 10,000 in 1990, and .44 in 2001. This does not point to a decline in cocaine use.

6. ARIMA modeling (a common strategy for analyzing time series), which we discuss below, is generally said to require a minimum of 100 observations to achieve trustworthy identification.

7. One weakness of the unit root tests is that they will tend to confirm the existence of a unit root even when there is none, in time series where a structural break occurs or where an outlier is present. Previous studies that claim to find unit roots (Greenberg, Citation2001; Hale, Citation1998) may have done so mistakenly because of this. Indeed, Cook and Cook (Citation2011) present evidence that this is in fact the case for US crime rates.

8. A word may be in order about the inclusion of misdemeanor arrests in a model for homicide, robbery and assault rates. According to a strict rational choice model based on perfect information, someone’s decision whether to commit one of these felonies should be uninfluenced by the level or likelihood of a misdemeanor arrest, because someone who commits one of these felonies would not be at risk of an arrest on misdemeanor charges. However, high levels of misdemeanor arrests in a precinct could remind a prospective felon of the possibility of being caught and prosecuted on felony charges. Misdemeanor arrests could also reduce felonies through their impact on the imprisonment rate. Some misdemeanants are jailed, and consequently unable to commit felonies against the general public while serving their jail sentences.

9. This implies that there are no substantial displacement or diffusion-of-benefits effects of policing across the boundaries of precincts. The evidence regarding the existence of such effects is limited, and largely confined to locations in very close proximity to a location that has received particularly intense police activity (Braga, Citation2007).

10. It is important to remember that while de-meaning protects the analysis from potential bias due to time-invariant omitted predictor variables, it does not protect against bias due to the omission of time-varying omitted predictors.

11. Some researchers extend this strategy by introducing unit-specific linear and quadratic terms in time (Raphael & Winter-Ebmer, Citation2001), but they, too, only model change without explaining it.

12. Though there are some differences between the two models as to which predictors are statistically significant, the magnitudes of the differences tend to be small. For example, the coefficient for percent black is 014 (significant) in the first model, and .012 (not significant) in the second model.

13. An alternative approach to the incorporation of a lagged endogenous variable can be found in the papers by Harcourt and Ludwig (Citation2006) and Rosenfeld, Fornango and Rengifo (Citation2007). It entails controlling for the time-1 value of the dependent variable at t = 1, i.e. at the first wave of observations. To see why this is inadvisable, consider Equation 3, and for simplicity, assume that x is time-invariant. The equation says that when t = 2, if x increases by one unit, y changes by the amount b2. When t > 2, the equation says that no further change in the outcome takes place. This is because y3y1 = (y3y2) + (y2y1). If the left-hand side and the expression in the second set of parentheses on the right-hand side are both b2, then the expression in the first set of parentheses on the right-hand side must be zero. This is not a plausible way to model change, given that x, by assumption, continues to characterize the precinct for the entire time span covered by the data set. The modeling strategy adopted here avoids this difficulty.

14. Judson and Owen’s (Citation1999) simulations show that the bias can be substantial even in the presence of 20 waves, but is greater for the lagged endogenous coefficient than for the coefficients representing exogenous variables.

15. Estimation of Granger causality models is another approach, but has the drawback of assuming that influences among the variables are lagged by one or more time units. Marvell and Moody (Citation1996) adopt this approach. It assumes that cross-contemporaneous influences are absent.

16. One might hope to reduce the contribution from the linear and quadratic trends by adding more predictors to the model, but when one considers the number of variables already in the model, this approach does not appear promising.

17. In 2002 the police documented 97,296 friskings (some friskings, however, may have taken place without being written up); in 2011, 685,724, an increase of a factor of 7. There were 1821 shootings in 2002, and 1892 in 2011, an increase of just under 4% (Weiss, Citation2012). It is difficult to reconcile these figures with claims that high levels of frisking or searching pedestrians, or of making misdemeanor arrests on such charges as possession of marijuana, have contributed to the crime drop by discouraging them from carrying guns. If the rate of shootings has been flat but homicide rates have been dropping, one might wonder about whether improved medical care for shooting victims is contributing importantly to the crime drop.

18. There are grounds for reservations over some details of Kubrin et al.’s analysis. Their model does not include a trend variable or felony arrests other than those for robbery, ignoring the possibility that felony arrests for other offenses could influence prospective robbers. Nor does it include imprisonment. It does not estimate the contributions of predictors to a trend (and does not claim to do so).

Additional information

Notes on contributors

David F. Greenberg

David F. Greenberg is Professor of Sociology at New York University. New York City Crime Decline - Methodological Issues

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.