Bias in Citizen Complaint Data

Adopted by more than 200 cities in the U.S., 311 service request systems represent one of the most significant links between citizens and city government, accounting for more than 8,000,000 requests annually in New York City alone. Increasingly, these data are being used to develop predictive models of citizen concerns and problem conditions within the city. However, predictive models trained on these data can suffer from biases embedded the propensity to complain (or make a request) that can vary based on socio-economic and demographic characteristics of an individual or neighborhood, cultural differences that can affect citizens’ willingness to interact with their government, and widely divergent local conditions. The goal of this project is to analyze the factors the influence citizen reporting, and to develop models to account for individual and neighborhood differences in reporting behavior. Objectives include the identification and measurement of algorithmic bias in city service delivery and the development of more fair, transparent methods for improved resource allocation.