Bias and Fairness in Data-Driven Decision-Making

Adopted by more than 200 cities in the United States, 311 service-request systems are one of the most significant links between residents and city government, accounting for more than eight million requests annually in New York City alone. Increasingly, these data are being used to develop predictive models of resident concerns and problem conditions across a city. However, predictive models trained on these data can suffer from biases embedded in the propensity to complain (or make a request) that can vary based on socio-economic and demographic characteristics of an individual or neighborhood, cultural differences that can affect residents’ willingness to interact with their government, and widely divergent local conditions. The goal of this project is to analyze the factors that influence resident reporting and to develop models to account for individual and neighborhood differences in reporting behaviors. Objectives include the identification and measurement of algorithmic bias in city management and the development of more fair, transparent methods for improved public-sector resource allocation.

Kansas_City.jpg
corbin-bell-Lxg6WvQR4Ro-unsplash-min.jpg

Featured Blog Posts

View all blog posts
brown-house.jpg
markus-spiske-iar-afB0QQw-unsplash.jpg
Blog
/ Aug 26,2021

Constantine Kontokost mentioned in Article

electronic_cubes.jpg
mona-jain--7e7KERdgW0-unsplash-min.jpg
Blog
/ Oct 09,2017

Equity in 311 Reporting

Understanding Sociospatial Differentials in the Propensity to Complain

by Constantine Kontokosta

Featured Press Posts

View all press posts