Project Lead: Mike Katell, University of Washington Information School
Community Engagement Lead: Meg Young, University of Washington Information School
Faculty Advisor: Peaks Krafft, Oxford Information Institute
Data Science Lead: Bernease Herman
DSSG Fellows: Corinne Bintz, Aaron Tam, Vivian Guetler, Daniella Raz
Community organizations across the U.S. are concerned about surveillance technologies implemented in their communities, and these technologies are profiling and targeting minorities and disadvantaged groups. These community organizations are calling for algorithmic equity (accountability, transparency, fairness) through the implementation of legislation like city surveillance ordinances that manage the acquisition and use of surveillance technology. The City of Seattle, Berkeley, Nashville, Cambridge, and others have implemented ordinances that differ in their scope, process, and power in regulating government technologies. However, most technology policy legislation in the U.S. fails to manage the growing use of Automated Decision Systems such as facial recognition and predictive policing algorithms.
The Algorithmic Equity Toolkit is a vital tool that community civil rights advocates can use to voice their concerns about these technologies during the decision-making process for the acquisition of these technologies.
This toolkit aims to help community civil rights activists, grassroots organizers, and grasstops leaders identify a government surveillance or Automated Decision System (ADS) tool or system, how they work, and what are the potential harms with using these technologies. An Automated Decision System is a computerized implementation of algorithms to assist in decision-making. ADS’s are increasingly used in our society to analyze data and make decisions more quickly and efficiently; however, the increasing use of ADS’s decreases transparency and accountability due to their complexity and the lack of awareness about how they work. We hope that with this toolkit civil rights activists can distinguish between surveillance tools from ADS tools and be empowered to challenge the implementation and expansion of both surveillance and ADS technologies by asking the right questions.
When do I use this toolkit? The toolkit is designed for community members, civil rights organizations, and anyone interested in algorithmic equity by voicing their concerns against the use of harmful ADS technologies within their community.
You can use this toolkit when engaging with policymakers, government representatives, or when you want to learn about ADS technologies and their potential harms.
What is included in this toolkit? Three components comprise this toolkit: