Skip to Main Content

Diversity in Data - Graduate Specialists

Algorithmic Bias

Katherine Lee, Diversity in Data Graduate Specialist, explored various aspects of Algorithmic Bias.

The algorithmicbias site algorithmicbias.github.io provides a web-based version of Katherine Lee's research (as of Spring 2023).  

The talks below cover the same topics from the Spring 2022 Diversity in Data series.

Who is Responsible for Bias in Algorithmic Decision Making?

In the past several years, we have acutely seen the way in which bias in algorithms can affect people on a daily basis, from Zoom not recognizing Black professors using virtual backgrounds to hiring algorithms that discriminate against women. And as facial recognition, deep fakes, and "smart" technologies become more ever-present in our lives, who bears the responsibility for bias in algorithms? Technology is as fallible as the developers who create it. But is it possible to prevent bias from entering algorithms entirely? And how can we best standardize practices, detect, and mitigate bias to ensure that users remain safe and protected?

 

How Tech Culture Allows for Algorithmic Bias and Unfairness

We need a cultural shift in technology. Let's discuss how the "move fast, break things" tech mentality and the uplifting of tech companies and leaders onto pedestals furthers algorithmic bias.

 

Case Studies on How Algorithmic Bias Arrives in Our Tech

This workshop will focus on specific examples of algorithmic bias in popular technologies, and how they were resolved short-term and long-term as well as what we can learn from these errors. 

 

Combating Algorithmic Bias 101: How Individuals, Institutions, and Governments Can Regulate and Mitigate Bias in Technology

We know algorithmic bias exists in our tech. So, what can we do?

Katherine Lee

Katherine Lee served as Diversity in Data Graduate Specialist during academic year 2021-2022 and 2022-2023.   Her research into algorithmic bias (problems, examples, and solutions) resulted in the presentations described on this page.