On Thursday, March 11, Cornell’s Digital Due Process Clinic held a screening and panel discussion of the documentary Coded Bias, moderated by Prof. Malte Ziewitz, science and technology studies. The discussion focused on facial recognition technology’s inherent racial biases and the questions of privacy that come up when it is integrated into public life.
Shalini Kantayya, director of Coded Bias, didn’t have much experience with computer science before researching for the movie, saying she was surprised by the fact that “various systems we are trusting so implicitly . . . are not vetted for racial and gender bias.” The movie starts with an explanation from Joy Buolamwini, a Black Ph.D. student at MIT, who realized during one of her experiments that facial recognition technology was unable to register her face because of her race.
Simone Brown, associate professor at the University of Texas at Austin and another participant on the panel, elaborated on the racism inherent in facial recognition, explaining how whiteness has been the template for facial recognition since the beginning of research into the field. Because of this fact, it is no surprise that it has led to instances where innocent people of color are misidentified as criminal perpetrators by these technologies.
Kantayya found it “really important to connect to the communities that are most vulnerable to [technology’s] impacts.” The movie focuses on an apartment complex in Brownsville, Brooklyn, where the landlord wanted to install facial recognition technology to the front door of their building. This apartment complex housed mainly women of color, and tenants felt that the addition of the new technology was being done to appeal to future tenants who would be brought in by increasing gentrification in the area.
The lower-income tenants were unable to resist the new technologies brought in by their landlord, or move to another apartment. The fact that they were having relatively new and experimental technology forced on them illustrates the point that Kantayya makes, the most vulnerable populations are the most affected by new technologies. While reflecting on this, Prof. Ziewitz applauded the movie’s humanitarian emphasis: “Many of us are trained in these fields as scientists and engineers, so we are really concerned about getting the algorithm right, making sense of the data, and getting it statistically right… It is important to be a critical mind and keep in the back of your mind that everything you’re going to do will at some point affect a real human being.”
Coded Bias seeks to demystify the inner workings of technology and algorithms, allowing people to think critically about the role technology plays in their lives. If we are technologically literate and aware of these ethical dilemmas, we will be able to fight these injustices; as Kantayya continued to explain, “The way we level the playing field is through technological literacy.”
Other members of the discussion contributed many fascinating insights as well. Prof. Shobita Parthasarathy, public policy at the University of Michigan, emphasized the role that the government plays as an early adopter and regulator of these technologies, possessing the power to regulate their usage and implementation.
Coded Bias makes this unsettling fact clear by examining the Metropolitan Police’s use of facial recognition in the streets of London; the reason the movie decided to focus on the United Kingdom stemmed from the existence of transparency laws allowing human rights groups to look at the police’s use of data, showing how dangerously inaccurate the technology could be in misidentifying innocent people as criminal suspects.
In contrast, the United States does not have these laws, and many departments use this technology without oversight. As this investigation unfolds, Kantayya comes to the unsettling conclusion that “democracies are picking up the tools of authoritarian states with no guardrails.”
Meanwhile, Luke Stark, assistant professor at the University of Western Ontario and an ex-Microsoft researcher, spoke about the challenging role of ethics in these technological fields; in many cases, although coders want to make a positive impact, there is no framework or leadership push to achieve this. He ultimately called for an integration of ethics into the computer science curriculum and the encouragement of unionization to ensure ethical coding, stating that “unionization is critical for justice in a society that involves tech.”
Together, the panelists and movie illustrate the importance of actively pursuing algorithmic justice. As members of this society, we all have a responsibility to be informed citizens and advocate on behalf of those who are most likely to suffer the negative consequences of our rapidly expanding technological capacity. In an interview with The Sun following the event, Prof. Ziewitz highlighted the numerous resources Cornell possesses to help people get involved with digital advocacy. “We have the Digital Due Process clinic who hosted this event, but that’s just one small aspect. There’s a ton of classes and research opportunities [across numerous departments] you can take that touch upon this area… So reaching out to the people who are doing this research can be very promising because a lot of these folks are looking for people to work with [from all academic backgrounds].”
Christina Ochoa is a sophomore in the College of Agriculture and Life Sciences. She can be reached at email@example.com.