When typing a simple Google search on the Internet, a vast array of systems have the power to decide which links or topics occupy the coveted top results page.
However, automated scoring systems may contain unconscious bias due to a variety of factors — system designers may bring their personal bias when designing algorithms or the data sets used for machine learning may already contain bias.
The proliferation of rating or ranking systems in everyday life often leads to complaints of online misrepresentation.
Cornell’s Due Process Clinic, an undergraduate “clinical” course designed to understand automated scoring systems such as credit scores and search engine rankings, started sending student researchers to collect qualitative data and build their own case studies on these systems.
Clinic director, Prof. Malte Ziewitz, science and technology studies, founded the clinic because he wanted to use legal frameworks to research non-legal situations.
The clinic tries to assess how someone who would not have access to a public relations expert could deal with online backlash or poor reviews, hoping to understand the consequences of those who have been misrepresented or ranked improperly. For example, a poor credit score could impose obstacles when buying a home or potential employers could view a misrepresented online incident.
“In the legal system you usually have some minimum support or representative like a lawyer or legal aid, but in these computer systems there is no such thing,” Ziewitz said.
Ziewitz’s background in law and work in studying the relationship between algorithms and the interaction with technology, people and institutions led to his specific interest in the ways people may feel misrepresented by scoring systems. These systems tend to be opaque and can be incomprehensible to the general population.
The clinic has 11 members: Ziewitz, a clinical research fellow, an undergraduate research assistant and eight student researchers. Ziewitz sought out student researchers with different majors to form a multidisciplinary team, remarking that it is “not just enough to have an engineer that might explain the tech or a sociologist that might study inequality.”
Student researchers are split into four teams of two before each team is assigned to a real world case. This semester, the clinic will tackle four cases related to the theme of misrepresentation in web search engine results by interviewing clients in the local Ithaca community.
By conducting clinical research, students hope to gain a better understanding of how search engines “[impact] people in their daily lives and what we can do about it,” said Ciarra Lee ’21.
The researchers are currently conducting desk research related to their respective cases and practicing mock interviews — they will be going out into the Ithaca community to interview individuals who have suffered as a result of misrepresentation starting next week.
Web search results are a major concern for misrepresentation because people use web searches daily. For example, one of the case studies involves researching the experiences of social activists, particularly those involved in political or environmental protests.
The social activists that protest against fracking and the companies that invest in it both want to present themselves favorably to their audiences by investing in resources to maintain a positive online image.
Not everyone has the expertise or means to curate a positive online image, but the clinic is working towards describing cases of misrepresentation in the hopes of understanding “the problem of fair representation from the margins of the system, that is, through the eyes of those who have to live with it,” Ziewitz said.