Alicia Wang / Sun Sketch Editor

September 30, 2018

MORADI | CS Needs an Ethics Requirement

Print More

Here’s a frightening proposition for this recruiting season: Cornell’s computer science undergraduates are woefully underprepared for careers in tech.

Cornell’s CS undergrads are bright and technically apt. They learn from some of the best minds in the field, and they score some of the most coveted positions in the industry. It’s not that these future developers can’t solve the problems put forth before before them. It’s that they often have no idea what the problems are in the first place.

For a long time, the prevailing Silicon Valley philosophy was best summarized in the famous Facebook creed: “Move fast and break things.” In other words, innovate at all costs. Mending what you broke comes way later: Silicon Valley’s approach has historically been to expeditiously develop, then try to fix up major issues like data privacy, foreign election interference, algorithmic discrimination or Nazis after disaster has already struck.

“We allow the market to rule and only in hindsight think, ‘Who did I harm? What did I miss?’” says Ji Su Yoo, a researcher at Harvard University’s Data Privacy Lab and my research supervisor from this past summer. “The dominant thinking in technical circles is that you push the law not to the moral end, but to the degree that it allows for profit or disruption.”

The prevalence of this precarious, retrospective approach to tech development inspired Yoo, alongside Harvard Prof. Latanya Sweeney, Professor of Government and Technology in Residence and the Director of the Data Privacy Lab, to develop the Tech Science program of study for undergraduates. The program, housed under Harvard’s renowned Department of Government, requires courses that specifically address the social and ethical implications of digital technology.

The Tech Science track is part of a larger movement among other high-caliber universities to set up ethics requirements and establish courses that investigate the social implications of technology. In many ways, Cornell is at the forefront of this effort: Many Cornell professors in Computing & Information Science are known both in the academy and in popular media as some of the foremost scholars of the social implications of digital tech. The Information Science majors in Arts & Sciences and CALS actually require students to take INFO 1200: Information Ethics, Law, and Policy, and the department offers a plethora of 3000+ level courses that can be used to fulfil major requirements.

Put simply, info sci students are entering the tech industry with a relatively robust understanding of how their algorithms and designs affect the world. Their peers in computer science? Not as much.

Actually, almost not at all: The computer science major — which spans both the College of Engineering and the College of Arts & Sciences — currently has no ethics requirement. The presumption is that the void is covered by distribution requirements in Arts & Sciences and “liberal studies” distributions in Engineering. Any remaining cracks are filled by the major’s own “external specialization” requirement. But as I’ve written about before, these kinds of requirements hardly support intellectual exploration, let alone target key interdisciplinary gaps in a curriculum.

The outcome is that Cornell is graduating hundreds of technologists every year with hardly any understanding of the sometimes frightening social consequences of their day-to-day work. In a world without ethical developers, we all suffer, especially now that algorithms are driving key decision-making systems in our day-to-day lives.

“There’s been a real shift in the way computing is used,” notes Professor Bart Selman, computer science, who led a seminar on ethics and artificial intelligence with Prof. Joe Halpern, computer science, back in Spring 2017. “Software was a tool that helped companies run more efficiently, like with databases or banking systems. Those systems were ethics-neutral, but that has been shifting in the past five years.” Now, these systems can decide on whether or not you get approved for a loan, what kind of news you see, or whether a cop will patrol your block.

“A requirement could be a step in a positive direction,” says Selman. “Whether separate courses or integrated, I really think it has to be a combination.”

Recent efforts like the AI, Policy and Practice initiative demonstrate that Cornell has the capability and the virtue to take action. And, thankfully, some CS professors have independently been integrating lectures on ethics into their courses: Profs. Nate Foster and Michael Clarkson, computer science, for instance, have added lectures on ethics in CS 3110: Functional Programming and CS 1380: Data Science for All, respectively.

But while all CS majors take CS 3110, not all will be taking Professor Clarkson’s class, and certainly not all of them will be venturing out into the technology law and ethics courses in the information science department. The solution to this vacuum is simple: Cornell CS must incorporate some kind of ethics and social responsibility requirement, whether it be through a separate but specialized course like INFO 1200 or integrated into the relevant existing courses.

As a leading educator of the next generation of developers, Cornell has a responsibility to ensure it is producing young software engineers who are prepared to address the prickly issues that come with creating digital technologies. If we don’t, we’ll find ourselves continuing down the current path of unstable technocracy, where our lives are ruled by software with little care for how it aligns with our humanity.

Pegah Moradi is a senior in the College of Arts and Sciences. All Jokes Aside runs every other Monday this semester. She can be reached at [email protected].

Correction: This column previously referred to Prof. Joe Halpern incorrectly as Jim, but it has been corrected.