ethical debt
March 04, 2021

Teaching Students to Live (Ethically) Debt-Free

A new and timely resource to teach ethical app design.
by Nicholas Zufelt

As software developers, we can take shortcuts — both ethically and technically — when we're coding. But that borrows against future trouble. How do we help our computer science students learn to predict ways in which people will misuse the software they create?

One key example of this misuse is Zoombombing, the problematic practice of gaining access to video conferencing calls in order to force derogatory, damaging, or otherwise disruptive content upon the call’s attendees. This and other forms of technical malpractice are discussed in a recent video essay created by Casey Fiesler, assistant professor in the Department of Information Science at Colorado University-Boulder. In the video, titled "Zoombombing, Technology Ethics & Awful People Being Awful: An Introduction to Ethical Debt in Tech", Fiesler proposes that the burden of anticipation of such practices falls on the developers of the technologies used.

Fielser argues developers need to identify and address problematic practices early on.

She offers a way for us to learn to do so through introducing a concept called ethical debt, which pairs nicely with the well-established notion of technical debt — the implied cost to rewrite poorly written code. Ethical and technical debt get built up when technology creators take shortcuts, ethical or technical, in the creation of their apps.

As Fiesler says in her video, “The problem with debt is that it has to be paid back with interest.” The issue of Zoombombing, for example, would have been largely mitigated if early discussions about its possibility caused the developers of Zoom to have calls password-protected by default.

In order to avoid generating ethical debt, Fiesler suggests the practice of creating what we refer to in our class as problematic user personas: taking the time when your app is still just an idea on paper to envision the types of people that might want to misuse your app for harm and the ways they might do so.

In my app development class, we put Fiesler’s suggestion to direct use right at the start of the term. One of the first projects my students create is a business card generator, an app in which users submit some personal information in order to have a business card designed for them. At least, that’s how most users might use this app, but after watching Fiesler’s video students were also charged with brainstorming who else might have a use for their app. What about some user stalking a previous romantic partner, or someone looking to collect the personal data of others for blackmail? Students were tasked with, as Fiesler requests of us, “thinking about the risks, the misuse, during the design process”.

When my students get out into the workforce, potentially as creators of technology in the future, I want them to not only want to build more responsibly, but to know how to do so. And students want this too. One student reflected recently, “I think [these practices] add significantly to the class curriculum. I find that once charged with the ethical side of the equation, my design becomes much more deliberate as to how it will be affecting a user.” I couldn’t agree more.


Nick Zufelt is an instructor in mathematics, statistics, and computer science at Phillips Academy and a Tang Institute fellow. Zufelt is a contributor to the Workshop and the ethi{CS} project.

Other Posts