robotics and ethics
April 29, 2021

the ethi{CS} project in Robotics: Guiding Principles

Artacho explains the importance of embedding ethical inquiry into the robotics curriculum.
by Carol Artacho

This post is the second of a three-part series. You can read Carol’s first post here.

Ethics and robots? Why, yes! As we prepare our students to become the future designers and consumers of technology, it is imperative that we embed ethical analysis into our curriculum. In my own journey to design lessons that reach this goal, it quickly became apparent that I would need to scaffold these lessons in two ways: the complexity of the issue at hand as a way to manage cognitive load, and the degree to which students have a personal, emotional connection to the topic at hand, as a way to manage emotional load.

Scaffolding Cognitive Load

Cognitive Load is a term coined in the 1980s to describe the ways in which a brain acquires and processes information. In layman’s terms it refers to the number of new pieces of information that an individual can learn during a certain amount of time. This is something that teachers keep in mind when designing engaging learning experiences.

For example, when introducing an ethical matrix as a tool, it’s important to balance the load of this brand-new tool and way of thinking with the ethical premise that we’re actually thinking about. The topics to which we apply our matrices have to be increasingly complex— starting with a topic that has little controversy and moving to more heated debates. In other words, at the beginning of the process, students’ cognitive load is mostly taken up by learning how to create a matrix. As matrix creation becomes more natural and consumes less “brain space”, some of those thinking cycles can be applied to more complex ethical situations.

How does this play out in real time in the classroom? With my students, we start with reading several articles (like this) on what artificial intelligence is and examples of AI. From those examples, we created our first ethical matrix around an issue that is pretty innocuous to a teen: the use of artificial intelligence in eldercare. At first, my students offered a more superficial understanding of the issue:

Artacho's students begin to experiment with ethical matrices.

However, after more time and the opportunity to discuss as a class, students were able to identify additional values and stakeholders, while also realizing that we needed more information in order to fully grasp the issue at hand. Our class discussions forced them to look at the process both more broadly and in a more granular way. This is a sample of our second pass:

With practice, students design a more thorough ethical matrix on AI in eldercare.

As my students became more confident in their ability to apply an ethical matrix to a given situation, we began to investigate topics that were more complicated. For example, we discussed “A Thousand Days of Fear”, a documentary on The Manhattan Project (content warning: graphic imagery of war and Nazi paraphernalia), bias in policing, and racism in financial services algorithms, to name a few. By building off of the learning and skills they had developed when analyzing “easier” issues, my students were able to meaningfully engage with issues that had increasingly more layers of complexity.

Scaffolding the emotional load

Zaretta Hammond’s work, Culturally Responsive Pedagogy and the Brain: Promoting Authentic Engagement and Rigor Among Culturally and Linguistically Diverse Students (2014), connects learning and natural responses of the brain to the world around us.

Just like our computers, all brains come with a default setting that acts as its prime directive regardless of race, class, language, or culture: Avoid threats to safety at all costs and seek well-being at every opportunity.

Zaretta Hammond, Ph.D.

Dr. Hammond’s framework stresses the importance of emotions and the physical response that comes along with those, especially in situations that the brain can perceive as threatening- like situations of stress and/or discomfort in the classroom. It highlights the relationship between validation and management of these feelings, and learning outcomes.

As many of us have experienced, ethical discussions elicit strong emotional responses and scaffolding these reactions is an important element to any good course design. I want to make sure that students in my class are not so emotionally invested in what we’re discussing that their brains trigger a fight or flight response and I also want to validate and acknowledge those emotions and the important role they play in ethical decision-making.

As such, our first attempts at ethical matrices were situations that presented little risk of strong emotional responses in my students. My students are likely not in a stage of their lives where they need to think about caring for aging parents. World War II is part of a history lesson, both far from their reality in terms of timing and also completely out of their control. What my students do have a great deal of interest in and knowledge about is how their lives are going as students. Inspired by an employer in Wisconsin that gave employees the option of surgically implanting a microchip into their bodies to access workplace resources, students were presented with a (fictional) initiative by our school’s Dean of Students Office to implant a geotracking chip in students during their time at our school. This exercise was the most emotionally involved and complex ethical issue that they were faced with— one which they wouldn’t have been able to engage with and analyze critically without the practice on “easier” topics and skills.

The task of introducing ethical discussions in technology classrooms can feel daunting. Whether programmatic limitations, lack of time or confidence, or something else— please don’t let that dissuade you from trying. And we hope this series will help! The third (and final) part in this series will feature resources and lesson plans to get you started, whether you want to try a single lesson or a complete overhaul of your course. It’s an interesting adventure, and it pays off in spades!

Carol Artacho is an instructor in robotics at Phillips Academy and a Tang Institute fellow with the ethi{CS} project.

Other Posts