Perhaps the most pervasive myth around computer science is that it is a neutral discipline used to express objective truths and linearly enhance our lives.
Full Story
Perhaps the most pervasive myth around computer science is that it is a neutral discipline used to express objective truths and linearly enhance our lives. Yet history shows us that any technology can be weaponized, be it medicine or social media.
In fact, human bias seeps into machine learning systems deployed in mortgage lending, courtroom sentencing, and college admissions. Amazon abandoned an AI recruiting algorithm after realizing it discriminated against women. Facial recognition technologies, notorious for misidentifying Black people as criminals, have been banned for use by police in cities like Boston and San Francisco. The Cambridge Analytica scandal of 2016 exposed the vulnerabilities of American democracy in the age of big data, micro-targeted advertising, and social media.
In response to these challenges, universities are increasingly incorporating ethical and policy questions into undergraduate computer-science coursework. They realize the importance of embedding computer science within the social, political, and historical frameworks of the humanities, so that programmers can anticipate the consequences of their work in the real world.
In their compelling piece Excavating AI, Kate Crawford and Trevor Paglen argue:
“Struggles for justice have always been, in part, struggles over the meaning of images and representations⦠Representations aren’t simply confined to the spheres of language and culture, but have real implications in terms of rights, liberties, and forms of self-determination.”
As algorithms increasingly govern our lives, we should care about how we are represented, rendered mathematically, and labeled by others.
The difference for computer science is that machine learning systems are trained on massive datasets and involve opaque mathematical computations that defy understanding and thus resist argument. We don’t understand them, so we accept them. Yet these same models rely on databases of the past, which are steeped in histories of oppression. So while an algorithm may itself be neutral, it often perpetuates and magnifies inequities, especially if the humans programming it aren’t careful.
To fix tech’s ethics problem, we need to begin these critical conversations in K-12.
Our students today are not just digital natives; with so much of their world automated by algorithms, they are also AI natives. We have a responsibility to teach them not only how to code the technology around them, but how to critically consume it; how to see technology as an arena for power; and how to engage in a society that proactively considers the consequences of technological choices.
These conversations are not easy, but merit our attention and effort as educators. If you are hesitant to dive into these topics in your classroom, that’s okay! Figure out what’s holding you back and strategize forward.
CONCERN: I don’t feel prepared to cover this content.
STRATEGY: Don’t avoid critical conversations; educate yourself.
Start by reading some of the literature on the topic. Books in this genre include Weapons of Math Destruction, Automating Inequality, The Black Box Society, and Algorithms of Oppression. Excellent long-form articles include “Machine Bias,” “Austerity Is an Algorithm,” and “Are Algorithms Building the New Infrastructure of Racism?“
You don’t have to be an expert to have critical conversations. Leading vulnerably and learning alongside your students can be a powerful way to invite students into the conversation. Be willing to admit what you don’t know, and find answers with students. Throw away perfectionism and model a growth mindset!
CONCERN: This isn’t part of my curriculum.
STRATEGY: Shift to a mindset that this is NOT extra-curricular; it’s essential.
I’ve noticed myself getting so focused on helping students master loops and procedures, the technicals of a technical discipline, that the human and ethical considerations can fall by the wayside. While that’s totally understandable, it’s important to have the mindset that these topics are NOT extracurricular. Critical thinking and cross-disciplinary integration are essential skills your students need. Providing a social context can help students connect more deeply with the material, reinforce concepts, and encourage girls and students of color to engage more meaningfully.
CONCERN: I don’t have time to cover this ON TOP of required content.
STRATEGY: Find easy points in your curriculum to have critical conversations and keep conversations brief.
You can strategically intersperse conversations around equity, policy, history, and ethics in your curriculum so that they tie into the content already being taught. They also don’t need to take much time!
-
Are you teaching a lesson on privacy? This would be a great time to discuss the pros and cons of facial recognition technology, within the context of 1) school safety, 2) predictive policing, or 3) mass surveillance.
-
Are you learning about algorithms? Consider ending the unit with a Crash Course video or a one-day lesson on algorithmic bias.
Interrogate students on their attitudes as the stakes rise, under different scenarios, or compare their final attitudes to how they felt before the reading/video/discussion. Opportunities to discuss with peers also confers benefits as “brain breaks”, and the real-world applications can get students reinvested in the course material.
CONCERN: I don’t have the resources.
STRATEGY: Connect with like-minded teachers at CSTA!
CSTA has a wealth of resources and connections through its annual conferences, chapters, and community hubs.
For exploring on your own, check out these low-lift, high-impact resources on algorithmic bias.
Curriculum/Activities:
|
Videos (pair with video guide)
|
Reading:
|
As K-12 computer science teachers, we carry the responsibility of curating students’ first formal forays into the digital world. We want to keep it engaging, fun, and relevant. Integrating the humanities into computer science is one critical way we can all achieve our goals.
About the Author
Elizabeth Naameh teaches high school math and computer science, with a focus on equity and engagement. She founded the AP Computer Science program at USC Hybrid High and lectures at UCLA through the AP Readiness Program, which brings high-quality AP instruction to students and professional development for teachers throughout Los Angeles. Elizabeth is a committed advocate for girls and students of color in STEM, working to expand notions of “who” does CS. She has received training through AP CS 50, Code.org, and TEALS. Her accomplishments this year include competing in a triathlon, raising a kitten, starting a book club among friends, and not over-watering her succulents.