- Home
- Catalogue
- Advancing AI Literacy Lessons
- Unseen by the Machine: When People and Rocks Are Misread
Advancing AI Literacy Lessons
Unseen by the Machine: When People and Rocks Are Misread
Science | 9-12
Duration: 60 minutes
Author: Travis Garcia, Computer Science Teacher
This lesson introduces students to the concept of bias in technology through a relatable lens: moments when humans feel “unseen” by machines. Students begin by sharing personal experiences with tech missteps, then connect these ideas to Dr. Joy Buolamwini’s work on coded bias. Through discussion and bias scenario analysis, students explore how incomplete or skewed data can lead to harmful outcomes in AI systems. The lesson concludes by bridging to scientific contexts, asking students to consider how misclassification might also occur in geology and other fields—laying the foundation for the next lesson on rocks and AI.
Lesson Plan Series
Lesson Objectives
- Explain how technology can misinterpret or fail to recognize individuals or objects.
- Identify examples of bias in everyday technology.
- Connect the concept of bias to real-world implications in science and society.
Essential Questions
- What happens when technology gets it wrong?
- Why do systems fail to recognize certain people or objects?
- How might these mistakes affect people’s lives?