Student Predictions, Student Protections
Data science, predictive analytics, and artificial intelligence have become critical business tools. By learning from past business failures, educators can avoid dangerous pitfalls that might harm students. We will learn how algorithmic biases can perpetuate racism and inequality. We will also learn how systems can bend statistics to hide poor predictive performance. Finally, we will learn key questions you can ask as buyers and consumers to expose these problems and protect your students.
Additional Supporting Materials
- How the algorithms employed by predictive systems can be biased
- How biased predictions can lead to dangerous/prejudicial outcomes
- Key questions to ask and key things to look for when evaluating predictive products to avoid putting bad predictions into practice
- Christopher Walker, Chief Data Scientist, Illuminate Education, Inc.
Christopher Walker, Chief Data Scientist, Illuminate Education, Inc.
Show me another