Voting period for this idea type has passed

Student Predictions, Student Protections

Data science, predictive analytics, and artificial intelligence have become critical business tools. By learning from past business failures, educators can avoid dangerous pitfalls that might harm students. We will learn how algorithmic biases can perpetuate racism and inequality. We will also learn how systems can bend statistics to hide poor predictive performance. Finally, we will learn key questions you can ask as buyers and consumers to expose these problems and protect your students.

Additional Supporting Materials

Learning Objectives

  1. How the algorithms employed by predictive systems can be biased
  2. How biased predictions can lead to dangerous/prejudicial outcomes
  3. Key questions to ask and key things to look for when evaluating predictive products to avoid putting bad predictions into practice

Speakers

Organizer

Christopher Walker, Chief Data Scientist, Illuminate Education, Inc.


Add Comments

comments powered by Disqus

SXSW reserves the right to restrict access to or availability of comments related to PanelPicker proposals that it considers objectionable.


Show me another