SXSW 2023

Helping People Navigate an AI World


Existing in today’s world increasingly requires AI literacy.
For people’s safety, their careers, and higher aspects of humanity, such as freedom and flourishing, there is a growing need to navigate the AI systems and other advanced technologies that pervade everyone’s lives.

We must move beyond ascribing the achievements of AI to magic, instead providing transparency and education in AI systems.

In our research and design workshops, we’ve learned how we can help teams increase AI literacy by taking a broader view on explanations. This broad view involves:
* improving explanations at the moment AI makes a decision or inference
* adding explanations throughout user journeys
* and explaining the impacts of AI outside of tech products: in user manuals, journalism, and K-12 curricula.

Related Media

Additional Supporting Materials


  1. Explainability, providing human understandable reasons and context for decisions made by an AI system, is a key aspect of responsible AI development
  2. We must design experiences that build AI literacy and help people understand the consequences and impacts of AI
  3. A broad view of explainability, which takes us well beyond today’s in-the-moment explanations, is the best path for improving the UX of AI products


  • Patrick Gage Kelley, Research, Google


Patrick Gage Kelley, Research, Google

Meta Information:

  • Event: SXSW
  • Format: Presentation
  • Track: Design
  • Track 2
  • Level: Beginner

Add Comments

comments powered by Disqus

SXSW reserves the right to restrict access to or availability of comments related to PanelPicker proposals that it considers objectionable.