Voting period for this idea type has passed

Machines: How To Understand and Talk To Humans

Interacting with humans requires all our technological senses to understand them beyond words and gestures. Computer vision brings awareness of visual cues in facial expressions, body language, and appearance. Natural language understanding helps us get their meaning and intent. Sensing provides context to choreograph our input and output modes. This session explores cognitive services from Microsoft, IBM, Google, and others that drive our need for well-designed interactions with humans.

Additional Supporting Materials

Questions

  1. How do I as a device, robot, automaton, or other machine effectively model good conversational patterns with humans using the tools at my disposal?
  2. How do I design and deliver appropriate responses based on contextual awareness across speech, sound, visual, haptic, and other outputs for humans?
  3. How do the cognitive services from my friends at Microsoft, IBM, Google, and others compare and contrast in empowering my emotional intelligence?

Speakers

Organizer

Robert Tuttle, Exec Tech Director, frog


Add Comments

comments powered by Disqus

SXSW reserves the right to restrict access to or availability of comments related to PanelPicker proposals that it considers objectionable.


Show me another