Machines: How To Understand and Talk To Humans
Interacting with humans requires all our technological senses to understand them beyond words and gestures. Computer vision brings awareness of visual cues in facial expressions, body language, and appearance. Natural language understanding helps us get their meaning and intent. Sensing provides context to choreograph our input and output modes. This session explores cognitive services from Microsoft, IBM, Google, and others that drive our need for well-designed interactions with humans.
Share this idea
Additional Supporting Materials
- How do I as a device, robot, automaton, or other machine effectively model good conversational patterns with humans using the tools at my disposal?
- How do I design and deliver appropriate responses based on contextual awareness across speech, sound, visual, haptic, and other outputs for humans?
- How do the cognitive services from my friends at Microsoft, IBM, Google, and others compare and contrast in empowering my emotional intelligence?
- Robert Tuttle, Executive Technology Director, frog
Robert Tuttle, Exec Tech Director, frog