Understanding and Interpreting Nonverbal Language
ABSTRACT During social interactions we express ourselves not only through words but also through facial expressions, tone of voice, gesture and body posture. However, today’s computers are still unable to reliably read and understand such nonverbal language. Current home and mobile assistants such as Apple’s Siri, Google’s Home, and Amazon’s Echo are limited to only listening to their users. I envision a future where a personalised robotic assistant will complement the speech signal by reading the user’s facial expressions, body gestures, emotions and intentions. In addition, better automatic analysis of human behavior has numerous applications in the fields of human computer interaction, education and healthcare.
In this talk I will provide a brief history of work on nonverbal behavior analysis and explore the challenges that we still face. I will particularly focus on my work on facial behavior analysis, including facial expression recognition, eye gaze estimation, and emotion recognition. I will also discuss the applications of such technologies in healthcare settings.
BIO Tadas Baltrušaitis is a post-doctoral associate at the Language Technologies Institute, Carnegie Mellon University working with Prof. Louis-Philippe Morency. His primary research interests lie in the automatic understanding of nonverbal human behavior, computer vision, and multimodal machine learning. In particular, he is interested in the application of such technologies to healthcare settings, with a focus on mental health. Before joining CMU, he was a post-doctoral researcher at the University of Cambridge, where he also received his Ph.D. and Bachelor’s degrees in Computer Science. His Ph.D. research focused on automatic facial expression analysis in especially difficult real world settings.