Soo yeung lee
Korea Advanced Institute for Science and Technology, Republic of Korea
Title: Digital Companion with Human-like Emotion and Ethics
Biography
Biography: Soo yeung lee
Abstract
For the successful interaction between human and digital companion, i.e., machine agents, the digital companions need be able to bind with human-like ethics as well as to make emotional dialogue, understand human emotion and express its own emotion. In this talk we present our approaches to develop human-like ethical and emotional conversational agents as a part of the Korean Flagship AI Program. The emotion of human users is estimated from text, audio and visual face expression during verbal conversation and the emotion of intelligent agents is expressed by speech and facial expression. Specifically, we will show how our ensemble networks won the Emotion Recognition in the Wild (EmotiW2015) challenge with 61.6% accuracy to recognize seven emotions from facial expression. Then, a multimodal classifier combines text, voice and facial video for better accuracy. Also, a deep learning based Text-to-Speech (TTS) system will be introduced to express emotions. These emotions of human users and agents interacts each other during the dialogue. Our conversational agents have chitchat and Question-and-Answer (Q&A) modes and the agents respond differently for different emotional states in chitchat mode. Then, the internal states will be further extended into trustworthiness, implicit intention and personality. Also, we will discuss how the agents may learn human-like ethics during the human-machine interactions.