Communication between humans and robots is a very important aspect in the field of Humanoid Robotics. For a natural interaction, robots capable of nonverbal communication must be developed. However, despite the most recent efforts, robots still can show only limited expression capabilities. The purpose of this work is to create a facial expression generator that can be applied to the 24 DoF head of the humanoid robot KOBIAN-R. In this manuscript, we present a system that based on relevant studies of human communication and facial anatomy can produce thousands of combinations of facial and neck movements. The wide range of expressions covers not only primary emotions, but also complex or blended ones, as well as communication acts that are not strictly categorized as emotions. Results showed that the recognition rate of expressions produced by this system is comparable to the rate of recognition of the most common facial expressions. Context-based recognition, which is especially important in case of more complex communication acts, was also evaluated. Results proved that produced robotic expressions can alter the meaning of a sentence in the same way as human expressions do. We conclude that our system can successfully improve the communication abilities of KOBIAN-R, making it capable of complex interaction in the future.
ASJC Scopus subject areas